Not Applicable.
Not Applicable.
The inventive concepts disclosed and claimed herein relate generally to digital image processing and, more particularly, but not by way of limitation, to finding and selecting points of interest on an edge of interest in a digital image.
In the remote sensing/aerial imaging industry, imagery is used to capture views of a geographic area to be able to measure objects and structures within the images as well as to be able to determine geographic locations of points within the image. Such imagery is described in, for example, U.S. Patent Application Publication No. 2008/0231700. Photogrammetry is the science of making measurements of and between objects depicted within photographs, especially aerial photographs. A person, or User, may interact with the image to select Points of Interest that may be used to perform other functions such as: determine distance between two points; determine the area outlined by a set of points; determine the volume of a 3-dimensional shape, or other functions. Usually, the greatest error induced into the calculation of the function is the human error introduced by the User selecting the Points of Interest. For example, if a measurement from a building to the edge of a curb is desired and the User selects a point that is close to the edge of the building, but not the actual edge of the building, then the measurement will differ from the actual measurement by the distance the User selected away from the building. In the projected image, several pixels could significantly impair the accuracy of the measurement. Additionally, when multiple Users perform the same measurement, their results can differ significantly.
In light of the foregoing, there is a need for a system and process for allowing a User to select Points of Interest based on the edge elements that exist in a geo-referenced image, wherein the Points of Interest are determined without the error caused by human determination of the position of the edge of an element in an image.
A method for creating image products includes the following steps. Image data and positional data corresponding to the image data are captured and processed to create geo-referenced images. Edge detection procedures are performed on the geo-referenced images to identify edges and produce geo-referenced, edge-detected images.
In one embodiment, a computerized system includes a computer system for storing a database of captured oblique images with corresponding geo-location data and corresponding detected edge data. The computer system has computer executable logic that, when executed by a processor, causes the computer system to receive a selection of a geographic point from a User, search the database to find images that contain the selected point, and make the images that contain the selected point available to the User.
In another embodiment a method of providing images to a User includes the following steps. A database stores captured oblique images having corresponding geo-location data and corresponding detected edge data. A selection of a geographic point is received from a user and the database is then searched to find images that contain the selected geographic point. The images that contain the selected geographic point are then made available to the User.
In yet another embodiment a sequence of instructions is stored on at least one non-transitory computer readable medium for running on a computer system capable of displaying and navigating imagery. The sequence of instructions includes instructions for causing the computer system to display a pixel representation of a georeferenced, edge-detected image, wherein the pixel representation includes one or more detected edges in the geo-referenced, edge-detected image; instructions for causing the computer system to allow the User to select one of the one or more detected edges by moving a cursor over a region of interest (which may be magnified), wherein the cursor is caused to snap-to a selected detected edge when the cursor is within a predetermined distance from the selected detected edge; instructions for causing the computer system to allow the User to accept the selected detected edge as an edge of interest; and instructions for causing the computer system to allow the User to determine and store one or more points of interest along the edge of interest.
In yet another embodiment, a system for preparing and utilizing geo-referenced images includes one or more image and data files accessible by a computer system capable of displaying and navigating digital imagery, the image and data file including a plurality of image files, detected edge information corresponding to the plurality of image files, and positional data corresponding to the plurality of image files; and image display and analysis software stored on a non-transitory computer readable medium and executable by the computer system. The image display and analysis software causes the computer system to allow a user to download and display, from the image and data file, a pixel representation of an image having a plurality of detected edges within the image, and to select a detected edge within the pixel representation by moving a cursor over the pixel representation, wherein the cursor is caused to snap-to a selected detected edge when the cursor is within a predetermined distance from the selected detected edge. The image display and analysis software also causes the computer system to allow the user to accept the selected detected edge as an edge of interest; and to allow the user to determine and store one or more points of interest along the edge of interest.
Thus, utilizing (1) the technology known in the art; (2) the above-referenced general description of the presently claimed and disclosed inventive concept(s); and (3) the drawings and detailed description of the inventive concepts that follows, the advantages and novelties of the presently claimed and disclosed inventive concept(s) are readily apparent to one of ordinary skill in the art.
Like reference numerals in the figures represent and refer to the same element or function. Implementations of the disclosure may be better understood when consideration is given to the following detailed description thereof. Such description makes reference to the annexed pictorial illustrations, schematics, graphs, drawings, and appendices. In the drawings:
Before explaining at least one embodiment of the inventive concepts(s) disclosed herein in detail, it is to be understood that the inventive concept(s) is not limited in its application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. The inventive concept(s) disclosed herein is capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
In the following detailed description of embodiments of the disclosure, numerous specific details are set forth in order to provide a more thorough understanding of the disclosure. However, it will be apparent to one of ordinary skill in the art that the concepts within the disclosure can be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the description.
The inventive concept(s) disclosed herein is directed to methods and systems of creating image products, wherein image data is captured, along with positional data corresponding to the image data. The image and positional data are processed to create a plurality of geo-referenced images. In one embodiment disclosed herein, edge detection procedures are performed on the plurality of geo-referenced images to identify edges and produce geo-referenced, edge-detected images, which are saved in a database. A computer system storing such a database of captured oblique images having corresponding geo-location data and corresponding detected edge data, has computer executable logic that when executed by a processor causes the computer system to receive a selection of a geographic point from a user, search the database to find images that contain the selected point, and make the images that contain the selected point available to the user. Alternatively, the database may store captured oblique images without the detected edge data, and the edges within such images can be detected in real-time as the user is viewing the image(s).
In one embodiment, a sequence of instructions is stored on at least one non-transitory computer readable medium for running on a computer system capable of displaying and navigating imagery. Referring now to the drawings, and in particular to
Referring now to
The image display and analysis software can be stored on a non-transitory computer readable medium and/or executed by the first and/or the second computer systems 21 and 22 Software providing instructions for displaying a pixel representation of image are available commercially and well known to those skilled in the art. Image display and analysis software can include methods for utilizing oblique images such as described in U.S. Pat. No. 7,787,659, the content of which is incorporated herein by reference.
The computer readable mediums 30a and/or 30b includes, for example, non-volatile read only memory, random access memory, hard disk memory, removable memory cards and/or other suitable memory storage devices and/or media. Input devices 26a and 26b, such as, for example, a mouse, keyboard, joystick, or other such input device, enable the input of data and interaction of a User with image display and analysis software being executed by the first and/or the second computer systems 21 and 22. In one embodiment, the first computer system 21 executes the image display and analysis software, and the second computer system 22 executes a communication software package, such as a web browser, to communicate with the first computer system 21 and display the geo-referenced images. An output device such as display device 28a can include, for example, a liquid crystal display or cathode ray tube, and displays information to the User of the second computer system 22. Additional output devices 28b can include a second display device, printer, speaker, etc. Communication connections 24 connect the second computer system 22 to a network 32, such as, for example, a local-area network, a wide-area network, the Internet and/or the World Wide Web for establishing communication with the first computer system 21.
The second computer system 22 typically uses a computer monitor as a display device 28a. These display devices often cannot display an entire image with the detail necessary. This may be due to the resolution of the information in the image and the resolution and size of the display surface. When a full image is not displayable on the monitor in its entirety, the displayable image which is substituted for the full image is often a global image, i.e. the full image with resolution removed to allow the entire image to fit onto the display surface of the display device. Referring now to
The detail image or magnified region of interest MROI 54 shows the details of the region of interest ROI 50, but when shown alone the global contexts of the details are lost. Thus, in one embodiment, instructions and algorithms known to those skilled in the art are utilized for magnifying a user-requested region of interest (ROI) 50 from the image 52 and displaying the full image or a portion thereof, along with a subsection of the display screen showing a linear magnification of the user-requested ROI 50. This allows for magnification of a particular ROI in an image while preserving visibility of the larger image as shown in
In some applications, a non-linear magnification of the user-selected ROI 50 may be provided so that the connection between the detail image and the immediately surrounding portion of the global image is not obscured. Such methods are also known to those skilled in the art. While non-linear magnification creates a “lens” like distortion to the original image, it still provides increased detail for the ROI 50 while maintaining the connection to the immediately surrounding portion of the global image.
The User typically interprets the image and decides which features are to be measured. By positioning, for example, a mouse cursor, the approximate location of the features can be pointed out to an algorithm. Semi-automatic feature extraction is known to those skilled in the art and is used for measuring height points as well as for measuring specific object corners. A User positions the cursor at some position in the image 52 and the cursor snaps-to the desired surface or object corner. However, the image around this selected point will usually contain gray value gradients caused by the object edges. Thus the snap-to feature may be limited by the size and quality of the pixels.
The extraction of lines from digital images has also been researched for many years. Semi-automatic algorithms have been developed, for example, for the extraction of roads. Most algorithms of this kind are based on so-called “snakes”. Extraction of objects like house roofs can be improved by algorithms that extract homogeneous gray value. The algorithms used to find the boundaries of a homogeneous area are usually based on a “region growing algorithm”. A common interactive approach is to allow the User to select an appropriate object model, and approximately align the object model with the image. A fitting algorithm can then be used to find the best correspondence between edges of the object model and the location of high gradients in the image.(1)
In an embodiment of the present disclosure, the sequence of instructions stored on at least one computer readable medium 30a, 30b and/or computer readable medium 55 of the first computer system 21 for running on the first and/or second computer systems 21 and 22 capable of displaying and navigating digital imagery includes instructions for performing edge detection procedures on the user-requested region of interest and/or on the entire pixel representation of the image. Referring now to
Referring now to
In an embodiment, instructions are provided at a step 108 to hold the cursor at the detected edge DE 56 until the User indicates acceptance or rejection of the detected edge DE 56 as an edge of interest EOI 110 at a step 112. Upon rejection, the cursor is allowed to move freely at a step 114 and the process 98 branches to the step 100 until the cursor position is detected again within a predetermined distance of the same or another detected edge DE 56, at which point the cursor will again snap-to the detected edge DE 56. Upon acceptance of the detected edge DE 56 as an edge of interest EOI 110, the process 98 includes instructions at a step 116 to indicate, mark and store the accepted edge of interest EOI 32 on the image 52 as shown, for example, in
Referring now to
The User is then allowed to move the cursor in a step 128 in order to select additional points of interest POI 120. The process 118 queries the User at a step 130 regarding whether all of the points of interest POI 120 have been selected. The User can indicate by any means of communication with the second computer system 22, such as, for example, by selecting “All Done” or “Continue” from an opened dialog box. If additional points of interest POI 120 are desired, process 118 branches back to step 100 of process 98 and additional points of interest POI 120 can be identified using the procedure described above until the User has selected all of the points of interest POI 120 necessary to perform a desired function. If in fact, the User has selected all the necessary points of interest POI 120, the User may select a function to be performed or exit out of the magnified region of interest MROI 54 as in a step 132.
Often, the User takes measurements of and between objects depicted in the image 52 by selecting one of several available measuring modes provided within the image display and analysis software. The User selects the desired measurement mode by accessing, for example, a series of pull-down menus or toolbars or via keyboard commands. The measuring modes provided by image display and analysis software may include, for example, a distance mode that enables measurement of the distance between two or more selected points, an area mode that enables measurement of the area encompassed by several selected and interconnected points, a height mode that enables measurement of the height between two or more selected points, and an elevation mode that enables the measurement of the change in elevation of one selected point relative to one or more other selected points.
After selecting the desired measurement mode, the User selects a starting point of interest POI 120 and an ending point of interest POI 120′ on the image 52, and image display and analysis software automatically calculates and displays the quantity sought. The accuracy, precision and consistency of these measurements depend in great part on the procedure used to select the points of interest POI 120. By combining edge detection procedures with a snap-to function in a magnified region of interest, edges of interest can be identified and individual points of interest along those edges can be selected in a much more accurate and consistent manner than is currently available.
In one embodiment disclosed herein, edge detection procedures are performed on the plurality of geo-referenced images to identify edges and produce geo-referenced, edge-detected images which are saved in a database on one or more of the computer readable mediums 30a, 30b and 55. Either the first and/or the second computer system 21 and 22 storing such a database of captured oblique images having corresponding geo-location data and corresponding detected edge data, has computer executable logic that when executed by a processor causes the computer systems 21 and/or 22 to receive a selection of a geographic point from a user, search the database to find images that contain the selected point, and make the images that contain the selected point available to the user. Alternatively, the database may store captured oblique images without the detected edge data, and the edges within such images can be detected in real-time as the user is viewing the image(s) and/or when the MROI 54 is showing details of a region of interest 50.
As it will be appreciated by persons of ordinary skill in the art, changes may be made in the construction and the operation of the various components, elements and assemblies described herein or in the steps or the sequence of steps of the methods described herein without departing from the spirit and scope of the inventive concept(s) disclosed herein.
From the above description, it is clear that the inventive concept(s) disclosed herein is well adapted to carry out the objects and to attain the advantages mentioned herein as well as those inherent in the inventive concept(s) disclosed herein. While presently preferred embodiments of the inventive concept(s) disclosed herein have been described for purposes of this disclosure, it will be understood that numerous changes may be made which will readily suggest themselves to those skilled in the art and which are accomplished within the spirit of the inventive concept(s) disclosed and claimed herein.
The present patent application is a continuation of U.S. Ser. No. 14/341,213, filed Jul. 25, 2014, which is a continuation of U.S. Ser. No. 12/972,088, filed Dec. 17, 2010, now U.S. Pat. No. 8,823,732, the entire contents of all of which are hereby incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
2273876 | Lutz et al. | Feb 1942 | A |
3153784 | Petrides et al. | Oct 1964 | A |
3594556 | Edwards | Jul 1971 | A |
3614410 | Bailey | Oct 1971 | A |
3621326 | Hobrough | Nov 1971 | A |
3661061 | Tokarz | May 1972 | A |
3716669 | Watanabe et al. | Feb 1973 | A |
3725563 | Woycechowsky | Apr 1973 | A |
3864513 | Halajian et al. | Feb 1975 | A |
3866602 | Furihata | Feb 1975 | A |
3877799 | O'Donnell | Apr 1975 | A |
4015080 | Moore-Searson | Mar 1977 | A |
4044879 | Stahl | Aug 1977 | A |
4184711 | Wakimoto | Jan 1980 | A |
4240108 | Levy | Dec 1980 | A |
4281354 | Conte | Jul 1981 | A |
4344683 | Stemme | Aug 1982 | A |
4360876 | Girault et al. | Nov 1982 | A |
4382678 | Thompson et al. | May 1983 | A |
4387056 | Stowe | Jun 1983 | A |
4396942 | Gates | Aug 1983 | A |
4463380 | Hooks | Jul 1984 | A |
4489322 | Zulch et al. | Dec 1984 | A |
4490742 | Wurtzinger | Dec 1984 | A |
4491399 | Bell | Jan 1985 | A |
4495500 | Vickers | Jan 1985 | A |
4527055 | Harkless et al. | Jul 1985 | A |
4543603 | Laures | Sep 1985 | A |
4586138 | Mullenhoff et al. | Apr 1986 | A |
4635136 | Ciampa et al. | Jan 1987 | A |
4653136 | Denison | Mar 1987 | A |
4653316 | Fukuhara | Mar 1987 | A |
4673988 | Jansson et al. | Jun 1987 | A |
4686474 | Olsen et al. | Aug 1987 | A |
4688092 | Kamel et al. | Aug 1987 | A |
4689748 | Hofmann | Aug 1987 | A |
4707698 | Constant et al. | Nov 1987 | A |
4758850 | Archdale et al. | Jul 1988 | A |
4805033 | Nishikawa | Feb 1989 | A |
4807024 | Mclaurin et al. | Feb 1989 | A |
4814711 | Olsen et al. | Mar 1989 | A |
4814896 | Heitzman et al. | Mar 1989 | A |
4843463 | Michetti | Jun 1989 | A |
4899296 | Khattak | Feb 1990 | A |
4906198 | Cosimano et al. | Mar 1990 | A |
4953227 | Katsuma et al. | Aug 1990 | A |
4956872 | Kimura | Sep 1990 | A |
5034812 | Rawlings | Jul 1991 | A |
5086314 | Aoki et al. | Feb 1992 | A |
5121222 | Endoh et al. | Jun 1992 | A |
5138444 | Hiramatsu | Aug 1992 | A |
5155597 | Lareau et al. | Oct 1992 | A |
5164825 | Kobayashi et al. | Nov 1992 | A |
5166789 | Myrick | Nov 1992 | A |
5191174 | Chang et al. | Mar 1993 | A |
5200793 | Ulich et al. | Apr 1993 | A |
5210586 | Grage et al. | May 1993 | A |
5231435 | Blakely | Jul 1993 | A |
5247356 | Ciampa | Sep 1993 | A |
5251037 | Busenberg | Oct 1993 | A |
5265173 | Griffin et al. | Nov 1993 | A |
5267042 | Tsuchiya et al. | Nov 1993 | A |
5270756 | Busenberg | Dec 1993 | A |
5296884 | Honda et al. | Mar 1994 | A |
5335072 | Tanaka et al. | Aug 1994 | A |
5342999 | Frei et al. | Aug 1994 | A |
5345086 | Bertram | Sep 1994 | A |
5353055 | Hiramatsu | Oct 1994 | A |
5369443 | Woodham | Nov 1994 | A |
5381338 | Wysocki | Jan 1995 | A |
5402170 | Parulski et al. | Mar 1995 | A |
5414462 | Veatch | May 1995 | A |
5467271 | Abel et al. | Nov 1995 | A |
5481479 | Wight et al. | Jan 1996 | A |
5486948 | Imai et al. | Jan 1996 | A |
5506644 | Suzuki et al. | Apr 1996 | A |
5508736 | Cooper | Apr 1996 | A |
5555018 | von Braun | Sep 1996 | A |
5604534 | Hedges et al. | Feb 1997 | A |
5617224 | Ichikawa et al. | Apr 1997 | A |
5633946 | Lachinski et al. | May 1997 | A |
5668593 | Lareau et al. | Sep 1997 | A |
5677515 | Selk et al. | Oct 1997 | A |
5798786 | Lareau et al. | Aug 1998 | A |
5835133 | Moreton et al. | Nov 1998 | A |
5841574 | Willey | Nov 1998 | A |
5844602 | Lareau et al. | Dec 1998 | A |
5852753 | Lo et al. | Dec 1998 | A |
5894323 | Kain et al. | Apr 1999 | A |
5899945 | Baylocq et al. | May 1999 | A |
5963664 | Kumar et al. | Oct 1999 | A |
6037945 | Loveland | Mar 2000 | A |
6088055 | Lareau et al. | Jul 2000 | A |
6094215 | Sundahl et al. | Jul 2000 | A |
6097854 | Szeliski et al. | Aug 2000 | A |
6108032 | Hoagland | Aug 2000 | A |
6130705 | Lareau et al. | Oct 2000 | A |
6157747 | Szeliski et al. | Dec 2000 | A |
6167300 | Cherepenin et al. | Dec 2000 | A |
6222583 | Matsumura et al. | Apr 2001 | B1 |
6236886 | Cherepenin et al. | May 2001 | B1 |
6256057 | Mathews et al. | Jul 2001 | B1 |
6373522 | Mathews et al. | Apr 2002 | B2 |
6421610 | Carroll et al. | Jul 2002 | B1 |
6434280 | Peleg et al. | Aug 2002 | B1 |
6448964 | Isaacs et al. | Sep 2002 | B1 |
6597818 | Kumar et al. | Jul 2003 | B2 |
6639596 | Shum et al. | Oct 2003 | B1 |
6711475 | Murphy | Mar 2004 | B2 |
6731329 | Feist et al. | May 2004 | B1 |
6747686 | Bennett | Jun 2004 | B1 |
6810383 | Loveland | Oct 2004 | B1 |
6816819 | Loveland | Nov 2004 | B1 |
6826539 | Loveland | Nov 2004 | B2 |
6829584 | Loveland | Dec 2004 | B2 |
6834128 | Altunbasak et al. | Dec 2004 | B1 |
6876763 | Sorek et al. | Apr 2005 | B2 |
7009638 | Gruber et al. | Mar 2006 | B2 |
7018050 | Ulichney et al. | Mar 2006 | B2 |
7046401 | Dufaux et al. | May 2006 | B2 |
7061650 | Walmsley et al. | Jun 2006 | B2 |
7065260 | Zhang et al. | Jun 2006 | B2 |
7123382 | Walmsley et al. | Oct 2006 | B2 |
7127348 | Smitherman et al. | Oct 2006 | B2 |
7133551 | Chen | Nov 2006 | B2 |
7142984 | Rahmes et al. | Nov 2006 | B2 |
7184072 | Loewen et al. | Feb 2007 | B1 |
7233691 | Setterholm | Jun 2007 | B2 |
7262790 | Bakewell | Aug 2007 | B2 |
7348895 | Lagassey | Mar 2008 | B2 |
7509241 | Guo | Mar 2009 | B2 |
7728833 | Verma | Jun 2010 | B2 |
7832267 | Woro | Nov 2010 | B2 |
7844499 | Yahiro | Nov 2010 | B2 |
8031947 | Jacobsen | Oct 2011 | B2 |
8078396 | Meadow | Dec 2011 | B2 |
8103445 | Smith et al. | Jan 2012 | B2 |
8150617 | Manber et al. | Apr 2012 | B2 |
8422825 | Neophytou et al. | Apr 2013 | B1 |
8452125 | Schultz et al. | May 2013 | B2 |
8531472 | Freund et al. | Sep 2013 | B2 |
8705843 | Lieckfeldt | Apr 2014 | B2 |
20020041328 | LoCompte et al. | Apr 2002 | A1 |
20020041717 | Murata et al. | Apr 2002 | A1 |
20020114536 | Xiong et al. | Aug 2002 | A1 |
20030014224 | Guo et al. | Jan 2003 | A1 |
20030043824 | Remboski et al. | Mar 2003 | A1 |
20030088362 | Melero et al. | May 2003 | A1 |
20030103682 | Blake et al. | Jun 2003 | A1 |
20030164962 | Nims et al. | Sep 2003 | A1 |
20030214585 | Bakewell | Nov 2003 | A1 |
20040105090 | Schultz et al. | Jun 2004 | A1 |
20040167709 | Smitherman et al. | Aug 2004 | A1 |
20050073241 | Yamauchi et al. | Apr 2005 | A1 |
20050088251 | Matsumoto | Apr 2005 | A1 |
20050169521 | Hel-Or | Aug 2005 | A1 |
20060028550 | Palmer et al. | Feb 2006 | A1 |
20060061566 | Verma et al. | Mar 2006 | A1 |
20060089792 | Manber et al. | Apr 2006 | A1 |
20060092043 | Lagassey | May 2006 | A1 |
20060238383 | Kimchi et al. | Oct 2006 | A1 |
20060250515 | Koseki et al. | Nov 2006 | A1 |
20070024612 | Balfour | Feb 2007 | A1 |
20070046448 | Smitherman | Mar 2007 | A1 |
20070237420 | Steedly et al. | Oct 2007 | A1 |
20080120031 | Rosenfeld et al. | May 2008 | A1 |
20080123994 | Schultz et al. | May 2008 | A1 |
20080158256 | Russell et al. | Jul 2008 | A1 |
20080247663 | Jacobsen | Oct 2008 | A1 |
20090177458 | Hochart et al. | Jul 2009 | A1 |
20090202101 | Nielsen et al. | Aug 2009 | A1 |
20090208095 | Zebedin | Aug 2009 | A1 |
20090249257 | Bove | Oct 2009 | A1 |
20090271719 | Clare et al. | Oct 2009 | A1 |
20090300553 | Pettigrew et al. | Dec 2009 | A1 |
20090304227 | Kennedy et al. | Dec 2009 | A1 |
20100085350 | Mishra | Apr 2010 | A1 |
20100205575 | Arora et al. | Aug 2010 | A1 |
20100205576 | Majumder et al. | Aug 2010 | A1 |
20100278424 | Warner | Nov 2010 | A1 |
20100296693 | Thornberry et al. | Nov 2010 | A1 |
20110033110 | Shimamura et al. | Feb 2011 | A1 |
20110109719 | Wilson et al. | May 2011 | A1 |
20110196610 | Waldman et al. | Aug 2011 | A1 |
20130212537 | Hall | Aug 2013 | A1 |
20130246204 | Thornberry et al. | Sep 2013 | A1 |
Number | Date | Country |
---|---|---|
331204 | Jul 2006 | AT |
0316110 | Sep 2005 | BR |
2402234 | Sep 2000 | CA |
2505566 | May 2004 | CA |
1735897 | Feb 2006 | CN |
60017384 | Mar 2006 | DE |
60306301 | Nov 2006 | DE |
1418402 | Oct 2006 | DK |
1010966 | Feb 1999 | EP |
1180967 | Feb 2002 | EP |
1418402 | May 2004 | EP |
1696204 | Aug 2006 | EP |
S2266704 | Mar 2007 | ES |
11328378 | Nov 1999 | JP |
2003317089 | Nov 2003 | JP |
2007149046 | Jun 2007 | JP |
PA05004987 | Feb 2006 | MX |
WO9918732 | Apr 1999 | WO |
WO2000053090 | Sep 2000 | WO |
WO20040044692 | May 2004 | WO |
WO2005088251 | Sep 2005 | WO |
WO2008028040 | Mar 2008 | WO |
Entry |
---|
Ackermann, Prospects of Kinematic GPS Aerial Triangulation, ITC Journal, 1992. |
Ciampa, John A., “Pictometry Digital Video Mapping”, SPIE, vol. 2598, pp. 140-148, 1995. |
Ciampa, J. A., Oversee, Presented at Reconstruction After Urban earthquakes, Buffalo, NY, 1989. |
Dunford et al., Remote Sensing for Rural Development Planning in Africa, The Journal for the International Institute for Aerial Survey and Earth Sciences, 2:99-108, 1983. |
Gagnon, P.A., Agnard, J. P., Nolette, C., & Boulianne, M., “A Micro-Computer based General Photogrammetric System”, Photogrammetric Engineering and Remote Sensing, vol. 56, No. 5., pp. 623-625, 1990. |
Konecny, G., “Issues of Digital Mapping”, Leibniz University Hannover, Germany, GIS Ostrava 2008, Ostrava 27.-30.1.2008, pp. 1-8. |
Konecny, G., “Analytical Aerial Triangulation with Convergent Photography”, Department of Surveying Engineering, University of New Brunswick, pp. 37-57, 1966. |
Konecny, G., “Interior Orientation and Convergent Photography”, Photogrammetric Engineering, pp. 625-634, 1965. |
Graham, Lee A., “Airborne Video for Near-Real-Time Vegetation Mapping”, Journal of Forestry, 8:28-32, 1993. |
Graham, Horita TRG-50 SMPTE Time-Code Reader, Generator, Window Inserter, 1990. |
Hess, L.L, et al., “Geocoded Digital Videography for Validation of Land Cover Mapping in the Amazon Basin”, International Journal of Remote Sensing, vol. 23, No. 7, pp. 1527-1555, 2002. |
Hinthorne, J., et al., “Image Processing in the Grass GIS”, Geoscience and Remote Sensing Symposium, 4:2227-2229, 1991. |
Imhof, Ralph K., “Mapping from Oblique Photographs”, Manual of Photogrammetry, Chapter 18, 1966. |
Jensen, John R., Introductory Digital Image Processing: A Remote Sensing Perspective, Prentice-Hall, 1986; 399 pages. |
Lapine, Lewis A., “Practical Photogrammetric Control by Kinematic GPS”, GPS World, 1(3):44-49, 1990. |
Lapine, Lewis A., Airborne Kinematic GPS Positioning for Photogrammetry—The Determination of the Camera Exposure Station, Silver Spring, MD, 11 pages, at least as early as 2000. |
Linden et al., Airborne Video Automated Processing, US Forest Service Internal report, Fort Collins, CO, 1993. |
Myhre, Dick, “Airborne Video System Users Guide”, USDA Forest Service, Forest Pest Management Applications Group, published by Management Assistance Corporation of America, 6 pages, 1992. |
Myhre et al., “An Airborne Video System Developed Within Forest Pest Management—Status and Activities”, 10 pages, 1992. |
Myhre et al., “Airborne Videography—A Potential Tool for Resource Managers”—Proceedings: Resource Technology 90, 2nd International Symposium on Advanced Technology in Natural Resource Management, 5 pages, 1990. |
Myhre et al., Aerial Photography for Forest Pest Management, Proceedings of Second Forest Service Remote Sensing Applications Conference, Slidell, Louisiana, 153-162, 1988. |
Myhre et al., “Airborne Video Technology”, Forest Pest Management/Methods Application Group, Fort Collins, CO, pp. 1-6, at least as early as Jul. 30, 2006. |
Norton-Griffiths et al., 1982. “Sample surveys from light aircraft combining visual observations and very large scale color photography”. University of Arizona Remote Sensing Newsletter 82-2:1-4. |
Norton-Griffiths et al., “Aerial Point Sampling for Land Use Surveys”, Journal of Biogeography, 15:149-156, 1988. |
Novak, Rectification of Digital Imagery, Photogrammetric Engineering and Remote Sensing, 339-344, 1992. |
Slaymaker, Dana M., “Point Sampling Surveys with GPS-logged Aerial Videography”, Gap Bulletin number 5, University of Idaho, http://www.gap.uidaho.edu/Bulletins/5/PSSwGPS.html, 1996. |
Slaymaker, et al., “Madagascar Protected Areas Mapped with GPS-logged Aerial Video and 35mm Air Photos”, Earth Observation magazine, vol. 9, No. 1, http://www.eomonline.com/Common/Archives/2000jan/00jan_tableofcontents.html, pp. 1-4, 2000. |
Slaymaker, et al., “Cost-effective Determination of Biomass from Aerial Images”, Lecture Notes in Computer Science, 1737:67-76, http://portal.acm.org/citation.cfm?id=648004.743267&coll=GUIDE&dl=, 1999. |
Slaymaker, et al., “A System for Real-time Gerneration of Geo-referenced Terrain Models”, 4232A-08, SPIE Enabling Technologies for Law Enforcement Boston, MA, ftp://vis-ftp.cs.umass.edu/Papers/schultz/spie2000.pdf, 2000. |
Slaymaker, et al.,“Integrating Small Format Aerial Photography, Videography, and a Laser Profiler for Environmental Monitoring”, In ISPRS WG III/1 Workshop on Integrated Sensor Calibration and Orientation, Portland, Maine, 1999. |
Slaymaker, et al., “Calculating Forest Biomass With Small Format Aerial Photography, Videography and a Profiling Laser”, In Proceedings of the 17th Biennial Workshop on Color Photography and Videography in Resource Assessment, Reno, NV, 1999. |
Slaymaker et al., Mapping Deciduous Forests in Southern New England using Aerial Videography and Hyperclustered Multi-Temporal Landsat TM Imagery, Department of Forestry and Wildlife Management, University of Massachusetts, 1996. |
Star et al., “Geographic Information Systems an Introduction”, Prentice-Hall, 1990. |
Tomasi et al., “Shape and Motion from Image Streams: a Factorization Method”—Full Report on the Orthographic Case, pp. 9795-9802, 1992. |
Warren, Fire Mapping with the Fire Mousetrap, Aviation and Fire Management, Advanced Electronics System Development Group, USDA Forest Service, 1986. |
Welch, R., “Desktop Mapping with Personal Computers”, Photogrammetric Engineering and Remote Sensing, 1651-1662, 1989. |
Westervelt, James, “Introduction to Grass 4”, pp. 1-25, 1991. |
“RGB Spectrum Videographics Report, vol. 4, No. 1, McDonnell Douglas Integrates RGB Spectrum Systems in Helicopter Simulators”, pp. 1-6, 1995. |
RGB “Computer Wall”, RGB Spectrum, 4 pages, 1995. |
“The First Scan Converter with Digital Video Output”, Introducing . . . The RGB/Videolink 1700D-1, RGB Spectrum, 2 pages, 1995. |
Erdas Field Guide, Version 7.4, A Manual for a commercial image processing system, 1990. |
“Image Measurement and Aerial Photography”, Magazine for all branches of Photogrammetry and its fringe areas, Organ of the German Photogrammetry Association, Berlin-Wilmersdorf, No. 1, 1958. |
“Airvideo Analysis”, MicroImages, Inc., Lincoln, NE, 1 page, Dec. 1992. |
Zhu, Zhigang, Hanson, Allen R., “Mosaic-Based 3D Scene Representation and Rendering”, Image Processing, 2005, ICIP 2005, IEEE International Conference on 1(2005). |
Mostafa, et al., “Direct Positioning and Orientation Systems How do they Work? What is the Attainable Accuracy?”, Proceeding, American Society of Photogrammetry and Remote Sensing Annual Meeting, St. Louis, MO, Apr. 24-27, 2001. |
“POS AV” georeferenced by Applanix aided inertial technology, http://www.applanix.com/products/posav_index.php. |
Mostafa, et al., “Ground Accuracy from Directly Georeferenced Imagery”, Published in GIM International vol. 14 N. 12 Dec. 2000. |
Mostafa, et al., “Airborne Direct Georeferencing of Frame Imagery: An Error Budget”, The 3rd International Symposium on Mobile Mapping Technology, Cairo, Egypt, Jan. 3-5, 2001. |
Mostafa, M.R. and Hutton, J., “Airborne Kinematic Positioning and Attitude Determination Without Base Stations”, Proceedings, International Symposium on Kinematic Systems in Geodesy, Geomatics, and Navigation (KIS 2001) Banff, Alberta, Canada, Jun. 4-8, 2001. |
Mostafa, et al., “Airborne DGPS Without Dedicated Base Stations for Mapping Applications”, Proceedings of ION-GPS 2001, Salt Lake City, Utah, USA, Sep. 11-14. |
Mostafa, “ISAT Direct Exterior Orientation QA/QC Strategy Using POS Data”, Proceedings of OEEPE Workshop: Integrated Sensor Orientation, Hanover, Germany, Sep. 17-18, 2001. |
Mostafa, “Camera/IMU Boresight Calibration: New Advances and Performance Analysis”, Proceedings of the ASPRS Annual Meeting, Washington, D.C., Apr. 21-26, 2002. |
Hiatt, “Sensor Integration Aids Mapping at Ground Zero”, Photogrammetric Engineering and Remote Sensing, Sep. 2002, p. 877-878. |
Mostafa, “Precision Aircraft GPS Positioning Using CORS”, Photogrammetric Engineering and Remote Sensing, Nov. 2002, p. 1125-1126. |
Mostafa, et al., System Performance Analysis of INS/DGPS Integrated System for Mobile Mapping System (MMS), Department of Geomatics Engineering, University of Calgary, Commission VI, WG VI/4, Mar. 2004. |
Artes F., & Hutton, J., “GPS and Inertial Navigation Delivering”, Sep. 2005, GEOconnexion International Magazine, p. 52-53, Sep. 2005. |
“POS AV” Applanix, Product Outline, airborne@applanix.com, 3 pages, Mar. 28, 2007. |
POSTrack, “Factsheet”, Applanix, Ontario, Canada, www.applanix.com, Mar. 2007. |
POS AV “Digital Frame Camera Applications”, 3001 Inc., Brochure, 2007. |
POS AV “Digital Scanner Applications”, Earthdata Brochure, Mar. 2007. |
POS AV “Film Camera Applications” AeroMap Brochure, Mar. 2007. |
POS AV “LIDAR Applications” MD Atlantic Brochure, Mar. 2007. |
POS AV “OEM System Specifications”, 2005. |
POS AV “Synthetic Aperture Radar Applications”, Overview, Orbisat Brochure, Mar. 2007. |
“POSTrack V5 Specifications” 2005. |
“Remote Sensing for Resource Inventory Planning and Monitoring”, Proceeding of the Second Forest Service Remote Sensing Applications Conference—Slidell, Louisiana and NSTL, Mississippi, Apr. 11-15, 1988. |
“Protecting Natural Resources with Remote Sensing”, Proceeding of the Third Forest Service Remote Sensing Applications Conference—Apr. 9-13, 1990. |
Heipke, et al, “Test Goals and Test Set Up for the OEEPE Test—Integrated Sensor Orientation”, 1999. |
Kumar, et al., “Registration of Video to Georeferenced Imagery”, Sarnoff Corporation, CN5300, Princeton, NJ, 1998. |
McConnel, Proceedings Aerial Pest Detection and Monitoring Workshop—1994.pdf, USDA Forest Service Forest Pest Management, Northern Region, Intermountain region, Forest Insects and Diseases, Pacific Northwest Region. |
“Standards for Digital Orthophotos”, National Mapping Program Technical Instructions, US Department of the Interior, Dec. 1996. |
Tao, “Mobile Mapping Technology for Road Network Data Acquisition”, Journal of Geospatial Engineering, vol. 2, No. 2, pp. 1-13, 2000. |
“Mobile Mapping Systems Lesson 4”, Lesson 4 Sure 382 Geographic Information Systems II, pp. 1-29, Jul. 2, 2006. |
Konecny, G., “Mechanische Radialtriangulation mit Konvergentaufnahmen”, Bildmessung and Luftbildwesen, 1958, Nr. 1. |
Myhre, “ASPRS/ACSM/RT 92” Technical papers, Washington, D.C., vol. 5 Resource Technology 92, Aug. 3-8, 1992. |
Rattigan, “Towns get new view from above,” The Boston Globe, Sep. 5, 2002. |
Mostafa, et al., “Digital image georeferencing from a multiple camera system by GPS/INS,” ISP RS Journal of Photogrammetry & Remote Sensing, 56(I): I-12, Jun. 2001. |
Dillow, “Grin, or bare it, for aerial shot,” Orange County Register (California), Feb. 25, 2001. |
Anonymous, “Live automatic coordinates for aerial images,” Advanced Imaging, 12(6):51, Jun. 1997. |
Anonymous, “Pictometry and US Geological Survey announce—Cooperative Research and Development Agreement,” Press Release published Oct. 20, 1999. |
Miller, “Digital software gives small Arlington the Big Picture,” Government Computer NewsState & Local, 7(12), Dec. 2001. |
Garrett, “Pictometry: Aerial photography on steroids,” Law Enforcement Technology 29(7):114-116, Jul. 2002. |
Weaver, “County gets an eyeful,” The Post-Standard (Syracuse, NY), May 18, 2002. |
Reed, “Firm gets latitude to map O.C. In 3D,” Orange County Register (California), Sep. 27, 2000. |
Reyes, “Orange County freezes ambitious aerial photography project,” Los Angeles Times, Oct. 16, 2000. |
Aerowest Pricelist of Geodata as of Oct. 21, 2005 and translations to English 3 pages. |
www.archive.org Web site showing archive of German AeroDach Web Site http://www.aerodach.de. from Jun. 13, 2004 (retrieved Sep. 20, 2012) and translations to English 4 pages. |
AeroDach®Online Roof Evaluation Standard Delivery Format and 3D Data File: Document Version 01.00.2002 with publication in 2002, 13 pages. |
Noronha et al., “Detection and Modeling of Building from Multiple Aerial Images,” Institute for Robotics and Intelligent Systems, University of Southern California, Nov. 27, 2001, 32 pages. |
Applicad Reports dated Nov. 25, 1999-Mar. 9, 2005, 50 pages. |
Applicad Online Product Bulletin archive from Jan. 7, 2003, 4 pages. |
Applicad Sorcerer Guide, Version 3, Sep. 8, 1999, 142 pages. |
Xactimate Claims Estimating Software archive from Feb. 12, 2010, 8 pages. |
Bignone et al, Automatic Extraction of Generic House Roofs from High Resolution Aerial Imagery, Communication Technology Laboratory, Swiss Federal Institute of Technology ETH, CH-8092 Zurich, Switzerland, 12 pages, 1996. |
Geospan 2007 Job proposal. |
Greening et al., Commercial Applications of GPS-Assisted Photogrammetry, Presented at GIS/LIS Annual Conference and Exposition, Phoenix, AZ, Oct. 1994. |
Pictometry International Corp., Patent Owner's Response in inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, Filed Nov. 16, 2016. |
Bajaj, Declaration of Dr. Chandrajit Bajaj, as filed as Exhibit-2006 in inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, on Nov. 16, 2016. |
Deposition of Mr. Harold Schuch of Nov. 2, 2016, as filed as Exhibit-2005 in inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, on Nov. 16, 2016. |
Pictometry International Corp. et al.; Metropolitan Area Planning Council License Agreement; 2009; as filed as Exhibit-2009 in inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, on Nov. 16, 2016. |
Xactware Solutions, Inc., Petition for inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, Mail Date Feb. 8, 2016. |
Able Software Corp., R2V for Windows User's Manual (as filed as Exhibit-1003 in IPR2016-00593), Sep. 16, 2000. |
Pictometry International Corp., Electronic Field Study User Guide Version 2.7 (as filed as Exhibit-1004 in IPR2016-00593), Jul. 2007. |
Gleicher, Michael, Image Snapping (as filed as Exhibit-1005 in IPR2016-00593), 1995. |
Schuch, Harold, Declaration in support of Petition for inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593 (as filed as Exhibit-1006 in IPR2016-00593), Mail Date Feb. 8, 2016. |
Eagle View Technologies, Inc., and Pictometry International Corp; Complaint for infringement of U.S. Pat. No. 8,823,732 in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court District of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, filed Sep. 23, 2015. |
Xactware Solutions, Inc., and Verisk Analytics, Inc., Answer to Complaint for infringement of U.S. Pat. No. 8,823,732 in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court District of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, filed Nov. 12, 2015. |
Eagle View Technologies, Inc., and Pictometry International Corp; Amended Complaint for infringement of U.S. Pat. No. 8,823,732 in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court District of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, filed Nov. 30, 2015. |
Xactware Solutions, Inc., and Verisk Analytics, Inc., Amended Answer, Affirmative Defenses, Counterclaims to Amended Complaint for infringement of U.S. Pat. No. 8,823,732 in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court District of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, filed Dec. 14, 2015. |
Xactware Solutions, Inc., and Verisk Analytics, Inc., Motion to Dismiss Complaint for infringement of U.S. Pat. No. 8,823,732 under 35 U.S.C. 101 and supporting Memorandum in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, filing date Feb. 9, 2016. |
Xactware Solutions, Inc., and Verisk Analytics, Inc., Invalidity Contentions regarding U.S. Pat. No. 8,823,732 in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court District of New Jersy, case No. njd-1-15-cv-07025-RBK-JS, filing date Feb. 9, 2016. |
Mexican Patent Office, Official Action regarding Mexican Patent Application No. MX/a/2014/014451 and English translation, dated Nov. 2015. |
Bertan et al., Automatic 3D Roof Reconstruction Using Digital Cadastral Map, Architectural Knowledge and an Aerial Image; 2006 IEEE International Geoscience and Remote Sensing Symposium, IGARSS, 2006. |
Eagle View Technologies, Inc., and Pictometry International Corp., Response with Claim Charts to Invalidity Contentions regarding U.S. Pat. No. 8,823,732 in litigation (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, dated Mar. 10, 2016. |
Eagle View Technologies, Inc., and Pictometry International Corp., Response in Opposition to Motion to Dismiss Complaint for Infringement of U.S. Pat. No. 8,823,732 under 35 U.S.C. 101 (Eagle View Technologies, Inc., and Pictometry International Corp. v. Xactware Solutions, Inc., and Verisk Analytics, Inc.) in the United States District Court of New Jersey, case No. njd-1-15-cv-07025-RBK-JS, filing date Mar. 22, 2016. |
Pictometry International Corp., Patent Owner's Preliminary Response to petition for inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, Filed Jun. 2, 2016. |
USPTO Patent Trial and Appeal Board; Institution Decision for inter partes review of claims 12-15, 21-23, 25-27, 33-38, 44, and 45 of U.S. Pat. No. 8,823,732, IPR2016-00593, Paper 13, Aug. 31, 2016. |
Unknown; Cohasset Town Report for the year ending Dec. 31, 2008; 2009; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2616-00593). |
Pictometry International Corp., EFS 2.7 Getting Started Guide, Jul. 2007 (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Asimonye, Bernard; Geospatial Information Systems Council—Pictometry: Oblique Imagery Training—Announcement; Apr. 6, 2009; (provided to Pictometry International Corp. as supplemental evidence inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Genesee County GIS Working Group Meeting Minutes; 2007; (provided to Pictometry International Corp. as supplemental evidence inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Los Angeles Region Imagery Acquisition Consortium (LAR-IAC2) Product Guide 2; Feb. 2009; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; LARIAC1 Pictometry Training Announcement; Feb. 2009; (provided to Pictometry International Corp. as supplemental evidence inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Los Angeles County Extends its License Agreement with Pictometry for New Oblique Aerial Photos; Directions Magazine—Press Releases; Mar. 2006; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Screen shot of properties page for “TXWILMAdministrative Training” PowerPoint File; Sep. 2008; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Pictometry Training Descriptions, Unknown date; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Horan, Mike; Pictometry Administrative Training PowerPoint; Unknown date; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Pictometry Announces Technical Advancements for GIS Professionals; GISuser; Dec. 2006 (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Unknown; Pictometry Announces Technical Advancements for GIS Professionals; Directions Magazine—Press Releases; Dec. 2006; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Los Angeles County Department of Regional Planning; PowerPoint slides titled “Pictometry”; Apr. 19, 2007; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Office of Transportation Planning; Pictometry License Guidelines; Jan. 26, 2005; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Lanctot, Corey; Welcome to Your End User Training; Unknown date; (provided to Pictometry International Corp. as supplemental evidence regarding Pictometry Electronic Field Study Guide Version 2.7 during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Association for Computing Machinery; Screenshots from ACM Digital Library regarding reference “Image Snapping” by Michael Gleicher; Sep. 2016; (provided to Pictometry International Corp. as supplemental evidence “Image Snapping” by Michael Gleicher during the inter partes review of claims of U.S. Pat. No. 8,823,732, IPR2016-00593). |
Canadian Patent Office, Office Action regarding Canadian Patent Application No. 2,819,166, dated Oct. 17, 2017. |
Mexican Patent Office, Office Action regarding Mexican Patent Application No. MX/a/2016/003154, dated Oct. 22, 2017. |
Xactware Solutions, Inc., Appellant Brief filed in Xactware Solutions, Inc. v. Pictometry International Corp., Fed. Cir. Case 18/1094 (consolidated with Case 18/1093) (Appeal of USPTO PTAB Final Written Decision in inter partes review of claims 12-15, 21-23, 25-27, 33-38, 44, and 45 of U.S. Pat. No. 8,823,732, IPR2016-00593), Mar. 5, 2018. |
Pictometry International Corp., Response to Canadian Patent Office's Office Action regarding Canadian Patent Application No. 2,819,166, dated Oct. 17, 2017. |
Canadian Patent Office, Office Action regarding Canadian Patent Application No. 2,819,166, dated Sep. 26, 2018. |
Pictometry International Corp., Appellee Brief (corrected) filed in Xactware Solutions, Inc. v. Pictometry International Corp., Fed. Cir. Case 18/1094 (consolidated with Case 18/1093) (Appeal of USPTO PTAB Final Written Decision in inter partes review of claims 12-15, 21-23, 25-27, 33-38, 44, and 45 of U.S. Pat. No. 8,823,732, IPR2016-00593), Apr. 18, 2018. |
Xactware Solutions, Inc., Reply Brief filed in Xactware Solutions, Inc. v. Pictometry International Corp., Fed. Cir. Case 18/1094 (consolidated with Case 18/1093) (Appeal of USPTO PTAB Final Written Decision in inter partes review of claims 12-15, 21-23, 25-27, 33-38, 44, and 45 of U.S. Pat. No. 8,823,732, IPR2016-00593), May 30, 2018. |
European Patent Office, Search Report and Written Opinion regarding European Patent Application No. 11849260.2, dated Aug. 18, 2017. |
Pictometry International Corp, Response to Aug. 18, 2017 Search Report and Written Opinion regarding European Patent Application No. 11849260.2, dated Mar. 8, 2018. |
European Patent Office, Examination Report regarding European Patent Application No. 11849260.2, dated Jun. 21, 2018. |
Carpendale et al., “Achieving Higher Magnification in Context”, UIST 04. Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, Santa Fe, NM, Oct. 24-27, 2004, New York, NY: ACM Press, US, ISBN 978-1-58113-957-0, Oct. 24, 2004. |
Xactware Solutions, Inc., Petitioner's Reply to Patent Owner's Response in inter partes review of U.S. Pat. No. 8,823,732, IPR2016-00593, Filed Jan. 27, 2017. |
Mexican Patent Office, Official Action regarding Mexican Patent Application No. MX/a/2014/014451 and English summary, Jun. 2015. |
International Search Authority Korean Intellectual Property Office, International Search Report and Written Opinion regarding PCT/US11/065418, dated May 30, 2012. |
International Bureau, Preliminary Report on Patentability regarding PCT/US11/065418, dated Jun. 27, 2013. |
Pictometry International Corp., Response to Sep. 26, 2018 Canadian Office Action regarding Canadian Patent Application No. 2,819,166; dated Mar. 26, 2019. |
Canadian Intellectual Property Office, Office Action regarding Canadian Patent Application No. 2,819,166; dated Aug. 19, 2019. |
Pictometry International Corp., Response to Aug. 19, 2019 Canadian Office Action regarding Canadian Patent Application No. 2,819,166; dated Feb. 14, 2020. |
Pictometry International Corp., Voluntary Amendment and Response to Jul. 20, 2020 telephone conversation with Canadian Office regarding Canadian Patent Application No. 2,819,166; dated Jul. 24, 2020. |
Pictometry International Corp., Response dated Jul. 3, 2018 European Patent Office Action regarding European Patent Application No. 11849260.2 ; dated Dec. 20, 2018. |
Number | Date | Country | |
---|---|---|---|
20200311461 A1 | Oct 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14341213 | Jul 2014 | US |
Child | 16846527 | US | |
Parent | 12972088 | Dec 2010 | US |
Child | 14341213 | US |