Claims
- 1. A method for detecting an intersection as displayed in two intersecting panoramic video frame sequences, the method comprising:
receiving data representing two intersecting panoramic video frame sequences, the sequences each including a plurality of frames depicting an image of the intersection between the video frame sequences; and detecting an intersection frame from each of the panoramic video frame sequences using only the data representing the two panoramic video frame sequences, the intersection comprised of the intersection frame from each panoramic video frame sequence.
- 2. The method of claim 1 wherein detecting an intersection frame from each of the panoramic video frame sequences includes detecting the most similar frame images from each of the panoramic video frame sequences.
- 3. The method of claim 1 wherein detecting the most similar frame images includes:
segmenting each frame into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row; determining an identifying value for each strip; and determining a distance measurement between strips in corresponding rows in each frame to determine which frames from the two sequences are closest to each other, wherein taking the distance measure includes processing the identifying value of each strip.
- 4. The method of claim 3 wherein the identifying value is an average intensity of the pixel values for the pixels contained in the strip.
- 5. The method of claim 3 wherein taking the distance measurement includes:
receiving Fourier spectra data associated with the strips in a corresponding row of each frame; and comparing the Fourier spectra data to derive a distance measurement between the frames of each panoramic video frame sequence.
- 6. The method of claim 5 further including:
determining the intersection to be associated with the frames having the shortest distance measurement between them.
- 7. The method of claim 5 wherein the distance measurement is one of a Euclidean, cosine, and normalized distance.
- 8. The method of claim 7 wherein the distance measurement is weighted.
- 9. The method of claim 8 wherein the weighted distance measurement includes weights to reduce a high and low band of the Fourier spectra.
- 10. A method for detecting an intersection as displayed in two intersecting panoramic video frame sequences, the method comprising:
receiving a first set of data representing two panoramic video frame sequences, the sequences each including a plurality of frames depicting an image of an intersection, the first set of data including pixel data and location data, the location data associated with at least one frame from each sequence; determining missing location data for any frame not already associated with location data; and detecting the intersection using only the first set of data representing the two panoramic video frame sequences.
- 11. The method of frame 10 wherein detecting the intersection includes:
determining a rough intersection between the two panoramic video frame sequences, the rough intersection determined by comparing the location data for each frame; determining a neighborhood having a set of neighborhood frames, the neighborhood including the rough intersection, the neighborhood frames including frames from each of the panoramic video sequences located within a parameter relative to the rough intersection; and comparing data derived from the neighborhood frames to determine the most similar frame images from each of the panoramic video frame sequences.
- 12. The method of claim 11 wherein comparing data includes:
deriving strips from the panoramic video frames; determining an identifying value for each strip; and processing the identifying values to determine a distance measurement between images.
- 13. The method of claim 12 wherein the parameter is a minimum distance from the rough intersection frames.
- 14. The method of claim 12 wherein the parameter is a minimum time from the rough intersection frames.
- 15. The method of claim 10 wherein determining missing location data includes performing interpolation using location data associated with a frame.
- 16. The method of claim 10 wherein the location data is one of satellite-based geographical positioning system data, inertial navigation data, radio beacon data, and landmark triangulation data.
- 17. A method for detecting the orientation between two intersecting panoramic video frame sequences at an intersection as displayed in the two video sequences, the method comprising:
receiving data representing two intersecting panoramic video frame sequences, the sequences each including a plurality of frames depicting an image of the intersection between the panoramic video frame sequences; detecting an intersection frame from each panoramic video sequence, the intersection frames containing the most similar frame images between the two intersecting panoramic video frame sequences, the intersection frames detected using only the data representing the two panoramic video frame sequences; detecting a relative orientation between the intersection frames of the panoramic video sequence using only the data representing the two panoramic video frame sequences.
- 18. The method of claim 17 wherein detecting a relative orientation includes:
segmenting a frame into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row; determining an identifying value for each strip; and processing the identifying values to determine the relative orientation between images.
- 19. The method of claim 18 wherein processing the identifying values includes:
receiving Fourier spectra data associated with the identifying values of the strips; and determining relative orientation as the slope of the phase difference between the Fourier spectra associated with the corresponding rows of the frames.
- 20. The method of claim 18 wherein processing the identifying values includes:
receiving Fourier spectra data associated with the identifying values of the strips; and determining the relative orientation by deriving the maximum of the cross correlation between the identifying data of the first and second panoramic video images, the cross correlation values derived from the Fourier spectra.
- 21. The method of claim 20 wherein deriving the maximum cross correlation includes:
processing the Fourier spectra using complex conjugation to derive a set of data; and taking the inverse DFT of the set of data.
- 22. The method of claim 21 wherein the distance measurement is weighted.
- 23. The method of claim 22 wherein the weighted distance measurement includes weights to reduce a high and low band of the Fourier spectra.
- 24. A method for detecting the orientation between two intersecting panoramic video frame sequences at an intersection as displayed in the two video sequences, the method comprising:
receiving pixel data representing two intersection panoramic video frames, wherein each of the two panoramic video frames is associated with a separate panoramic video frame sequence, the panoramic video frame sequences intersecting at the intersection, wherein the intersection panoramic video frames depict the most similar image between the video frame sequences; and detecting a relative orientation between the intersection panoramic video frames of the panoramic video sequence using only the pixel data representing the two panoramic video frame sequences.
- 25. The method of claim 24 wherein detecting a relative orientation includes:
segmenting a frame into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row; determining an identifying value for each strip; and processing the identifying values to determine the relative orientation between images.
- 26. The method of claim 25 wherein processing the identifying values includes:
receiving Fourier spectra data associated with the identifying values of the strips; and determining relative orientation as the slope of the phase difference between the Fourier spectra associated with the corresponding rows of the frames.
- 27. The method of claim 25 wherein processing the identifying values includes:
receiving Fourier spectra data associated with the identifying values of the strips; and determining the relative orientation by deriving the maximum of the cross correlation between the identifying data of the first and second panoramic video images, the cross correlation values derived from the Fourier spectra.
- 28. The method of claim 27 wherein deriving the maximum cross correlation includes:
processing the Fourier spectra using complex conjugation to derive a set of data; and taking the inverse DFT of the set of data.
- 29. The method of claim 28 wherein the distance measurement is weighted to reduce a high and low band of the Fourier spectra.
- 30. A method for determining an intersection and orientation between two intersecting panoramic video frame sequences, the method comprising:
receiving a first set of data representing two intersecting panoramic video frame sequences, the first set of data including pixel data and location data, the location data associated with at least one frame from each sequence, the sequences each including a plurality of frames depicting an image of the intersection between the video frame sequences; determining location data for any frame not already associated with location data; determining a rough estimate of the intersection between the two panoramic video frame sequences, the rough intersection determined by comparing the location data for each frame; determining a neighborhood having a set of neighborhood frames, the neighborhood including the rough intersection, the neighborhood frames including frames from each of the panoramic video sequences within a parameter of the rough estimate of the intersection; segmenting the neighborhood frames into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row, each strip comprised of a plurality of pixels; determining an identifying value for each strip; receiving Fourier spectra data associated with the strips in a corresponding row of each frame; comparing the Fourier spectra data to derive a distance measurement between the frames of the panoramic video sequences; and determining the relative orientation between the frames having the shortest distance measurement using the Fourier spectra data.
- 31. The method of claim 30 wherein determining relative orientation includes deriving the slope of the phase difference between the Fourier spectra associated with the corresponding rows of the frames
- 32. The method of claim 30 wherein determining relative orientation includes deriving the maximum of the cross correlation between the identifying data of the first and second panoramic video images, the cross correlation values derived from the Fourier spectra.
- 33. The method of claim 30 wherein the identifying value is an average intensity of the pixel values for the pixels contained in the strip.
- 34. A computer program product for execution by a server computer for determining an intersection and orientation between two intersecting panoramic video frame sequences, comprising:
computer code for receiving a first set of data representing two intersecting panoramic video frame sequences, the first set of data including pixel data and location data, the location data associated with at least one frame from each sequence, the sequences each including a plurality of frames depicting an image of the intersection between the video frame sequences; computer code for determining location data for any frame not already associated with location data; computer code for determining a rough estimate of the intersection between the two panoramic video frame sequences, the rough intersection determined by comparing the location data for each frame; computer code for determining a neighborhood having a set of neighborhood frames, the neighborhood including the rough intersection, the neighborhood frames including frames from each of the panoramic video sequences within a parameter of the rough estimate of the intersection; computer code for segmenting the neighborhood frames into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row, each strip comprised of a plurality of pixels; computer code for determining an identifying value for each strip; computer code for receiving Fourier spectra data associated with the strips in a corresponding row of each frame; computer code for comparing the Fourier spectra data associated with the corresponding rows of the frames to derive a distance measurement between the frames of the panoramic video sequences; and computer code for determining the relative orientation between the frames having the shortest distance measurement using the Fourier spectra data.
- 35. The method of claim 34 computer code for determining relative orientation includes computer code for deriving the slope of the phase difference between the Fourier spectra associated with the corresponding rows of the frames.
- 36. The method of claim 34 wherein computer code for determining relative orientation includes computer code for deriving the maximum of the cross correlation between the identifying data of the first and second panoramic video images, the cross correlation values derived from the Fourier spectra.
- 37. A computer program product for execution by a server computer for detecting an intersection as displayed in two intersecting panoramic video frame sequences, comprising:
computer code for receiving pixel information representing two intersecting panoramic video frame sequences, the sequences each including a plurality of frames depicting an image of the intersection between the video frame sequences; and computer code for segmenting a frame into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row, each strip containing multiple pixels; computer code for determining an identifying value for each strip; computer code for receiving Fourier spectra data associated with the strips in a corresponding row of each frame; and computer code for comparing the Fourier spectra data associated with the corresponding rows of the frames to derive a distance measurement between each frame.
- 38. A computer program product for execution by a server computer for detecting the orientation between two intersecting panoramic video frame sequences at an intersection as displayed in the two video sequences, comprising:
computer code for receiving data representing two intersecting panoramic video frame sequences, the sequences each including a plurality of frames depicting an image of the intersection between the panoramic video frame sequences; computer code for detecting an intersection frame from each panoramic video sequence, the intersection frames containing the most similar frame images between the intersecting panoramic video frame sequences, the intersection frames detected using only the data representing the two panoramic video frame sequences; computer code for segmenting a frame into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row; computer code for determining an identifying value for each strip; computer code for receiving Fourier spectra data associated with the identifying values of the strips; and computer code for determining the relative orientation between the intersection frames using the Fourier spectra data.
- 39. The computer program product of claim 38 wherein the computer code for determining the relative orientation includes computer code for determining the slope of the phase difference between the Fourier spectra data associated with the corresponding rows of the frames.
- 40. The computer program product of claim 38 wherein the computer code for determining the relative orientation includes computer code for deriving the maximum of the cross correlation between the identifying data of the first and second panoramic video images, the cross correlation values derived from the Fourier spectra data.
- 41. The computer program product of claim 38 wherein the computer code for deriving the maximum cross correlation includes:
computer code for processing the Fourier spectra data using complex conjugation to derive a second set of data; and computer code for taking the inverse DFT of the second set of data.
- 42. An apparatus for determining an intersection and orientation between two intersecting panoramic video frame sequences, comprising:
a processor; and a processor readable storage medium coupled to the processor; said processor readable storage medium containing program code for programming the apparatus to perform a method for determining an intersection and orientation between two intersecting panoramic video frame sequences, the method comprising the steps of:
receiving a first set of data representing two intersecting panoramic video frame sequences, the first set of data including pixel data and location data, the location data associated with at least one frame from each sequence, the sequences each including a plurality of frames depicting an image of the intersection between the video frame sequences; determining location data for any frame not already associated with location data; determining a rough estimate of the intersection between the two panoramic video frame sequences, the rough intersection determined by comparing the location data for each frame; determining a neighborhood having a set of neighborhood frames, the neighborhood including the rough intersection, the neighborhood frames including frames from each of the panoramic video sequences within a parameter of the rough estimate of the intersection; segmenting the neighborhood frames into at least one strip, wherein the frames are divided into rows and at least one strip is associated with each row, each strip comprised of a plurality of pixels; determining an identifying value for each strip; receiving Fourier spectra data associated with the strips in a corresponding row of each frame; comparing the Fourier spectra data associated with the corresponding rows of the frames to derive a distance measurement between each frame; and determining the relative orientation between the panoramic video frames having the shortest distance measurement between them, the relative orientation determined using the Fourier spectra data.
- 43. A method for generating a composite video frame sequence comprising:
providing a first video frame sequence; providing a second video frame sequence, the first and second video frame sequences intersecting at an intersection and having a relative orientation, the intersection associated with a first intersection frame from the first video frame sequence and a second intersection frame from the second video frame sequence, wherein the first and second intersection frames are determined to be the frames with the most similar image; and generating a composite video frame sequence, the composite video frame sequence comprised of a set of frames from the first video frame sequence up to the first intersection frame and a set of frames from the second video frame sequence starting from the second intersection frame, wherein the transition between the two intersection frames is determined using the relative orientation, the intersection and relative orientation determined using only pixel information from the video frame sequences.
- 44. The method of claim 43 wherein the first and second video frame sequences are panoramic video frame sequences.
REFERENCE TO RELATED APPLICATIONS
[0001] The present application is related to the following United States Patents and Patent Applications, which patents/applications are assigned to the owner of the present invention, and which patents/applications are incorporated by reference herein in their entirety:
[0002] U.S. Patent Application No. 60/325,172, entitled “XXXX”, filed on XXXXXX XX, XXXXX, currently pending.