The present disclosure relates to integrating audio into a multi-view interactive digital media representation.
With modern computing platforms and technologies shifting towards mobile and wearable devices that include camera sensors as native acquisition input streams, the desire to record and preserve moments digitally in a different form than more traditional two-dimensional (2D) flat images and videos has become more apparent. Traditional digital media formats typically limit their viewers to a passive experience. For instance, a 2D flat image can be viewed from one angle and is limited to zooming in and out. Consequently, traditional digital media formats, such as 2D flat images, do not easily lend themselves to reproducing memories and events with high fidelity. In addition, 2D videos are usually limited to a set playback of visual data from a particular viewpoint and a corresponding fixed audio track.
As technology has progressed, various three-dimensional (3D) media formats have developed, such as multi-view interactive digital media representations. Examples of these multi-view interactive media representations include surround views, multiview images, and 3D data formats. In these multi-view interactive digital media representations, a user can control how to view the image data. For instance, the user can navigate around various objects and select a viewpoint from which to view the image data.
A problem in the presentation of multi-view interactive digital media representations is how to include audio information in the viewing process. Although image data and audio information may be recorded simultaneously, a user may choose to view the images in a different order than they were acquired during the recording process. Because a user may navigate through the images in the captured multi-view interactive digital media representation in any order, the displayed visual representation of the scene may not be synchronized with playback of the recorded audio. Accordingly, there is a need for improved mechanisms and processes for integrating audio into a multi-view interactive digital media representation.
Provided are various mechanisms and processes relating to integrating audio into a multi-view interactive digital media representation.
In one aspect, which may include at least a portion of the subject matter of any of the preceding and/or following examples and aspects, one process includes retrieving a multi-view interactive digital media representation that includes numerous images fused together into content and context models. The process next includes retrieving and processing audio data to be integrated into the multi-view interactive digital media representation. A first segment of audio data may be associated with a first position in the multi-view interactive digital media representation. In other examples, a first segment of audio data may be associated with a visual position or the location of a camera in the multi-view interactive digital media representation. The audio data may be played in coordination with the multi-view interactive digital media representation based on a user's navigation through the multi-view interactive digital media representation, where the first segment is played when the first position or first visual position is reached.
In another aspect, which may include at least a portion of the subject matter of any of the preceding and/or following examples and aspects, a computer readable medium for integrating audio into a multi-view interactive digital media representation includes computer code for retrieving a multi-view interactive digital media representation that includes numerous images fused together into content and context models. The computer readable medium also includes computer code for retrieving and processing audio data to be integrated into the multi-view interactive digital media representation. Computer code for processing the audio data includes segmenting the audio data into a first segment and a second segment and associating the first segment with a first position in the multi-view interactive digital media representation and the second segment with a second position in the multi-view interactive digital media representation. The computer readable medium further includes computer code for playing the audio data in coordination with the multi-view interactive digital media representation based on a user's navigation through the multi-view interactive digital media representation, where the first segment is played when the first position in the multi-view interactive digital media representation is depicted and the second segment is played when the second position in multi-view interactive digital media representation is depicted.
In yet another aspect, which may include at least a portion of the subject matter of any of the preceding and/or following examples and aspects, a process for integrating audio into a multi-view interactive digital media representation includes retrieving a multi-view interactive digital media representation that includes a plurality of images fused together into a three dimensional model that is navigable by a user. The process further includes retrieving and processing audio data to be integrated into the multi-view interactive digital media representation. Processing the audio data includes segmenting the audio data into a first segment and a second segment and associating the first segment with a first position in the multi-view interactive digital media representation and the second segment with a second position in the multi-view interactive digital media representation. Next, the process includes playing the audio data in coordination with the multi-view interactive digital media representation based on a user's navigation through the multi-view interactive digital media representation. In particular, the first segment is played when the first position in the multi-view interactive digital media representation is depicted and the second segment is played when the second position in multi-view interactive digital media representation is depicted.
These and other embodiments are described further below with reference to the figures.
The disclosure may best be understood by reference to the following description taken in conjunction with the accompanying drawings, which illustrate particular embodiments of the present invention.
Reference will now be made in detail to some specific examples of the invention including the best modes contemplated by the inventors for carrying out the invention. Examples of these specific embodiments are illustrated in the accompanying drawings. While the invention is described in conjunction with these specific embodiments, it will be understood that it is not intended to limit the invention to the described embodiments. On the contrary, it is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims.
For example, the techniques of the present invention will be described in the context of particular audio segments and components. However, it should be noted that the techniques of the present invention can apply to one or more of any variety of different audio segments and components. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. Particular example embodiments of the present invention may be implemented without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.
Various techniques and mechanisms of the present invention will sometimes be described in singular form for clarity. However, it should be noted that some embodiments include multiple iterations of a technique or multiple instantiations of a mechanism unless noted otherwise. For example, a system uses a processor in a variety of contexts. However, it will be appreciated that a system can use multiple processors while remaining within the scope of the present invention unless otherwise noted. Furthermore, the techniques and mechanisms of the present invention will sometimes describe a connection between two entities. It should be noted that a connection between two entities does not necessarily mean a direct, unimpeded connection, as a variety of other entities may reside between the two entities. For example, a processor may be connected to memory, but it will be appreciated that a variety of bridges and controllers may reside between the processor and memory. Consequently, a connection does not necessarily mean a direct, unimpeded connection unless otherwise noted.
Various three-dimensional (3D) media formats have developed with advances in technology, such as multi-view interactive media representations. These multi-view interactive digital media representations include formats such as surround views, multiview images, and 3D data formats. In these multi-view interactive digital media representations, a user can control how to view the image data. For instance, the user can navigate around various objects and select a viewpoint from which to view the image data.
Because users can navigate around various objects within multi-view interactive digital media representation, one problem is how to include audio information in this viewing process. Although image data and audio information may be recorded simultaneously, a user may choose to view the images in a different order than they were acquired during the recording process. Because a user may navigate through the images in the captured multi-view interactive digital media representation in any order, the displayed visual representation of the scene may not be synchronized with playback of the recorded audio. Various embodiments described herein relate to improved mechanisms and processes for integrating audio into a multi-view interactive digital media representation.
As described above, a multi-view interactive digital media representation can take numerous forms within the scope of this disclosure. For instance, a multi-view interactive digital media representation may include a surround view, multi-view image, or three dimensional model. Surround views are described in more detail with regard to U.S. patent application Ser. No. 14/530,669 by Holzer et al., filed on Oct. 31, 2014, titled “Analysis and Manipulation of Images and Video for Generation of Surround Views,” which is incorporated by reference herein in its entirety and for all purposes. According to various embodiments described therein, a surround view provides a user with the ability to control the viewpoint of the visual information displayed on a screen. In addition, a surround view presents a user with an interactive and immersive active viewing experience.
According to various embodiments, the data used to generate a surround view can come from a variety of sources. In particular, data such as, but not limited to, two-dimensional (2D) images can be used to generate a surround view. These 2D images can include color image data streams such as multiple image sequences, video data, etc., or multiple images in any of various formats for images, depending on the application. Another source of data that can be used to generate a surround view includes location information. This location information can be obtained from sources such as accelerometers, gyroscopes, magnetometers, GPS, WiFi, IMU-like systems (Inertial Measurement Unit systems), and the like. Yet another source of data that can be used to generate a surround view can include depth images. These depth images can include depth, 3D, or disparity image data streams, and the like, and can be captured by devices such as, but not limited to, stereo cameras, time-of-flight cameras, three-dimensional cameras, three-dimensional capture devices, a combination of devices, a combination of multidimensional capture devices, and the like.
According to one example, gathered data can be fused together. In some embodiments, a surround view can be generated by a combination of data that includes both 2D images and location information, without any depth images provided. In other embodiments, depth images and location information can be used together. Various combinations of image data can be used with location information, depending on the application and available data.
In the present example, the data that has been fused together is then used for content modeling and context modeling. According to various examples, the subject matter featured in the images can be separated into content and context. The content can be delineated as the object of interest and the context can be delineated as the scenery surrounding the object of interest. According to various embodiments, the content can be a three-dimensional model, depicting an object of interest, although the content can be a two-dimensional model in some embodiments. Furthermore, in some embodiments, the context can be a two-dimensional model depicting the scenery surrounding the object of interest. Although in many examples the context can provide two-dimensional views of the scenery surrounding the object of interest, the context can also include three-dimensional aspects in some embodiments. For instance, the context can be depicted as a “flat” image along a cylindrical “canvas,” such that the “flat” image appears on the surface of a cylinder. In addition, some examples may include three-dimensional context models, such as when some objects are identified in the surrounding scenery as three-dimensional objects. In various embodiments, the models provided by content modeling and context modeling can be generated by combining the image and location information data.
According to various embodiments, context and content of a surround view are determined based on a specified object of interest. In some examples, an object of interest is automatically chosen based on processing of the image and location information data. For instance, if a dominant object is detected in a series of images, this object can be selected as the content. In other examples, a user specified target can be chosen. It should be noted, however, that a surround view can be generated without a user specified target in some applications.
According to various embodiments, one or more enhancement algorithms can be applied. In particular example embodiments, various algorithms can be employed during capture of surround view data, regardless of the type of capture mode employed. These algorithms can be used to enhance the user experience. For instance, automatic frame selection, stabilization, view interpolation, filters, and/or compression can be used during capture of surround view data. In some examples, these enhancement algorithms can be applied to image data after acquisition of the data. In other examples, these enhancement algorithms can be applied to image data during capture of surround view data.
Although various embodiments described herein may include references to surround views, other types of multi-view interactive digital media representations are also intended to be included. For instance, representations such as a multi-view image, three dimensional model, or other formats can be integrated with audio data. For instance, a multi-view image or three dimensional model may include navigation capabilities, views of the subject matter from various viewpoints, etc. In these representations, content and context need not necessarily be separated.
With reference to
In the present example, various navigations are available to the user. For instance, the user can browse through the multi-view interactive digital media representation by swiping around the trees using navigation 102. This may involve a rotation around the trees to reach a new viewpoint behind the car. The user can also browse through the multi-view interactive digital media representation by dragging the car to the left of the screen using navigation 108. In this case, the car would move to the left and the scenery around the car would also shift relative to the car. Yet another example of browsing includes the user swiping in the direction of navigation 110 to move the viewpoint of the scene. The car would then be viewed at a different angle and the scenery surrounding this viewpoint would also shift. Endless possibilities for navigating through the multi-view interactive digital media representation are possible.
In the present example, pressing the autoplay button 112 shows the car moving from right to left in the scene and the scenery moving relative to the car.
In other examples, a panoramic surround view includes a car that is driving by. In the background there are trees with chirping birds and the ocean with waves crashing on the shore. The audio data that is recorded is decomposed into the sound of the car, the sound of the birds, and the sound of the waves and the audio files are attached to the locations of those elements in the visual data. In still other examples, a surround view of a person includes a person making a face at a specific camera position within the surround view. An audio file is automatically played when that camera position is reached while navigating through the surround view.
In the present example, a particular multi-view interactive digital media representation is depicted with a car as content 106 and the trees as context 104. With reference to
According to various embodiments, the digital visual data included in a scene 200 can be, semantically and/or practically, separated into content 206 and context 210, especially in the implementation of surround views. According to particular embodiments, content 206 can include the object(s), person(s), or scene(s) of interest while the context 210 represents the remaining elements of the scene surrounding the content 206. In the present example, the object 202 is a car. This object 202 constitutes the content 206 of the scene 200. The trees in the scenery 208 constitute the context. In some examples, a surround view may represent the content 206 as three-dimensional data, and the context 210 as a two-dimensional panoramic background. In other examples, a surround view may represent both the content 206 and context 210 as two-dimensional panoramic scenes. In yet other examples, content 206 and context 210 may include three-dimensional components or aspects. In particular embodiments, the way that the surround view depicts content 206 and context 210 depends on the capture mode used to acquire the images.
In some examples, such as but not limited to: recordings of objects, persons, or parts of objects or persons, where only the object, person, or parts of them are visible, recordings of large flat areas, and recordings of scenes where the data captured appears to be at infinity (i.e., there are no subjects close to the camera), the content 206 and the context 210 may be the same. In these examples, the surround view produced may have some characteristics that are similar to other types of digital media such as panoramas. However, according to various embodiments, surround views include additional features that distinguish them from these existing types of digital media. For instance, a surround view can represent moving data. Additionally, a surround view is not limited to a specific cylindrical, spherical or translational movement. Various motions can be used to capture image data with a camera or other capture device. Furthermore, unlike a stitched panorama, a surround view can display different sides of the same object.
With reference to
In the present example, the process 300 continues by retrieving audio data to be integrated into the multi-view interactive digital media representation at 303. The audio data can be obtained in a variety of ways depending on the application or desired effect. For instance, an audio stream can be recorded together (i.e., at the same time) with the recording of the visual data included in the multi-view interactive digital media representation. In another example, an audio stream can be recorded separately from the visual data. One or more audio recordings can be created to use with the visual data. In other examples, pre-recorded audio files can be used. For instance, the user may have recorded this audio data at an earlier time or may use one or more existing audio files from the Internet or other sources. Some examples of audio files that may be used include musical recordings, sound effects, ambient noise or sounds, voice recordings, etc. A variety of effects can be applied in the processing step. Examples of effects include changing the pitch or introducing an echo effect.
In the present example, the process 300 further includes processing the audio data at 305. In particular, after an audio file has been recorded or selected it has to be processed in order to be integrated into the format of the multi-view interactive digital media representation. Several options for processing are possible, one or more of which can be combined in some examples. In one example, the recorded/selected audio file is directly used without processing. In another example, the recorded/selected audio file is decomposed into different components. For instance, voices are separated from background sounds, and different sound sources are separated (e.g. cars, ocean, birds, talking). This decomposition can be implemented in a variety of ways. One way includes using independent component analysis. Once the audio file is decomposed into different components, the separate audio streams are then either presented to the user for further selection and positioning or automatically assigned to locations in the multi-view interactive digital media representation corresponding to where they originated (i.e. locations of the original audio if the audio was recorded with the video). More details relating to the positioning of audio streams within the multi-view interactive digital media representation are discussed below with regard to audio playback at 307. Additionally, a particular example of processing audio data is described in conjunction with
Once the audio data is processed, the audio data is then played in coordination with the multi-view interactive digital media representation at 307. The playback of the audio data can be done in several ways. In one example, the audio data is played once as soon as the multi-view interactive digital media representation is loaded and displayed. In some instances, the audio data is played while the user navigates through the multi-view interactive digital media representation and the audio data is played at original speed during this navigation, independent of navigation direction or speed. In other instances, the audio data is played while the multi-view interactive digital media representation follows a predetermined auto play sequence. In yet other instances, audio and image data are initially played (once or multiple times) without user interaction available. Once this initial play is over, the user can manually navigate through the multi-view interactive digital media representation. In another example, the audio data is played repeatedly as soon as surround view is loaded and displayed. In some instances, the audio data is played repeatedly whether the user navigates the multi-view interactive digital media representation or the multi-view interactive digital media representation plays through an automatic playback sequence.
In some examples, the audio data is played in conjunction with navigation through the multi-view interactive digital media representation. For instance, navigating in one direction plays the audio forward and navigating in the other direction plays the audio backwards. In some instances, the speed of audio playback corresponds to navigation speed.
Another option for audio playback includes playing the audio data when a “Play” button is pressed or otherwise selected. For instance, as shown in
In other embodiments, audio playback is based on navigation through the multi-view interactive digital media representation. In one example, audio is played when a certain position/frame of the multi-view interactive digital media representation is reached during navigation, whether manual or automatic. In some examples, audio data is associated with specific positions or frames in the multi-view interactive digital media representation. The closer the user navigates towards these positions or frames, the louder the corresponding audio plays. In particular examples, audio data can be “attached” to a 3D location in a scene and the volume of the audio playback depends on the distance and orientation of the current view point of the visualization. Specifically, the sound volume of an audio playback increases if the navigation location approaches the 3D location of the audio data. Similarly, the volume increases if the viewing direction of the navigation is oriented towards the location of the audio data and the volume decreases if the viewing direction of the navigation is oriented away from the location of the audio data.
Referring to
Referring back to
Another option for audio playback is to play background music (e.g. a song) or sound effects while the user navigates through a multi-view interactive digital media representation. In some embodiments, the type or style of the background music can be correlated with the occurrence and strength of visual filters. For instance, detection of a beat in the music increases the strength of a filter or a different filter is applied if the style or type of the music changes. In some examples, the filters can be applied to correspond to the chosen background music. In other examples, the background music can be chosen based on any filters or effects included in the visual file. For instance, certain filters that make the visuals dark and murky may cause selection of darker songs or sound effects.
In some examples, audio playback in 307 can also occur during automatic playback of the multi-view interactive digital media representation. For instance, automatic playback may be initiated by user action such as selection of an autoplay button, as shown in
In some embodiments, a switch between multi-view interactive digital media representations or a switch in the type of visualization or playback effects within a certain multi-view interactive digital media representation can happen when the style of the music changes or in synchronization with the occurrence of certain instruments or beats. In some examples, a change in the visualization direction within a multi-view interactive digital media representation occurs in synchronization with certain instruments or beats. In other examples, a change in the playback speed of the visual data may occur if the type of music changes or in synchronization with a certain instrument or beat. In yet other examples, the occurrence and strength of visual filters can correspond to the type of music and the occurrence of certain instruments. For instance, a beat increases the strength of a filter or a different filter is applied if the style or type of the music changes. In some embodiments, the user can define where and when the multi-view interactive digital media representations are switched and/or how the playback changes during automatic playback.
In particular embodiments, a combination of automatic playback and interactive navigation is possible. For example, a specific song can be played in the background and the current visible multi-view interactive digital media representation is switched at certain locations within the song, but the user is able to interactively navigate through the currently visible multi-view interactive digital media representation. In a converse example, a song can be played in the background and the user can switch between different multi-view interactive digital media representations, but the multi-view interactive digital media representations are autoplayed.
Although the above example describes various embodiments relating to integrating audio with a multi-view interactive digital media representation,
Next, the first segment is associated with a first position in the multi-view interactive digital media representation at 313 and the second segment is associated with a second position in the multi-view interactive digital media representation at 315. In some instances, the first position is the same as the second position. In these cases, the first segment and second segment will overlap when played. In some examples, the first position and second position are located in separate places, but at least a portion of the first segment and second segment may overlap when played.
With reference to
According to various embodiments, the requests to navigate to the first position and second position can be made in numerous ways. In one example, a user's navigation through the multi-view interactive digital media representation includes selecting automatic playback, where automatic playback triggers play of a predetermined navigation through the multi-view interactive digital media representation. This predetermined navigation includes a set sequence of views and audio data that includes navigation to the first position and the second position. In another example, a user's navigation through the multi-view interactive digital media representation includes navigating to the first position and navigating to the second position. As described above with regard to
Various computing devices can implement the methods described herein. For instance, a mobile device, computer system, etc. can be used to display a multi-view interactive digital media representation and the associated audio media. With reference to
In addition, various very high-speed interfaces may be provided such as fast Ethernet interfaces, Gigabit Ethernet interfaces, ATM interfaces, HSSI interfaces, POS interfaces, FDDI interfaces and the like. Generally, these interfaces may include ports appropriate for communication with the appropriate media. In some cases, they may also include an independent processor and, in some instances, volatile RAM. The independent processors may control such communications intensive tasks as packet switching, media control and management.
According to particular example embodiments, the system 500 uses memory 503 to store data and program instructions and maintain a local side cache. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store received metadata and batch requested metadata.
Because such information and program instructions may be employed to implement the systems/methods described herein, the present invention relates to tangible, machine readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable media include hard disks, floppy disks, magnetic tape, optical media such as CD-ROM disks and DVDs; magneto-optical media such as optical disks, and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and programmable read-only memory devices (PROMs). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
Although many of the components and processes are described above in the singular for convenience, it will be appreciated by one of skill in the art that multiple components and repeated processes can also be used to practice the techniques of the present disclosure.
While the present disclosure has been particularly shown and described with reference to specific embodiments thereof, it will be understood by those skilled in the art that changes in the form and details of the disclosed embodiments may be made without departing from the spirit or scope of the invention. It is therefore intended that the invention be interpreted to include all variations and equivalents that fall within the true spirit and scope of the present invention.
This patent application is a continuation of and claims priority to U.S. patent application Ser. No. 14/861,019, titled “INTEGRATION OF AUDIO INTO A MULTI-VIEW INTERACTIVE DIGITAL MEDIA REPRESENTATION”, filed Sep. 22, 2015 by Holzer et al, the entirety of which is hereby incorporated by reference in its entirety and for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2534821 | Richard | Dec 1950 | A |
5495576 | Ritchey | Feb 1996 | A |
5557684 | Wang | Sep 1996 | A |
5613048 | Chen | Mar 1997 | A |
5613056 | Gasper | Mar 1997 | A |
5694533 | Richards | Dec 1997 | A |
5706417 | Adelson | Jan 1998 | A |
5847714 | Naqvi | Dec 1998 | A |
5850352 | Moezzi | Dec 1998 | A |
5926190 | Turkowski | Jul 1999 | A |
6031564 | Ma | Feb 2000 | A |
6080063 | Khosla | Jun 2000 | A |
6185343 | Ikeda | Feb 2001 | B1 |
6252974 | Martens | Jun 2001 | B1 |
6266068 | Kang | Jul 2001 | B1 |
6281903 | Martin | Aug 2001 | B1 |
6327381 | Rogina | Dec 2001 | B1 |
6385245 | De Haan | May 2002 | B1 |
6504569 | Jasinschi | Jan 2003 | B1 |
6522787 | Kumar | Feb 2003 | B1 |
6778207 | Lee | Aug 2004 | B1 |
6814889 | O'Grady | Nov 2004 | B1 |
6975756 | Slabaugh | Dec 2005 | B1 |
7167180 | Shibolet | Jan 2007 | B1 |
7593000 | Chin | Sep 2009 | B1 |
7631261 | Williams | Dec 2009 | B2 |
7631277 | Nie | Dec 2009 | B1 |
8078004 | Kang | Dec 2011 | B2 |
8094928 | Graepel | Jan 2012 | B2 |
8160391 | Zhu | Apr 2012 | B1 |
8244069 | Bourdev | Aug 2012 | B1 |
8401276 | Choe | Mar 2013 | B1 |
8503826 | Klimenko | Aug 2013 | B2 |
8504842 | Meacham | Aug 2013 | B1 |
8515982 | Hickman | Aug 2013 | B1 |
8589069 | Lehman | Nov 2013 | B1 |
8682097 | Steinberg | Mar 2014 | B2 |
8803912 | Fouts | Aug 2014 | B1 |
8817071 | Wang | Aug 2014 | B2 |
8819525 | Holmer | Aug 2014 | B1 |
8866841 | Distler | Oct 2014 | B1 |
8942917 | Chrysanthakopoulos | Jan 2015 | B2 |
8947452 | Ballagh | Feb 2015 | B1 |
8947455 | Friesen | Feb 2015 | B2 |
8963855 | Chen | Feb 2015 | B2 |
8966356 | Hickman | Feb 2015 | B1 |
9024970 | Lynch | May 2015 | B2 |
9027117 | Wilairat | May 2015 | B2 |
9043222 | Kerr | May 2015 | B1 |
9070402 | Burtnyk | Jun 2015 | B2 |
9094670 | Furio | Jul 2015 | B1 |
9129179 | Wong | Sep 2015 | B1 |
9317881 | Ledterman | Apr 2016 | B1 |
9325899 | Chou | Apr 2016 | B1 |
9367951 | Gray | Jun 2016 | B1 |
9390250 | Kim | Jul 2016 | B2 |
9400595 | Li | Jul 2016 | B2 |
9407816 | Sehn | Aug 2016 | B1 |
9412203 | Garcia, III | Aug 2016 | B1 |
9472161 | Ho | Oct 2016 | B1 |
9621768 | Lyon | Apr 2017 | B1 |
9704257 | Tuzel | Jul 2017 | B1 |
9734586 | Luo | Aug 2017 | B2 |
9865033 | Jafarzadeh | Jan 2018 | B1 |
9865058 | Mullins | Jan 2018 | B2 |
9865069 | Saporta | Jan 2018 | B1 |
9886771 | Chen | Feb 2018 | B1 |
9898742 | Higgins | Feb 2018 | B2 |
9904056 | Raghoebardajal | Feb 2018 | B2 |
9910505 | Park | Mar 2018 | B2 |
9928544 | Hasan | Mar 2018 | B1 |
9940541 | Holzer | Apr 2018 | B2 |
9968257 | Burt | May 2018 | B1 |
9998663 | François | Jun 2018 | B1 |
10008027 | Baker | Jun 2018 | B1 |
10055882 | Marin | Aug 2018 | B2 |
10147211 | Holzer | Dec 2018 | B2 |
10157333 | Wang | Dec 2018 | B1 |
10176592 | Holzer | Jan 2019 | B2 |
10176636 | Neustein | Jan 2019 | B1 |
10204448 | Hazeghi | Feb 2019 | B2 |
10222932 | Holzer | Mar 2019 | B2 |
10242474 | Holzer | Mar 2019 | B2 |
10262426 | Holzer | Apr 2019 | B2 |
10275935 | Holzer | Apr 2019 | B2 |
10284794 | Francois | May 2019 | B1 |
10306203 | Goyal | May 2019 | B1 |
10360601 | Adegan | Jul 2019 | B1 |
10373260 | Haller, Jr. | Aug 2019 | B1 |
10382739 | Rusu | Aug 2019 | B1 |
10430995 | Holzer | Oct 2019 | B2 |
10574974 | Arora | Feb 2020 | B2 |
10592199 | Rakshit | Mar 2020 | B2 |
10657647 | Chen | May 2020 | B1 |
10668965 | Czinger | Jun 2020 | B2 |
10719939 | Holzer | Jul 2020 | B2 |
10725609 | Holzer | Jul 2020 | B2 |
10726560 | Holzer | Jul 2020 | B2 |
10748313 | Holzer | Aug 2020 | B2 |
10750161 | Holzer | Aug 2020 | B2 |
10818029 | Holzer | Oct 2020 | B2 |
10846913 | Holzer | Nov 2020 | B2 |
10852902 | Holzer | Dec 2020 | B2 |
11006095 | Holzer | May 2021 | B2 |
11055534 | Beall | Jul 2021 | B2 |
11138432 | Holzer | Oct 2021 | B2 |
11438565 | Trevor | Sep 2022 | B2 |
20010014172 | Baba | Aug 2001 | A1 |
20010046262 | Freda | Nov 2001 | A1 |
20020024517 | Yamaguchi | Feb 2002 | A1 |
20020094125 | Guo | Jul 2002 | A1 |
20020190991 | Efran | Dec 2002 | A1 |
20030065668 | Sowizral | Apr 2003 | A1 |
20030068053 | Chu | Apr 2003 | A1 |
20030086002 | Cahill | May 2003 | A1 |
20030120472 | Lind | Jun 2003 | A1 |
20030137506 | Efran | Jul 2003 | A1 |
20030137517 | Kondo | Jul 2003 | A1 |
20030185456 | Sato | Oct 2003 | A1 |
20030231179 | Suzuki | Dec 2003 | A1 |
20040085335 | Burlnyk | May 2004 | A1 |
20040104935 | Williamson | Jun 2004 | A1 |
20040141014 | Kamiwada | Jul 2004 | A1 |
20040184013 | Raskar | Sep 2004 | A1 |
20040222987 | Chang | Nov 2004 | A1 |
20040239699 | Uyttendaele | Dec 2004 | A1 |
20050018045 | Thomas | Jan 2005 | A1 |
20050041842 | Frakes | Feb 2005 | A1 |
20050046645 | Breton | Mar 2005 | A1 |
20050119550 | Serra | Jun 2005 | A1 |
20050151759 | Gonzalez-Banos | Jul 2005 | A1 |
20050186548 | Tomlinson | Aug 2005 | A1 |
20050195216 | Kramer | Sep 2005 | A1 |
20050219264 | Shum | Oct 2005 | A1 |
20050226502 | Cohen | Oct 2005 | A1 |
20050232467 | Mohri | Oct 2005 | A1 |
20050232510 | Blake | Oct 2005 | A1 |
20050253877 | Thompson | Nov 2005 | A1 |
20050283075 | Ma | Dec 2005 | A1 |
20050285874 | Zitnick, III | Dec 2005 | A1 |
20060028552 | Aggarwal | Feb 2006 | A1 |
20060087498 | Evemy | Apr 2006 | A1 |
20060188147 | Rai | Aug 2006 | A1 |
20060193535 | Mishima | Aug 2006 | A1 |
20060250505 | Gennetten | Nov 2006 | A1 |
20060256109 | Acker | Nov 2006 | A1 |
20070008312 | Zhou | Jan 2007 | A1 |
20070058880 | Lienard | Mar 2007 | A1 |
20070064802 | Paniconi | Mar 2007 | A1 |
20070070069 | Samarasekera | Mar 2007 | A1 |
20070110338 | Snavely | May 2007 | A1 |
20070118801 | Harshbarger | May 2007 | A1 |
20070126928 | Klompnhouwer | Jun 2007 | A1 |
20070159487 | Felt | Jul 2007 | A1 |
20070237420 | Steedly | Oct 2007 | A1 |
20070237422 | Zhou | Oct 2007 | A1 |
20070252804 | Engel | Nov 2007 | A1 |
20070269054 | Takagi | Nov 2007 | A1 |
20080009734 | Houle | Jan 2008 | A1 |
20080025588 | Zhang | Jan 2008 | A1 |
20080033641 | Medalia | Feb 2008 | A1 |
20080106593 | Arfvidsson | May 2008 | A1 |
20080151106 | Verburgh | Jun 2008 | A1 |
20080152258 | Tulkki | Jun 2008 | A1 |
20080198159 | Liu | Aug 2008 | A1 |
20080201734 | Lyon | Aug 2008 | A1 |
20080225132 | Inaguma | Sep 2008 | A1 |
20080232716 | Plagne | Sep 2008 | A1 |
20080246759 | Summers | Oct 2008 | A1 |
20080266142 | Sula | Oct 2008 | A1 |
20080278569 | Rotem | Nov 2008 | A1 |
20080313014 | Fell | Dec 2008 | A1 |
20090003725 | Merkel | Jan 2009 | A1 |
20090046160 | Hayashi | Feb 2009 | A1 |
20090077161 | Hamilton, II | Mar 2009 | A1 |
20090087029 | Coleman | Apr 2009 | A1 |
20090116732 | Zhou | May 2009 | A1 |
20090141130 | Ortiz | Jun 2009 | A1 |
20090144173 | Mo | Jun 2009 | A1 |
20090153549 | Lynch | Jun 2009 | A1 |
20090160934 | Hendrickson | Jun 2009 | A1 |
20090163185 | Lim | Jun 2009 | A1 |
20090174709 | Kozlak | Jul 2009 | A1 |
20090208062 | Sorek | Aug 2009 | A1 |
20090259946 | Dawson | Oct 2009 | A1 |
20090262074 | Nasiri | Oct 2009 | A1 |
20090263045 | Szeliski | Oct 2009 | A1 |
20090274391 | Arcas | Nov 2009 | A1 |
20090276805 | Andrews, II | Nov 2009 | A1 |
20090282335 | Alexandersson | Nov 2009 | A1 |
20090303343 | Drimbarean | Dec 2009 | A1 |
20100007715 | Lai | Jan 2010 | A1 |
20100017181 | Mouton | Jan 2010 | A1 |
20100026788 | Ishikawa | Feb 2010 | A1 |
20100033553 | Levy | Feb 2010 | A1 |
20100060793 | Oz | Mar 2010 | A1 |
20100079667 | Engin | Apr 2010 | A1 |
20100098258 | Thorn | Apr 2010 | A1 |
20100100492 | Law | Apr 2010 | A1 |
20100110069 | Yuan | May 2010 | A1 |
20100111444 | Coffman | May 2010 | A1 |
20100164990 | Van Doorn | Jul 2010 | A1 |
20100171691 | Cook | Jul 2010 | A1 |
20100188584 | Liu | Jul 2010 | A1 |
20100215251 | Klein Gunnewiek | Aug 2010 | A1 |
20100225743 | Florencio | Sep 2010 | A1 |
20100231593 | Zhou | Sep 2010 | A1 |
20100259595 | Trimeche | Oct 2010 | A1 |
20100265164 | Okuno | Oct 2010 | A1 |
20100266172 | Shlomi | Oct 2010 | A1 |
20100305857 | Byrne | Dec 2010 | A1 |
20100315412 | Sinha | Dec 2010 | A1 |
20100329542 | Ramalingam | Dec 2010 | A1 |
20110007072 | Khan | Jan 2011 | A1 |
20110033170 | Ikeda | Feb 2011 | A1 |
20110034103 | Fong | Feb 2011 | A1 |
20110040539 | Szymczyk | Feb 2011 | A1 |
20110043604 | Peleg | Feb 2011 | A1 |
20110064388 | Brown | Mar 2011 | A1 |
20110074926 | Khan | Mar 2011 | A1 |
20110090344 | Gefen | Apr 2011 | A1 |
20110105192 | Jung | May 2011 | A1 |
20110109618 | Nowak | May 2011 | A1 |
20110109726 | Hwang | May 2011 | A1 |
20110115886 | Nguyen | May 2011 | A1 |
20110141141 | Kankainen | Jun 2011 | A1 |
20110141227 | Bigioi | Jun 2011 | A1 |
20110142289 | Barenbrug | Jun 2011 | A1 |
20110142343 | Kim | Jun 2011 | A1 |
20110170789 | Amon | Jul 2011 | A1 |
20110173565 | Ofek | Jul 2011 | A1 |
20110179373 | Moore | Jul 2011 | A1 |
20110193941 | Inaba | Aug 2011 | A1 |
20110214072 | Lindemann | Sep 2011 | A1 |
20110234750 | Lai | Sep 2011 | A1 |
20110248987 | Mitchell | Oct 2011 | A1 |
20110254835 | Segal | Oct 2011 | A1 |
20110261050 | Smolic | Oct 2011 | A1 |
20110288858 | Gay | Nov 2011 | A1 |
20110313653 | Lindner | Dec 2011 | A1 |
20110316963 | Li | Dec 2011 | A1 |
20120007713 | Nasiri | Jan 2012 | A1 |
20120013711 | Tamir | Jan 2012 | A1 |
20120017147 | Mark | Jan 2012 | A1 |
20120019557 | Aronsson | Jan 2012 | A1 |
20120028706 | Raitt | Feb 2012 | A1 |
20120041722 | Quan | Feb 2012 | A1 |
20120057006 | Joseph | Mar 2012 | A1 |
20120062756 | Tian | Mar 2012 | A1 |
20120075411 | Matsumoto | Mar 2012 | A1 |
20120092348 | Mccutchen | Apr 2012 | A1 |
20120095323 | Eskandari | Apr 2012 | A1 |
20120099804 | Aguilera | Apr 2012 | A1 |
20120127172 | Wu | May 2012 | A1 |
20120127270 | Zhang | May 2012 | A1 |
20120139918 | Michail | Jun 2012 | A1 |
20120147224 | Takayama | Jun 2012 | A1 |
20120148162 | Zhang | Jun 2012 | A1 |
20120162223 | Hirai | Jun 2012 | A1 |
20120162253 | Collins | Jun 2012 | A1 |
20120167146 | Incorvia | Jun 2012 | A1 |
20120198317 | Eppolito | Aug 2012 | A1 |
20120207308 | Sung | Aug 2012 | A1 |
20120212579 | Per | Aug 2012 | A1 |
20120236201 | Larsen | Sep 2012 | A1 |
20120240035 | Gaucas | Sep 2012 | A1 |
20120242798 | Mcardle | Sep 2012 | A1 |
20120257025 | Kim | Oct 2012 | A1 |
20120257065 | Velarde | Oct 2012 | A1 |
20120258436 | Lee | Oct 2012 | A1 |
20120262580 | Huebner | Oct 2012 | A1 |
20120287123 | Starner | Nov 2012 | A1 |
20120293632 | Yukich | Nov 2012 | A1 |
20120294549 | Doepke | Nov 2012 | A1 |
20120300019 | Yang | Nov 2012 | A1 |
20120301044 | Nakada | Nov 2012 | A1 |
20120314027 | Tian | Dec 2012 | A1 |
20120314040 | Kopf | Dec 2012 | A1 |
20120314899 | Cohen | Dec 2012 | A1 |
20120329527 | Kang | Dec 2012 | A1 |
20120330659 | Nakadai | Dec 2012 | A1 |
20130002649 | Wu | Jan 2013 | A1 |
20130016102 | Look | Jan 2013 | A1 |
20130016897 | Cho | Jan 2013 | A1 |
20130018881 | Bhatt | Jan 2013 | A1 |
20130044191 | Matsumoto | Feb 2013 | A1 |
20130050573 | Syed | Feb 2013 | A1 |
20130057644 | Stefanoski | Mar 2013 | A1 |
20130063487 | Spiegel | Mar 2013 | A1 |
20130063549 | Schnyder | Mar 2013 | A1 |
20130071012 | Leichsenring | Mar 2013 | A1 |
20130076619 | Carr | Mar 2013 | A1 |
20130113830 | Suzuki | May 2013 | A1 |
20130120581 | Daniels | May 2013 | A1 |
20130127844 | Koeppel | May 2013 | A1 |
20130127847 | Jin | May 2013 | A1 |
20130129304 | Feinson | May 2013 | A1 |
20130141530 | Zavesky | Jun 2013 | A1 |
20130147795 | Kim | Jun 2013 | A1 |
20130147905 | Janahan | Jun 2013 | A1 |
20130154926 | Kim | Jun 2013 | A1 |
20130155180 | Wantland | Jun 2013 | A1 |
20130162634 | Baik | Jun 2013 | A1 |
20130162787 | Cho | Jun 2013 | A1 |
20130176392 | Carr | Jul 2013 | A1 |
20130195350 | Tanaka | Aug 2013 | A1 |
20130204411 | Clark | Aug 2013 | A1 |
20130208900 | Jon | Aug 2013 | A1 |
20130212538 | Lemire | Aug 2013 | A1 |
20130219357 | Reitan | Aug 2013 | A1 |
20130240628 | Van Der Merwe | Sep 2013 | A1 |
20130250045 | Ki | Sep 2013 | A1 |
20130271566 | Chen | Oct 2013 | A1 |
20130278596 | Wu | Oct 2013 | A1 |
20130314442 | Langlotz | Nov 2013 | A1 |
20140002440 | Lynch | Jan 2014 | A1 |
20140002472 | Sobeski | Jan 2014 | A1 |
20140009462 | McNamer | Jan 2014 | A1 |
20140013414 | Bruck | Jan 2014 | A1 |
20140023341 | Wang | Jan 2014 | A1 |
20140037198 | Larlus-Larrondo | Feb 2014 | A1 |
20140040742 | Hyekyung | Feb 2014 | A1 |
20140049607 | Amon | Feb 2014 | A1 |
20140059674 | Sun | Feb 2014 | A1 |
20140063005 | Ahn | Mar 2014 | A1 |
20140078136 | Sohn | Mar 2014 | A1 |
20140086551 | Kaneko | Mar 2014 | A1 |
20140087877 | Krishnan | Mar 2014 | A1 |
20140092259 | Tsang | Apr 2014 | A1 |
20140100995 | Koshy | Apr 2014 | A1 |
20140107888 | Quast | Apr 2014 | A1 |
20140118479 | Rapoport | May 2014 | A1 |
20140118483 | Rapoport | May 2014 | A1 |
20140118494 | Wu | May 2014 | A1 |
20140125659 | Yoshida | May 2014 | A1 |
20140132594 | Gharpure | May 2014 | A1 |
20140152834 | Kosseifi | Jun 2014 | A1 |
20140153832 | Kwatra | Jun 2014 | A1 |
20140177927 | Shieh | Jun 2014 | A1 |
20140192155 | Choi | Jul 2014 | A1 |
20140198184 | Stein | Jul 2014 | A1 |
20140199050 | Khalsa | Jul 2014 | A1 |
20140211989 | Ding | Jul 2014 | A1 |
20140225930 | Durmek | Aug 2014 | A1 |
20140232634 | Piemonte | Aug 2014 | A1 |
20140253436 | Petersen | Sep 2014 | A1 |
20140253746 | Voss | Sep 2014 | A1 |
20140267616 | Krig | Sep 2014 | A1 |
20140275704 | Zhang | Sep 2014 | A1 |
20140286566 | Rhoads | Sep 2014 | A1 |
20140293004 | Tsubaki | Oct 2014 | A1 |
20140293028 | Nguyen | Oct 2014 | A1 |
20140297798 | Bakalash | Oct 2014 | A1 |
20140307045 | Richardt | Oct 2014 | A1 |
20140340404 | Wang | Nov 2014 | A1 |
20140362198 | Nakayama | Dec 2014 | A1 |
20140365888 | Curzon | Dec 2014 | A1 |
20140375684 | Algreatly | Dec 2014 | A1 |
20150009130 | Motta | Jan 2015 | A1 |
20150010218 | Bayer | Jan 2015 | A1 |
20150016714 | Chui | Jan 2015 | A1 |
20150022518 | Takeshita | Jan 2015 | A1 |
20150022677 | Guo | Jan 2015 | A1 |
20150042812 | Tang | Feb 2015 | A1 |
20150046875 | Xu | Feb 2015 | A1 |
20150073570 | Gonzalez-Mendoza | Mar 2015 | A1 |
20150078449 | Diggins | Mar 2015 | A1 |
20150097961 | Ure | Apr 2015 | A1 |
20150103170 | Nelson | Apr 2015 | A1 |
20150103197 | Djordjevic | Apr 2015 | A1 |
20150130799 | Holzer | May 2015 | A1 |
20150130800 | Holzer | May 2015 | A1 |
20150130894 | Holzer | May 2015 | A1 |
20150134651 | Holzer | May 2015 | A1 |
20150138190 | Holzer | May 2015 | A1 |
20150143239 | Birkbeck | May 2015 | A1 |
20150154442 | Takahashi | Jun 2015 | A1 |
20150156415 | Gallup | Jun 2015 | A1 |
20150188967 | Paulauskas | Jul 2015 | A1 |
20150193863 | Cao | Jul 2015 | A1 |
20150193963 | Chen | Jul 2015 | A1 |
20150198443 | Yi | Jul 2015 | A1 |
20150201176 | Graziosi | Jul 2015 | A1 |
20150206341 | Loper | Jul 2015 | A1 |
20150213784 | Jafarzadeh | Jul 2015 | A1 |
20150222880 | Choi | Aug 2015 | A1 |
20150227285 | Lee | Aug 2015 | A1 |
20150227816 | Du | Aug 2015 | A1 |
20150235408 | Gross | Aug 2015 | A1 |
20150242686 | Lenka | Aug 2015 | A1 |
20150254224 | Kim | Sep 2015 | A1 |
20150269772 | Ha | Sep 2015 | A1 |
20150271356 | Terada | Sep 2015 | A1 |
20150281323 | Gold | Oct 2015 | A1 |
20150294492 | Koch | Oct 2015 | A1 |
20150309695 | Sannandeji | Oct 2015 | A1 |
20150318020 | Pribula | Nov 2015 | A1 |
20150319424 | Haimovitch-Yogev | Nov 2015 | A1 |
20150324649 | Grewe | Nov 2015 | A1 |
20150325044 | Lebovitz | Nov 2015 | A1 |
20150339846 | Holzer | Nov 2015 | A1 |
20150371440 | Pirchheim | Dec 2015 | A1 |
20150379763 | Liktor | Dec 2015 | A1 |
20160001137 | Phillips | Jan 2016 | A1 |
20160012646 | Huang | Jan 2016 | A1 |
20160026253 | Bradski | Jan 2016 | A1 |
20160027209 | Demirli | Jan 2016 | A1 |
20160034459 | Larsen | Feb 2016 | A1 |
20160042251 | Cordova-Diba | Feb 2016 | A1 |
20160044240 | Beers | Feb 2016 | A1 |
20160050368 | Seo | Feb 2016 | A1 |
20160055330 | Morishita | Feb 2016 | A1 |
20160061582 | Lucey | Mar 2016 | A1 |
20160063740 | Sakimoto | Mar 2016 | A1 |
20160066119 | Wu | Mar 2016 | A1 |
20160077422 | Wang | Mar 2016 | A1 |
20160078287 | Auge | Mar 2016 | A1 |
20160080684 | Farrell | Mar 2016 | A1 |
20160080830 | Kim | Mar 2016 | A1 |
20160086381 | Jung | Mar 2016 | A1 |
20160088287 | Sadi | Mar 2016 | A1 |
20160104316 | Shenkar | Apr 2016 | A1 |
20160110913 | Kosoy | Apr 2016 | A1 |
20160139794 | Hammendorp | May 2016 | A1 |
20160140125 | Goyal | May 2016 | A1 |
20160148349 | Cho | May 2016 | A1 |
20160171330 | Mentese | Jun 2016 | A1 |
20160189334 | Mason | Jun 2016 | A1 |
20160191895 | Yun | Jun 2016 | A1 |
20160203586 | Chang | Jul 2016 | A1 |
20160205341 | Hollander | Jul 2016 | A1 |
20160210602 | Siddique | Jul 2016 | A1 |
20160261855 | Park | Sep 2016 | A1 |
20160267676 | Setomoto | Sep 2016 | A1 |
20160275283 | De Leon | Sep 2016 | A1 |
20160275723 | Singh | Sep 2016 | A1 |
20160295127 | Yu | Oct 2016 | A1 |
20160350930 | Lin | Dec 2016 | A1 |
20160350975 | Nakagawa | Dec 2016 | A1 |
20160353089 | Gallup | Dec 2016 | A1 |
20160358337 | Dai | Dec 2016 | A1 |
20160379415 | Espeset | Dec 2016 | A1 |
20170018054 | Holzer | Jan 2017 | A1 |
20170018055 | Holzer | Jan 2017 | A1 |
20170018056 | Holzer | Jan 2017 | A1 |
20170024094 | Gresham | Jan 2017 | A1 |
20170026574 | Kwon | Jan 2017 | A1 |
20170053169 | Cuban | Feb 2017 | A1 |
20170067739 | Siercks | Mar 2017 | A1 |
20170084001 | Holzer | Mar 2017 | A1 |
20170084293 | Holzer | Mar 2017 | A1 |
20170087415 | Nandimandalam | Mar 2017 | A1 |
20170103510 | Wang | Apr 2017 | A1 |
20170103584 | Vats | Apr 2017 | A1 |
20170109930 | Holzer | Apr 2017 | A1 |
20170124769 | Saito | May 2017 | A1 |
20170124770 | Vats | May 2017 | A1 |
20170126988 | Holzer | May 2017 | A1 |
20170140236 | Price | May 2017 | A1 |
20170148179 | Holzer | May 2017 | A1 |
20170148186 | Holzer | May 2017 | A1 |
20170148199 | Holzer | May 2017 | A1 |
20170148222 | Holzer | May 2017 | A1 |
20170148223 | Holzer | May 2017 | A1 |
20170158131 | Friebe | Jun 2017 | A1 |
20170206648 | Marra | Jul 2017 | A1 |
20170213385 | Yu | Jul 2017 | A1 |
20170231550 | Do | Aug 2017 | A1 |
20170236287 | Shen | Aug 2017 | A1 |
20170249339 | Lester | Aug 2017 | A1 |
20170255648 | Dube | Sep 2017 | A1 |
20170256066 | Richard | Sep 2017 | A1 |
20170277363 | Holzer | Sep 2017 | A1 |
20170277952 | Thommes | Sep 2017 | A1 |
20170278544 | Choi | Sep 2017 | A1 |
20170287137 | Lin | Oct 2017 | A1 |
20170293894 | Taliwal | Oct 2017 | A1 |
20170308771 | Shimauchi | Oct 2017 | A1 |
20170316092 | Fichter | Nov 2017 | A1 |
20170330319 | Xu | Nov 2017 | A1 |
20170337693 | Baruch | Nov 2017 | A1 |
20170344223 | Holzer | Nov 2017 | A1 |
20170344808 | El-Khamy | Nov 2017 | A1 |
20170357910 | Sommer | Dec 2017 | A1 |
20170359570 | Holzer | Dec 2017 | A1 |
20170364766 | Das | Dec 2017 | A1 |
20170372523 | Espeset | Dec 2017 | A1 |
20180012330 | Holzer | Jan 2018 | A1 |
20180012529 | Chiba | Jan 2018 | A1 |
20180035105 | Choi | Feb 2018 | A1 |
20180035500 | Song | Feb 2018 | A1 |
20180045592 | Okita | Feb 2018 | A1 |
20180046356 | Holzer | Feb 2018 | A1 |
20180046357 | Holzer | Feb 2018 | A1 |
20180046649 | Dal Mutto | Feb 2018 | A1 |
20180052665 | Kaur | Feb 2018 | A1 |
20180063504 | Haines | Mar 2018 | A1 |
20180082715 | Rymkowski | Mar 2018 | A1 |
20180143023 | Bjorke | May 2018 | A1 |
20180143756 | Mildrew | May 2018 | A1 |
20180144547 | Shakib | May 2018 | A1 |
20180155057 | Irish | Jun 2018 | A1 |
20180158197 | Dasgupta | Jun 2018 | A1 |
20180165875 | Yu | Jun 2018 | A1 |
20180199025 | Holzer | Jul 2018 | A1 |
20180203877 | Holzer | Jul 2018 | A1 |
20180205941 | Kopf | Jul 2018 | A1 |
20180211131 | Holzer | Jul 2018 | A1 |
20180211373 | Stoppa | Jul 2018 | A1 |
20180211404 | Zhu | Jul 2018 | A1 |
20180218235 | Holzer | Aug 2018 | A1 |
20180218236 | Holzer | Aug 2018 | A1 |
20180234671 | Yang | Aug 2018 | A1 |
20180240243 | Kim | Aug 2018 | A1 |
20180255290 | Holzer | Sep 2018 | A1 |
20180268220 | Lee | Sep 2018 | A1 |
20180268256 | Di Febbo | Sep 2018 | A1 |
20180286098 | Lorenzo | Oct 2018 | A1 |
20180293774 | Yu | Oct 2018 | A1 |
20180315200 | Davydov | Nov 2018 | A1 |
20180336724 | Spring | Nov 2018 | A1 |
20180336737 | Varady | Nov 2018 | A1 |
20180338126 | Trevor | Nov 2018 | A1 |
20180338128 | Trevor | Nov 2018 | A1 |
20180357518 | Sekii | Dec 2018 | A1 |
20180374273 | Holzer | Dec 2018 | A1 |
20190019056 | Pierce | Jan 2019 | A1 |
20190025544 | Watanabe | Jan 2019 | A1 |
20190026956 | Gausebeck | Jan 2019 | A1 |
20190026958 | Gausebeck | Jan 2019 | A1 |
20190035149 | Chen | Jan 2019 | A1 |
20190035179 | Bhardwaj | Jan 2019 | A1 |
20190050664 | Yang | Feb 2019 | A1 |
20190080499 | Holzer | Mar 2019 | A1 |
20190094981 | Bradski | Mar 2019 | A1 |
20190132569 | Karpenko | May 2019 | A1 |
20190147221 | Grabner | May 2019 | A1 |
20190209886 | Harlow | Jul 2019 | A1 |
20190213392 | Pan | Jul 2019 | A1 |
20190213406 | Porikli | Jul 2019 | A1 |
20190220991 | Holzer | Jul 2019 | A1 |
20190221021 | Holzer | Jul 2019 | A1 |
20190222776 | Carter | Jul 2019 | A1 |
20190235729 | Day | Aug 2019 | A1 |
20190244372 | Holzer | Aug 2019 | A1 |
20190251738 | Holzer | Aug 2019 | A1 |
20190278434 | Holzer | Sep 2019 | A1 |
20190304064 | Zhang | Oct 2019 | A1 |
20190364265 | Matsunobu | Nov 2019 | A1 |
20200027263 | Holzer | Jan 2020 | A1 |
20200045249 | Francois | Feb 2020 | A1 |
20200125877 | Phan | Apr 2020 | A1 |
20200128060 | Han | Apr 2020 | A1 |
20200137380 | Supikov | Apr 2020 | A1 |
20200167570 | Beall | May 2020 | A1 |
20200207358 | Katz | Jul 2020 | A1 |
20200234397 | Holzer | Jul 2020 | A1 |
20200234451 | Holzer | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
104462365 | Mar 2015 | CN |
105849781 | Aug 2016 | CN |
105849781 | Aug 2016 | CN |
107466474 | Dec 2017 | CN |
112014005165 | Jul 2016 | DE |
112017004150 | Jun 2019 | DE |
20120110861 | Oct 2012 | KR |
101590256 | Feb 2016 | KR |
2015073570 | May 2015 | WO |
2017053197 | Mar 2017 | WO |
2018035500 | Feb 2018 | WO |
2018052665 | Mar 2018 | WO |
2018154331 | Aug 2018 | WO |
2019209886 | Oct 2019 | WO |
2020092177 | May 2020 | WO |
Entry |
---|
U.S. Appl. No. 14/800,640, Advisory Action mailed Jan. 5, 2018, 3 pgs. |
U.S. Appl. No. 14/800,640, Advisory Action mailed Feb. 8, 2018, 2 pgs. |
U.S. Appl. No. 14/800,640, Examiner Interview Summary mailed Feb. 8, 2018, 1 pg. |
U.S. Appl. No. 14/800,640, Examiner Interview Summary mailed Oct. 23, 2018, 3 pgs. |
U.S. Appl. No. 14/800,640, Final Office Action mailed Oct. 16, 2017, 15 pgs. |
U.S. Appl. No. 14/800,640, Non Final Office Action mailed Jun. 8, 2017, 14 pgs. |
U.S. Appl. No. 14/800,640, Non Final Office Action mailed Jul. 17, 2018, 16 pgs. |
U.S. Appl. No. 14/800,640, Notice of Allowance mailed Nov. 21, 2018, 7 pgs. |
U.S. Appl. No. 14/800,640, Restriction Requirement mailed Mar. 3, 2017, 5 pgs. |
U.S. Appl. No. 14/800,642, Advisory Action mailed Jan. 5, 2018, 3 pgs. |
U.S. Appl. No. 14/800,642, Advisory Action mailed Feb. 8, 2018, 3 pgs. |
U.S. Appl. No. 14/800,642, Examiner Interview Summary mailed Aug. 6, 2018, 1 pg. |
U.S. Appl. No. 14/800,642, Final Office Action mailed Oct. 17, 2017, 18 pgs. |
U.S. Appl. No. 14/800,642, Non Final Office Action mailed May 18, 2017, 17 pgs. |
U.S. Appl. No. 14/800,642, Non-Final Office Action mailed May 18, 2017, 17 pages. |
U.S. Appl. No. 14/800,642, Notice of Allowance mailed Aug. 6, 2018, 12 pgs. |
U.S. Appl. No. 14/819,473, Examiner Interview Summary mailed Jul. 11, 2016, 3 pgs. |
U.S. Appl. No. 14/819,473, Examiner Interview Summary mailed Aug. 17, 2016, 3 pgs. |
U.S. Appl. No. 14/819,473, Examiner Interview Summary mailed Oct. 14, 2016, 3 pages. |
U.S. Appl. No. 14/819,473, Final Office Action mailed Apr. 28, 2016, 45 pgs. |
U.S. Appl. No. 14/819,473, Non Final Office Action mailed Sep. 1, 2016, 36 pgs. |
U.S. Appl. No. 14/819,473, Non Final Office Action mailed Oct. 8, 2015, 44 pgs. |
U.S. Appl. No. 14/860,983, Advisory Action mailed Jan. 23, 2018, 3 pgs. |
U.S. Appl. No. 14/860,983, Advisory Action mailed Mar. 26, 2019, 2 pgs. |
U.S. Appl. No. 14/860,983, Examiner Interview Summary mailed Mar. 26, 2019, 2 pgs. |
U.S. Appl. No. 14/860,983, Examiner Interview Summary mailed Apr. 8, 2019, 3 pgs. |
U.S. Appl. No. 14/860,983, Examiner Interview Summary mailed Nov. 15, 2018, 3 pgs. |
U.S. Appl. No. 14/860,983, Final Office Action mailed Jan. 18, 2019, 19 pgs. |
U.S. Appl. No. 14/860,983, Final Office Action mailed Oct. 18, 2017, 21 pgs. |
U.S. Appl. No. 14/860,983, Non Final Office Action mailed Jun. 8, 2017, 26 pgs. |
U.S. Appl. No. 14/860,983, Non Final Office Action mailed Aug. 7, 2018, 22 pgs. |
U.S. Appl. No. 15/408,211, Advisory Action mailed Mar. 18, 2019, 4 pgs. |
U.S. Appl. No. 15/408,211, Examiner Interview Summary mailed Mar. 4, 2019, 3 pgs. |
U.S. Appl. No. 15/408,211, Examiner Interview Summary mailed Mar. 18, 2019, 2 pgs. |
U.S. Appl. No. 15/408,211, Examiner Interview Summary mailed Apr. 3, 2019, 3 pgs. |
U.S. Appl. No. 15/408,211, Examiner Interview Summary mailed Aug. 5, 2019, 3 pgs. |
U.S. Appl. No. 15/408,211, Examiner Interview Summary mailed Oct. 16, 2019, 2 pgs. |
U.S. Appl. No. 15/408,211, Examiner Interview Summary mailed Dec. 5, 2018, 3 pgs. |
U.S. Appl. No. 15/408,211, Final Office Action mailed Jan. 11, 2019, 23 pgs. |
U.S. Appl. No. 15/408,211, Non Final Office Action mailed Aug. 6, 2018, 22 pgs. |
U.S. Appl. No. 15/408,211, Non Final Office Action Mailed May 2, 2019, 20 pgs. |
U.S. Appl. No. 15/425,983, Advisory Action mailed Oct. 12, 2018, 5 pgs. |
U.S. Appl. No. 15/425,983, Examiner Interview Summary mailed May 3, 2018, 3 pgs. |
U.S. Appl. No. 15/425,983, Examiner Interview Summary mailed May 17, 2018, 3 pgs. |
U.S. Appl. No. 15/425,983, Examiner Interview Summary mailed Sep. 28, 2018, 3 pgs. |
U.S. Appl. No. 15/425,983, Examiner Interview Summary mailed Oct. 12, 2018, 2 pgs. |
U.S. Appl. No. 15/425,983, Examiner Interview Summary mailed Dec. 12, 2018, 2 pgs. |
U.S. Appl. No. 15/425,983, Final Office Action mailed Jun. 26, 2018, 29 pgs. |
U.S. Appl. No. 15/425,983, Non Final Office Action mailed Jan. 11, 2018, 29 pgs. |
U.S. Appl. No. 15/425,983, Notice of Allowance mailed Dec. 12, 2018, 14 pgs. |
“U.S. Appl. No. 15/409,500, Examiner Interview mailed Mar. 5, 2019”, 3 pages. |
“U.S. Appl. No. 15/409,500, Non Final Office Action mailed Dec. 11, 2018”, 11 pgs. |
“U.S. Appl. No. 15/409,500, Notice of Allowance mailed Jun. 3, 2019”, 8 pages. |
“U.S. Appl. No. 15/601,863, Examiner Interview Summary mailed Nov. 21, 2018”, 4 pgs. |
“U.S. Appl. No. 15/601,863, Non Final Office Action mailed Sep. 20, 2018”, 23 pages. |
“U.S. Appl. No. 15/601,863, Notice of Allowance mailed Jan. 24, 2019”, 8 pages. |
“Intl Application Serial No. PCT/US16/52192, Intl Preliminary Report on Patentability mailed Apr. 5, 2018”, 7 pgs. |
U.S. Appl. No. 14/860,983, Final Rejection, Feb. 12, 2020, 18 pgs. |
U.S. Appl. No. 15/620,506, Notice Of Allowance And Fees Due (Ptol-85), Mar. 2, 2020, 10 pgs. |
U.S. Appl. No. 15/632,709, Non-Final Rejection, May 22, 2020, 10 pgs. |
U.S. Appl. No. 15/673,125, Final Rejection, Feb. 19, 2020, 17 pgs. |
U.S. Appl. No. 15/713,406, Final Rejection, Feb. 19, 2020, 22 pgs. |
U.S. Appl. No. 15/717,889, Non-Final Rejection, Oct. 27, 2020, 40 pgs. |
U.S. Appl. No. 15/911,993, Non-Final Rejection, Aug. 5, 2020, 6 pgs. |
U.S. Appl. No. 15/969,749, Final Rejection, Feb. 26, 2020, 15 pgs. |
U.S. Appl. No. 15/969,749, Non-Final Rejection, Sep. 17, 2020, 15 pgs. |
U.S. Appl. No. 16/179,746, Advisory Action (Ptol-303), Sep. 15, 2020, 2 pgs. |
U.S. Appl. No. 16/179,746, Examiner Interview Summary Record (Ptol-413), Nov. 5, 2020, 2 pgs. |
U.S. Appl. No. 16/362,547, Advisory Action (Ptol-303), Nov. 18, 2020, 2 pgs. |
U.S. Appl. No. 16/362,547, Examiner Interview Summary Record (Ptol-413), Nov. 18, 2020, 1 pg. |
No. U.S. Appl. No. 16/362,547, Final Rejection, Sep. 24, 2020,14 pgs. |
U.S. Appl. No. 16/362,547, Examiner Interview Summary Record (Ptol-413), Nov. 5, 2020, 2 pgs. |
U.S. Appl. No. 16/426,323, Notice Of Allowance And Fees Due (Ptol-85), Aug. 5, 2020, 11 pgs. |
U.S. Appl. No. 16/451,371, NOA—Notice Of Allowance And Fees Due (Ptol-85), Sep. 17, 2020, 5 pgs. |
U.S. Appl. No. 16/451,371, Non-Final Rejection, Jun. 11, 2020, 9 pgs. |
U.S. Appl. No. 16/586,868, Notice Of Allowance And Fees Due (Ptol-85), Jul. 31, 2020, 13 pgs. |
U.S. Appl. No. 16/586,868, Notice Of Allowance And Fees Due (Ptol-85), Oct. 7, 2020, 2 pgs. |
U.S. Appl. No. 16/586,868, USPTO e-Office Action: CTNF—Non-Final Rejection, Dec. 20, 2019, 19 pgs. |
U.S. Appl. No. 16/726,090, Non-Final Rejection, Nov. 19, 2020, 12 pgs. |
U.S. Appl. No. 16/778,981, Non-Final Rejection, Oct. 13, 2020, 7 pgs. |
U.S. Appl. No. 14/800,638, Non Final Office Action mailed Jul. 29, 2016, 11 pgs. |
U.S. Appl. No. 12/101,883, Examiner Interview Summary mailed Sep. 6, 2016, 3 pgs. |
U.S. Appl. No. 12/101,883, Examiner Interview Summary mailed Oct. 18, 2017, 2 pgs. |
U.S. Appl. No. 13/464,588, Non Final Office Action mailed Aug. 2, 2019, 14 pgs. |
U.S. Appl. No. 14/530,669, Advisory Action mailed Aug. 8, 2017, 5 pgs. |
U.S. Appl. No. 14/530,669, Examiner Interview Summary mailed Apr. 14, 2017, 3 pgs. |
U.S. Appl. No. 14/530,669, Examiner Interview Summary mailed Aug. 8, 2017, 2 pgs. |
U.S. Appl. No. 14/530,669, Final Office Action mailed Apr. 20, 2017, 25 pgs. |
U.S. Appl. No. 14/530,669, Non Final Office Action mailed Jan. 3, 17, 26 pgs. |
U.S. Appl. No. 14/530,671, Non Final Office Action mailed Jan. 3, 2017, 32 pgs. |
U.S. Appl. No. 14/539,814, Non Final Office Action mailed Dec. 30, 2016, 37 pgs. |
U.S. Appl. No. 14/539,889, Non-Final Office Action mailed Oct. 6, 2016, 14 pages. |
U.S. Appl. No. 14/800,638, Advisory Action mailed May 9, 2017, 5 pgs. |
U.S. Appl. No. 14/800,638, Examiner Interview Summary mailed May 9, 2017, 2 pgs. |
U.S. Appl. No. 14/800,638, Examiner Interview Summary mailed Nov. 7, 2016, 3 pgs. |
U.S. Appl. No. 14/800,638, Examiner Interview Summary mailed Dec. 13, 2017, 1 pg. |
U.S. Appl. No. 14/800,638, Final Office Action mailed Jan. 20, 2017, 12 pgs. |
U.S. Appl. No. 14/800,638, Non Final Office Action mailed Jun. 15, 2017, 12 pgs. |
U.S. Appl. No. 14/800,638, Notice of Allowance mailed Dec. 13, 2017, 9 pgs. |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Oct. 15, 2021 for U.S. Appl. No. 16/179,746 (pp. 1-2). |
Office Action dated Feb. 2, 2023 for U.S. Appl. No. 17/814,820 (pp. 1-34). |
Office Action dated Apr. 1, 2021 for U.S. Appl. No. 15/427,030 (pp. 1-18). |
Office Action dated Apr. 1, 2021 for U.S. Appl. No. 16/389,544 (pp. 1-29). |
Office Action dated Apr. 9, 2021 for U.S. Appl. No. 16/554,996 (pp. 1-29). |
Office Action dated Jun. 3, 2021 for U.S. Appl. No. 16/179,746 (pp. 1-26). |
Office Action dated Mar. 12, 2021 for U.S. Appl. No. 16/726,090 (pp. 1-14). |
Office Action dated Mar. 23, 2021 for U.S. Appl. No. 16/362,547 (pp. 1-15). |
Pollard, Stephen et al., “Automatically Synthesising Virtual Viewpoints by Trinocular Image Interpolation—Detailed Report”, HP, Technical Report, HP Laboratories Bristol HPL-97-166, Dec. 1997, 40 pgs. |
Prisacariu, Victor A. et al., “Simultaneous 3D Tracking and Reconstruction on a Mobile Phone”, IEEE International Symposium on Mixed and Augmented Reality, 2013, pp. 89-98. |
Qi Pan et al., “Rapid Scene Reconstruction on Mobile Phones from Panoramic Images”, Oct. 2011, pp. 55-64 (Year: 2011). |
Russell, Bryan C, etal. “LabelMe: a database and web-based tool forimage annotation.” International Journal of Computer vision 77.1-3 (2008): 157-173. (Year: 2008). |
Saito, Hideo et al., “View Interpolation of Multiple Cameras Based on Projective Geometry”, Department of Information and Computer Science, Keio University and Presto, Japan Science and Technology Corporation (JST), retrieved from the Internet <http://citeseerx.ist. psu.edu/viewdoc/download?doi=10.1.1.6.5162&rep=r ep1&type=pdf>, 6 pages. (2002). |
Seitz, Steven M., “Image-Based Transformation of Viewpoint and Scene Appearance”, A Dissertation submitted in partial fulfillment of the requirements for the Degree of Doctor of Philosophy at the University of Wisconsin; retrieved from the Internet <http://homes.cs.washington.edu/˜seitz/papers/thesis.pdf>, 1997, 111 pages. |
Shade, Jonathan et al., “Layered Depth Images”, Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, ACM, SIGGRAPH, Jul. 24, 1998, pp. 231-242. |
Shin, Hong-Chang et al., “Fast View Synthesis using GPU for 3D Display”, IEEE Transactions on Consumer Electronics, vol. 54, No. 4, Dec. 2008, pp. 2068-2076. |
Snavely, Noah et al., “Photo Tourism: Exploring Phot Collections in 3D”, ACM, ACM Transactions on Graphics (TOG)—Proceeding of ACM SIGGRAPH 2006, vol. 25, Issue 3, Jul. 2006, 835-846. |
Steder, Bastian et al., “Robust On-line Model-based Object Detection from Range Images”, International Conference on Intelligent Robots and Systems, pp. 4739-4744, Oct. 15, 2009, 6 pages. |
Supplemental Notice of Allowability dated May 5, 2021 for U.S. Appl. No. 15/969,749 (pp. 1-2). |
Thyssen, Anthony , “ImageMagick v6 Examples—Color Basics and Channels”, Website http://www.imagemagick.org/Usage/color basics/, Retrieved Dec. 23, 2016, Mar. 9, 2011, 31 pgs. |
United Kingdom Application Serial No. 1609577.0, Office Action mailed Jun. 15, 2016, 1 pg. |
Utasi, Äkos, and Csaba Benedek. “A multi-view annotation tool for people detection evaluation.” Proceedings of the Ist international Workshop on Visual interfaces forground truth collection in Computer vision applications. ACM, 2012. (Year: 2012) 7 pages. |
Weigel, Christian, et al., Advanced 3D Video Object Synthesis Based on Trilinear Tensors, IEEE Tenth International Symposium on Consumer Electronics, 2006, 5 pages. |
Xiao, Jiangjian et al., “Tri-view Morphing”, Elsevier, Computer Vision and Image Understanding, vol. 96, Issue 3, Dec. 2004, pp. 345-366. |
Z. Cao et al., ‘Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields’, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Apr. 14, 2017, pp. 1-9 sections 2-3; and figure 2. |
Zhang, Zhengyou, et al., A robust technique for matching two uncalibrated images through the recovery of the unknown epipolar geometry, Elsevier, Artificial Intelligence 78, 1995, 87-119 pgs. |
U.S. Appl. No. 15/425,988, Examiner Interview Summary mailed Nov. 30, 2018, 3 pages. |
U.S. Appl. No. 15/425,988, Non Final Office Action mailed Aug. 10, 2018, 18 pgs. |
U.S. Appl. No. 15/425,988, Notice of Allowance mailed Dec. 28, 2018, 8 pgs. |
U.S. Appl. No. 15/426,994, Advisory Action mailed Dec. 13, 2018, 3 pgs. |
U.S. Appl. No. 15/426,994, Examiner Interview Summary mailed Jan. 15, 2019, 3 pgs. |
U.S. Appl. No. 15/426,994, Final Office Action mailed Oct. 10, 2018, 21 pgs. |
U.S. Appl. No. 15/426,994, Non Final Office Action mailed Apr. 19, 2018, 22 pgs. |
U.S. Appl. No. 15/426,994, Non Final Office Action mailed Aug. 6, 2019, 22 pgs. |
U.S. Appl. No. 15/427,009, Notice of Allowance mailed Sep. 6, 2018, 9 pgs. |
U.S. Appl. No. 15/428,104, Advisory Action mailed Dec. 13, 2018, 3 pgs. |
U.S. Appl. No. 15/428,104, Examiner Interview Summary mailed Jan. 15, 2019, 3 pgs. |
U.S. Appl. No. 15/428,104, Examiner Interview Summary mailed Dec. 7, 2018, 3 pgs. |
U.S. Appl. No. 15/428,104, Final Office Action mailed Oct. 10, 2018, 23 pgs. |
U.S. Appl. No. 15/428,104, Non Final Office Action mailed Apr. 19, 2018, 21 pgs. |
U.S. Appl. No. 15/428,104, Non Final Office Action mailed Aug. 6, 2019, 24 pgs. |
U.S. Appl. No. 15/620,506, Advisory Action mailed Aug. 26, 2019, 2 pgs. |
U.S. Appl. No. 15/620,506, Examiner Inteview Summary mailed Aug. 26, 2019, 1 pg. |
U.S. Appl. No. 15/620,506, Final Office Action mailed Jun. 10, 2019, 17 pgs. |
U.S. Appl. No. 15/620,506, Non-Final Office Action mailed Jan. 23, 2019, 13 pages. |
U.S. Appl. No. 15/632,709, Examiner Interview Summary mailed Apr. 30, 2018, 1 pg. |
U.S. Appl. No. 15/632,709, Final Office Action mailed Jul. 17, 2018, 12 pgs. |
U.S. Appl. No. 15/632,709, Non Final Office Action mailed Apr. 3, 2019, 13 pgs. |
U.S. Appl. No. 15/632,709, Non Final Office Action mailed Apr. 30, 2018, 14 pgs. |
U.S. Appl. No. 15/632,709, Notice of Allowance mailed May 3, 2021, 9 pgs. |
U.S. Appl. No. 15/673,125, Examiner Interview Summary mailed Aug. 1, 2019, 3 pgs. |
U.S. Appl. No. 15/673,125, Final Office Action mailed Jun. 3, 2019, 17 pgs. |
U.S. Appl. No. 15/673,125, Non Final Office Action mailed Feb. 6, 2019, 17 pgs. |
U.S. Appl. No. 15/682,362, Notice of Allowance mailed Oct. 22, 2018, 9 pgs. |
U.S. Appl. No. 15/713,406, Examiner Interview Summary mailed Aug. 2, 2019, 3 pgs. |
U.S. Appl. No. 15/713,406, Final Office Action mailed Jun. 3, 19, 2021 pgs. |
U.S. Appl. No. 15/713,406, Non Final Office Action mailed Jan. 30, 2019, 21 pgs. |
U.S. Appl. No. 15/717,889, Advisory Action mailed Jul. 6, 2021, 3 pgs. |
U.S. Appl. No. 15/717,889, Examiner Interview Summary mailed Jun. 4, 2021, 2 pgs. |
U.S. Appl. No. 15/717,889, Examiner Interview Summary mailed Jul. 6, 2021, 2 pgs. |
U.S. Appl. No. 15/717,889, Final Office Action mailed Mar. 4, 2021, 37 pgs. |
U.S. Appl. No. 15/724,081, Examiner Interview Summary mailed Mar. 4, 2019, 3 pgs. |
U.S. Appl. No. 15/724,081, Examiner Interview Summary mailed Jul. 30, 2019, 3 pgs. |
U.S. Appl. No. 15/724,081, Examiner Interview Summary mailed Aug. 20, 2019, 2 pgs. |
U.S. Appl. No. 15/724,081, Final Office Action Mailed May 14, 2019, 14 pgs. |
U.S. Appl. No. 15/724,081, Non Final Office Action mailed Dec. 11, 2018, 12 pgs. |
U.S. Appl. No. 15/724,081, Notice of Allowance mailed Aug. 20, 2019, 12 pgs. |
U.S. Appl. No. 15/724,087, Final Office Action mailed Jul. 1, 2019, 16 pgs. |
U.S. Appl. No. 15/724,087, Non Final Office Action mailed Jan. 31, 2019, 15 pgs. |
U.S. Appl. No. 15/911,993, Notice of Allowance mailed Jan. 12, 2021, 8 pgs. |
U.S. Appl. No. 15/963,896, Non Final Office Action mailed Apr. 18, 2019, 7 pgs. |
U.S. Appl. No. 15/963,896, Notice of Allowance mailed May 22, 2019, 8 pgs. |
U.S. Appl. No. 15/969,749, Examiner Interview Summary mailed Apr. 20, 2021, 1 pg. |
U.S. Appl. No. 16/179,746, Examiner Interview Summary mailed Jun. 3, 2021, 1 pg. |
U.S. Appl. No. 16/179,746, Final Office Action mailed Jun. 3, 2021, 25 pgs. |
U.S. Appl. No. 16/179,746, Non-Final Office Action mailed Feb. 11, 2021, 25 pgs. |
U.S. Appl. No. 16/362,547, Examiner Interview Summary mailed Jul. 12, 2021, 2 pgs. |
U.S. Appl. No. 16/362,547, Non-Final Office Action mailed Mar. 23, 2021, 14 pgs. |
U.S. Appl. No. 16/384,578, Corrected Notice of Allowance mailed Nov. 26, 2019, 2 pgs. |
U.S. Appl. No. 16/384,578, Non Final Office Action mailed May 9, 2019, 9 pgs. |
U.S. Appl. No. 16/726,090, Advisory Action mailed Jun. 21, 2021, 3 pgs. |
U.S. Appl. No. 16/726,090, Examiner Interview Summary mailed Feb. 25, 2021, 3 pgs. |
U.S. Appl. No. 16/726,090, Examiner Interview Summary mailed Jun. 21, 2021, 1 pg. |
U.S. Appl. No. 16/778,981, Corrected Notice of Allowance mailed Mar. 31, 2021, 11 pgs. |
U.S. Appl. No. 16/778,981, Examiner Interview Summary mailed Mar. 31, 2021, 1 pg. |
U.S. Appl. No. 16/778,981, Notice of Allowance mailed Mar. 9, 2021, 9 pgs. |
Ballan, Luca etal., “Unstructured Video-Based Rendering: Interactive Exploration of Casually Captured Videos”, ACM, ACM Transactions on Graphics (TOG)—Proceedings of ACM SIGGRAPH 2010, vol. 29, Issue 4, Article No. 87, Jul. 30, 2010, 11 pages. |
Belongie, Serge, Jitendra Malik, and Jan Puzicha. “Shape matching and object recognition using shape contexts.” IEEE Transactions on Pattern Analysis & Machine Intelligence 4 (2002): 509-522. (Year: 2002). |
Buehler, Chris et al., “Unstructured Lumigraph Rendering”, ACM, ACM SIGGRAPH, 2001, pp. 425-432. |
Bulat et al.; “Human pose estimation via convolutional part heatmap regression,” In ECCV, 2016 (Year: 2016) 16 pages. |
Cao, Xun et al., “Semi-Automatic 2D-to-3D Conversion Using Disparity Propagation”, IEEE, IEEE Transactions on Broadcasting, vol. 57, Issue 2, Apr. 19, 2011, pp. 491-499. |
Chan, Shing-Chow et al., “An Object-Based Approach to Image/Video-Based Synthesis and Processing for 3-D and Multiview Televisions”, IEEE, IEEE Transactions on Circuits and Systems for Video Technology, vol. 19, Issue 6, Mar. 16, 2009, pp. 821-831. |
Chen, Shenchang E. , “QuickTime VR—An Image-Based Approach to Virtual Environment Navigation”, ACM, SIGGRAPH '95 Proceedings of the 22nd annual Conference on Computer graphics and interactive techniques, 1995, 29-38. |
Cläre, Adam , “Reality is a Game; What is Skybox?”, retrieved from the Internet <http://www.realityisagame.com/archives/1776/what-is-a-skybox/>, Mar. 28, 2013, 5 pgs. |
Davis, Abe et al., “Unstructured Light Fields”, Blackwell Publishing, Computer Graphics Forum, vol. 31, Issue 2, Pt. 1, May 2012, pp. 305-314. |
Extract all frames from video files, Video Help Forum, Oct. 2010, 3 pages. |
Figueroa, Nadia, et al., “From Sense to Print: Towards Automatic 3D Printing from 3D Sensing Devices”, IEEE, IEEE International Conference on Systems, Man, and Cybernetics (SMC, Oct. 13, 2013) 8 pages. |
Fischler, Martin A.., et al., Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography, ACM, Communications of the ACM, vol. 24, No. 6, Jun. 1981, 381-395 pgs. |
Fitzgibbon, Andrew , “Automatic 3D Model Acquisition and Generation of New Images from Video Sequences”, IEEE, 9th European Signal Processing Conference, Sep. 1998, 8 pgs. |
Fusiello, Andrea et al., “View Synthesis from Uncalibrated Images Using Parallax”, Proceedings of the 12th International Conference on Image Analysis and Processing, 2003, pp. 146-151. |
Fusiello, Andrea, Specifying Virtual Cameras In Uncalibrated View Synthesis, IEEE Transactions On Circuits and Systems for Video Technology, vol. 17, No. 5, May 2007, 8 pages. |
Gatys et al., “A Neural Algorithm of Artistic Style”, Comell University, arXiv:1508.06576v2 pp. 1-16. (Year: 2015). |
Gibson, Simon, et al., Accurate Camera Calibration for Off-line, Video-Based Augmented Reality, IEEE, Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR'02) 10 pages. |
Golovinskly, Aleksey et al., “Shape-based Recognition of 3D Point Clouds in Urban Environment”, IEEE, IEEE 12th International Conference on Computer Vision (ICCV), 2009, 2154-2161. |
Gurdan, Tobias et al., “Spatial and Temporal Interpolation of MultiView Image Sequences”, Department of Computer Science, Technische Universität München Ascending Technologies GmbH, Krailing, Germany, Section 2.3, Image Warping and Blending; Retrieved from the Internet <https://vision.in.tum.de/ media/spezial/bib/gurdan-et-al-gcpr-2014.pdf>, 12 pages. |
International Application Serial No. PCT/US19/28807 Preliminary Report on Patentability mailed Nov. 5, 2020, 9 pgs. |
International Application Serial No. PCT/US19/28807 Search Report and Written Opinion mailed Oct. 8, 2019, 12 pgs. |
International Application Serial No. PCT/US19/58204, Preliminary Report on Patentability mailed May 14, 2021, 7 pgs. |
International Application Serial No. PCT/US2016/042355, Search Report and Written Opinion mailed Oct. 19, 2016, 9 pgs. |
International Application Serial No. PCT/US2019/058204, Search Report and Written Opinion mailed Apr. 21, 2020, 10 pages. |
Intl Application Serial No. PCT/US17/47684, Intl Preliminary Report on Patentability mailed Feb. 28, 2019, 7 pgs. |
Intl Application Serial No. PCT/US17/47684, Intl Search Report and Written Opinion mailed Oct. 27, 2017, 8 pgs. |
Intl Application Serial No. PCT/US17/47859, Intl Preliminary Report on Patentability mailed Feb. 28, 2019, 7 pgs. |
Intl Application Serial No. PCT/US17/47859, Intl Search Report and Written Opinion mailed Nov. 2, 2017, 8 pgs. |
Intl Application Serial No. PCT/US19/030395, Intl Search Report and Written Opinion mailed Sep. 2, 2019, 9 pgs. |
Jaesik Choi, Ziyu Wang, Sang-Chul Lee, Won J. Jeon. A spatio-temporal pyramid matching for video retrieval. Computer Vision and Image Understanding. vol. 117, Issue 6. 2013. pp. 660-669 (Year: 2013). |
Keller, Maik et al., “Real-Time 3D Reconstruction in Dynamic Scenes Using Point-Based Fusion”, IEEE, 2013 International Conference on 3DTV, Jul. 1, 2013, 8 pages. |
Klappstein, Jens, et al., Moving Object Segmentation Using Optical Flow and Depth Information, Springer, In: Wada T., Huang F., Lin S. (eds) Advances in Image and Video Technology. PSIVT 2009. LectureNotes in Computer Science, vol. 5414, 611-623 pgs. |
Kottamasu, V. L. P. , “User Interaction of One-Dimensional Panoramic Images for iPod Touch”, Thesis, Linkoping University Electronic Press, LIU-IDA-LITH-EX-A-12/071-SE, Dec. 4, 2012, 70 pages. |
Li, Mingyang, Byung Hyung Kim, and Anastasius 1. Mourikis. “Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera.” 2013 IEEE International Conference on Robotics and Automation. IEEE, 2013. (Year: 2013) 8 pages. |
Mian, Ajmal S. et al., “Three-Dimensional Model-Based Object Recognition and Segmentation in Cluttered Scenes”, IEEE Transactions on Pattern Analysis and Machine Intelligence, col. 28, No. 10, Oct. 2006, 1584-1601. |
Mikolajczyk, Krystian, et al., A Performance Evaluation of Local Descriptors, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, No. 10, Oct. 2005, 1615-1630. |
Notice of Allowance dated Apr. 20, 2021 for U.S. Appl. No. 15/969,749 (pp. 1-5). |
Notice of Allowance dated Jun. 17, 2021 for U.S. Appl. No. 15/604,938 (pp. 1-12). |
Notice of Allowance dated May 3, 2021 for U.S. Appl. No. 15/632,709 (pp. 1-9). |
Nützi, Gabriel, et al. “Fusion of IMU and vision for absolute scale estimation in monocular SLAM.” Journal of intelligent & robotic Systems 61.1-4 (2011): 287-299. (Year: 2011). |
Office Action (Ex Parte Quayle Action) dated Aug. 24, 2023 for U.S. Appl. No. 16/813,506 (pp. 1-5). |
Office Action (Final Rejection) dated Jan. 18, 2023 for U.S. Appl. No. 17/352,654 (pp. 1-21). |
Office Action (Final Rejection) dated Jan. 19, 2022 for U.S. Appl. No. 16/726,090 (pp. 1-16). |
Office Action (Final Rejection) dated Feb. 17, 2023 for U.S. Appl. No. 17/519,452 (pp. 1-18). |
Office Action (Final Rejection) dated Feb. 21, 2023 for U.S. Appl. No. 17/483,573 (pp. 1-23). |
Office Action (Final Rejection) dated Apr. 19, 2023 for U.S. Appl. No. 16/813,506 (pp. 1-14). |
Office Action (Final Rejection) dated Apr. 25, 2022 for U.S. Appl. No. 16/813,506 (pp. 1-17). |
Office Action (Final Rejection) dated Jul. 6, 2022 for U.S. Appl. No. 14/861,019 (pp. 1-36). |
Office Action (Final Rejection) dated Jul. 22, 2022 for U.S. Appl. No. 15/427,030 (pp. 1-18). |
Office Action (Final Rejection) dated Aug. 20, 2021 for U.S. Appl. No. 16/362,547 (pp. 1-15). |
Office Action (Final Rejection) dated Sep. 2, 2022 for U.S. Appl. No. 16/362,547 (pp. 1-15). |
Office Action (Final Rejection) dated Sep. 13, 2023 for U.S. Appl. No. 17/814,821 (pp. 1-10). |
Office Action (Final Rejection) dated Dec. 6, 2023 for U.S. Appl. No. 17/483,573 (pp. 1-18). |
Office Action (Non-Final Rejection) dated Jan. 10, 2023 for U.S. Appl. No. 15/427,030 (pp. 1-5). |
Office Action (Non-Final Rejection) dated Mar. 21, 2022 for U.S. Appl. No. 14/861,019 (pp. 1-32). |
Office Action (Non-Final Rejection) dated Mar. 24, 2022 for U.S. Appl. No. 16/362,547 (pp. 1-14). |
Office Action (Non-Final Rejection) dated Mar. 27, 2023 for U.S. Appl. No. 17/519,457 (pp. 1-12). |
Office Action (Non-Final Rejection) dated Mar. 30, 2023 for U.S. Appl. No. 17/814,821 (pp. 1-10). |
Office Action (Non-Final Rejection) dated Mar. 30, 2023 for U.S. Appl. No. 17/814,823 (pp. 1-17). |
Office Action (Non-Final Rejection) dated Apr. 14, 2022 for U.S. Appl. No. 17/338,217 (pp. 1-10). |
Office Action (Non-Final Rejection) dated Jul. 21, 2023 for U.S. Appl. No. 17/483,573 (pp. 1-18). |
Office Action (Non-Final Rejection) dated Aug. 23, 2023 for U.S. Appl. No. 17/519,457 (pp. 1-15). |
Office Action (Non-Final Rejection) dated Sep. 12, 2022 for U.S. Appl. No. 16/813,506 (pp. 1-15). |
Office Action (Non-Final Rejection) dated Sep. 16, 2022 for U.S. Appl. No. 17/373,737 (pp. 1-8). |
Office Action (Non-Final Rejection) dated Sep. 22, 2021 for U.S. Appl. No. 16/726,090 (pp. 1-15). |
Office Action (Non-Final Rejection) dated Oct. 4, 2022 for U.S. Appl. No. 17/352,654 (pp. 1-19). |
Office Action (Non-Final Rejection) dated Oct. 5, 2022 for U.S. Appl. No. 17/483,573 (pp. 1-20). |
Office Action (Non-Final Rejection) dated Oct. 5, 2022 for U.S. Appl. No. 17/519,452 (pp. 1-17). |
Office Action (Non-Final Rejection) dated Oct. 5, 2023 for U.S. Appl. No. 17/935,239 (pp. 1-13). |
Office Action (Non-Final Rejection) dated Oct. 14, 2021 for U.S. Appl. No. 15/427,030 (pp. 1-17). |
Office Action (Non-Final Rejection) dated Oct. 28, 2021 for U.S. Appl. No. 16/813,506 (pp. 1-19). |
Office Action (Non-Final Rejection) dated Nov. 10, 2021 for U.S. Appl. No. 16/389,544 (pp. 1-28). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jan. 17, 2023 for U.S. Appl. No. 17/373,737 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jan. 19, 2023 for U.S. Appl. No. 16/362,547 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Feb. 1, 2023 for U.S. Appl. No. 16/362,547 (pp. 1-2). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated May 18, 2022 for U.S. Appl. No. 16/389,544 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jun. 2, 2022 for U.S. Appl. No. 16/726,090 (pp. 1-7). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jun. 22, 2023 for U.S. Appl. No. 14/861,019 (pp. 1-10). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jun. 29, 2022 for U.S. Appl. No. 17/338,217 (pp. 1-9). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jul. 28, 2023 for U.S. Appl. No. 17/814,823 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Aug. 2, 2023 for U.S. Appl. No. 17/352,654 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 13, 2023 for U.S. Appl. No. 17/814,820 (pp. 1-8). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 16, 2021 for U.S. Appl. No. 16/179,746 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 29, 2021 for U.S. Appl. No. 15/717,889 (pp. 1-12). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Sep. 30, 2021 for U.S. Appl. No. 16/179,746 (pp. 1-2). |
Office Action (Non-Final Rejection) dated Dec. 7, 2023 for U.S. Appl. No. 18/183,917 (pp. 1-18). |
Office Action (Final Rejection) dated Dec. 7, 2023 for U.S. Appl. No. 15/427,030 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jan. 2, 2024 for U.S. Appl. No. 17/935,239 (pp. 1-9). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jan. 8, 2024 for U.S. Appl. No. 16/813,506 (pp. 1-7). |
Office Action (Final Rejection) dated Jan. 8, 2024 for U.S. Appl. No. 17/519,457 (pp. 1-15). |
Communication pursuant to Article 94(3) issued in App. No. EP19877784, dated Jan. 3, 2024, 14 pages. |
Milani, Patrizia, et al. “Mobile smartphone applications for body position measurement in rehabilitation: a review of goniometric tools.” PM&R 6.11 (2014): 1038-1043. |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Jan. 12, 2024 for U.S. Appl. No. 17/814,821 (pp. 1-5). |
Office Action (Notice of Allowance and Fees Due (PTOL-85)) dated Feb. 20, 2024 for U.S. Appl. No. 17/519,452 (pp. 1-5). |
Office Action (Final Rejection) dated Apr. 3, 2024 for U.S. Appl. No. 18/183,917 (pp. 1-16). |
Office Action (Non-Final Rejection) dated Apr. 10, 2024 for U.S. Appl. No. 17/483,573 (pp. 1-15). |
International Application Serial No. PCT/US/52192, International Search Report and Written Opinion mailed Dec. 12, 2016, 8 pages. |
Haines, Russell, U.S. Appl. No. 62/380,914, Specification and Drawings, filed Aug. 29, 2016, 31 pgs. |
International Application Serial No. PCT/US2014/065282, Search Report & Written Opinion mailed Feb. 23, 2015, 11 pgs. |
Number | Date | Country | |
---|---|---|---|
20230402067 A1 | Dec 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14861019 | Sep 2015 | US |
Child | 18458084 | US |