The present disclosure relates to systems and methods for selective capture of and presentation of native image portions, particularly for use in broadcast production.
Common image or video formats are typically referred to either in terms of vertical resolution or horizontal resolution.
Examples of vertical high resolution designators are 720p (1280×720 pixels), 1080i (utilizing an interlace of two fields of 1920×540 pixels for a total resolution of 1920×1080 pixels) or 1080p (representing a progressive scan of 1920×1080 pixels).
Examples of horizontal high resolution designators, which are more common to digital cinema terminology, include 2K (2048 pixels wide) and 4 K (4096 pixels wide). Overall resolution would depend on the image aspect ratio, e.g. a 2K image with a Standard or Academy ratio of 4:3 would have an overall ratio of 2048×1536 pixels, whereas an image with a Panavision ratio of 2.39:1 would have an overall ratio of 2048×856 pixels. PRIOR ART
Currently, technologies exist for greater than high definition capture for digital cinema, e.g. up to 2K, 4 K and beyond. However, for consumer home viewing of the captured digital cinema, the captured image is compressed down at the distributing studio to a version that is specific to traditional usable consumer high definition formats for broadcast or other distribution, e.g., at 720p, 1080i or 1080p.
Also, while digital cinema has utilized large resolution capture, traditional broadcast capture has not. This broadcast capture is performed at the desired consumer display resolution, e.g., 1080p, both due to limitations at the consumer display device as well as to bandwidth restrictions of broadcast carriers. Thus, in scenarios calling for magnification of the broadcast image, for example to better show line calls or to follow specific players on the field, the display resolution of the line calls is considerably less than the native image captured on the field.
Accordingly, there is a need in the art for improved mechanisms for capturing and presenting image material for broadcasts or other image presentation.
The above described and other problems and disadvantages of the prior art are overcome and alleviated by the present system and method for selective capture of and presentation of native image portions. In exemplary embodiments, a first image or video is captured at a first resolution, which resolution is greater than high definition and higher than a predetermined broadcast display resolution. A desired portion of the first image or video is then displayed at a second, lower resolution, which resolution is less than and closer to the predetermined broadcast display resolution. Accordingly, a selected portion of the captured image may be displayed at or near the predetermined broadcast display resolution (i.e., minimizing or eliminating loss of image detail relative to the predetermined broadcast display resolution).
In further exemplary embodiments, native image capture occurs at greater than high definition resolutions, and portions of that greater than high definition image are selected for presentation. In exemplary embodiments, at least one selected portion is a native high definition portion of the greater than high definition image.
In another exemplary embodiment, a first video is captured at a first frame rate, which frame rate is higher than a predetermined broadcast frame rate. A desired portion of the first video is then displayed at a second, lower frame rate, which frame rate is less than and closer to the predetermined broadcast frame rate. The desired portion of the first video is captured by an extraction window that extracts frames across the native captured video. In such a way, the extracted video provides smooth and clear video, without edgy or blurred frames.
In another exemplary embodiment, a first video is captured at a first resolution and at a first frame rate, which first resolution and first frame rate are higher than a predetermined broadcast display resolution and frame rate. A desired portion of the first video is then selected by an extraction window and is displayed at a second, lower resolution and lower frame rate that is at or near the predetermined broadcast display resolution and frame rate. Accordingly, the captured video can be displayed at or near the predetermined broadcast display resolution and may be displayed with smooth and clear video, without edgy or blurred frames.
In further exemplary embodiments, a graphical user interface (“GUI”) is provided with a selectable extraction window that is configured to allow a user to navigate within a captured image and select portions of the captured image for presentation. In exemplary embodiments, the extraction window is configured to allow the user to adjust the size and position of the extraction window. In other exemplary embodiments, the extraction window is configured to track or scan across moving images, e.g., to follow a play or subject of interest during a sporting event.
In other exemplary embodiments, multiple cameras are positioned to capture images from different points of view, and extraction windows may be provided relative to the multiple image captures in a system for selectively displaying portions of native images from different points of view.
The above discussed and other features and advantages of the present invention will be appreciated and understood by those skilled in the art from the following detailed description and drawings.
Referring now to the drawings, wherein like elements are numbered alike in the following FIGURES:
PRIOR ART
As was noted above, the present disclosure relates to a system and method for selective capture of and presentation of native image portions.
In exemplary embodiments, a first image or video is captured at a first resolution, which resolution is greater than high definition and higher than a predetermined broadcast display resolution. A desired portion of the first image or video is then displayed at a second, lower resolution, which resolution is less than and closer to the predetermined broadcast display resolution. Accordingly, a selected portion of the captured image may be displayed at or near the predetermined broadcast display resolution (i.e., minimizing or eliminating loss of image detail relative to the predetermined broadcast display resolution).
An example of this is illustrated at
Also, while one extraction window is illustrated in
In further exemplary embodiments, the selectable extraction window (12 in
Referring now to
An image recorder 24 records the captured images, e.g., as a data stream on a server, and is configured to allow an operator to go back in time relative to the recording and examine selected portions of the captured image as described above. Such control is provided to an operator via the GUI 14 through a processor 26 interfacing with the GUI 14 and recorder 24. In exemplary embodiments, the recorder, processor and GUI are configured to allow the operator to go back instantaneously or near-instantaneously to select portions of the recorded image for presentation.
For example, with regard to
Referring again to
In another embodiment, at least one GUI is accessed by a tablet controller as a navigation tool for the system. Such tablet controller may be wireless and portable to allow for flexible a primary or supplemental navigation tool.
In other exemplary embodiments, multiple cameras may be positioned to capture images from different points of view, and extraction windows may be provided relative to the multiple image captures in a system for selectively displaying portions of native images from different points of view.
Further exemplary embodiments provide real time or near real time tracking of subjects of interest (e.g., identified, selected or pre-tagged players of interest or automatic tracking of a ball in a game). Additional exemplary embodiments also provide virtual directing of operated and automatically tracked subjects of interest for cutting into a full live broadcast, utilizing backend software and tracking technology to provide a virtual viewfinder that operates in manners similar to otherwise human camera operators. Such processes may also use artificial technology for simple tracking, e.g., of a single identified object, or for more complex operations approximating motions utilized by human camera operators, e.g., pan, tilt and zoom of the extraction window in a manner similar to human operators. For those examples using 4K (or the like) capture, camera capture could utilize a specifically designed 4K camera. A camera may also use wider lensing to capture more of the subject, with possible reconstituting or flattening in post production. Also, different lensing can be used specific to different applications.
Such processes may use the above-described multiple cameras and/or multiple extraction windows, or may run with specific regard to one camera and/or one extraction window. In such a way, an artificial intelligence can automatically capture, extract and display material for broadcast, utilizing the extraction window(s) as virtual viewfinders.
Additional exemplary embodiments also provide for virtual 3D extraction, e.g. via s single camera at 4K or 8K with a two window output.
In other exemplary embodiments, an increased image capture frame rates relative to a broadcast frame rate along with or in lieu of an increased image capture resolution, as has been discussed above.
In such embodiments, a first video is captured at a first frame rate, which frame rate is higher than a predetermined broadcast frame rate. A desired portion of the first video is then displayed at a second, lower frame rate, which frame rate is less than and closer to the predetermined broadcast frame rate. The desired portion of the first video is captured by an extraction window that extracts frames across the native captured video. In such a way, the extracted video provides smooth and clear video, without edgy or blurred frames. Such captured first video may be at any frame rate that is above the predetermined broadcast frame rate.
In further exemplary embodiments, the first video is captured at a first frame rate that is in super motion or hyper motion. In traditional video, this equates to approximately 180 (“supermotion”) frames per second or above (“hypermotion” or “ultramotion”) in a progressive frame rate. In exemplary embodiments, hypermotion is recorded in discrete times sufficient to capture a triggered instance of an action of camera subject for playback. In other exemplary embodiments, the present system performs a full time record of a camera in hypermotion, e.g., of sufficient length for replay playback archiving, such as more than fifteen minutes, more than thirty minutes, more than an hour, more than an hour and a half, or more than two hours, among others.
In other exemplary embodiments, raw data from at least one camera is manipulated to adjust the image quality (make it “paintable”) to broadcast specifications. In exemplary embodiments, broadcast “handles” may be integrated into the system to affect the raw data in a manner that is more germane to broadcast color temperatures, hues and gamma variables.
The present disclosure thus advantageously provides systems and methods for selective capture of and presentation of native image portions, for broadcast production or other applications. By providing exemplary embodiments using a selectable extraction window through a GUI, an operator has complete control over portions within the native images that the operator desires for presentation. Also, by providing exemplary embodiments with image capture greater than high definition (e.g., 4K), desired portions of the image selected by an operator may be presented at or relatively near high definition quality (i.e., without relative degradation of image quality). Further, by providing exemplary embodiments with image capture frame rates greater than that of a predetermined broadcast frame rate, extracted video therefrom provides smooth and clear video, without edgy or blurred frames. Finally, various exemplary embodiments utilizing enhanced GUI features, such as automatic tracking of subjects of interests, plural GUIs or extraction windows for one or plural (for different points of view) captured images provide advantageous production flexibilities and advantages.
It will be apparent to those skilled in the art that, while exemplary embodiments have been shown and described, various modifications and variations can be made to the invention disclosed herein without departing from the spirit or scope of the invention. Also, the exemplary implementations described above should be read in a non-limiting fashion, both with regard to construction and methodology. Accordingly, it is to be understood that the various embodiments have been described by way of illustration and not limitation.
The present application claims priority to U.S. Provisional Patent Application Ser. No. 61/515,549 filed Aug. 5, 2011; and, U.S. Provisional Patent Application Ser. No. 61/563,126 filed Nov. 23, 2011, the entire contents of which are specifically incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4184270 | Presbrey | Jan 1980 | A |
4679068 | Lillquist et al. | Jul 1987 | A |
4975770 | Troxell | Dec 1990 | A |
5342051 | Rankin et al. | Aug 1994 | A |
5413345 | Nauck | May 1995 | A |
5489099 | Rankin et al. | Feb 1996 | A |
5517236 | Sergeant et al. | May 1996 | A |
5729471 | Jain et al. | Mar 1998 | A |
5789519 | Vock et al. | Aug 1998 | A |
5865624 | Hayashigawa | Feb 1999 | A |
5892554 | DiCicco et al. | Apr 1999 | A |
5912700 | Honey et al. | Jun 1999 | A |
5938545 | Cooper et al. | Aug 1999 | A |
5953056 | Tucker | Sep 1999 | A |
6100925 | Rosser et al. | Aug 2000 | A |
6144375 | Jain et al. | Nov 2000 | A |
6154250 | Honey et al. | Nov 2000 | A |
6201554 | Lands | Mar 2001 | B1 |
6224492 | Grimes | May 2001 | B1 |
6233007 | Carlbom et al. | May 2001 | B1 |
6236940 | Rudow et al. | May 2001 | B1 |
6449010 | Tucker | Sep 2002 | B1 |
6520864 | Wilk | Feb 2003 | B1 |
6525690 | Rudow et al. | Feb 2003 | B2 |
6750919 | Rosser | Jun 2004 | B1 |
6774932 | Ewing et al. | Aug 2004 | B1 |
6958772 | Sugimori | Oct 2005 | B1 |
7158676 | Rainsford | Jan 2007 | B1 |
7250952 | Johnson et al. | Jul 2007 | B2 |
7315631 | Corcoran et al. | Jan 2008 | B1 |
7356082 | Kuhn | Apr 2008 | B1 |
7380259 | Schroeder | May 2008 | B1 |
7450758 | Cohen | Nov 2008 | B2 |
7529298 | Yasuda | May 2009 | B2 |
7693679 | Warnke et al. | Apr 2010 | B1 |
7839926 | Metzger et al. | Nov 2010 | B1 |
7843510 | Ayer et al. | Nov 2010 | B1 |
7873910 | Chaudhri et al. | Jan 2011 | B2 |
7996771 | Girgensohn | Aug 2011 | B2 |
8077917 | Forsgren | Dec 2011 | B2 |
8381259 | Khosla | Feb 2013 | B1 |
8495697 | Goldfeder et al. | Jul 2013 | B1 |
8648857 | Williams | Feb 2014 | B2 |
8702504 | Hughes et al. | Apr 2014 | B1 |
8743219 | Bledsoe | Jun 2014 | B1 |
8756641 | Ivanov et al. | Jun 2014 | B2 |
8949889 | Erdmann | Feb 2015 | B1 |
9094615 | Aman | Jul 2015 | B2 |
9137558 | Gibbon et al. | Sep 2015 | B2 |
9138652 | Thompson et al. | Sep 2015 | B1 |
9288545 | Hill et al. | Mar 2016 | B2 |
9535879 | Allen | Jan 2017 | B2 |
20020019258 | Kim et al. | Feb 2002 | A1 |
20020057217 | Milnes et al. | May 2002 | A1 |
20020082122 | Pippin et al. | Jun 2002 | A1 |
20020090217 | Limor et al. | Jul 2002 | A1 |
20020118875 | Wilensky | Aug 2002 | A1 |
20020168006 | Yasuda | Nov 2002 | A1 |
20030009270 | Breed | Jan 2003 | A1 |
20030021445 | Larice et al. | Jan 2003 | A1 |
20030033602 | Gibbs et al. | Feb 2003 | A1 |
20030103648 | Ito et al. | Jun 2003 | A1 |
20030151835 | Su et al. | Aug 2003 | A1 |
20030210329 | Aagaard | Nov 2003 | A1 |
20040136592 | Chen et al. | Jul 2004 | A1 |
20040218099 | Washington | Nov 2004 | A1 |
20040258154 | Liu et al. | Dec 2004 | A1 |
20040261127 | Freeman et al. | Dec 2004 | A1 |
20050040710 | Ahn | Feb 2005 | A1 |
20050052533 | Ito et al. | Mar 2005 | A1 |
20050137958 | Huber et al. | Jun 2005 | A1 |
20050147278 | Rui et al. | Jul 2005 | A1 |
20050237385 | Kosaka et al. | Oct 2005 | A1 |
20050255914 | McHale et al. | Nov 2005 | A1 |
20060003825 | Iwasaki et al. | Jan 2006 | A1 |
20060044410 | Shinkai | Mar 2006 | A1 |
20060078047 | Shu | Apr 2006 | A1 |
20060078329 | Ohnishi et al. | Apr 2006 | A1 |
20060197839 | Senior et al. | Sep 2006 | A1 |
20060197843 | Yoshimatsu | Sep 2006 | A1 |
20060197849 | Wernersson | Sep 2006 | A1 |
20070018952 | Arseneau et al. | Jan 2007 | A1 |
20070024706 | Brannon, Jr. | Feb 2007 | A1 |
20070076957 | Wang | Apr 2007 | A1 |
20070139562 | Miyake | Jun 2007 | A1 |
20070198939 | Gold | Aug 2007 | A1 |
20080019299 | Lekutai et al. | Jan 2008 | A1 |
20080021651 | Seeley et al. | Jan 2008 | A1 |
20080129825 | DeAngelis | Jun 2008 | A1 |
20080129844 | Cusack et al. | Jun 2008 | A1 |
20080175441 | Matsumoto et al. | Jul 2008 | A1 |
20080192116 | Tamir et al. | Aug 2008 | A1 |
20080199043 | Forsgren | Aug 2008 | A1 |
20080261711 | Tuxen | Oct 2008 | A1 |
20080277486 | Seem et al. | Nov 2008 | A1 |
20080311983 | Koempel et al. | Dec 2008 | A1 |
20090003599 | Hart | Jan 2009 | A1 |
20090009605 | Ortiz | Jan 2009 | A1 |
20090021583 | Salgar et al. | Jan 2009 | A1 |
20090028440 | Elangovan et al. | Jan 2009 | A1 |
20090031382 | Cope | Jan 2009 | A1 |
20090037605 | Li | Feb 2009 | A1 |
20090040308 | Temovskiy | Feb 2009 | A1 |
20090046152 | Aman | Feb 2009 | A1 |
20090066782 | Choi et al. | Mar 2009 | A1 |
20090067670 | Johnson et al. | Mar 2009 | A1 |
20090082139 | Hart | Mar 2009 | A1 |
20090136226 | Wu et al. | May 2009 | A1 |
20090140976 | Bae et al. | Jun 2009 | A1 |
20090160735 | Mack | Jun 2009 | A1 |
20090225845 | Veremeev et al. | Sep 2009 | A1 |
20090245571 | Chien et al. | Oct 2009 | A1 |
20090262137 | Walker et al. | Oct 2009 | A1 |
20090271821 | Zalewski | Oct 2009 | A1 |
20090284601 | Eledath | Nov 2009 | A1 |
20090290848 | Brown | Nov 2009 | A1 |
20100077435 | Kandekar | Mar 2010 | A1 |
20100091017 | Kmiecik et al. | Apr 2010 | A1 |
20100095345 | Tran | Apr 2010 | A1 |
20100141772 | Inaguma et al. | Jun 2010 | A1 |
20100179005 | Meadows et al. | Jul 2010 | A1 |
20100192088 | Iwano | Jul 2010 | A1 |
20100208082 | Buchner | Aug 2010 | A1 |
20100265125 | Kelly et al. | Oct 2010 | A1 |
20100265344 | Velarde et al. | Oct 2010 | A1 |
20100289904 | Zhang et al. | Nov 2010 | A1 |
20100289913 | Fujiwara | Nov 2010 | A1 |
20100321389 | Gay et al. | Dec 2010 | A1 |
20110013087 | House et al. | Jan 2011 | A1 |
20110013836 | Gefen et al. | Jan 2011 | A1 |
20110016497 | Bloom et al. | Jan 2011 | A1 |
20110067065 | Karaoguz et al. | Mar 2011 | A1 |
20110149094 | Chen et al. | Jun 2011 | A1 |
20110149103 | Hatakeyama | Jun 2011 | A1 |
20110169959 | Deangelis et al. | Jul 2011 | A1 |
20110181728 | Tieman et al. | Jul 2011 | A1 |
20110191023 | Engstrom | Aug 2011 | A1 |
20110205022 | Cavallaro et al. | Aug 2011 | A1 |
20110292030 | Jiang et al. | Dec 2011 | A1 |
20110304843 | Rogers et al. | Dec 2011 | A1 |
20120060101 | Vonog et al. | Mar 2012 | A1 |
20120090010 | Dace et al. | Apr 2012 | A1 |
20120154593 | Anderson | Jun 2012 | A1 |
20120277036 | Lee | Nov 2012 | A1 |
20120295679 | Izkovsky et al. | Nov 2012 | A1 |
20120316843 | Beno et al. | Dec 2012 | A1 |
20120331387 | Lemmey et al. | Dec 2012 | A1 |
20130016099 | Rinard et al. | Jan 2013 | A1 |
20130041755 | Ivanov | Apr 2013 | A1 |
20130211774 | Bentley et al. | Aug 2013 | A1 |
20130227596 | Pettis et al. | Aug 2013 | A1 |
20140005929 | Gale et al. | Jan 2014 | A1 |
20140229996 | Ellis et al. | Aug 2014 | A1 |
20140236331 | Lehmann et al. | Aug 2014 | A1 |
20140240500 | Davies | Aug 2014 | A1 |
20140245367 | Sasaki et al. | Aug 2014 | A1 |
20140266160 | Coza | Sep 2014 | A1 |
20140344839 | Woods et al. | Nov 2014 | A1 |
20150057108 | Regimbal et al. | Feb 2015 | A1 |
20150062339 | Ostrom | Mar 2015 | A1 |
20150094883 | Peeters et al. | Apr 2015 | A1 |
20150149250 | Fein et al. | May 2015 | A1 |
20150149837 | Alonso et al. | May 2015 | A1 |
20150226828 | Davies et al. | Aug 2015 | A1 |
20150234454 | Kurz | Aug 2015 | A1 |
20150318020 | Pribula | Nov 2015 | A1 |
20150370818 | Des Jardins et al. | Dec 2015 | A1 |
20150382076 | Davisson et al. | Dec 2015 | A1 |
20160173958 | Ryu et al. | Jun 2016 | A1 |
20160198228 | Hill et al. | Jul 2016 | A1 |
20160203694 | Hogsten et al. | Jul 2016 | A1 |
20160217345 | Appel et al. | Jul 2016 | A1 |
20170201779 | Publicover et al. | Jul 2017 | A1 |
20170280199 | Davies et al. | Sep 2017 | A1 |
20170366866 | Davies et al. | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
2213485 | Feb 1995 | CA |
101090472 | Dec 2007 | CN |
2575079 | Apr 2013 | EP |
H06105231 | Apr 1994 | JP |
H07141022 | Jun 1995 | JP |
H08164896 | Jun 1996 | JP |
H0952555 | Feb 1997 | JP |
2001268562 | Sep 2001 | JP |
2003-125414 | Apr 2003 | JP |
2003162213 | Jun 2003 | JP |
2003242517 | Aug 2003 | JP |
2004048116 | Feb 2004 | JP |
2004056473 | Feb 2004 | JP |
2004354236 | Dec 2004 | JP |
2004354256 | Dec 2004 | JP |
2005073218 | Mar 2005 | JP |
2005144003 | Jun 2005 | JP |
2005159385 | Jun 2005 | JP |
2006081696 | Mar 2006 | JP |
2006340108 | Dec 2006 | JP |
2008005110 | Jan 2008 | JP |
2008035006 | Feb 2008 | JP |
2008199370 | Aug 2008 | JP |
20080199370 | Aug 2008 | JP |
2009-188976 | Aug 2009 | JP |
2009194234 | Aug 2009 | JP |
2010005267 | Jan 2010 | JP |
2010-152556 | Jul 2010 | JP |
2010194074 | Sep 2010 | JP |
2010245821 | Oct 2010 | JP |
2011108165 | Jun 2011 | JP |
2011130112 | Jun 2011 | JP |
2011183138 | Sep 2011 | JP |
2011-527527 | Oct 2011 | JP |
2012034365 | Feb 2012 | JP |
2012095914 | May 2012 | JP |
2013020308 | Jan 2013 | JP |
2013118712 | Jun 2013 | JP |
20060134702 | Dec 2006 | KR |
1020090056047 | Jun 2009 | KR |
20130086814 | Aug 2013 | KR |
20140023136 | Feb 2014 | KR |
9728856 | Aug 1997 | WO |
0114021 | Mar 2001 | WO |
0228093 | Apr 2002 | WO |
2005027516 | Mar 2005 | WO |
2008057285 | May 2008 | WO |
2010140858 | Dec 2010 | WO |
2012051054 | Apr 2012 | WO |
2014036363 | Mar 2014 | WO |
Entry |
---|
PCT Application No. PCT/US2012/049707 Written Opinion and International Search Report dated Jan. 7, 2013, 13 pages. |
International Application No. PCT/US2014/025362 International Search Report and Written Opinion dated Aug. 19, 2014, 15 pages. |
New Zealand Application No. 620992 First Examination Report dated Jul. 15, 2014, 2 pages. |
EP Application No. 12822586.9 Extended European Search Report dated Feb. 5, 2015, 6 pages. |
International Application No. PCT/US2013/057450 International Search Report and Written Opinion dated Dec. 27, 2013, 12 pages. |
CN Application No. 201280044974.9 First Office Action and Search Report dated Sep. 30, 2015, 14 pages. |
EP Application No. 15156533.0 Extended European Search Report dated Jun. 10, 2015, 6 pages. |
New Zealand Patent Application No. 620992 Further Examination Report dated Feb. 1, 2016, 3 pages. |
New Zealand Patent Application No. 715962 First Examination Report dated Feb. 1, 2016, 2 pages. |
U.S. Appl. No. 14/207,998 Non-Final Office Action dated Dec. 2, 2015, 33 pages. |
U.S. Appl. No. 14/424,632 Non-Final Office Action dated Nov. 6, 2015, 25 pages. |
U.S. Appl. No. 14/804,637 Notice of Allowance dated Nov. 17, 2015, 18 pages. |
EP Application No. 12832174.0 Extended European Search Report dated Mar. 23, 2016, 9 pages. |
International Application No. PCT/US2015/065472 International Search Report and Written Opinion dated Apr. 5, 2016, 14 pages. |
International Application No. PCT/US2015/065474 International Search Report and Written Opinion dated May 4, 2016, 12 pages. |
International Application No. PCT/US2015/065477 International Search Report and Written Opinion dated May 4, 2016, 11 pages. |
International Application No. PCT/US2015/065481 International Search Report and Written Opinion dated May 4, 2016, 12 pages. |
Australian Patent Application No. 2012294568 Office Action dated Aug. 22, 2016, 3 pages. |
China Application No. 201280044974.9 Second Office Action dated Jul. 22, 2016, 10 pages. |
U.S. Appl. No. 14/207,998 Final Office Action dated Sep. 9, 2016, 36 pages. |
JP Patent Application No. 2014-525086 Notice of Reasons for Rejection dated May 10, 2016, 5 pages. |
New Zealand Patent Application No. 719619 First Examination Report dated May 19, 2016, 3 pages. |
U.S. Appl. No. 15/068,819 Non-Final Office Action dated May 6, 2016, 21 pages. |
Austrialian Application No. 2014244374 Examination Report No. 1, dated Mar. 17, 2017, 3 pages. |
EP Application No. 12822586.9 Office Action dated Feb. 28, 2017, 4 pages. |
EP Application No. 13832174.0 Office Action dated Apr. 25, 2017, 4 pages. |
EP Application No. 14776040.9 Extended European Search Report dated Oct. 7, 2016, 8 pages. |
JP Patent Application No. 2014-525086 Decision of Rejection dated Nov. 8, 2016, 4 pages. |
First Examination Report for New Zealand IP No. 734221, dated Aug. 28, 2017 (2 pp.). |
Further Examination Report for New Zealand IP No. 719619, dated Oct. 16, 2017 (1 pp.). |
Further Examination Report for New Zealand IP No. 719619, dated Sep. 20, 2017 (2 pp.). |
Rodrigues, Pedro. “A Field, Tracking and Video Editor Tool for a Football Resource Planner”, IEEE Conference Publications, US, 2013 (6 pp.). |
Notice of Reasons for Refusal for Japanese Patent Application No. 2015-530081, dated Oct. 10, 2017 (10 pp.). |
EP Application No. 15156533 Office Action dated May 15, 2017, 4 pages. |
U.S. Appl. No. 14/207,998 Non-Final Office Action dated Jun. 30, 2017, 46 pages. |
EP Application No. 14776040.9 Office Action dated Dec. 8, 2017, 4 pages. |
U.S. Appl. No. 15/621,126 Non-Final Office Action dated Dec. 20, 2017, 34 pages. |
U.S. Appl. No. 14/207,998 Final Office Action dated Feb. 22, 2018, 26 pages. |
Australian Application No. 2013308641 Examination Report No. 1 dated Mar. 8, 2018, 4 pages. |
EP Application No. 15867249 EP Search Report and Written Opinion dated May 17, 2018, 8 pages. |
EP Application No. 15867985 Supplementary EP Search Report and Written Opinion dated May 30, 2018, 9 pages. |
EP Application No. 15868450 Supplementary EP Search Report and Written Opinion dated Jun. 1, 2018, 9 pages. |
EP Application No. 15868581 Supplementary EP Search Report and Written Opinion dated Jun. 1, 2018, 8 pages. |
JP Patent Application No. 2014-525086 English Translation of Trial Decision issued Apr. 3, 2018, 17 pages. |
JP Patent Application No. 2016-501836 Notice of Reasons for Refusal dated May 18, 2018, 6 pages. |
U.S. Appl. No. 14/424,632 Non-Final Office Action dated Jun. 28, 2018, 44 pages. |
Australian Application No. 151189D1AU Examination Report No. 1 dated Nov. 27, 2018, 3 pages. |
International Application No. PCT/US2018/035007 International Search Report and Written Opinion dated Sep. 17, 2018, 10 pgs. |
U.S. Appl. No. 15/535,243 Non-Final Office Action dated Sep. 27, 2018, 50 pages. |
U.S. Appl. No. 15/535,257 Non-Final Office Action dated Sep. 20, 2018, 51 pages. |
Australian Application No. 2017219030 Office Action dated Feb. 12, 2019, 4 pages. |
JP Patent Application No. 2016-501836 Notice of Reasons for Refusal dated Jan. 15, 2019, 3 pages. |
U.S. Appl. No. 14/424,632 Final Office Action dated Feb. 8, 2019, 27 pages. |
AU Application No. 2015360249 Examination Report No. 1 dated May 9, 2019, 4 pages. |
AU Application No. 2015360250 Examination Report No. 1 dated May 23, 2019, 5 pages. |
AU Application No. 2015360251 Examination Report No. 1 dated May 17, 2019, 5 pages. |
AU Application No. 2015360252 Examination Report No. 1 dated May 8, 2019, 4 pages. |
EP Application No. 15867249.3 Office Action dated Jun. 6, 2019, 8 pages. |
EP Application No. 15867985.2 Office Action dated Jun. 6, 2019, 8 pages. |
EP Application No. 15868450.6 Office Action dated Jun. 6, 2019, 9 pages. |
EP Application No. 15868581.8 Office Action dated Jun. 6, 2019, 7 pages. |
JP Patent Application No. 2017-531609 Notice of Reasons for Refusal dated Jun. 18, 2019, 3 pages. |
JP Patent Application No. 2017-531610 Notice of Reasons for Refusal dated Jun. 18, 2019, 3 pages. |
U.S. Appl. No. 15/535,243 Final Office Action dated Jul. 2, 2019, 27 pages. |
NZ Application No. 751181 First Examination Report dated Mar. 21, 2019, 2 pages. |
U.S. Appl. No. 15/535,257 Final Office Action dated May 6, 2019, 31 pages. |
Golf Relay Broadcast, Proceedings of Workshop of the Institute of Television Engineers of Japan and Institute of Television Engineers of Japan Using Multimedia PC besides Katori, Nov. 26, 1993, vol. 17. No. 74, p. 23-27. |
JP Patent Application No. 2017-531612 Notice of Reasons for Refusal dated Jul. 30, 2019, 6 pages. |
Relay watch in the “synchronization World Cup 2006” besides ** Group work, broadcast technology, * 6 hall publication incorporated company, Jan. 1, 2007, vol. [ 60th ] No. 1 (716th volume of the set), p. 19-29. |
EP Application No. 15867249.3 Oral Proceedings Summons dated Aug. 25, 2020, 10 pages. |
U.S. Appl. No. 15/535,243 Non-Final Office Action dated Aug. 6, 2020, 24 pages. |
EP Application No. 18809839.6 Extended EP Search Report dated Sep. 11, 2020, 7 pages. |
NZ IP No. 768143; First Examination Report; dated Sep. 28, 2020 2 pages. |
Mike—More homers? Blame the seats; Published Apr. 20, 2009 (Year: 2009), 3 pages. |
Newton—Autodesk wind simulation to enhance Fox Sports Super Bowl coverage; Published Jan. 31, 2014 (Year: 2014), 3 pages. |
Number | Date | Country | |
---|---|---|---|
20130033605 A1 | Feb 2013 | US |
Number | Date | Country | |
---|---|---|---|
61515549 | Aug 2011 | US | |
61563126 | Nov 2011 | US |