Devices and methods for optically detecting and tracking inertially fixed objects and moving objects are provided.
Global positioning systems (GPS) are widely used for navigation including for ships, aircrafts, and missiles. However, these systems are vulnerable to interference and have other shortcomings. Their space components are subject to hostile attack and the systems may be jammed (i.e., GPS denied). The systems also suffer from reliability failures and these GPS systems do not provide the data needed for attitude determination. For centuries navigators have used the sky, particularly at night, for the most fundamental and accurate inertial system available, in which each star is a benchmark. Cataloged positions and motions of the stars define the celestial reference frame. The problem is stars are hard to see during the day. Therefore, there remains a need for a reliable and accurate backup to GPS systems.
Today, star tracker devices play a key role in guidance and control systems. In particular, effective methods for day and night tracking of both stars and resident space objects (e.g., satellites) is a key enabler for navigation in GPS-denied environments. A star tracker is fundamentally a camera that images a star field and computes and reports the direction toward each star. From these data, the tracker (or its host) can determine its attitude. Like all components used in space missions or on airborne vehicles, there is continuous pressure to reduce size, weight, power and cost (SWAP-C) and increase the lifetime of these components without compromising performance. A tracker must be rugged enough to survive the stresses of launch or deployment and then function for many years in the extreme temperatures and radiation encountered in harsh environments. Star trackers are typically mounted on the external surface of a spacecraft or vehicle bus and are not shielded from the environment.
In a typical implementation of a star tracker incorporating a digital image sensor, the sensor includes an array of pixels that is used to obtain an image from within a field of view of the device defined by the size of the sensor and associated imaging optics. Within the image, the relative location of tracked satellite to identified stars within the image, and the line of sight of the device, enable a relative location of the platform carrying the star tracker device to be determined. However, even with state-of-the-art digital image sensors, detecting dim objects remains a challenge, particularly during the daytime. In addition, the accurate registration of objects in view is a challenge, particularly in those cases in which the optical image sensor is moving. Similar problems with tracking and registration are present where one or more of the imaged objects is moving relative to the image sensor. In addition to star trackers, image sensors in the form of cameras used to obtain images of various scenes can benefit from an improved signal-to-noise ratio in order to detected dim or poorly illuminated objects. However, as with star trackers, the accurate registration of objects within a field of view is a challenge.
Embodiments of the present disclosure provide systems and methods for day and night detection and tracking of stationary and moving objects in digital video streams. Moreover, embodiments of the present disclosure enable, in a highly effective and efficient computation manner via a camera that may itself be moving, the detection and tracking of objects. This includes day and night tracking of targets, including but not limited to stars and resident space objects (RSOs, e.g., satellites), for navigation in GPS-denied environments, as well as the more general case of terrestrial objects (e.g., tanks), seaborne objects (e.g., surface ships and submarines), and other airborne objects. An optical image sensor in accordance with embodiments of the present disclosure functions to detect objects, including dim objects, within an image area. The optical image sensor operates by taking multiple frames (images) of a scene. A processor included in or associated with the optical image sensor aggregates the images, until objects are detected within the scene. Each detected object is separately tracked by processing pixel-data near the image location of the detection. In particular, instead of treating each image of the video stream as monolithic and registering it to a previous image in the stream (i.e. estimating and compensating for image-to-image motion), an optical image sensor in accordance with embodiments of the present disclosure separates each image of the video stream into spatially separate and distinct sub-images, also referred to herein as registration frames or subframes, and registers the subframes with respect to the movement of the object of interest within that subframe stream. These subframe streams can then be coherently averaged independently, improving the detectability and tracking of all objects within the video stream, and without the compromise required if the image stream was registered as a single stream.
Embodiments of the present disclosure are directed to an image processing system, referred to herein as an image sensor system. The image sensor system can be used to detect and to track point-sources that move, one relative to another, differently across a field of view. The system includes a camera and associated read-out electronics that produce a time-tagged sequence of digital images. The image sensor system processes the images, each of short enough exposure so that streaking due to apparent motion of each tracked object is not much greater than the width of the point-spread function of the image sensor. The brevity of the exposure can result in a signal associated with a tracked object being undetectable above the noise in a single image. In accordance with embodiments of the present disclosure, the image sensor system can detect faint objects that move differently by combining the same sequence of images differently for each object. In accordance with further embodiments of the present disclosure, sub-areas or subframes of the full frame images are established for each tracked object, and sequences of the subframes are combined for detection and tracking of the associated object.
The apparent motion of a tracked object across the field of view is due in part to the rotation of the camera itself and in part to the rotation of the object about the camera. The image sensor system includes a gyroscope in order to account for the rotation of the camera itself. The image sensor system also includes a command-interface for ingesting both a list of inertially fixed objects to track and a list of non-inertial objects to track; each of the latter requires an estimate of its expected rotation about the camera. The system provides a refined measurement of the apparent relative angular motions of the tracked objects.
In accordance with embodiments of the present disclosure, the image sensor system can be operated to provide GPS-like geolocation by having the image-processing system simultaneously track stars and artificial satellites. The position of the image sensor system near Earth can then be calculated from the tracking data. The image sensor system can also be applicable to metrology, whether in the Earth's atmosphere or in outer space, and can be adapted to the tracking of reference-fiducials other than stars.
The design of the system supports daytime and nighttime operation within the Earth's atmosphere. Short exposure times during the day not only limit streaking due to angular motion but also limit saturation of the detector from the bright sky.
In accordance with embodiments of the present disclosure, the objects identified by the optical image sensor include inertially stationary objects (e.g. stars) and moving objects (e.g. resident space objects (RSOs), terrestrial objects (e.g., tanks), seaborne objects (e.g., submarines), and airborne objects). An optical image sensor in accordance with at least some embodiment of the present disclosure can implement a star tracker. By identifying, tracking, and determining a relative angle to stationary or moving objects having known absolute locations, attitude and location information can be derived by the star tracker, which can, for example, enable navigation in GPS denied environments. Alternatively or in addition, an optical image sensor in accordance with further embodiments of the present disclosure can implement a surveillance camera that is operated to identify and track one or more stationary or moving objects within a scene where the objects being identified or tracked do not have known absolute locations.
Additional features and advantages of embodiments of the present disclosure will become more readily apparent from the following description, particularly when considered together with the accompanying drawings.
The optical image sensor system 106 images a plurality of stationary objects 112, such as stars, and moving objects 114, such as satellites or other resident space objects (RSOs), within a field of view 116 of the image sensor system 106. The field of view 116 is associated with a line of sight or boresight 118. Although depicted with a single field of view 116, an image sensor system 106 can have multiple fields of view 116. Alternatively or in addition, a platform 104 can be associated with multiple image sensor systems 106 having the same or different fields of view 116. As described herein, the image sensor system 106 enables attitude and geolocation determinations with associated time tags, and with registration of pixels within a frame relative to an inertial reference frame (IRF), such as but not limited to an Earth centered inertial (ECI) coordinate frame. Moreover, in accordance with embodiments of the present disclosure, the motion of the image sensor system 106 is sensed to enable stacking of multiple image frames collected by the image sensor system 106 in order to significantly boost the signal-to-noise ratio (SNR) of the object image, allowing the detection of objects 112 and 114, including in daylight or other noisy conditions.
The lens assembly 232 is oriented along the boresight 118, and collects light from within the field of view 116. The collected light is selectively passed to the array of pixels 244 by the shutter 236, which can be operated to define the exposure time. In particular, the sensor assembly 204 can be operated such that the exposure times are sufficiently short to avoid the smearing of point light sources across the image sensor pixels 244. The amount of collected light passed to the detector 240 during an exposure period can also be controlled by varying the size of the aperture 238. The sensor assembly 204 can include or can be associated with driver and analog to digital conversion (ADC) circuitry 246, enabling the sensor assembly 204 to provide a digital output representative of an amplitude or intensity of light detected at each pixel 244 within the detector 240.
The optical image sensor system 106 processor 208 can include one or more general purpose programmable processors, graphics processing units (GPUs), vector processors, array processors, field programmable gate arrays (FPGA), controllers, or other processing device or set of devices capable of executing instructions for operation of the optical image sensor system 106, including operation and control of the sensor assembly 204 and the registration and aggregation of subframe images as described herein. The instructions executed by the processor 208 can be stored as application programming 224 in the memory 212 and/or data storage 216. The memory 212 can include one or more volatile or nonvolatile solid-state memory devices, such as but not limited to RAM, SDRAM, or the like. The data storage 216 can include one or more mass storage devices, such as, but not limited to, a hard disk drive, an optical storage device, a solid-state drive, or the like. In addition to providing storage for the application programming 224, the memory 212 and/or the data storage 216 can store intermediate or final data products or other data or reference information 228. In the case of a star tracker 108 embodiment, the memory 212 and/or the data storage 216 of the optical image sensor system 106 can store reference information 228 in the form of an object catalog database, navigational information, a star database or catalog, RSO ephemeris data, and image data. In addition, the memory 212, data storage 216, and/or memory or data storage included in the sensor assembly 204 can store detector performance parameter data.
As can be appreciated by one of skill in the art after consideration of the present disclosure, smearing of a collected image can be minimized or avoided by using a sufficiently short exposure time. However, a short exposure time can result in an inability to distinguish an object 112, 114 from noise within a given image frame. In order to increase the signal-to-noise ratio, multiple images can be taken and summed or aggregated. However, the collection of multiple frames of image data, even where the frame rate is relatively high, will be accompanied by some movement of the sensor assembly 204 relative to the Earth centered inertial (ECI) coordinate frame, which will in turn result in smearing within the aggregate image. An example motion trajectory 304 of the detector 240 during a period of time is depicted in
With reference now to
As can be appreciated by one of skill in the art after consideration of the present disclosure, an image sensor system 106 implementing a star tracker 108 must track multiple stationary objects 112 in order to provide attitude or location information. In addition, an image sensor system 106 in accordance with embodiments of the present disclosure and implementing a star tracker 108 is capable of tracking multiple stationary objects 112 and multiple moving objects 114 simultaneously. This is at least in part enabled by establishing a plurality of registration frames 504. More particularly, one registration frame 504 can be established for each tracked object 112, 114 around or encompassing an area of the celestial sphere 502 at which the object 112, 114 is expected to be located. Each registration frame 504 within the reference frame 502 is mapped to a subframe 512 within the array of detector pixels 244 using texture mapping. As a result, an image 508 of each object 112, 114 is tracked within a corresponding subframe or subset 512 of pixels 244 of the detector 240. For example, a first object 112a located within an area of the Celestial sphere 502 corresponding to a first registration frame 504a appears as an imaged object 508a within a first subframe 512a; a second object 112b located within an area corresponding to a second registration frame 504b appears as an imaged object 508b within a second subframe 512b; and a third object 114 located within an area corresponding to a third registration frame 504c appears as an imaged object 508c within a third subframe 512c. As discussed in greater detail elsewhere herein, the establishment of an individual registration frame 504 for each tracked object 112, 114, and a corresponding subframe or area 512 on the detector 240, can result in a simplified processing procedure or algorithm. In addition, and as also discussed in greater detail elsewhere herein, this enables or facilitates the establishment of multiple parallel processes, with one process established for each tracked object 112, 114. Moreover, the establishment of individual registration frames 504 for each tracked object 112, 114 can facilitate the tracking of objects 112, 114 moving at different rates relative to the image sensor system 104.
For example, and as depicted in
As previously noted, the registration frames 504 within the reference frame in which the locations of the objects 112, 114 are mapped or catalogued can be mapped to corresponding subframe 512 areas on the detector 240. Accordingly, and as depicted in
The locations of the subframes 512 corresponding to the registration frames 504 at least approximately correspond to and encompass the expected locations of objects 112 and 114 catalogued within the star tracker data 228. As previously noted, the approximate locations can be determined from, for example, attitude and location information obtained from the IMU 312 of the star tracker sensor assembly 204, with reference to a star catalog or RSO ephemeris data stored as part of the star tracker data 228. A subframe 512 may have an area that covers the same or a different number of pixels as any other subframe 512. Moreover, a subframe 512 may be approximately centered around an object 112 or 114 being tracked within that subframe 504. Furthermore, less than the entire area of a subframe 512 may be included within any one image frame 404.1, for example as depicted by subframe 504c.1, tracking stationary object 112c. Moreover, different subframes 512 may have different shapes and/or dimensions.
In accordance with embodiments of the present disclosure, the image data encompassing a selected object 112 or 114 within a series of subframes 512 for the respective object 112 or 114 acquired in connection with a number of image frames 404 taken at different times can be aggregated to enable the centroid of the object 112 or 114 to be accurately located, even in daylight or other noisy conditions. Moreover, by utilizing subframes 512, embodiments of the present disclosure enable the detection and identification of an object 112 or 114, and the detection of the centroid of that object 112 or 114, to be performed quickly and efficiently.
At step 812, a registration frame 504 is established for each selected object 112, 114. Each registration frame 504 area encompasses the approximately known location of the selected objects 112, 114. At step 816, a processing chain is established for processing data associated with each of the registration frames 504. In each processing chain, an area or subframe 512 of pixels 244 on the detector 240 corresponding to an area encompassed by a respective registration frame 504 is registered and data within that subframe 512 is accumulated (steps 820a-n). A determination is then made as to whether a selected number of frames 404 have been collected (step 824). The selected number of frames can be a predetermined number, a number based on expected ambient light conditions, a number based on actual light conditions, or the like. If additional frames 404 are to be collected, the process returns to step 826, where an additional frame 404 of image data is acquired, and information from the image is provided to the individual processing chains. Once the selected number of images have been collected, a centroid location for the object 112, 114 tracked in each processing chain is provided as an output (step 828). The process can then end.
As discussed herein, the coherent summation or aggregation of images allows image features corresponding to objects 112, 114 to be distinguished from noise. In addition, by processing select areas or subframes 512 of a full image frame, processing resources can be concentrated on areas containing objects of interest, and furthermore enhances the ability to track a plurality of objects 112, 114, including objects that are themselves moving within the reference frame 502, accurately.
As can be appreciated by one of skill in the art, a digital image is an array of discrete and independent points, where each point or pixel of a detector represents a specific point in a scene or image and has an associated brightness. Taken together, the array of pixels 244 describes the entire scene, which may contain any number of objects 112, 114, such as stars or space objects. More particularly, each pixel or photosensitive site 244 responds to the amount of light it receives by producing an amount of charge, which is converted to a numeric value corresponding to the brightness of the corresponding point in the scene. However, an individual detector 240 may have performance characteristics that differ from other otherwise similar detectors, and individual pixels 244 within a detector 240 can have performance characteristics that differ from other pixels 244 within the detector 240. The performance characteristics of detector 240 pixels 244 can include whether a particular pixel provides reliable and useful data, which is expressed by a “good detector map”; a signal or noise level produced by a detector even when it is not exposed to light, which is expressed by a “pixel dark variance map”, where variance describes the amount of noise produced by the pixel; and the maximum quantity of light that a pixel 244 can accurately measure, expressed by a “detector saturation map”. These attributes can be measured and cataloged for each pixel 244 of a detector 240 prior to deployment of the detector 240 in an operational image sensor system 106. In addition, these attributes can be referenced in association with detector output provided as part of collected image frame data, to improve the quality of the image data related to an object 112, 114. Moreover, in accordance with embodiments of the present disclosure, these attributes can be processed across localized areas of the detector 240 corresponding to the subframes 512 established for imaging a number of tracked objects 112, 114 simultaneously, in a number of parallel processes.
As previously discussed, registration frames 504 are established around the expected areas of objects 112, 114 selected for detection and tracking. These registration frames 504 are translated to subframes 512 encompassing corresponding areas of an image sensor system 106 detector 240. In accordance with embodiments of the present disclosure, sub-areas 916, 918, and 920 of the respective precomputed data maps 904, 908, and 912 that fall within the established subframe 512 areas of the detector 240 are provided as inputs to parallel data processing streams 924. Thus, for a first RSO 114, the subframe image data 512.1, good pixel map sub-area data 916.1, dark pixel variance map sub area data 918.1, and pixel saturation map sub-area data 920.1 are all provided to a first registration and summation process 924.1. Similarly, for a first object 112.1, good pixel map sub-area data 916.2, dark pixel variance map sub area data 918.2, and pixel saturation map sub-area data 920.2 are all provided to a second registration and summation process 924.2. In addition, for an nth object 112.n, good pixel map sub-area data 916.1n, dark pixel variance map sub area data 918.n, and pixel saturation map sub-area data 920.n are all provided to an nth registration and summation process 924.n. The sub-area 916, 918, and 920 data and the subframe 512 data associated with each object 112, 114 are then combined and summed for detection and tracking. As can be appreciated by one of skill in the art after consideration of the present disclosure, the processing of image and detector attributes for subframe areas can reduce the amount of data required to be processed, and can facilitate the tracking of objects 112, 114 moving relative to one another within larger frames 404 of data. With reference now to
At steps 1018, 1020, and 1024, the subarray data 916.n, 918.n and 920.n from the maps 904, 908, and 912 and at step 1028 the subframe data 512 from the image subframes 512 are registered. In accordance with embodiments of the present disclosure, the registration can be performed by applying data from the IMU 248. The registered sub-areas are summed at steps 1032, 1036, and 1048 respectively, and the registered subframes 512 are summed at step 1044. At step 1052, the summed image subframe 512 output from the image summation step 1044 is divided by the summed good detector subarea output from the good detector summation step 1036 (i.e. the number of pixel measurements contributing to the aggregate image), to provide a normalized image subframe output. This can be provided to a detection process 1060 and to further processes for detection and tracking operations. The output from the pixel dark variance summation step 1032 can also be normalized by dividing it by the output of the good detector summation step 1048 (step 1056), and the result can be provided for use by the detection process 1060 or other processes. In addition, the output from the detector saturation summation step 1048 can be divided by the output of the good detector summation step 1036 (step 1058), and the result can be provided to a dynamic range control process 1064, which can be used to adjust the gain of the detector 240, the exposure parameters, or the like. Different instances of this process can be performed simultaneously for other objects 112, 114 located within separate or overlapping subframes 512 established across the some or all of the same sequence of full frame images.
In accordance with embodiments of the present disclosure, the acquisition of image frames 404 and the processing of image data within subframes 512, including processing that includes the application of sub-areas of mapped detector 240 data can be performed simultaneously in parallel processing streams. Moreover, the different processing streams can use the same IMU 248 attitude and quaternion data. As can be appreciated by one of skill in the art after consideration of the present disclosure, embodiments of the present disclosure establish different sub-image or subframe 512 areas for each object 112 or 114 tracked by the optical image sensor system 106. Moreover, the subframe 512 areas track the different objects 112 and 114 separately and simultaneously. By using subframes 512 to track the different objects 112 and 114, tracking can be performed with greater accuracy and over a wider area within a field of view 116 of the optical image sensor system 106. In addition, tracking of one or more moving objects 114, even in daylight conditions, is possible.
In accordance with further aspects of the present disclosure, multiple stationary objects 112 can located in order to determine the attitude of the optical image sensor system 106. In addition, objects 114 moving along known paths can be located and tracked in order to geolocate the optical image sensor system 106. More particularly, geolocation information can be determined by determining a location of two or more moving objects 114 traversing a known path, or by determining a location of a single moving object 114 traversing a known path at two or more points in time.
Embodiments of the present disclosure enable images of dim objects 112, 114 to be obtained. More particularly, an optical image sensor system 106 with an improved SNR can be provided by stacking or aggregating multiple subframe images 512. In accordance with further embodiments of the present disclosure, a location at least some of the subframe images 512 relative to other subframe images acquired at the same time can be changed, in order to track moving objects 114. At least some embodiments of the present disclosure can be operated in conjunction with other instruments, for example in an initial step of determining an area encompassed by a subframe 512.
The foregoing discussion of the disclosed systems and methods has been presented for purposes of illustration and description. Further, the description is not intended to limit the disclosed systems and methods to the forms disclosed herein. Consequently, variations and modifications commensurate with the above teachings, within the skill or knowledge of the relevant art, are within the scope of the present disclosure. The embodiments described herein are further intended to explain the best mode presently known of practicing the disclosed systems and methods, and to enable others skilled in the art to utilize the disclosed systems and methods in such or in other embodiments and with various modifications required by the particular application or use. It is intended that the appended claims be construed to include alternative embodiments to the extent permitted by the prior art.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/892,867, filed Aug. 28, 2019, the entire disclosure of which is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5719794 | Altshuler et al. | Feb 1998 | A |
5960391 | Tateishi et al. | Sep 1999 | A |
6075991 | Raleigh et al. | Jun 2000 | A |
6437692 | Petite et al. | Aug 2002 | B1 |
6597394 | Duncan et al. | Jul 2003 | B1 |
6820053 | Ruwisch | Nov 2004 | B1 |
7020501 | Elliott et al. | Mar 2006 | B1 |
7590098 | Ganesh | Sep 2009 | B2 |
8019544 | Needelman et al. | Sep 2011 | B2 |
8583371 | Goodzeit et al. | Nov 2013 | B1 |
8929936 | Mody et al. | Jan 2015 | B2 |
9073648 | Tsao et al. | Jul 2015 | B2 |
9191587 | Wright et al. | Nov 2015 | B2 |
9294365 | Misra et al. | Mar 2016 | B2 |
9449374 | Nash et al. | Sep 2016 | B2 |
9702702 | Lane et al. | Jul 2017 | B1 |
9924522 | Gulati et al. | Mar 2018 | B2 |
9927510 | Waldron et al. | Mar 2018 | B2 |
10021313 | Chen et al. | Jul 2018 | B1 |
10027893 | Bell | Jul 2018 | B2 |
10048084 | Laine et al. | Aug 2018 | B2 |
10271179 | Shima | Apr 2019 | B1 |
10469725 | Prasad | Nov 2019 | B2 |
10761182 | Tchilian | Sep 2020 | B2 |
10970520 | Kim et al. | Apr 2021 | B1 |
20040190762 | Dowski et al. | Sep 2004 | A1 |
20050018162 | Leenders | Jan 2005 | A1 |
20050049876 | Agranat | Mar 2005 | A1 |
20050213096 | Kouris | Sep 2005 | A1 |
20050228660 | Schweng | Oct 2005 | A1 |
20060030332 | Carrott et al. | Feb 2006 | A1 |
20060238623 | Ogawa | Oct 2006 | A1 |
20070010956 | Nerguizian et al. | Jan 2007 | A1 |
20080020354 | Goree et al. | Jan 2008 | A1 |
20080045235 | Kennedy et al. | Feb 2008 | A1 |
20080293353 | Mody et al. | Nov 2008 | A1 |
20090179142 | Duparre et al. | Jul 2009 | A1 |
20090197550 | Huttunen et al. | Aug 2009 | A1 |
20090268619 | Dain et al. | Oct 2009 | A1 |
20100091017 | Kmiecik et al. | Apr 2010 | A1 |
20120071105 | Walker et al. | Mar 2012 | A1 |
20120072986 | Livsics et al. | Mar 2012 | A1 |
20120163355 | Heo et al. | Jun 2012 | A1 |
20120167144 | Avison-Fell | Jun 2012 | A1 |
20120202510 | Singh | Aug 2012 | A1 |
20120238201 | Du et al. | Sep 2012 | A1 |
20120238220 | Du et al. | Sep 2012 | A1 |
20140063061 | Reitan | Mar 2014 | A1 |
20140218520 | Teich et al. | Aug 2014 | A1 |
20140232871 | Kriel et al. | Aug 2014 | A1 |
20140282783 | Totten et al. | Sep 2014 | A1 |
20140329540 | Duggan et al. | Nov 2014 | A1 |
20150009072 | Nijsure | Jan 2015 | A1 |
20150358546 | Higashiyama | Dec 2015 | A1 |
20160101779 | Katoh | Apr 2016 | A1 |
20160173241 | Goodson et al. | Jun 2016 | A1 |
20160187477 | Wang | Jun 2016 | A1 |
20160198141 | Fettig et al. | Jul 2016 | A1 |
20160227138 | Kozlowski | Aug 2016 | A1 |
20160292865 | Floor | Oct 2016 | A1 |
20160379374 | Sokeila | Dec 2016 | A1 |
20170120906 | Penilla et al. | May 2017 | A1 |
20170123429 | Levinson et al. | May 2017 | A1 |
20170131096 | Karlov | May 2017 | A1 |
20170366264 | Riesing et al. | Dec 2017 | A1 |
20180019910 | Tsagkaris et al. | Jan 2018 | A1 |
20180025641 | LaVelle et al. | Jan 2018 | A1 |
20180053108 | Olabiyi et al. | Feb 2018 | A1 |
20180082438 | Simon et al. | Mar 2018 | A1 |
20180107215 | Djuric et al. | Apr 2018 | A1 |
20180121767 | Wang et al. | May 2018 | A1 |
20180149730 | Li et al. | May 2018 | A1 |
20180268571 | Park et al. | Sep 2018 | A1 |
20180293893 | Yang et al. | Oct 2018 | A1 |
20180324595 | Shima | Nov 2018 | A1 |
20190049955 | Yabuuchi et al. | Feb 2019 | A1 |
20190122689 | Jain et al. | Apr 2019 | A1 |
20190164430 | Nix | May 2019 | A1 |
20190213887 | Kitayama et al. | Jul 2019 | A1 |
20190222752 | Burstein | Jul 2019 | A1 |
20190322282 | Theodosis et al. | Oct 2019 | A1 |
20190363430 | Wang et al. | Nov 2019 | A1 |
20200174094 | Tchilian | Jun 2020 | A1 |
20200333140 | Elson | Oct 2020 | A1 |
Number | Date | Country |
---|---|---|
108875595 | Nov 2018 | CN |
108985372 | Dec 2018 | CN |
Entry |
---|
U.S. Appl. No. 16/745,725, filed Jan. 17, 2020, Tchilian et al. |
“Deep Learning Meets DSP: OFDM Signal Detection,” KickView Tech Blog, Feb. 13, 2018, 25 pages [retrieved online from: blog.kickview.com/deep-learning-meets-dsp-ofdm-signal-detection/]. |
Buchheim “Astronomical Discoveries You Can Make, Too!” Springer, 2015, pp. 442-443. |
Ma et al. “Attitude-correlated frames approach for a star sensor to improve attitude accuracy under highly dynamic conditions,” Applied Optics, Sep. 2015, vol. 54, No. 25, pp. 7559-7566. |
Ma et al. “Performance Analysis of the Attitude-correlated Frames Approach for Star Sensors,” IEEE, 3rd IEEE International Workshop on Metrology for Aerospace (MetroAeroSpace), Firenze, Italy, Jun. 22-23, 2016, pp. 81-86. |
Nair et al. “Accelerating Capsule Networks with Tensor Comprehensions,” Princeton, May 2018, 8 pages. |
Ni et al. “Attitude-correlated frames adding approach to improve signal-to-noise ratio of star image for star tracker,” Optics Express, May 2019, vol. 27, No. 11, pp. 15548-15564. |
Wang “Research on Pruning Convolutional Neural Network, Autoencoder and Capsule Network,” before Oct. 9, 2018, 11 pages. |
Wang et al. “An Optimization View on Dynamic Routing Between Capsules,” ICLR 2018 Workshop, Feb. 2018, 4 pages. |
Number | Date | Country | |
---|---|---|---|
62892867 | Aug 2019 | US |