This application is related to:
U.S. application Ser. No. 15/076,701 filed on Mar. 22, 2016, entitled “Method and system for surveillance camera arbitration of uplink consumption,” now U.S. Patent Publication No.: 2017/0278368 A1;
U.S. application Ser. No. 15/076,703 filed on Mar. 22, 2016, entitled “Method and system for pooled local storage by surveillance cameras,” now U.S. Patent Publication No.: 2017/0280102 A1;
U.S. application Ser. No. 15/076,704 filed on Mar. 22, 2016, entitled “System and method for designating surveillance camera regions of interest,” now U.S. Patent Publication No.: 2017/0277967 A1;
U.S. application Ser. No. 15/076,705 filed on Mar. 22, 2016, entitled “System and method for deadzone detection in surveillance camera network,” now U.S. Patent Publication No.: 2017/0278366 A1;
U.S. application Ser. No. 15/076,708 filed on Mar. 22, 2016, entitled “System and method for retail customer tracking in surveillance camera network,” now U.S. Patent Publication No.: 2017/0278137 A1;
U.S. application Ser. No. 15/076,709 filed on Mar. 22, 2016, entitled “Method and system for modeling image of interest to users,” now U.S. Patent Publication No.: 2017/0277785 A1;
U.S. application Ser. No. 15/076,710 filed on Mar. 22, 2016, entitled “System and method for using mobile device of zone and correlated motion detection,” now U.S. Patent Publication No.: 2017/0280103 A1;
U.S. application Ser. No. 15/076,712 filed on Mar. 22, 2016, entitled “Method and system for conveying data from monitored scene via surveillance cameras,” now U.S. Pat. No. 9,965,680;
U.S. application Ser. No. 15/076,713 filed on Mar. 22, 2016, entitled “System and method for configuring surveillance cameras using mobile computing devices,” now U.S. Patent Publication No.: 2017/0278365 A1;
and
U.S. application Ser. No. 15/076,717 filed on Mar. 22, 2016, entitled “System and method for controlling surveillance cameras,” now U.S. Patent Publication No.: 2017/0280043 A1.
All of the afore-mentioned applications are incorporated herein by this reference in their entirety.
Surveillance camera systems are often deployed to collect image data within or around premises. Examples of premises include governmental buildings, office buildings, retail establishments, and single and multi-unit residences. The cameras are typically installed to monitor and detect individuals and/or activities at different locations in and around the premises.
A successful installation of surveillance camera systems requires careful consideration of several factors. The designers/installers select the locations in which to install the cameras, select the type of camera that is best suited for each location, and then position the cameras' fields of view to capture scenes at each location. For example, point of sale areas might require one or more ceiling mounted, dome style cameras to capture transaction-related activities within the locations. For monitoring large open areas such as shopping malls, open-floor plan offices, and parking lots, either panoramic view (e.g. “fish eye”) cameras or pan-tilt-zoom (PTZ) cameras are often utilized because of each camera's ability to provide wider fields of view and to scan the areas, respectively. Designers/installers might also position the fields of view of different surveillance cameras to overlap, and also position the field of view of one camera to include another surveillance camera. These actions provide different views or perspectives of the same scene and the ability to capture attempts at tampering with the surveillance cameras.
Analytics systems are often part of surveillance camera systems. At a basic level, the analytics systems provide the ability to detect and track individuals and objects within the image data of the monitored scenes. Other capabilities include the ability to determine motion of objects relative to visual cues superimposed upon the image data and to search for specific behaviors of interest within the image data. The visual cues are often placed near fixed objects in the background scene of the image data to infer motion of objects relative to the visual cues. In one example, virtual tripwire visual cues can be located near entryways within the scene to detect entry or exit of individuals through the entryways and to provide a count of the individuals passing through the entryway over a specific time period. These analytics systems can provide both real-time analysis of live image data and forensic analysis of previously recorded image data.
It would be beneficial to determine overlap among fields of view of surveillance cameras during the installation of the surveillance cameras using an analytics system, for example. In contrast, installers of current surveillance camera systems might typically use an “educated guess” approach for installing surveillance cameras to provide the desired level of overlap among the fields of view, where the experience of the installer is paramount to achieving this objective.
It would also be beneficial to infer overlap among fields of view of surveillance cameras from image data captured by the surveillance cameras. Such a capability could allow system operators to better interpret image data from different cameras. Moreover, the analytics systems could use this information to present image data to operators in a way that is easier to grasp context.
The present invention provides for the ability to analyze image data from multiple surveillance cameras. It does this analysis using a mobile computing device such as a smartphone or tablet computing device or even a laptop computer. These modern devices have excellent image data processing resources and can be used to tap the image data feeds from nearby surveillance cameras and analyze that image data to provide information on the configuration of the system as a whole.
In general, according to one aspect, the invention features a method for determining overlap for a network of surveillance cameras. The method comprises detecting motion within image data from the network of surveillance cameras, correlating detected motion among the image data from different surveillance cameras, and determining overlap of fields of view of the surveillance cameras.
In embodiments, motion is detected within image data from the network of surveillance cameras by tracking a mobile user device and/or an installer carrying the mobile user device as the mobile user device is moved along a critical path within a premises.
For example, a mobile user device might be used for traversing a critical path within a premises and the correlated detected motion detected within the image data to determine whether the mobile user device and/or the installer is included in the image data from at least two of the surveillance cameras.
Objects can be identified within the image data and the objects tracked across the fields of view of the surveillance cameras. Motion detection events can be generated in response to detecting motion within the image data, and correlating the motion detection events.
In some embodiments, a display grid is provided, which includes image data from at least two surveillance cameras having at least one portion of an object within a scene monitored by the surveillance cameras included within the image data, and sending the display grid for display on a mobile user device.
In general, according to another aspect, the invention features a method for determining overlap of fields of view for surveillance cameras of a network. The method comprises defining a path within monitored by the surveillance cameras via a mobile user device carried by the installer, the surveillance cameras capturing image data of the scene during the definition of the path and transferring the image data to an analytics system, and the analytics system determining overlap from the image data.
In general, according to another aspect, the invention features a method for determining overlap of fields of view for surveillance cameras. This method comprises receiving time-stamped image data from two or more surveillance cameras, analyzing the image data to determine correlated motion, and determining overlap of the fields of view based on the correlated motion.
In general, according to still another aspect, the invention features a surveillance camera system, which comprises two or more surveillance cameras generating image data of a scene and an analytics system that receives the image data from the two or more surveillance cameras and determines overlap within the scene by determining whether the image data from at least two of the surveillance cameras includes correlated motion.
In general, according to still another aspect, the invention features a surveillance camera system, comprising two or more surveillance cameras generating image data of a scene and an analytics system that receives the image data from the surveillance cameras and determines overlap within the scene by correlating detected motion among the image data from different surveillance cameras, and by determining that the correlated detected motion occurs at substantially the same time in the image data from two or more different surveillance cameras and inferring that the motion is related.
In general, according to still another aspect, the invention features a surveillance camera system. The system comprises a mobile user device for defining a critical path, surveillance cameras capturing image data capturing image data along the critical path and an analytics system determining overlap of fields of view of the surveillance cameras from the image data.
In general, according to still another aspect, the invention features surveillance camera system. This system comprises surveillance cameras for generating time-stamped image data and an analytics system for determine correlated motion in the image data and determining overlap of fields of view of the surveillance cameras based on the correlated motion.
The above and other features of the invention including various novel details of construction and combinations of parts, and other advantages, will now be more particularly described with reference to the accompanying drawings and pointed out in the claims. It will be understood that the particular method and device embodying the invention are shown by way of illustration and not as a limitation of the invention. The principles and features of this invention may be employed in various and numerous embodiments without departing from the scope of the invention.
In the accompanying drawings, reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale; emphasis has instead been placed upon illustrating the principles of the invention. Of the drawings:
The invention now will be described more fully hereinafter with reference to the accompanying drawings, in which illustrative embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Further, the singular forms including the articles “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms: includes, comprises, including and/or comprising, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Further, it will be understood that when an element, including component or subsystem, is referred to and/or shown as being connected or coupled to another element, it can be directly connected or coupled to the other element or intervening elements may be present.
Surveillance cameras 103-1 through 103-3, labeled camera1 through camera3, have fields of view 105-1 through 105-3, respectively. The cameras 103 are installed to monitor a corridor 70 within a premises 52, in one example. In the illustrated example, a doorway 66 is located at the end of the corridor 70. In another example, the cameras 103 are installed around a premises to monitor an alley 71 adjacent to the premises 52.
The surveillance cameras 103 communicate with each other over a local network 210 and with a local analytics system 222. The analytics system 222 creates an intelligent display grid 418 that includes image data 250 from at least two surveillance cameras 250. The system 10 might also include a network video recorder 228 that stores image data 250 captured by the surveillance cameras 103.
In other examples, the analytics system 222 could be a cloud based system that is accessible through a public network. In still other examples, the analytics system, as illustrated below, is integrated on one or more of the cameras or distributed over the processing resources of the several cameras and/or standalone systems. Such an analytics system could be partially or completely executing on the surveillance cameras 103.
However implemented, the analytics system 222 preferably includes or utilizes a map 180, which is an image representation of the area of the premises 52 (e.g. the corridor 70 and/or alley 71) under surveillance by the cameras 103. The installer 60 typically loads the map 180 onto the analytics system 222 after initial installation of the cameras 103 but prior to analyzing the corridor 70 and/or alley 71 for instances of overlap 90. The map 180 further preferably includes the locations of each of the cameras 103-1 through 103-3 of the network.
Each of the surveillance cameras 103-1 through 103-3 capture image data 250 of a scene within their respective fields of view 105-1 through 105-3 of the cameras. The surveillance cameras 103 transmit their image data 250 over the local network 210 for analysis by the analytics system 222 or retain the image data 250 to determine instances of overlap 90 between the scenes/fields of view 105 of the cameras 103.
A wireless router 244 provides a wireless network 230 such as WiFi or a cellular wide area network that enables exchange of wireless messages 264 between components. The wireless router 244 also has a local network interface that connects the wireless router 244 to the local network 210.
In one implementation, an installer 60-1 holds a user mobile computing device 400, also known as user device, for communicating with the surveillance cameras 103. Examples of user devices 400 include smartphones, tablet computing devices, and laptop computers running operating systems such as Windows, Android, Linux, or IOS, in examples. Each user device 400 includes a display screen or touch screen 410 and one or more applications 412, or “apps.” The apps 412 execute upon the operating systems of the user devices 400.
The user device 400 can exchange wireless messages 264 directly between each surveillance camera 103 and/or the analytics systems 222 for this purpose. Exemplary wireless messages 264-1, 264-2 and 264-5 between user device 400 and surveillance cameras 103-1, 103-2, and 103-3 are shown. The surveillance cameras 103 also transmit their image data 250 over the wireless network 230 to the user device 400 in the wireless messages 264 via the wireless router 244 or directly via peer-to-peer connections. Even Bluetooth or similar protocol could be used. The user device 400 receives the wireless messages 264, extracts the image data 250 therein, and forwards the image data 250 to an analytics system to determine instances of overlap 90 between the scenes/fields of view 105 of the cameras 103.
It is important to note that additional embodiments of the analytics system 222 can exist in the system 10. In other examples, the analytics system 222 can be a remote analytics system maintained by a third party entity and which the surveillance cameras 103 access over a network cloud, a process running on the user devices 400, or a process integrated within one or more of the surveillance cameras 103. For the latter example, one of the cameras 103 typically functions as a master in a master/slave relationship between the cameras 103. The remaining cameras 103 functioning as slaves in this relationship transmit their image data 250 over the network 210 for analysis by the integrated analytics system of the master surveillance camera 103.
Via the wireless messages 264, user device 400 sends instructions to configure the cameras 103 and access the image data 250 on the cameras 103. The wireless messages 264 include both control and data wireless messages. In one example, data wireless messages 264 include frames of image data 250 that the surveillance cameras 103 send to the user mobile computing devices 400.
Specific examples showing how the cameras 103 might be deployed are illustrated. In one example, dome style cameras camera2 and camera3 are mounted overhead within a premises 52 to monitor corridor 70. Camera1 is a PTZ style camera mounted along a wall of corridor 70 such that the field of view 105-1 of camera1 provides a side view of the corridor 70. In another example, similar dome style cameras camera2 and camera3 are mounted overhead outside the premises 52 to monitor alley 71. In this example, camera1 might also be a PTZ style camera mounted along a wall of an adjacent building such that the field of view 105-1 of camera1 provides a side view of the alley 71.
An installer 60-1 might initially position camera1 and camera2 such that their fields of view 105-1 and 105-2 include a common portion of the scene, indicated by overlap 90-1. In a similar fashion, the installer 60-1 positions camera2 and camera3 to include a different portion of the scene in common between the fields of view 105-2 and 105-3 of camera2 and camera3, indicated by overlap 90-2. However, the initial positioning of the cameras 103 to achieve the desired overlap 90-1/90-2 or no overlap is based on an educated guess and requires verification. To determine that the desired amount of overlap 90-1/90-2 is achieved, in embodiments, the installer 60-1 utilizes the user device 400 in conjunction with the cameras 103 and the analytics system 222.
In a preferred embodiment, with respect to the corridor 70 monitored area example, the system 10 enables determination of overlap 90-1/90-2 during the installation of the cameras 103 in response to the installer 60-1 walking a critical path 54 through the monitored scene (e.g. corridor 70) while carrying the user device 400. The cameras 103 capture the installer/user device in the image data 250 of each of the cameras 103 during the traversal of the critical path 54, and send the image data 250 to the analytics system 222 to determine the overlap 90-1/90-2 based on correlating detected motion of the installer/user device among overlapping frames of image data 250.
In another embodiment, with respect to the alley 71 monitored area example, the analytics system 222 of the system 10 determines overlap 90-1/90-2 within the scene by first determining motion of objects in image data 250 of the scene. Unlike the preferred embodiment, where the motion is associated with a known and predetermined object moving within the scene in a specific manner (e.g. the installer/user device moving through the scene along the critical path 54), the objects and their expected manner of movement are not predetermined. In the example, objects such as a dog 88 and individual 60-2 are moving through the alley 71. Then, the analytics system 222 correlates the detected motion among the image data 250 from the surveillance cameras 103, determines that the correlated detected motion occurs at substantially the same time in the image data 250 from two or more different surveillance cameras 103, and infers that the motion is related and thus that the cameras have overlapping fields of view and the degree of that overlap.
It is also important to note that the analysis of the image data 250 provided by the analytics system 222 can either be executed in real time, or at a time after the cameras 103 are installed, in a forensics fashion. For the real time analysis, the analytics system 222 preferably receives the image data 250 over the local network 210 from the cameras 103 just after the cameras 103 capture the image data 250 of the scene. For the forensic analysis of the image data 250, the analytics system 222 can analyze previously recorded image data 250 of the scene stored on a network video recorder 228, or image data 250 stored locally within the cameras 103, in examples.
In
The camera 103 includes a processing unit (CPU) 138, an imager 140, a camera image data storage system 174 and a network interface 142. An operating system 136 runs on top of the CPU 138. A number of processes or applications are executed by the operating system 136. One of the processes is a control process 162. In some embodiments, a camera analytics system 176 process is also included within one or more of the surveillance cameras 103 in the network 210. This camera analytics system 176 can also create an intelligent display grid 418.
The camera 103 saves image data 250 captured by the imager 140 to the camera image data storage system 174, locally, and/or to the NVR 228, remotely. Each camera 103 can support one or more streams of image data 250. The control process 162 receives and sends messages 264 via its network interface 142. Each camera 103 also saves metadata 160 for the image data 250, including a timestamp 164 and camera number 166 for each frame of image data 250.
In step 502, an app 412 running on the user device 400 sends a pairing request to one or more cameras 103 to establish a communications session 308 with each of the cameras 103 and for the surveillance cameras 103 to enter an overlap detection mode. According to step 504, the cameras 103 send a pairing response message and enter overlap detection mode. As a result of step 504, a communication session 308 is established between the each of the cameras 103 currently in overlap detection mode and the app 412, in one example. In other examples, the communication is established with the analytics systems 222, 176.
In step 506, the app 412 then presents user interface buttons to start and stop definition of a critical path 54 within an area being monitored (e.g. corridor 70) by multiple surveillance cameras 103.
According to step 508, in response to selection of a “Start” button on the user interface of the app 412 by the installer 60, the app 412 starts a local timer and sends an instruction to the cameras 103 to indicate start of the critical path definition. In step 510, the installer/user device moves through the area being monitored to define a critical path 54 through the area being monitored.
In step 514, in response to the receiving the “start” button instruction, at regular time intervals, each of the surveillance cameras 103 sends time-stamped image data 250 captured during the definition of the critical path 54 to an analytics system 222. According to step 516, in response to selection of a “Stop” button on the user interface of the app 412 by the installer 60, the app 412 stops the local timer and sends an instruction to the cameras 103 to end definition of the critical path 54. In response to receiving the “stop” instruction, the cameras 103 send their remaining image data 250 to the analytics system 222, in step 518.
According to step 520, the analytics system 222 and/or 176 receives the time stamped image data 250 from the surveillance cameras 103 during definition of the critical path 54. In step 522, the app 412 sends the time interval over which to analyze the image data 250, indicated by the value of the local timer. The analytics system 222 and/or 176 then tracks the installer/user device through the image data 250 from the surveillance cameras 103, in step 524.
In step 526, the analytics system 222 and/or 176 determines overlap 90 among fields of view 105 of each the cameras 103 by correlating the motion detection events, and determining from the correlated detected motion whether the installer/user device is included within the fields of view 105 of at least two or more different fields of view 105 of the cameras 103 at substantially the same time. Then, in step 528, the analytics system 222 and/or 176 includes image data 250 associated with the determined overlap 90 (e.g. overlapping fields of view 105 of the cameras 103) within an intelligent display grid 418 and sends the intelligent display grid 418 for display on the user device 400. The image data 250 displayed within the display grid 418 is from at least two surveillance cameras 103 and has at least one portion of an object within a scene monitored by the surveillance cameras included within the image data 250.
According to step 530, the app 412 displays the intelligent display grid 418 on the display screen 410 of the user device 400, and the installer 60-1 uses the displayed image data 250 within the intelligent display grid 418 concerning overlap 90 between fields of view 105 for each of the surveillance cameras 103 to determine whether the cameras 103 require repositioning to achieve the desired amount of overlap 90.
In step 532, the installer 60-1 optionally repeats this process to either verify that repositioning of the cameras 103 and/or changing settings of the cameras 103 (e.g. lens, zoom) achieves the desired overlap 90 or to define additional critical path(s) 54 and detect overlap 90 therein. Changing the lenses of the cameras 103 can cause a corresponding change in the fields of view 105 of the cameras 103 for achieving the desired overlap 90. This change is required, in one example, when the lenses are of a fixed focus type, which are designed to work for a single, specific working distance. Replacing a fixed lens with a varifocal lens, in one example, enables the installer 60-1 to subsequently adjust the focal length, angle of view, and level of zoom of the cameras 103, thereby enabling adjustment of overlap 90 among fields of view 105 of two or more surveillance cameras 103. Additionally, changing a zoom setting of the cameras 103 can cause a corresponding change in the fields of view 105 of the cameras 103 in accordance with the installer's overlap 90 objectives. This is a typical course of action for adjusting overlap 90 when the cameras 103 are PTZ type cameras, in one example. In step 534, the installer 60-1 selects an option within the app 412 to exit overlap detection mode, and the app 412 sends an associated message 264 to the cameras 103 in response. Finally, in step 536, the cameras 103 receive the exit message 264, and end overlap detection mode and terminate the communications session 308 in response.
In step 550, the analytics system 222 and/or 176 receives time-stamped image data 250 from two or more surveillance cameras 103 at the same premises 52, where fields of view 105 of the two or more of the surveillance cameras 103 are positioned to overlap a monitored area within the premises 52.
In step 552, the analytics system 222 and/or 176 analyzes the image data 250 to determine correlated motion detection events, where each correlated motion detection event is associated with motion occurring at substantially the same time in image data 250 from two or more different surveillance cameras 103 and inferring that the motion is related.
Then, in step 554, from the determined correlated motion detection events, the analytics system 222 builds an intelligent display grid 418 including the overlapping frames of image data 250 and sends the intelligent display grid 418 for display on the user device 400, thereby providing a visual indication of the degree of overlap 90 for the installer or operator 60-1. Upon conclusion of step 554, the method transitions back to step 550 to receive the next frames of time-stamped image data 250 from the surveillance cameras 103.
The intelligent display grid 418 includes panes 289. Each of the panes 289 can include image data 250, where the image data 250 is provided by the analytics system 222 and/or 176 in accordance with the overlap detection methods of
With reference to the method of
With reference to the method of
While this invention has been particularly shown and described with references to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
3217098 | Oswald | Nov 1965 | A |
4940925 | Wand | Jul 1990 | A |
5164827 | Paff | Nov 1992 | A |
5204536 | Vardi | Apr 1993 | A |
5317394 | Hale | May 1994 | A |
5729471 | Jain | Mar 1998 | A |
5850352 | Moezzi | Dec 1998 | A |
5940538 | Spiegel | Aug 1999 | A |
5969755 | Courtney | Oct 1999 | A |
6341183 | Goldberg | Jan 2002 | B1 |
6359647 | Sengupta | Mar 2002 | B1 |
6581000 | Hills | Jun 2003 | B2 |
6724421 | Glatt | Apr 2004 | B1 |
6812835 | Ito | Nov 2004 | B2 |
6970083 | Venetianer | Nov 2005 | B2 |
7242423 | Lin | Jul 2007 | B2 |
7286157 | Buehler | Oct 2007 | B2 |
7342489 | Milinusic | Mar 2008 | B1 |
7382244 | Donovan et al. | Jun 2008 | B1 |
7409076 | Brown | Aug 2008 | B2 |
7450735 | Shah | Nov 2008 | B1 |
7456596 | Goodall | Nov 2008 | B2 |
7460149 | Donovan et al. | Dec 2008 | B1 |
7529388 | Brown | May 2009 | B2 |
7623152 | Kaplinsky | Nov 2009 | B1 |
7623676 | Zhao | Nov 2009 | B2 |
7733375 | Mahowald | Jun 2010 | B2 |
8249301 | Brown | Aug 2012 | B2 |
8300102 | Nam | Oct 2012 | B2 |
8482609 | Mishra | Jul 2013 | B1 |
8483490 | Brown | Jul 2013 | B2 |
8502868 | Buehler | Aug 2013 | B2 |
8594482 | Fan | Nov 2013 | B2 |
8675074 | Salgar | Mar 2014 | B2 |
8723952 | Rozenboim | May 2014 | B1 |
8995712 | Huang | Mar 2015 | B2 |
9015167 | Ballou et al. | Apr 2015 | B1 |
9058520 | Xie | Jun 2015 | B2 |
9094615 | Aman | Jul 2015 | B2 |
9129179 | Wong | Sep 2015 | B1 |
9158975 | Lipton | Oct 2015 | B2 |
9168882 | Mirza et al. | Oct 2015 | B1 |
9197861 | Saptharishi | Nov 2015 | B2 |
9280833 | Brown | Mar 2016 | B2 |
9412269 | Saptharishi | Aug 2016 | B2 |
9495614 | Boman et al. | Nov 2016 | B1 |
9594963 | Bobbitt | Mar 2017 | B2 |
9641763 | Bernal | May 2017 | B2 |
9674458 | Teich et al. | Jun 2017 | B2 |
9785898 | Hofman et al. | Oct 2017 | B2 |
9860554 | Samuelsson | Jan 2018 | B2 |
9967446 | Park | May 2018 | B2 |
20020104098 | Zustak et al. | Aug 2002 | A1 |
20030107649 | Flickner | Jun 2003 | A1 |
20030169337 | Wilson et al. | Sep 2003 | A1 |
20050012817 | Hampapur et al. | Jan 2005 | A1 |
20050057653 | Maruya | Mar 2005 | A1 |
20060001742 | Park | Jan 2006 | A1 |
20060181612 | Lee et al. | Aug 2006 | A1 |
20060239645 | Curtner | Oct 2006 | A1 |
20060243798 | Kundu | Nov 2006 | A1 |
20070178823 | Aronstam et al. | Aug 2007 | A1 |
20070279494 | Aman | Dec 2007 | A1 |
20070294207 | Brown et al. | Dec 2007 | A1 |
20080004036 | Bhuta | Jan 2008 | A1 |
20080101789 | Sharma | May 2008 | A1 |
20080114477 | Wu | May 2008 | A1 |
20080158336 | Benson et al. | Jul 2008 | A1 |
20090237508 | Arpa | Sep 2009 | A1 |
20090268033 | Ukita | Oct 2009 | A1 |
20090273663 | Yoshida | Nov 2009 | A1 |
20090284601 | Eledath et al. | Nov 2009 | A1 |
20100013917 | Hanna | Jan 2010 | A1 |
20100110212 | Kuwahara et al. | May 2010 | A1 |
20100153182 | Quinn et al. | Jul 2010 | A1 |
20110043631 | Marman | Feb 2011 | A1 |
20110128384 | Tiscareno et al. | Jun 2011 | A1 |
20110246626 | Peterson et al. | Oct 2011 | A1 |
20120072420 | Moganti et al. | Mar 2012 | A1 |
20120206605 | Buehler et al. | Aug 2012 | A1 |
20120226526 | Donovan et al. | Sep 2012 | A1 |
20130169801 | Martin et al. | Jul 2013 | A1 |
20130223625 | de Waal et al. | Aug 2013 | A1 |
20130278780 | Cazier et al. | Oct 2013 | A1 |
20130343731 | Pashkevich et al. | Dec 2013 | A1 |
20140085480 | Saptharishi | Mar 2014 | A1 |
20140211018 | de Lima et al. | Jul 2014 | A1 |
20140218520 | Teich et al. | Aug 2014 | A1 |
20140330729 | Colangelo | Nov 2014 | A1 |
20150039458 | Reid | Feb 2015 | A1 |
20150092052 | Shin et al. | Apr 2015 | A1 |
20150121470 | Rongo et al. | Apr 2015 | A1 |
20150208040 | Chen et al. | Jul 2015 | A1 |
20150215583 | Chang | Jul 2015 | A1 |
20150244992 | Buehler | Aug 2015 | A1 |
20150249496 | Muijs et al. | Sep 2015 | A1 |
20150358576 | Hirose et al. | Dec 2015 | A1 |
20150379729 | Datta et al. | Dec 2015 | A1 |
20160065615 | Scanzano et al. | Mar 2016 | A1 |
20160269631 | Jiang | Sep 2016 | A1 |
20160379074 | Nielsen et al. | Dec 2016 | A1 |
20170278365 | Madar | Sep 2017 | A1 |
20170278367 | Burke | Sep 2017 | A1 |
20170278368 | Burke | Sep 2017 | A1 |
20170280043 | Burke | Sep 2017 | A1 |
20170280102 | Burke | Sep 2017 | A1 |
20170280103 | Burke | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
2 164 003 | Mar 2010 | EP |
2 538 672 | Dec 2012 | EP |
2003151048 | May 2003 | JP |
2010074382 | Apr 2010 | JP |
2007030168 | Mar 2007 | WO |
2013141742 | Sep 2013 | WO |
2014114754 | Jul 2014 | WO |
Entry |
---|
International Search Report and the Written Opinion of the International Searching Authority, dated May 31, 2017, from International Application No. PCT/US2017/023430, filed Mar. 21, 2017. Fourteen pages. |
International Search Report and the Written Opinion of the International Searching Authority, dated Jun. 12, 2017 from International Application No PCT/US2017/023440, filed on Mar. 21, 2017. Fourteen pages. |
International Search Report and the Written Opinion of the International Searching Authority, dated Jun. 19, 2017, from International Application No. PCT/US2017/023436, filed on Mar. 21, 2017. Fourteen pages. |
International Search Report and the Written Opinion of the International Searching Authority, dated Jun. 19, 2017, from International Application No. PCT/US2017/023444, filed on Mar. 21, 2017. Thirteen pages. |
International Search Report and the Written Opinion of the International Searching Authority, dated Jun. 28, 2017, from International Application No. PCT/US2017/023434, filed on Mar. 21, 2017. Thirteen pages. |
Number | Date | Country | |
---|---|---|---|
20170278367 A1 | Sep 2017 | US |