The disclosed implementations relate generally to controlling a camera with Day and Night modes, including, but not limited to, integrating electrochromic filtering into optical apparatus of the camera and automatically removing/passing infrared light component in ambient light incident on the camera.
Some security cameras operate in one of two modes (i.e., Day mode and Night mode) depending on the ambient lighting conditions. Day mode is used when there is sufficient ambient light to adequately illuminate the scene. Night mode (also called infrared mode) is used when there is not enough ambient light to adequately illuminate the scene, in which case the camera relies on additional infrared illumination (e.g., using onboard infrared light emitting diodes). A security camera configured to operate in both Day mode and Night mode often includes an infrared (IR) filter that is disposed at two distinct locations associated with Day and Night modes, respectively. Specifically, in Day mode, the IR filter is disposed with a first position in which it is interposed between a lens assembly and a sensor array of the camera, while in Night mode, the IR filter is disposed with a second position in which it is not interposed between the lens assembly and the sensor array. As part of initiating a change of the camera mode to Night mode, the IR filter has to be mechanically moved from the first position to the second position, and as part of initiating a change of the camera mode to Day mode, the IR filter has to be mechanically moved from the second position to the first position.
One challenge for such security cameras is mechanical failure of the IR filter due to constant switching of the security cameras between Day mode and Night mode. These security cameras often have to operate constantly over days, months and even years (e.g., switch between the day and Night modes at least twice every day), and a mechanical motor that drives the IR filter could fail in the long term due to such constant operation. Thus, it would be beneficial to use a more reliable filtering mechanism in a security camera than the current mechanically driven IR filter.
Accordingly, there is a need for a security camera that implements more effective methods for controlling IR filtering when the camera switches between a Night mode to a Day mode.
In accordance with one aspect of the application, a method for controlling a camera system is performed at a camera including a controller, a sensor array including a plurality of sensor elements, and a lens assembly that is configured to focus light on the sensor array. The lens assembly includes an electrochromic glass layer disposed in front of the sensor array and having optical transmission properties that are responsive to voltage applied to the electrochromic glass layer. The lens assembly further includes a first transmission state in which the electrochromic glass layer is substantially opaque to a predefined band of IR wavelengths, and a second transmission state in which the electrochromic glass layer is substantially transparent to the predefined band of IR wavelengths and visible wavelengths.
The method for controlling the camera mode includes, in accordance with a determination to transition the camera mode to a Day mode, generating by the controller a first voltage, and applying the first voltage to the electrochromic glass layer to cause the lens assembly to enter the first transmission state. Prior to the transition of the camera mode to the Day mode, the lens assembly was in the second transmission state. The method for controlling the camera mode further includes in response to the first voltage, removing by the electrochromic glass layer a substantial portion of the predefined band of IR wavelengths in ambient light incident on the camera, and simultaneously passing by the electrochromic glass layer a substantial portion of visible wavelengths in the ambient light, thereby exposing the sensor array to the substantial portion of the visible wavelengths of the ambient light via the lens assembly.
In accordance with another aspect of the application, some implementations include a camera for controlling a camera system. The camera further includes: a controller, a sensor array comprising a plurality of sensor elements, and a lens assembly that is configured to focus light on the sensor array. The lens assembly includes an electrochromic glass layer disposed in front of the sensor array and having optical transmission properties that are responsive to voltage applied to the electrochromic glass layer. The camera is configured to perform any of the methods described herein (e.g., any of the methods described above).
In accordance with one aspect of the application, a method for controlling a sensor is performed at an image sensor device including an electrochromic glass layer and an image sensor array. The image sensor array further includes a plurality of sensor elements. The electrochromic glass layer is disposed in front of the sensor array and has optical transmission properties that are responsive to voltage applied to the glass. The electrochromic glass layer includes a first transmission state in which the electrochromic glass layer is substantially opaque to a predefined band of IR wavelengths, and a second transmission state in which the electrochromic glass layer is substantially transparent to the predefined band of IR wavelengths and visible wavelengths. The method for controlling the sensor mode includes in accordance with a determination to transition the sensor mode to a Day mode, generating a first voltage, and applying the first voltage to cause the electrochemical glass layer to enter the first transmission state. Prior to the transition of the sensor mode to the Day mode, the lens assembly was in the second transmission state. The method for controlling the sensor mode further includes in response to the first voltage, removing by the electrochromic glass layer a substantial portion of the predefined band of IR wavelengths in ambient light incident on the image sensor device and simultaneously passing by the electrochromic glass layer a substantial portion of visible wavelengths in the ambient light, thereby exposing the image sensor array to the substantial portion of the visible wavelengths of the ambient light.
In accordance with another aspect of the application, some implementations include an image sensor array for controlling a sensor mode. The image sensor array includes an electrochromic glass layer and an image sensor array having a plurality of sensor elements. The electrochromic glass layer is disposed in front of the sensor array and has optical transmission properties that are responsive to voltage applied to the glass. The electrochromic glass layer include a first transmission state in which transmission of a predefined band of IR wavelengths is substantially reduced and a second transmission state in which the electrochromic glass layer is substantially transparent to the predefined band of IR wavelengths and visible wavelengths. The image sensor array is configured to perform the method described above.
Thus, a camera and an image sensor device are provided to implement more effective methods for controlling IR filtering when the camera and the image sensor device switch between Night mode to Day mode. Such methods may complement or replace conventional methods for controlling IR filtering associated with various operation modes.
Further, in accordance with another aspect of the application, a method is implemented at a camera for controlling a lens assembly. The camera includes a controller, an image sensor array including a plurality of sensor elements, and the lens assembly configured to focus light on the sensor array, and the lens assembly further includes an electrochromic glass lens disposed in front of the sensor array and having an index of refraction that is variable and responsive to voltage applied on the electrochromic glass lens. The method for controlling the lens assembly includes determining that the camera mode of the camera is a first mode, and in the first mode, the index of refraction of the electrochromic glass lens has a first index value associated with a first focal length. The method for controlling the lens assembly further includes in accordance with a determination that the camera mode at the first mode, generating by the controller a first voltage and applying the first voltage on the electrochromic glass lens, thereby changing the index of refraction of the electrochromic glass lens to a second index value associated with a second focal length that is distinct from the first focal length.
In accordance with another aspect of the application, a method is implemented at a camera for controlling a filtering effect. The camera includes a controller, a sensor array comprising a plurality of sensor elements, and a lens assembly configured to focus light on the sensor array. The lens assembly includes an electrochromic glass layer disposed in front of the sensor array and having optical transmission properties that are responsive to voltage applied to the electrochromic glass layer. The lens assembly includes a first filtering mode in which the electrochromic glass layer is configured to band-transmit a first predefined band of wavelengths. The method for controlling the filtering effect of the camera includes, in accordance with a determination to transition the camera mode to an operation mode, determining a magnitude of a first voltage and generating by the controller the first voltage. In the operation mode, the camera is configured to capture media content in a field of view illuminated with light having the first predefined band of wavelengths. The method for controlling the filtering effect of the camera further includes applying the first voltage to cause the lens assembly to enter the first filtering mode, and in response to the first voltage, passing by the electrochromic glass layer a substantial portion of the first predefined band of wavelengths in the ambient light, thereby exposing the sensor array to the substantial portion of the first predefined band of wavelengths of the ambient light via the lens assembly.
For a better understanding of the various described implementations, reference should be made to the Description of Implementations below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
Reference will now be made in detail to implementations, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described implementations. However, it will be apparent to one of ordinary skill in the art that the various described implementations may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the implementations.
One or more network-connected cameras could be set up in a smart home environment to provide video monitoring and security therein. In some implementations, the cameras operate in two modes, a Day mode in which there is enough ambient light to capture color video of a scene, and a Night mode in which the camera captures video of a scene using onboard LED illumination when there is not enough ambient light. A program module of the camera may decide when to switch between Night mode and Day mode using one or more of: illuminant detection, lux detection, and tiling. When the camera is in Day mode, IR filtering is enabled to block a substantial portion of the IR components of the incident light. When the camera is in Night mode, IR filtering is disabled so an image sensor array of the cameras can receive incident IR light from a scene illuminated by the camera's onboard IR illuminators or external IR illuminators.
To overcome mechanical failure issues associated with conventional mechanically driven IR filters, an electrochromic glass layer is applied in a camera to control filtering of the IR components of the incident light. Specifically, in accordance with a determination to transition a camera mode to a Day mode, the camera generates a first voltage which is applied to the electrochromic glass layer to enable electrochromic filtering. In response to the first voltage, the electrochromic glass layer removes a substantial portion of a predefined band of IR wavelengths in ambient light incident on the camera, and simultaneously passes a substantial portion of visible wavelengths of the ambient light, thereby exposing the sensor array to the substantial portion of the visible wavelengths of the ambient light via a lens assembly of the camera. Alternatively, in accordance with a determination to transition the camera mode to a Night mode, the camera generates by the controller a second voltage that is distinct from the first voltage (in some implementations, the camera disables the first voltage, i.e., setting the second voltage to 0V), and applies the second voltage on the electrochromic glass layer of the electrochromic glass layer to disable electrochromic filtering. In response to the second voltage, the electrochromic glass layer passes by the a substantial portion of the predefined band of IR wavelengths and a substantial portion of visible wavelengths in the ambient light incident on the camera, thereby exposing the sensor array exposed to the ambient light via the lens assembly without interference by electrochromic filtering of the electrochromic glass layer.
Further, in some implementations, use of the electrochromic glass layer may introduce additional benefits, e.g., enabling a high dynamic range (HDR) mode. When two distinct voltages are applied sequentially on the electrochromic glass layer, distinct filtering effects associated with visible wavelengths of the incident light are enabled, and result in two images captured under two different exposure conditions. Image information of these two images can be combined to provide a high quality image that has an enhanced dynamic range.
In some implementations, the integrated devices of the smart home environment 100 include intelligent, multi-sensing, network-connected devices that integrate seamlessly with each other in a smart home network and/or with a central server or a cloud-computing system (e.g., a smart home provider server system 190) to provide a variety of useful smart home functions. The smart home environment 100 may include one or more intelligent, multi-sensing, network-connected thermostats 102 (hereinafter referred to as “smart thermostats 102”), one or more intelligent, network-connected, multi-sensing hazard detection units 104 (hereinafter referred to as “smart hazard detectors 104”), one or more intelligent, multi-sensing, network-connected entryway interface devices 106 and 120 (hereinafter referred to as “smart doorbells 106” and “smart door locks 120”), one or more intelligent, multi-sensing, network-connected alarm systems 122 (hereinafter referred to as “smart alarm systems 122”), one or more intelligent, multi-sensing, network-connected wall switches 108 (hereinafter referred to as “smart wall switches 108”), and one or more intelligent, multi-sensing, network-connected wall plug interfaces 110 (hereinafter referred to as “smart wall plugs 110”). In some implementations, the smart home environment 100 includes a plurality of intelligent, multi-sensing, network-connected appliances 112 (hereinafter referred to as “smart appliances 112”), such as refrigerators, stoves, ovens, televisions, washers, dryers, lights, stereos, intercom systems, garage-door openers, floor fans, ceiling fans, wall air conditioners, pool heaters, irrigation systems, security systems, space heaters, window AC units, motorized duct vents, and so forth. The smart home may also include a variety of non-communicating legacy appliances 140, such as old conventional washer/dryers, refrigerators, and the like, which may be controlled by smart wall plugs 110. The smart home environment 100 may further include a variety of partially communicating legacy appliances 142, such as infrared (“IR”) controlled wall air conditioners or other IR-controlled devices, which may be controlled by IR signals provided by the smart hazard detectors 104 or the smart wall switches 108. The smart home environment 100 may also include communication with devices outside of the physical home but within a proximate geographical range of the home. For example, the smart home environment 100 may include a pool heater monitor 114 and/or an irrigation monitor 116.
In some implementations, the smart home environment 100 includes one or more network-connected cameras 118 that are configured to provide video monitoring and security in the smart home environment 100. In some implementations, cameras 118 also capture video when other conditions or hazards are detected, in order to provide visual monitoring of the smart home environment 100 when those conditions or hazards occur. The cameras 118 may be used to determine occupancy of the structure 150 and/or particular rooms 152 in the structure 150, and thus may act as occupancy sensors. For example, video captured by the cameras 118 may be processed to identify the presence of an occupant in the structure 150 (e.g., in a particular room 152). Specific individuals may be identified based, for example, on their appearance (e.g., height, face) and/or movement (e.g., their walk/gait). For example, cameras 118 may additionally include one or more sensors (e.g., IR sensors, motion detectors), input devices (e.g., microphone for capturing audio), and output devices (e.g., speaker for outputting audio).
The smart home environment 100 may additionally or alternatively include one or more other occupancy sensors (e.g., the smart doorbell 106, smart door locks 120, touch screens, IR sensors, microphones, ambient light sensors, motion detectors, smart nightlights 170, etc.). In some implementations, the smart home environment 100 includes radio-frequency identification (RFID) readers (e.g., in each room 152 or a portion thereof) that determine occupancy based on RFID tags located on or embedded in occupants. For example, RFID readers may be integrated into the smart hazard detectors 104. The smart home environment 100 may include one or more sound and/or vibration sensors for detecting abnormal sounds and/or vibrations. These sensors may be integrated with any of the devices described above. The sound sensors detect sound above a decibel threshold. The vibration sensors detect vibration above a threshold directed at a particular area (e.g., vibration on a particular window when a force is applied to break the window).
By virtue of network connectivity, one or more of the smart home devices of
As discussed above, users may control smart devices in the smart home environment 100 using a network-connected computer or portable electronic device 130. In some examples, some or all of the occupants (e.g., individuals who live in the home) may register their device 130 with the smart home environment 100. Such registration may be made at a central server (e.g., a smart home provider server system 190) to authenticate the occupant and/or the device as being associated with the home and to give permission to the occupant to use the device to control the smart devices in the home. An occupant may use their registered device 130 to remotely control the smart devices of the home, such as when the occupant is at work or on vacation. The occupant may also use their registered device to control the smart devices when the occupant is actually located inside the home, such as when the occupant is sitting on a couch inside the home. It should be appreciated that instead of or in addition to registering devices 130, the smart home environment 100 may make inferences about which individuals live in the home and are therefore occupants and which devices 130 are associated with those individuals. As such, the smart home environment may “learn” who is an occupant and permit the devices 130 associated with those individuals to control the smart devices of the home.
In some implementations, in addition to containing processing and sensing capabilities, devices 102, 104, 106, 108, 110, 112, 114, 116, 118, 120, and/or 122 (collectively referred to as “the smart devices”) are capable of data communications and information sharing with other smart devices, a central server or cloud-computing system, and/or other devices that are network-connected. Data communications may be carried out using any of a variety of custom or standard wireless protocols (e.g., IEEE 402.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.11a, WirelessHART, MiWi, etc.) and/or any of a variety of custom or standard wired protocols (e.g., Ethernet, HomePlug, etc.), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
In some implementations, the smart devices serve as wireless or wired repeaters. In some implementations, a first one of the smart devices communicates with a second one of the smart devices via a wireless router. The smart devices may further communicate with each other via a connection (e.g., network interface 160) to a network, such as the Internet 162. Through the Internet 162, the smart devices may communicate with a smart home provider server system 190 (also called a central server system and/or a cloud-computing system herein). The smart home provider server system 190 may be associated with a manufacturer, support entity, or service provider associated with the smart device(s). In some implementations, a user is able to contact customer support using a smart device itself rather than needing to use other communication means, such as a telephone or Internet-connected computer. In some implementations, software updates are automatically sent from the smart home provider server system 190 to smart devices (e.g., when available, when purchased, or at routine intervals).
In some implementations, the network interface 160 includes a conventional network device (e.g., a router), and the smart home environment 100 of
It is to be appreciated that “smart home environments” may refer to smart environments for homes such as a single-family house, but the scope of the present teachings is not so limited. The present teachings are also applicable, without limitation, to duplexes, townhomes, multi-unit apartment buildings, hotels, retail stores, office buildings, industrial buildings, and more generally any living space or work space.
In some implementations, the smart home provider server system 190 or a component thereof serves as the video server system 208; the video server system 208 is a part or component of the smart home provider server system 190. In some implementations, the video server system 208 is a dedicated video processing server that provides video processing services to video sources and client devices 204 independent of other services provided by the video server system 208.
In some implementations, each of the video sources 222 includes one or more video cameras 118 that capture video and send the captured video to the video server system 208 substantially in real-time. In some implementations, each of the video sources 222 optionally includes a controller device (not shown) that serves as an intermediary between the one or more cameras 118 and the video server system 208. The controller device receives the video data from the one or more cameras 118, optionally performs some preliminary processing on the video data, and sends the video data to the video server system 208 on behalf of the one or more cameras 118 substantially in real-time. In some implementations, each camera has its own on-board processing capabilities to perform some preliminary processing on the captured video data before sending the processed video data (along with metadata obtained through the preliminary processing) to the controller device and/or the video server system 208.
In some implementations, a camera 118 of a video source 222 captures video at a first resolution (e.g., 720P and/or 1080P) and/or a first frame rate (24 frames per second), and sends the captured video to the video server system 208 at both the first resolution (e.g., the original capture resolution(s), the high-quality resolution(s) such as 1080P and/or 720P) and the first frame rate, and at a second, different resolution (e.g., 180P) and/or a second frame rate (e.g., 5 frames per second or 10 frames per second). For example, the camera 118 captures a video 223-1 at 720P and/or 1080P resolution (the camera 118 may capture a video at 1080P and create a downscaled 720P version, or capture at both 720P and 1080P). The video source 222 creates a second (or third), rescaled (and optionally at a different frame rate than the version 223-1) version 225-1 of the captured video at 180P resolution, and transmits both the original captured version 223-1 (i.e., 1080P and/or 720P) and the rescaled version 225-1 (i.e., the 180P version) to the video server system 208 for storage. In some implementations, the rescaled version has a lower resolution, and optionally a lower frame rate, than the original captured video. The video server system 208 transmits the original captured version or the rescaled version to a client 204, depending on the context. For example, the video server system 208 transmits the rescaled version when transmitting multiple videos to the same client device 204 for concurrent monitoring by the user, and transmits the original captured version in other contexts. In some implementations, the video server system 208 downscales the original captured version to a lower resolution, and transmits the downscaled version.
In some implementations, a camera 118 of a video source 222 captures video at a first resolution (e.g., 720P and/or 1080P) and/or a first frame rate, and sends the captured video to the video server system 208 at the first resolution (e.g., the original capture resolution(s); the high-quality resolution(s) such as 1080P and/or 720P) and first frame rate for storage. When the video server system 208 transmits the video to a client device 204, the video server system 208 may downscale the video to a second, lower resolution (e.g., 180P) and/or second, lower frame rate for the transmission, depending on the context. For example, the video server system 208 transmits the downscaled version when transmitting multiple videos to the same client device 204 for concurrent monitoring by the user, and transmits the original captured version in other contexts.
In some implementations, the camera 118 operates in two modes, a Day mode in which there is enough ambient light to capture color video of a scene, and a Night mode in which the camera captures video of a scene using onboard LED illumination when there is not enough ambient light (e.g., as described in the cross-referenced U.S. patent application Ser. No. 14/723,276, filed on May 27, 2015, entitled, “Multi-mode LED Illumination System.”). As described herein, in some implementations, the camera 118 includes a program module that decides when to switch from Night mode to Day mode using one or more of: illuminant detection (detecting the type of ambient light based on R/G and B/G component ratios of the ambient light), lux detection (detecting the ambient light level), and tiling (performing illuminant detection and/or lux detection for sub-regions of an image sensor array so as to detect localized/point light source that only impact a portion of the image sensor array).
Referring to
In some implementations, the server-side module 206 includes one or more processors 212, a video storage database 214, device and account databases 216, an I/O interface to one or more client devices 218, and an I/O interface to one or more video sources 220. The I/O interface to one or more clients 218 facilitates the client-facing input and output processing for the server-side module 206. In some implementations, the I/O interface to clients 218 or a transcoding proxy computer (not shown) rescales (e.g., downscales) and/or changes the frame rate of video for transmission to a client 204. The databases 216 store a plurality of profiles for reviewer accounts registered with the video processing server, where a respective user profile includes account credentials for a respective reviewer account, and one or more video sources linked to the respective reviewer account. The I/O interface to one or more video sources 220 facilitates communications with one or more video sources 222 (e.g., groups of one or more cameras 118 and associated controller devices). The video storage database 214 stores raw video data received from the video sources 222, as well as various types of metadata, such as motion events, event categories, event category models, event filters, and event masks, for use in data processing for event monitoring and review for each reviewer account.
In some implementations, the server-side module 206 receives information regarding alert events detected by other smart devices 204 (e.g., hazards, sound, vibration, motion). In accordance with the alert event information, the server-side module 206 instructs one or more video sources 222 in the smart home environment 100 where the alert event is detected to capture video and/or associate with the alert event video, received from the video sources 222 in the same smart home environment 100, that is contemporaneous or proximate in time with the alert event.
Examples of a representative client device 204 include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA), a tablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, a point-of-sale (POS) terminal, vehicle-mounted computer, an ebook reader, or a combination of any two or more of these data processing devices or other data processing devices. For example, client devices 204-1, 204-2, and 204-m are a smart phone, a tablet computer, and a laptop computer, respectively.
Examples of the one or more networks 162 include local area networks (LAN) and wide area networks (WAN) such as the Internet. The one or more networks 162 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB), FIREWIRE, Long Term Evolution (LTE), Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP), Wi-MAX, or any other suitable communication protocol.
In some implementations, the video server system 208 is implemented on one or more standalone data processing apparatuses or a distributed network of computers. In some implementations, the video server system 208 also employs various virtual devices and/or services of third party service providers (e.g., third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of the video server system 208. In some implementations, the video server system 208 includes, but is not limited to, a handheld computer, a tablet computer, a laptop computer, a desktop computer, or a combination of any two or more of these data processing devices or other data processing devices.
The server-client environment 200 shown in
The electronic devices, the client devices or the server system communicate with each other using the one or more communication networks 162. In an example smart home environment, two or more devices (e.g., the network interface device 160, the hub device 180, and the client devices 204-m) are located in close proximity to each other, such that they could be communicatively coupled in the same sub-network 162A via wired connections, a WLAN or a Bluetooth Personal Area Network (PAN). The Bluetooth PAN is optionally established based on classical Bluetooth technology or Bluetooth Low Energy (BLE) technology. This smart home environment further includes one or more other radio communication networks 162B through which at least some of the electronic devices of the video sources 222-n exchange data with the hub device 180. Alternatively, in some situations, some of the electronic devices of the video sources 222-n communicate with the network interface device 160 directly via the same sub-network 162A that couples devices 160, 180 and 204-m. In some implementations (e.g., in the network 162C), both the client device 204-m and the electronic devices of the video sources 222-n communicate directly via the network(s) 162 without passing the network interface device 160 or the hub device 180.
In some implementations, during normal operation, the network interface device 160 and the hub device 180 communicate with each other to form a network gateway through which data are exchanged with the electronic device of the video sources 222-n. As explained above, the network interface device 160 and the hub device 180 optionally communicate with each other via a sub-network 162A. In some implementations, the hub device 180 is omitted, and the functionality of the hub device 180 is performed by the video server system 208, video server system 252, or smart home provider server system 190.
In some implementations, the video server system 208 is, or includes, a dedicated video processing server configured to provide data processing for monitoring and facilitating review of alert events (e.g., motion events) in video streams captured by video cameras 118. In this situation, the video server system 208 receives video data from video sources 222 (including cameras 118) located at various physical locations (e.g., inside homes, restaurants, stores, streets, parking lots, and/or the smart home environments 100 of
Communication interfaces 304 include, for example, hardware capable of data communications using any of a variety of custom or standard wireless protocols (e.g., IEEE 402.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.11a, WirelessHART, MiWi, etc.) and/or any of a variety of custom or standard wired protocols (e.g., Ethernet, HomePlug, etc.), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium. In some implementations, memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 306, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 306, optionally, stores additional modules and data structures not described above.
In some implementations, the camera 118 captures surveillance video using a digital imaging system. Digital images (frames) are captured as a sequence at a particular frame rate 342a, compressed, and then sent to the “cloud” (e.g., the video server system 208) for storage and retrieval. In some implementations, each frame (e.g., the raw sensor data 3460) is composed of 1280 by 320 pixels (1280×720) and each pixel location has 3 color components, red, green and blue. The camera 118 operates in one of two modes (e.g., indicated by the Day/Night mode value 342c) depending on the ambient lighting conditions. Day mode is used when there is sufficient ambient light to adequately illuminate the scene. Night mode is used when there is not enough light to adequately illuminate the scene.
In some implementations, when operating in Day mode, the camera 118 uses the ambient lighting sources to illuminate the scene and capture surveillance video. In some implementations, the minimum lux level at which the camera captures 118 video in Day mode is between 0.1 to 1 lux depending on the color temperature of the dominant illuminant. Once the minimum lux level is reached, the camera automatically switches to Night mode. Switching to Night mode includes disabling electrochromic filtering of the electrochromic glass layer 332 and enabling a set of IR LEDs 336 to provide illumination for the scene. Night mode is maintained until the camera 118 detects an external illuminant.
As explained above, one challenge for using a mechanically driven IR filter is a risk of mechanical failure of the IR filter caused by constant switching of the camera between Day mode and Night mode. There has been a need for a camera that applies a more reliable filtering mechanism than the mechanically driven IR filter when the camera constantly switches between a Night mode to a Day mode.
In some implementations, the lens assembly 330 includes an electrochromic glass layer 332 disposed on the optical path of the incident light and at a location in front of the sensor array 334. The electrochromic glass layer 332 has optical transmission properties that are responsive to voltage applied to the electrochromic glass layer 332. As such, the lens assembly 330 include a first transmission state in which the electrochromic glass layer 332A is substantially opaque to a predefined band of IR wavelengths, and a second transmission state in which the electrochromic glass layer is substantially transparent to the predefined band of IR wavelengths and visible wavelengths.
The camera 118 operates in two modes, a Day mode in which there is enough ambient light to capture color video of a scene, and a Night mode in which the camera captures video of a scene using onboard LED illumination when there is not enough ambient light. When the camera is in Day mode, IR filtering is enabled to block a substantial portion of the IR components of the incident light. More specifically, when the camera 118 determines to transition the camera mode to a Day mode, the controller 333 generates a first voltage. The first voltage is applied to the electrochromic glass layer 332 to cause the lens assembly 330 to enter the first transmission state. Prior to this transition of the camera mode to the Day mode, the camera operates in a Night mode, and the lens assembly 330 was in the second transmission state. Then, in response to the first voltage, the electrochromic glass layer 332 removes a substantial portion of the predefined band of IR wavelengths in ambient light incident on the camera, and simultaneously passes by the electrochromic glass layer 332 a substantial portion of visible wavelengths in the ambient light, thereby exposing the sensor array 334 to the substantial portion of the visible wavelengths of the ambient light via the lens assembly 330.
In some implementations, the magnitude of the first voltage is not greater than 5V, and induces a limited amount of current consumption. In some implementations, the first voltage is sustained on the electrochromic glass layer at the first transmission state. In some implementations, when the camera transitions to the camera mode to the Day mode, the first voltage is applied on the electrochromic glass layer for a predetermined duration of time and is removed after the predetermined duration of time.
Alternatively, when the camera is in Night mode, IR filtering is disabled so a sensor array 334 of the cameras can receive incident IR light from a scene illuminated by the camera's onboard IR illuminators or external IR illuminators. More specifically, in some implementations, when the camera 118 determines to transition the camera mode to the Night mode, the controller 333 generates a second voltage that is distinct from the first voltage. The second voltage is applied on the electrochromic glass layer 332 to cause the lens assembly 330 to enter the second transmission state. In response to the second voltage, the electrochromic glass layer 332 passes a substantial portion of the predefined band of IR wavelengths and a substantial portion of visible wavelengths in ambient light incident on the camera, thereby exposing the sensor array exposed to the ambient light via the lens assembly without interference by electrochromic filtering of the electrochromic glass layer. In some implementations, the second voltage is substantially equal to zero, i.e., the first voltage is removed or disabled from the electrochromic glass layer 332.
In some implementations, the onboard IR illuminators 336 include one or more infrared light emitting diodes (e.g., as described in the cross-referenced U.S. patent application Ser. No. 14/723,276, filed on May 27, 2015, entitled, “Multi-mode LED Illumination System.”). In accordance with the determination to transition the camera mode to the Night mode, the infrared light emitting diodes are powered on to illuminate a field of view while electrochromic filtering of the electrochromic glass layer 332 is disabled.
In some implementations, the lens assembly 330 includes one or more optical lenses 452 in addition to the electrochromic glass layer 332, and the one or more optical lenses 452 are configured to focus incident light of the camera 118 onto the sensor array 334. Optionally, the electrochromic glass layer 332 is a standalone optical filter disposed between the one or more optical lenses 452 and the sensor array 334 (as shown in
As described herein, in some implementations, the camera 118 includes a mode control module 320b that decides when to switch from Night mode to Day mode using one or more of: illuminant detection (detecting the type of ambient light based on R/G and B/G component ratios of the ambient light), lux detection (detecting the ambient light level), and tiling (performing illuminant detection and/or lux detection for sub-regions of an image sensor array so as to detect localized/point light source that only impact a portion of the image sensor array). Specifically, when the camera mode is a Night mode, the mode control module 320b determines whether the ambient light is due to other than an IR light source, detects an ambient light level, and initiates a change of the camera mode to the Day mode when it is determined that the ambient light that is not filtered by the electrochromic glass layer 332 is due to other than an IR light source and that the ambient light level exceeds a first lux threshold. In contract, the mode control module 320b maintains the camera 118 in the Night mode if it is determined that the ambient light is due to other than an IR light source and that the ambient light threshold does not exceed the first lux threshold.
Further, in some implementations, wherein the sensor elements of the sensor array 334 include first, second and third pixels each of which has respective peak responses at different respective visible light frequencies (e.g., for green, red and blue colors, respectively). The mode control module 320b detects a first light component, a second light component and a third light component of the ambient light by averaging output signals from the first, second and third pixels, respectively. Respective values of the first, second and third light components are further used to determine whether the ambient light is due to other than an IR light source. Additionally, in some implementations, the mode control module 320b is also configured to determine whether the ambient light is due to sunlight or an incandescent light source based on values of the first, second and third light components. After it is determined that the ambient light is due to sunlight or an incandescent light source, the camera 118 initiates a change of the camera mode to the Day mode only when the ambient light level exceeds a second lux threshold higher than the first lux threshold. More details on detection and switching of Day mode and Night mode are described in the cross-referenced U.S. patent application Ser. No. 14/738,225, filed on Jun. 12, 2015, entitled, “Day and Night Detection Based on One or More of Illuminant Detection, Lux Level Detection, and Tilting.”
Referring to
In accordance with a determination to transition the sensor mode to a Day mode, the image sensor device 500 generates a first voltage, and applies the first voltage to cause the electrochromic glass layer 502 to enter the first transmission state. Prior to the transition of the sensor mode to the Day mode, the electrochromic glass layer 502 was in the second transmission state. In response to the first voltage, the image sensor device 500 removes by the electrochromic glass layer 502 a substantial portion of the predefined band of IR wavelengths in ambient light incident on the image sensor device, and simultaneously passes by the electrochromic glass layer a substantial portion of visible wavelengths in the ambient light, thereby exposing the image sensor array 504 to the substantial portion of the visible wavelengths of the ambient light.
In some implementations, the electrochromic glass layer 502A is monolithically coated on the sensor elements of the image sensor array directly. In some implementations, the electrochromic glass layer (not shown in
The image sensor device 500 can be part of a camera (e.g., the camera 118). The image sensor device 500 can also be used as a motion sensor that detects motion of an object in a field of view associated with the image sensor device 500.
In some implementations, transitions of the sensor mode to Day mode or Night mode can be similarly decided by one or more programs associated with the image sensor array 500. In some implementations, the image sensor array 500 can also be configured to implement an HDR mode. More details on determination of the camera mode and the HDR mode are described above with reference to
The electrochromic glass layer 332 includes an electrolyte layer 602, a first electrode 604 and a counter electrode 606 that are configured to act as an electrochemical cell. The first and second voltages are applied on the first electrode 604 at the Day mode and at the Night mode, respectively, while the counter electrode 606 is optionally grounded. Nanoparticle-in-glass composite material is deposited on a first side of the electrolyte layer 602 and forms the first electrode 604. For example, the first electrode 602 may include a layer of indium tin oxide (ITO) nanocrystals in glass made out of niobium oxide. The ITO nanocrystals are combined with niobium-containing ions (also called polyoxometalate (POM) clusters) in solution, and the first electrode is formed when the solution covers a surface of the electrochromic glass layer 332. In some implementations, the ITO nanocrystal solution is optionally evaporated or spray coated onto the first side of the electrolyte layer 602 to form the first electrode 604.
In some implementations, when no electrical voltage is applied between the first electrode 604 and the counter electrode 606, the electrochromic glass layer 332 operates at a second transmission state in which it is transparent to a predefined band of IR wavelengths (e.g., near-infrared wavelengths) and visible wavelengths of incident light. This second transmission state could be activated in accordance with a determination to transition a camera mode to a Night mode, when IR illumination is necessary for capturing images.
Referring to
Referring to
In some implementations, two images could be captured for a scene when the first and third voltages are applied on the electrochromic glass layer 332 to enable two different exposure conditions for the camera 118. Specifically, the third mode includes a high dynamic range (HDR) mode. The camera 118 captures a first image of a scene when the first voltage is applied on the electrochromic glass layer 332, and captures a second image of a substantially identical scene when the third voltage is applied on the electrochromic glass layer. For a recorded video clip, the camera 118 captures the first and second images for each frame of the recorded video clip. The camera 118 or a remote server then combines at least the first and second images to obtain a HDR image (or a HDR frame of the recorded video clip). The HDR image would have a higher dynamic range and thereby enables a better image quality than either one of the first and second images, because the HDR image combines dynamic ranges associated with the exposure conditions of the first and second images.
In some implementations, when the third mode includes the HDR mode, the first voltage does not have to be one of at least two voltages used to create the different exposure conditions for the camera 118. The controller 333 of the camera 118 generates a fourth voltage in addition to the third voltage as described above. The fourth voltage is distinct from the third voltage. The fourth voltage is applied on the electrochromic glass layer 332. In response to the fourth voltage, the electrochromic glass layer 332 removes a substantial portion of the predefined band of the wavelengths while reducing the intensity of the visible wavelengths in the ambient light to a second visible light intensity. The second visible light intensity is distinct from the first visible light intensity to which the electrochromic glass layer is exposed when the third voltage is applied thereon, thereby enabling two distinct exposure conditions for the same scene.
In this HDR mode, the camera 118 captures a first image of a scene when the third voltage is applied on the electrochromic glass layer 332, and a second image of the substantially identical scene when the fourth voltage is applied on the electrochromic glass layer 332. For a recorded video clip, the camera 118 captures the first and second images for each frame of the recorded video clip. The camera 118 or a remote server then combines at least the first and second images to obtain a HDR image (or a HDR frame of the recorded video clip). In some implementations, the camera 118 or the remote server combines more than two images of the scene to obtain the HDR image.
In some implementations, the first voltage is sustained on the electrochromic glass layer 332 at the first transmission state. In some implementations, when the camera transitions the camera mode to the Day mode, the first voltage is applied on the electrochromic glass layer 332 for a predetermined duration of time, and removed after the predetermined duration of time. Similarly, each of the other voltages (e.g., the second, third or fourth voltage) is optionally sustained on the electrochromic glass layer 332 or applied on the electrochromic glass layer 332 for a respective duration of time, in accordance with a determination to transition the camera mode to a respective camera mode (e.g., the Night mode and the HDR mode).
In accordance with a determination (706) to transition the camera mode to a Day mode, the controller 333 generates (708) a first voltage, and the first voltage is applied (710) to the electrochromic glass layer 332 to cause the lens assembly 330 to enter the first transmission state. Prior to the transition of the camera mode to the Day mode, the lens assembly 330 was (712) in the second transmission state. In response to the first voltage, electrochromic glass layer 332 removes (714) a substantial portion of the predefined band of IR wavelengths in ambient light incident on the camera, and simultaneously passes by the electrochromic glass layer a substantial portion of visible wavelengths in the ambient light, thereby exposing the sensor array to the substantial portion of the visible wavelengths of the ambient light via the lens assembly. In some implementations, the mode control module 320b includes one or more programs having instruction for controlling the controller 333 to generate the first voltage in accordance with a determination to transition the camera mode to the Day mode.
More details on a Night mode, a HDR mode and characteristics of the electrochromic layer 332 are discussed above with reference to
In accordance with a determination to transition (806) the sensor mode to a Day mode, the image sensor device 500 generates (808) a first voltage, and applies (810) the first voltage to cause the electrochromic glass layer to enter the first transmission state. Prior to the transition of the sensor mode to the Day mode, the electrochromic glass layer was (812) in the second transmission state. In response to the first voltage, the electrochromic glass layer 502 removes (814) a substantial portion of the predefined band of IR wavelengths in ambient light incident on the image sensor device, and simultaneously passes by the electrochromic glass layer a substantial portion of visible wavelengths in the ambient light, thereby exposing the image sensor array to the substantial portion of the visible wavelengths of the ambient light. Likewise, more details on a Night mode, a HDR mode and characteristics of the electrochromic layer 332 are discussed above with reference to
The camera 118 determines (904) that the camera mode is a first mode in which the index of refraction of the electrochromic glass lens has a first index value associated with a first focal length. In accordance with a determination (906) that the camera mode at the first mode, the controller 333 generates (908) by the controller a first voltage, and applies (910) the first voltage on the electrochromic glass lens, thereby changing the index of refraction of the electrochromic glass lens to a second index value associated with a second focal length that is distinct from the first focal length.
In some implementations, it is determined that the camera mode is the first mode when ambient temperature of the camera exceeds a thermal threshold, and the first voltage is applied to compensate a variation of a focal length of either the electrochromic glass lens or one or more optical lens 452 of the lens assembly 333 that is caused by a corresponding temperature increase. This temperature increase often happens to an outdoor surveillance camera.
It should be understood that the particular order in which the operations in
In summary, in accordance with various implementations in this application, nanoparticles are suspended in a crystal matrix to form an electrochromic glass layer that can switch between an IR blocking/filtering state and an IR transmission state without any moving part. A thin coating of nanocrystals embedded in glass can provide selective control over wavelengths of ambient light that can pass through the glass. A small jolt of electricity is needed to switch the electrochromic glass layer between the IR blocking/filtering state and the IR transmission state. Further, due to the use of the electrochromic glass layer in the optical path of a camera, the camera is configured for switching between a traditional visible light image capture mode and IR image capture mode, and this mode switching activity does not involve any moving part (such as a filter motor).
When the Nanocrystal-based electrochromic glass layer is used as an IR filter for a camera operating at Night and Day modes, it is compatible with many sensors that have greater dynamic range and sensitivity than an RGB-IR sensor.
More importantly, the electrochromic glass layer does not involve any moving parts and reduces space required by a camera module that relies on a mechanically driven filter. This helps improvement of the form factor of the camera module, and eliminates a risk of creating particle contamination in the camera module.
In some implementations, electrochromic filtering actually increases a dynamic range in Day mode by darkening in the visible range. This feature could be useful in outdoor imaging, where direct sunlight can saturate a sensor. Referring to
Referring to
It will be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first voltage could be termed a second voltage, and, similarly, a second voltage could be termed a first voltage, without departing from the scope of the various described implementations. The first voltage and the second voltage are both voltage levels, but they are not the same voltage level.
The terminology used in the description of the various described implementations herein is for the purpose of describing particular implementations only and is not intended to be limiting. As used in the description of the various described implementations and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting” or “in accordance with a determination that,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event]” or “in accordance with a determination that [a stated condition or event] is detected,” depending on the context.
Although some of various drawings illustrate a number of logical stages in a particular order, stages that are not order dependent may be reordered and other stages may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be obvious to those of ordinary skill in the art, so the ordering and groupings presented herein are not an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.
The foregoing description, for purpose of explanation, has been described with reference to specific implementations. However, the illustrative discussions above are not intended to be exhaustive or to limit the scope of the claims to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The implementations were chosen in order to best explain the principles underlying the claims and their practical applications, to thereby enable others skilled in the art to best use the implementations with various modifications as are suited to the particular uses contemplated.
This application is a continuation of U.S. patent application Ser. No. 15/339,839, filed Oct. 31, 2016, entitled “Electrochromic Filtering in a Camera,” which is hereby incorporated by reference in its entirety. This application is related to the following applications, each of which is hereby incorporated by reference in its entirety: U.S. patent application Ser. No. 14/723,276, filed on May 27, 2015, entitled, “Multi-mode LED Illumination System,” and U.S. patent application Ser. No. 14/738,225, filed on Jun. 12, 2015, entitled, “Day and Night Detection Based on One or More of Illuminant Detection, Lux Level Detection, and Tilting.”
Number | Name | Date | Kind |
---|---|---|---|
2997935 | Scheffold | Aug 1961 | A |
3782260 | Ettischer et al. | Jan 1974 | A |
D349914 | Usui | Aug 1994 | S |
D357267 | Yotsuya | Apr 1995 | S |
D372490 | Sheffield et al. | Aug 1996 | S |
5604534 | Hedges | Feb 1997 | A |
D385571 | Abrams | Oct 1997 | S |
5862428 | An | Jan 1999 | A |
5963253 | Dwyer | Oct 1999 | A |
5978028 | Yamane | Nov 1999 | A |
5995273 | Chandrasekhar | Nov 1999 | A |
6033592 | Chandrasekhar | Mar 2000 | A |
6088470 | Camus | Jul 2000 | A |
D429269 | Renkis | Aug 2000 | S |
D429743 | Renkis | Aug 2000 | S |
6141052 | Fukumitsu | Oct 2000 | A |
6147701 | Tamura et al. | Nov 2000 | A |
D442202 | Pfeifer | May 2001 | S |
D445123 | Shen | Jul 2001 | S |
6268882 | Elberbaum | Jul 2001 | B1 |
D446534 | Zimmer | Aug 2001 | S |
6271752 | Vaios | Aug 2001 | B1 |
D447758 | Lin et al. | Sep 2001 | S |
D449630 | Rak et al. | Oct 2001 | S |
D452259 | Choi | Dec 2001 | S |
6357936 | Elberbaum | Mar 2002 | B1 |
D455164 | Tsang et al. | Apr 2002 | S |
6462781 | Arnold | Oct 2002 | B1 |
D467952 | Nakamura | Dec 2002 | S |
D469775 | Bradley | Feb 2003 | S |
D470874 | Chiu | Feb 2003 | S |
D470875 | Liao | Feb 2003 | S |
6515275 | Hunter et al. | Feb 2003 | B1 |
6634804 | Toste et al. | Oct 2003 | B1 |
6650694 | Brown et al. | Nov 2003 | B1 |
6678001 | Elberbaum | Jan 2004 | B1 |
6714236 | Wada | Mar 2004 | B1 |
6714286 | Wheel | Mar 2004 | B1 |
6727954 | Okada et al. | Apr 2004 | B1 |
D489388 | Saito et al. | May 2004 | S |
6762790 | Matko et al. | Jul 2004 | B1 |
D511352 | Oliver et al. | Nov 2005 | S |
7034884 | Misawa | Apr 2006 | B2 |
7066664 | Sitoh et al. | Jun 2006 | B1 |
7076162 | Yamashita | Jul 2006 | B2 |
D527755 | Wu | Sep 2006 | S |
7151565 | Wada | Dec 2006 | B1 |
D534938 | Beasley et al. | Jan 2007 | S |
D537097 | Freeman | Feb 2007 | S |
D542320 | Cheng | May 2007 | S |
D552649 | Logan et al. | Oct 2007 | S |
D552659 | Stephens et al. | Oct 2007 | S |
D555692 | Liu et al. | Nov 2007 | S |
7290740 | Joy et al. | Nov 2007 | B2 |
D558250 | Hsia | Dec 2007 | S |
D563446 | Stephens et al. | Mar 2008 | S |
D575316 | Liu et al. | Aug 2008 | S |
7443446 | Seo | Oct 2008 | B2 |
7552340 | Ooi et al. | Jun 2009 | B2 |
7586537 | Konishi et al. | Sep 2009 | B2 |
D606105 | Hinkel | Dec 2009 | S |
7646425 | Bohaker et al. | Jan 2010 | B2 |
D610601 | Melder | Feb 2010 | S |
D614223 | Kim et al. | Apr 2010 | S |
7705882 | Engel et al. | Apr 2010 | B2 |
D627815 | Oba | Nov 2010 | S |
D628223 | Kao | Nov 2010 | S |
7930369 | Marriott et al. | Apr 2011 | B2 |
D638461 | Kim et al. | May 2011 | S |
7986369 | Burns | Jul 2011 | B1 |
D648766 | Chen | Nov 2011 | S |
D651229 | Tan et al. | Dec 2011 | S |
D651230 | Tan et al. | Dec 2011 | S |
8072536 | Campbell | Dec 2011 | B1 |
D651633 | Park et al. | Jan 2012 | S |
8139122 | Rolston | Mar 2012 | B2 |
D657410 | Helaoui et al. | Apr 2012 | S |
8165146 | Melick et al. | Apr 2012 | B1 |
8174972 | Cernius et al. | May 2012 | B2 |
8359622 | Everson | Jan 2013 | B1 |
D678929 | Hancock | Mar 2013 | S |
8402145 | Holden et al. | Mar 2013 | B2 |
8432485 | Martinez et al. | Apr 2013 | B1 |
D687085 | Manson | Jul 2013 | S |
8504707 | Toebes et al. | Aug 2013 | B2 |
8520069 | Haler | Aug 2013 | B2 |
D694305 | Katori et al. | Nov 2013 | S |
D697119 | Park et al. | Jan 2014 | S |
8625024 | Hsu | Jan 2014 | B2 |
D700232 | Ramsay | Feb 2014 | S |
8817107 | Matsumoto et al. | Aug 2014 | B2 |
D719205 | Matsumoto | Dec 2014 | S |
D729296 | Shelton | May 2015 | S |
D730422 | Kim et al. | May 2015 | S |
9071740 | Duffy | Jun 2015 | B1 |
D733781 | Chen | Jul 2015 | S |
D734801 | Yang | Jul 2015 | S |
9102055 | Konolige et al. | Aug 2015 | B1 |
D740871 | Moon et al. | Oct 2015 | S |
D742954 | Simonelli et al. | Nov 2015 | S |
D743465 | Aglassinger et al. | Nov 2015 | S |
D745916 | Oh | Dec 2015 | S |
D746350 | Li | Dec 2015 | S |
D748709 | Jeong | Feb 2016 | S |
D755880 | Luo et al. | May 2016 | S |
9330307 | Litvak et al. | May 2016 | B2 |
9386230 | Duran et al. | Jul 2016 | B1 |
9544485 | Conner | Jan 2017 | B2 |
9838602 | Duran et al. | Dec 2017 | B2 |
9866760 | Dorai et al. | Jan 2018 | B2 |
9875718 | Basehore et al. | Jan 2018 | B1 |
20010015760 | Fellegara | Aug 2001 | A1 |
20010022550 | Steffel | Sep 2001 | A1 |
20020003575 | Marchese | Jan 2002 | A1 |
20020056794 | Ibrahim | May 2002 | A1 |
20020107591 | Gabai et al. | Aug 2002 | A1 |
20020141418 | Ben-Dor et al. | Oct 2002 | A1 |
20020159270 | Lynam et al. | Oct 2002 | A1 |
20020160724 | Arai et al. | Oct 2002 | A1 |
20020171754 | Lai et al. | Nov 2002 | A1 |
20020186317 | Kayanuma | Dec 2002 | A1 |
20020191082 | Fujino et al. | Dec 2002 | A1 |
20030164881 | Ohe et al. | Sep 2003 | A1 |
20030169354 | Aotsuka | Sep 2003 | A1 |
20030193409 | Crank | Oct 2003 | A1 |
20030216151 | Kitano et al. | Nov 2003 | A1 |
20040130655 | Yanakawa et al. | Jul 2004 | A1 |
20040132489 | Ryley et al. | Jul 2004 | A1 |
20040211868 | Holmes et al. | Oct 2004 | A1 |
20040246341 | Lee et al. | Dec 2004 | A1 |
20040247203 | Dell'Eva | Dec 2004 | A1 |
20040257431 | Girish et al. | Dec 2004 | A1 |
20050062720 | Rotzoll et al. | Mar 2005 | A1 |
20050068423 | Bear et al. | Mar 2005 | A1 |
20050073575 | Thacher et al. | Apr 2005 | A1 |
20050088537 | Nakamura et al. | Apr 2005 | A1 |
20050128336 | Toledano et al. | Jun 2005 | A1 |
20050146792 | Schofield et al. | Jul 2005 | A1 |
20050149213 | Guzak et al. | Jul 2005 | A1 |
20050151042 | Watson | Jul 2005 | A1 |
20050200751 | Weaver | Sep 2005 | A1 |
20050212958 | Su et al. | Sep 2005 | A1 |
20050227217 | Wilson | Oct 2005 | A1 |
20050230583 | Wu | Oct 2005 | A1 |
20050237425 | Lee et al. | Oct 2005 | A1 |
20050243022 | Negru | Nov 2005 | A1 |
20050243199 | Bohaker et al. | Nov 2005 | A1 |
20050275723 | Sablak et al. | Dec 2005 | A1 |
20060017842 | Jun | Jan 2006 | A1 |
20060024046 | Jones | Feb 2006 | A1 |
20060086871 | Joseph et al. | Apr 2006 | A1 |
20060109375 | Ho et al. | May 2006 | A1 |
20060109613 | Chen | May 2006 | A1 |
20060123129 | Toebes et al. | Jun 2006 | A1 |
20060123166 | Toebes et al. | Jun 2006 | A1 |
20060150227 | Julia et al. | Jul 2006 | A1 |
20060210259 | Matsumoto | Sep 2006 | A1 |
20060238707 | Elvesjo | Oct 2006 | A1 |
20060244583 | Kawada | Nov 2006 | A1 |
20060262194 | Swain | Nov 2006 | A1 |
20060282866 | Kuo | Dec 2006 | A1 |
20070001087 | Shyu et al. | Jan 2007 | A1 |
20070011375 | Kumar | Jan 2007 | A1 |
20070036539 | Martinez et al. | Feb 2007 | A1 |
20070050828 | Renzi et al. | Mar 2007 | A1 |
20070083791 | Panesar et al. | Apr 2007 | A1 |
20070219686 | Plante | Sep 2007 | A1 |
20070222888 | Xiao et al. | Sep 2007 | A1 |
20080001547 | Negru | Jan 2008 | A1 |
20080005432 | Kagawa | Jan 2008 | A1 |
20080012980 | Yamane | Jan 2008 | A1 |
20080026793 | Teegan et al. | Jan 2008 | A1 |
20080031161 | Osthus | Feb 2008 | A1 |
20080056709 | Huang et al. | Mar 2008 | A1 |
20080074535 | Ohsuga | Mar 2008 | A1 |
20080151052 | Erel et al. | Jun 2008 | A1 |
20080152218 | Okada et al. | Jun 2008 | A1 |
20080186150 | Kao | Aug 2008 | A1 |
20080189352 | Mitchell et al. | Aug 2008 | A1 |
20080231699 | Konishi et al. | Sep 2008 | A1 |
20080291260 | Dignan et al. | Nov 2008 | A1 |
20080309765 | Dayan et al. | Dec 2008 | A1 |
20080316594 | Hashiguchi | Dec 2008 | A1 |
20090019187 | Okuma | Jan 2009 | A1 |
20090027570 | Fujinawa | Jan 2009 | A1 |
20090069633 | Orihara et al. | Mar 2009 | A1 |
20090102715 | Lou et al. | Apr 2009 | A1 |
20090141918 | Chris et al. | Jun 2009 | A1 |
20090141939 | Chambers et al. | Jun 2009 | A1 |
20090158373 | Belz et al. | Jun 2009 | A1 |
20090175612 | Wen | Jul 2009 | A1 |
20090195655 | Pandey | Aug 2009 | A1 |
20090245268 | Pugliese, IV | Oct 2009 | A1 |
20090248918 | Diab et al. | Oct 2009 | A1 |
20090289921 | Mickelson et al. | Nov 2009 | A1 |
20090296735 | Cernius et al. | Dec 2009 | A1 |
20090309969 | Wendler | Dec 2009 | A1 |
20100026811 | Palmer | Feb 2010 | A1 |
20100039253 | Zang | Feb 2010 | A1 |
20100076600 | Cross et al. | Mar 2010 | A1 |
20100085749 | Bezgachev | Apr 2010 | A1 |
20100109878 | Desrosiers | May 2010 | A1 |
20100180012 | Heo et al. | Jul 2010 | A1 |
20100199157 | Takaoka et al. | Aug 2010 | A1 |
20100271503 | Safaee-Rad et al. | Oct 2010 | A1 |
20100306399 | Khosravi et al. | Dec 2010 | A1 |
20100314508 | Bevirt et al. | Dec 2010 | A1 |
20100328475 | Thomas et al. | Dec 2010 | A1 |
20100330843 | Gao | Dec 2010 | A1 |
20110007159 | Camp et al. | Jan 2011 | A1 |
20110102438 | Mathe et al. | May 2011 | A1 |
20110102588 | Trundle et al. | May 2011 | A1 |
20110134243 | Siann et al. | Jun 2011 | A1 |
20110134313 | Kato | Jun 2011 | A1 |
20110158637 | Jung | Jun 2011 | A1 |
20110161076 | Davis et al. | Jun 2011 | A1 |
20110193964 | McLeod | Aug 2011 | A1 |
20110193967 | Matsumoto et al. | Aug 2011 | A1 |
20110205965 | Sprigg et al. | Aug 2011 | A1 |
20110231903 | Springer et al. | Sep 2011 | A1 |
20110234803 | Nakajima et al. | Sep 2011 | A1 |
20110255289 | Krah | Oct 2011 | A1 |
20110267492 | Prentice et al. | Nov 2011 | A1 |
20110285813 | Girdzijauskas et al. | Nov 2011 | A1 |
20110293137 | Gurman et al. | Dec 2011 | A1 |
20110299728 | Markovic et al. | Dec 2011 | A1 |
20120004956 | Huston et al. | Jan 2012 | A1 |
20120026325 | Bunker et al. | Feb 2012 | A1 |
20120081009 | Shteynberg | Apr 2012 | A1 |
20120086815 | Cooper et al. | Apr 2012 | A1 |
20120105632 | Renkis | May 2012 | A1 |
20120106037 | Diebel | May 2012 | A1 |
20120127270 | Zhang et al. | May 2012 | A1 |
20120140068 | Monroe et al. | Jun 2012 | A1 |
20120162416 | Su et al. | Jun 2012 | A1 |
20120194650 | Izadi et al. | Aug 2012 | A1 |
20120236373 | Oyama | Sep 2012 | A1 |
20120246359 | Scragg, Jr. et al. | Sep 2012 | A1 |
20120262575 | Champagne et al. | Oct 2012 | A1 |
20120263450 | Totani | Oct 2012 | A1 |
20120311686 | Medina et al. | Dec 2012 | A1 |
20120328259 | Seibert, Jr. et al. | Dec 2012 | A1 |
20120328358 | Akiyama | Dec 2012 | A1 |
20130007099 | Lee et al. | Jan 2013 | A1 |
20130053657 | Ziarno et al. | Feb 2013 | A1 |
20130156260 | Craig | Jun 2013 | A1 |
20130162629 | Huang et al. | Jun 2013 | A1 |
20130314544 | Ban | Nov 2013 | A1 |
20130321564 | Smith et al. | Dec 2013 | A1 |
20130342653 | McCloskey et al. | Dec 2013 | A1 |
20140032796 | Krause | Jan 2014 | A1 |
20140047143 | Bateman et al. | Feb 2014 | A1 |
20140049609 | Wilson et al. | Feb 2014 | A1 |
20140119604 | Mai et al. | May 2014 | A1 |
20140168421 | Xu et al. | Jun 2014 | A1 |
20140241387 | Ortiz | Aug 2014 | A1 |
20140267874 | Ratcliff et al. | Sep 2014 | A1 |
20140270387 | Hoof et al. | Sep 2014 | A1 |
20140333726 | Tokui et al. | Nov 2014 | A1 |
20140375635 | Johnson et al. | Dec 2014 | A1 |
20150049324 | Tan et al. | Feb 2015 | A1 |
20150052029 | Wu et al. | Feb 2015 | A1 |
20150120389 | Zhang et al. | Apr 2015 | A1 |
20150154467 | Feng et al. | Jun 2015 | A1 |
20150170371 | Muninder et al. | Jun 2015 | A1 |
20150181198 | Baele et al. | Jun 2015 | A1 |
20150228114 | Shapira et al. | Aug 2015 | A1 |
20160012588 | Taguchi et al. | Jan 2016 | A1 |
20160022181 | Valsan et al. | Jan 2016 | A1 |
20160029102 | Daily | Jan 2016 | A1 |
20160094763 | Patel | Mar 2016 | A1 |
20160094829 | Georgiev et al. | Mar 2016 | A1 |
20160142681 | Yu | May 2016 | A1 |
20160205318 | Wang et al. | Jul 2016 | A1 |
20160261829 | Olsson | Sep 2016 | A1 |
20170343801 | Dabic et al. | Nov 2017 | A1 |
20180052376 | Burrows | Feb 2018 | A1 |
20180113331 | Wang | Apr 2018 | A1 |
Number | Date | Country |
---|---|---|
WO2005034505 | Apr 2005 | WO |
Entry |
---|
0308 Brand USB 2.0 HD Night Vision Webcam Web Cam Camera Webcamera With Microphone Sucker Stand for PC Computer Laptop Notebook, Dec. 18, 2015, 13 pgs. |
720p TF Card IP Wireless Camera Indoor Built-In Microphone Support Two Way Intercom for Smart Home Life and Unique PIR Alarm, Dec. 18, 2015, 3 pgs. |
Adipranata, Fast method for multiple human face segmentation in color image, 2008 Second Int'l Conference on Furute Generation Communication and Networking, IEEE, 2008, 4 pgs. |
Buy Svb Ladybird Tripod Webcam 4 Mega Pixel—4 Megapixel Web Cam Online, Best Prices in India: Redliff Shopping, Dec. 16, 2015, 3 pgs. |
Drivers—Video Cam: Download Drivers for (Genius VideoCAM NB) Visual/Video Camera, Computer Question Help, Jul. 3, 2008, 2 pgs. |
Ebay, Belkin F7D7601AU, Net Cam IP WIFI Baby Pet Monitor Camera Security Night Vision, Dec. 15, 2015, 5 pgs. |
Ebay, Lot of 2 USB WebCam Web Cam Camera Logitech Quickcam HP Hewlett Packard, Dec. 16, 2015, 3 pgs. |
Ebay, Motorola, MBP421 Digital Video & Sound Rechargeable Baby Monitor 1.8″ LCD Screen, Dec. 15, 2015, 5 pgs. |
Ebay, New Smallest Mini Camera Camcorder Video Recorder DVR Spy Hidden Pinhole Web Cam, Dec. 2, 2015, 4 pgs. |
FabulaTech, What is USB for Remote Desktop, Dec. 12, 2011, 2 pgs, http://web.archive.org/web/20111212070644/http://www.usb-over-network.com/usb-for-remote-desktop.html. |
FabulaTech, What is USB over Network, Dec. 17, 2011, 2 pgs, http://web.archive.org/web/20111217080253/http://www.usb-over-network.com/usb-over-network.html. |
Goods in Stock PC Camera USB Plug and Play Free Driver Digital Webcam Stand Web Camera, Dec. 18, 2015, 12 pgs. |
Hampapur, Smart surveillance: applications, technologies and implications, Information Communications and Signal Processing 2, 2003, pp. 1133-1138. |
Heo, Fusion of visual and thermal face recognition techniques: A comparative study. Univ. of Tennessee, Knoxville, TN, 2003, 75 pgs. |
Input Devices on Pintrest, Computers, Mice and Apples, Tanna Darty, Dec. 15, 2015, 1 pg. |
Ion Camera, The Home Pro Wi-Fi Wireless Cloud Video Monitoring Security Camera (Black): Camera & Photo, Dec. 15, 2015, 6 pgs. |
Joel Johnson, Glowdoodle Turns Your Crappy Webcam in a Crappier Webcam (in a good way), webcam—Boing Boing, Dec. 16, 2015, 8 pgs. |
John Lewis, Samsung SEB-1019RW Add-On Night Vision Baby Monitor Camera, Dec. 15, 2015, 2 pgs. |
KYO-TUX, IconArchive, Device WebCam Icon, Phuzion Iconset, Jun. 8 2010, 3 pgs. |
Linksys Wireless-N Internet Home Monitoring Camera: Home Security Systems: Camera & Photo, Amazon.com, Dec. 15, 2015, 7 pgs. |
Logi Circle: Portable Home Surveillance Camera from Logitech (video), AppleApple.Top World News, Feb. 10, 2015, 5 pgs. |
Mini Universal Tripod Stand for Digital Camera and Webcam A33-in Tripods from Consumer Electronics on Aliexpress.com, Alibaba Group, Store: Angel One-Stop Shopping Center, Dec. 16, 2015, 3 pgs. |
Parent, Android USB Port Forwarding, Dec. 26, 2011, 7 pgs, http://www.codeproject.com/Articles/191930/Android-Usb-Port-Forwarding. |
Restore.Solutions, Numus Software, USB/VID, Syntek Web Cam Device Drivers, Dec. 12, 2015, 10 pgs. |
Silberman, Indoor Segmentation and Support Ingerence from RGBD Images, Computer Vision—ECCV 2012, Springer Berlin Heidelbert, Oct. 2012, pp. 746-780. |
SIV AI-BALL Very Small Hidden IP Network Camera Battery Powered Wireless IP Camera, Alibaba.com, 1999-2015, 7 pgs. |
TechAllianz, How to Pick the Right Webcam, Satyakam, Jan. 22, 2013, 4 pgs. |
TREK Ai-Ball Mini WiFi Spy Cam IP Wireless Camera for Iphone / Android /Ipad, Tmart, www.tmart.com, Dec. 18, 2015, 6 pgs. |
Tripod Support for a QuickCam (or other webcam), Instructables, 2015, 3 pgs. |
USB/IP Project, USB Request Over IP Network, Dec. 27, 2011, 5 pgs, http://web.archive.org/web/20111227171215/http://usbip.sourceforge.net/. |
Web Camera 6 Stock Photo, Dreamstime, Dec. 16, 2015, 2 pgs. |
Google, WO/ PCT/US2016/034462, International Preliminary Report on Patentability, dated Nov. 28, 2017, 8 pgs. |
Google LLC, International Preliminary Report on Patentability/ Written Opinion, PCT/US2016/037069, dated Dec. 12, 2017, 7 pgs. |
Number | Date | Country | |
---|---|---|---|
20180364538 A1 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15339839 | Oct 2016 | US |
Child | 16112284 | US |