Imaging systems may be configured to detect electromagnetic radiation of a variety of different wavelengths. For example, some imaging systems may be configured to detect visible light for grayscale or color imaging. An image sensor used for color imaging may include an array of color filters configured to selectively filter light of various colors prior to the light reaching an image sensor. Other imaging systems may be configured to detect infrared light. Infrared imaging systems may utilize an infrared band pass filter to pass a desired wavelength band of infrared light to an image sensor while blocking other wavelengths.
Embodiments are disclosed that relate to image sensing systems configured to sense visible and infrared light. For example, one disclosed embodiment provides an image sensing system comprising an image sensor comprising a plurality of pixels, an optical path extending from an exterior of the image sensing system to the image sensor, and an infrared filter array positioned along the optical path. The infrared filter array is configured to transmit the infrared light to a first subset of pixels of the image sensor, and to filter at least a portion of the infrared light to reduce an amount of infrared light reaching a second subset of pixels of the image sensor relative to an amount of infrared light reaching the first subset of pixels.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Computing device 104 also comprises an image sensing system 106 that, for example, may collect image data regarding human subjects (e.g., user 102) within its field of view. Optical data including visible and infrared light collected by image sensing system 106 may be processed by computing device 104 to determine an identity of user 102. Upon identification of the user, a notification 108 conveying the determined identification of user 102 may be displayed on a display 110 of computing device 104. In other embodiments, the result of analysis performed on data captured by image sensing system 106 may be conveyed in non-visual manners (e.g., via audio, tactile feedback, etc.). Further, the image sensing system 106 also may be configured to capture data regarding inanimate objects and/or other features of its surrounding environment (e.g., environmental surfaces). In some embodiments, image sensing system 106 may be at least partially housed in an enclosure separate from that of computing device 104, and may be operatively coupled to the computing device via any suitable communication link. It will be understood that the identification of a user is but one of any number of uses for a computing device imaging system.
In some embodiments, image sensing system 106 may be configured to collect data regarding the depth of surfaces within its field of view in cooperation with other suitable components. For example, image sensing system 106 may include two image sensors arranged in a stereo configuration to obtain depth data. As another example, image sensing system 106 may include a depth camera system, such as a time-of-flight depth sensor and/or a structured light depth sensor.
Image sensing system 106 may collect visible and infrared light, which may be separately or cooperatively analyzed for the purposes described above. For example, data derived from infrared light captured by image sensing system 106 may be used to identify human subjects within its field of view, while data derived from visible light captured by the image sensing system may be fed to participants in a videoconferencing application.
Image sensing system 200 includes an optical path 202 extending from an exterior of the image sensing system to an image sensor 206. Image sensor 206, positioned at an end of optical path 202 is configured to convert certain wavelengths of incident light to electrical output. The electrical output may then be digitized for processing, analysis, and other tasks including the formation of two-dimensional images. Image sensor 206 may form part of a charge-coupled device (CCD) sensor or a complimentary metal-oxide semiconductor (CMOS) sensor, for example.
As shown, image sensor 206 includes a plurality of pixels 210 each being photosensitive to selected wavelengths of incident light. In the depicted example, the plurality of pixels 210 is arranged in a plurality of sensor tiles, one of which (sensor tile 212) is shown in
Also positioned along optical path 202 is a color filter array 214 that includes a plurality of color filters 216 each configured to transmit certain wavelengths of light (e.g., colors) while preventing the transmission of other different wavelengths of light. As with image sensor 206, the plurality of color filters 216 is arranged in a plurality of color filter tiles (e.g., color filter tile 217) such that each color filter tile of the color filter array corresponds to a corresponding sensor tile (e.g., sensor tile 212) of the image sensor. In this example, each color filter tile 217 of color filter array 214 includes three color filters each comprising one or more colorants (e.g. pigments and/or dyes) that facilitate the transmission of certain wavelengths of visible light.
Each of the four pixels in each structure depicted in
Also positioned along optical path 202 is an infrared filter array 234 configured to transmit infrared light to a first subset of pixels of image sensor, and to filter at least a portion of infrared light reaching a second subset of pixels of the image sensor relative to an amount of infrared light reaching the first subset of pixels. The first subset of pixels may correspond to infrared pixels 232 in image sensor 206, and the second subset of pixels may correspond to red pixels 226, green pixels 228, and blue pixels 230 (“the color pixels”) in the image sensor. In some embodiments, infrared filter array 234 also may filter at least a portion of visible light reaching the first subset of pixels, relative to an amount of visible light reaching the second subset of pixels.
Thus, color filter array 214 and infrared filter array 234 divide image sensor 206 into a plurality of image sensor tiles. In this configuration, each image sensor tile 212 includes three color pixels 226, 228, and 230, and an infrared pixel 232 arranged in a two by two grid.
Infrared filter array 234 may include one or more materials that absorb or otherwise block the transmission of infrared light, such as infrared absorbing dyes, pigments, interference filters, etc. Such materials may be disposed in locations that correspond to and align with the color pixels of image sensor 206, and omitted from (or used in lesser quantities in) locations that correspond to infrared pixels 232 of the image sensor. Infrared filter array 234 may comprise filter cells 236 arranged as infrared filter tiles (e.g., infrared filter tile 238) such that each infrared filter tile corresponds to a corresponding color filter tile of color filter array 214 and a corresponding image sensor tile of image sensor 206. Thus, in this example, cells 236a in infrared filter tile 238 include the one or more infrared-blocking materials (represented in
Also arranged along optical path 202 is a microlens array 240 comprising a plurality of microlenses 242. Each microlens (e.g., microlens 244) in the plurality of microlenses 242 is configured to gather light incident on its surface and focus the incident light onto an associated pixel of image sensor 206. For example, microlenses 244a, 244b, 244c, and 244d respectively focus light onto pixels 226, 228, 230, and 232 of image sensor 206. In the depicted example, the plurality of microlenses 242 is optically transparent to at least a portion of visible and infrared wavelengths, but in other embodiments the microlens array may have one or more filter materials incorporated into a bulk material of the microlenses. The plurality of microlenses 242 may be formed from any suitable material, including but not limited to various polymers, such as poly(methyl methacrylate) (PMMA). It will be appreciated that in some embodiments microlens array 240 may be omitted from image sensing system 200 without departing from the scope of this disclosure.
As depicted, light L upstream of the microlens array 240 is focused by the microlens array 240 into light L′, wherein each microlens focuses light onto a corresponding image sensor pixel. Light L′ from the microlens array passes through the infrared filter array 234, where cells 236a remove at least a portion of the infrared component from light intended for the color pixels of the image sensor system (indicated as visible light V), while more infrared light is transmitted through cells that do not include infrared-blocking materials (e.g., cell 236b).
Visible light V then passes through the color filter array 214, while light L′ passes through infrared band pass filter 224 (which may be in any other suitable layer than that shown in
The filtering of infrared light prior to the infrared light reaching the color pixels of the image sensor may offer advantages over exposing the color pixels to both color and infrared light and then electronically correcting for the infrared exposure of the color pixels. For example, a higher signal-to-noise ratio may be achieved via filtering of infrared light as compared to the computational correction of a color pixel value that was also exposed to infrared light.
As mentioned above, the optical elements depicted in
First,
In some embodiments, two or more of the optical components of
Next, at 1410, method 1400 comprises forming an infrared filter array as a layer separate from the microlens array. Forming the infrared filter array may include, at 1412, forming the infrared filter array on a surface of the microlens array. Alternatively, formation of the infrared filter array may include, at 1414, forming the infrared filter array on a structure separate from that of the microlens array.
Next, at 1504, method 1500 comprises forming a second portion of the microlens array from material that does not comprise the infrared blocking material, or comprises a lesser concentration of infrared blocking material than the first portion. Forming the second portion may include, at 1506, forming the microlens array on a surface of a color filter array. Alternatively, formation of the second portion may include, at 1508, forming the microlens array on a structure separate from the color filter array.
In some embodiments, an RGB image sensing system that is configured to sense visible light may comprise an infrared filter layer (e.g. an infrared cut filter) positioned along the optical path to filter at least a portion of the infrared light. Such a filter may be provided as a separate structure and at an incremental cost in such image sensors. Thus, to reduce a cost of an image sensing system, the IR cut filter material may be included in another layer of the image sensing system, such as a color filter layer or microlens array, e.g. by mixing the infrared blocking materials of the infrared filter layer with materials used to form the color filter array and/or the microlens array, as described above. This may help to reduce the cost of an RGB image sensing system.
The image sensing systems embodiments disclosed herein and the embodiments of methods of making image sensing systems may be used in, and/or to form, any suitable type of device.
Computing system 1800 includes a logic subsystem 1802 and a storage subsystem 1804. Computing system 1800 may optionally include a display subsystem 1806, input subsystem 1808, communication subsystem 1810, and/or other components not shown in
Logic subsystem 1802 includes one or more physical devices configured to execute instructions. For example, the logic machine may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic machine may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic machine may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic machine may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic machine optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic machine may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage subsystem 1804 includes one or more physical devices configured to hold instructions executable by the logic machine to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage subsystem 1804 may be transformed—e.g., to hold different data.
Storage subsystem 1804 may include removable and/or built-in devices. Storage subsystem 1804 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage subsystem 1804 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
It will be appreciated that storage subsystem 1804 includes one or more physical devices. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
Aspects of logic subsystem 1802 and storage subsystem 1804 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
When included, display subsystem 1806 may be used to present a visual representation of data held by storage subsystem 1804. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage machine, and thus transform the state of the storage machine, the state of display subsystem 1806 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1806 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 1802 and/or storage subsystem 1804 in a shared enclosure, or such display devices may be peripheral display devices.
When included, input subsystem 1808 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition (including but not limited to the image sensing system embodiments described herein), a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
When included, communication subsystem 1810 may be configured to communicatively couple computing system 1800 with one or more other computing devices. Communication subsystem 1810 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system 1800 to send and/or receive messages to and/or from other devices via a network such as the Internet.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.