End users appreciate quality images and video. They also appreciate the ability to easily use devices that create such images and video. Designers and manufacturers may, therefore, endeavor to create and provide technology directed toward at least some of these objectives.
The following detailed description references the drawings, wherein:
Sensor modules that record still images or video may have unit-by-unit variations in their individual components. For example, the illumination sources of sensor modules may vary in the intensity of their light output or the wavelength of the light they emit. As another example, the photosensitive members of sensor modules may differ in their responsiveness to different wavelengths of light. These sensor module unit-by-unit variations can result in visually perceptible differences in the images and/or video they produce. For example, the colors in these images and/or video may differ enough so as to be noticeable by and objectionable to cud users of the sensor modules or the devices in which they are used. As another example, the contrast of images and/or video of the same subject taken by different sensor modules may vary enough so as to be perceptible by and a concern to end users of the sensor modules or the devices in which they are utilized.
Addressing these technical challenges caused by such unit by unit variations may assist such end users by providing them with more consistent and visually pleasing images and/or video between different sensor modules. This may be achieved by creating a set of default calibrated sensor module settings that are utilized with different sensor modules, as shown, for example, in
Developers that utilize sensor modules in devices, as well as end users of such sensor modules and devices, may appreciate the ability to create and utilize at least one user defined sensor module setting that differs from one of a set of default calibrated sensor module settings. This provides flexibility to such developers and end users. Retaining this at least one user defined sensor module setting for subsequent use, until no longer desired, along with any remaining default calibrated sensor module settings saves developer and end user time because they do not have to repeatedly recreate the user defined sensor module setting each time they want to use a sensor module in a customized manner. Examples directed to addressing these technical challenges are shown in
As used herein, the term “sensor module” represents, but is not necessarily limited to, a photosensitive n ember and an illumination source that are utilized to record still images and/or video. Examples of a photosensitive member include, but are not limited to, a charge-coupled device (CCD), a complementary metal-oxide semiconductor (CMOS), a camera, film, a light-sensitive plate, light sensitive paper, or any combination of the foregoing. Examples of an illumination source include, but are not limited to, a light-emitting diode (LED), a bulb, a tube, a laser, a reflector, a lens, ambient lighting, or any combination of the foregoing.
As used herein, the term “processor” represents, but is not necessarily limited to, an instruction execution system such as a computer-based system, an Application Specific Integrated Circuit (ASIC), a computing device, a hardware and/or machine-readable instruction system, or any combination thereof, that can fetch or obtain the logic from a machine-readable non-transitory storage medium and execute the instructions contained thereon. “Processor” can also include any controller, state-machine, microprocessor, logic control circuitry, cloud-based utility, service or feature, any other analogue, digital and/or mechanical implementation thereof, or any combination of the forgoing, A processor may be a component of a distributed system.
As used herein, the term “distributed system” represents, but is not necessarily limited to, multiple processors and machine-readable non-transitory storage media in different locations or systems that communicate via a network, such as the cloud. As used herein, the term “cloud” represents, but is not necessarily limited to, computing resources (hardware and/or machine readable instructions) that are delivered as a service over a network (such as the internet).
As used herein, the term “machine-readable non-transitory storage medium” represents, but is not necessarily limited to, any medium that can contain, store, retain, or maintain programs, code, scripts, information, and/or data. A machine-readable non-transitory storage medium may include any one of many physical media such as, for example, electronic, magnetic, optical, electromagnetic, or semiconductor media. A machine-readable non-transitory storage medium may be a component of a distributed system. More specific examples of suitable machine-readable non-transitory storage media include, but are not limited to, a magnetic computer diskette such as floppy diskettes or hard drives, magnetic tape, a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a flash drive or memory, a compact disc (CD), a digital video disk (DVD), or a memristor.
As used herein, the term “persistent memory” represents, but is not necessarily limited to, any structure, apparatus, memory, method and/or machine-readable non-transitory storage medium for storing data and information such that it can be continually accessed using instructions and/or application programming interfaces (APIs) even after the end of the process that, created or last modified them. As used herein the, term “memory” represents, but is not necessarily limited to, a device and/or process that allows data and information to be stored thereon for subsequent retrieval by, for example, a processor.
As used herein, the term “circuitry” represents, but is not necessarily limited to, an interconnection of elements such as a resistor, inductor, capacitor, voltage source, current source, transistor, diode, application specific integrated circuit (ASIC), processor, controller, switch, transformer, gate, timer, relay, multiplexor, connector, comparator, amplifier, filter, and/or module having these elements that allow operations to be performed alone or in combination with other elements or components. As used herein, the terms “include”, “includes”, “including”, “have”, “has”, “having,” and variations thereof, mean the same as the terms “comprise”, “comprises”, and “comprising” or appropriate variations thereof.
An example of a system 10 in accordance with an implementation is shown in
As can also be seen in
As can additionally be seen in
An example of additional elements of system 10 in accordance with an implementation is shown in
As can also be seen in
As can additionally be seen in
As can further be seen in
An example of a method 48 in accordance with an implementation is shown in
An example of additional elements of method 48 in accordance with an implementation is shown in
As can additionally be seen in
An example of a machine-readable non-transitory storage medium 70 including instructions executable by a processor 72, as generally indicated by double-headed arrow 74, in accordance with an implementation is shown in
An example of additional instructions in accordance with an implementation that are executable by processor 72, as generally indicated by double-headed arrow 74, that may be included on machine-readable non-transitory storage medium 70 are shown in
Although several drawings have been described and illustrated in detail, it is to be understood that the same are intended by way of illustration and example. These examples are not intended to be exhaustive or to be limited to the precise form disclosed. Modifications, additions, and variations may well be apparent. For example, although not shown in
Additionally, reference to an element in the singular is not intended to mean one, unless explicitly so stated, but rather means at least one. Furthermore, unless specifically stated, any method elements are not limited to the sequence or order described and illustrated. Moreover, no element or component is intended to be dedicated to the public regardless of whether the element or component is explicitly recited in the following claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2014/048569 | 7/29/2014 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/018243 | 2/4/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5608491 | Sasagaki et al. | Mar 1997 | A |
5721989 | Kitazawa | Feb 1998 | A |
6134606 | Anderson et al. | Oct 2000 | A |
6208379 | Oya | Mar 2001 | B1 |
6778770 | Cazier | Aug 2004 | B1 |
6970194 | Smith | Nov 2005 | B1 |
6980234 | Kitawaki | Dec 2005 | B2 |
6989861 | Schinner | Jan 2006 | B2 |
7599561 | Wilson et al. | Oct 2009 | B2 |
7710391 | Bell et al. | May 2010 | B2 |
8121640 | Russ et al. | Feb 2012 | B2 |
8199117 | Izadi et al. | Jun 2012 | B2 |
8736583 | Anderson et al. | May 2014 | B2 |
9607201 | Schafer | Mar 2017 | B1 |
9807838 | Mondal | Oct 2017 | B1 |
20010052067 | Klein | Dec 2001 | A1 |
20040218065 | Schinner | Nov 2004 | A1 |
20050007465 | Terasawa | Jan 2005 | A1 |
20050078092 | Clapper | Apr 2005 | A1 |
20050185055 | Miller | Aug 2005 | A1 |
20070132839 | Pang et al. | Jun 2007 | A1 |
20080018591 | Pittel et al. | Jan 2008 | A1 |
20080266407 | Battles | Oct 2008 | A1 |
20080266541 | Yung et al. | Oct 2008 | A1 |
20090066807 | Miyanishi | Mar 2009 | A1 |
20090105538 | Van Dam et al. | Apr 2009 | A1 |
20090123135 | Amling et al. | May 2009 | A1 |
20090202235 | Li et al. | Aug 2009 | A1 |
20100096461 | Kollarsky et al. | Apr 2010 | A1 |
20100245590 | Cazier et al. | Sep 2010 | A1 |
20100271489 | Muukki | Oct 2010 | A1 |
20100289941 | Ito | Nov 2010 | A1 |
20100322465 | Wesche et al. | Dec 2010 | A1 |
20110116685 | Sugita | May 2011 | A1 |
20110242054 | Tsu | Oct 2011 | A1 |
20110273606 | Hara | Nov 2011 | A1 |
20120162472 | Amling et al. | Jun 2012 | A1 |
20120206632 | Imai | Aug 2012 | A1 |
20120257065 | Velarde et al. | Oct 2012 | A1 |
20130077236 | Becze et al. | Mar 2013 | A1 |
20130127901 | Georgiev | May 2013 | A1 |
20130215289 | Vitsnudel | Aug 2013 | A1 |
20140152777 | Galor | Jun 2014 | A1 |
20140340534 | Kusaka | Nov 2014 | A1 |
20150213588 | Wang | Jul 2015 | A1 |
20150382436 | Kelly | Dec 2015 | A1 |
20170176034 | Hussain | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
2091235 | Aug 2009 | EP |
2001174690 | Jun 2001 | JP |
201007367 | Feb 2010 | TW |
WO-2009018612 | Feb 2009 | WO |
Entry |
---|
Izadi et al., “C-Stale: A Multi-Touch and Object Recognition System for Remote Collaboration using Horizontal Surfaces,”, 2007, pp. 3-10, IEEE. |
Release Notes for Version 2.3, Lumenera Network Camera firmware Version 2.3.1.6, Release Notes—Intelligent Series, Mar. 18, 2013, pp. 1-7, Lumenera Corporation. |
Number | Date | Country | |
---|---|---|---|
20170322903 A1 | Nov 2017 | US |