A wide variety of devices utilize sensors to detect signals from one or more sources. Sensors which are configured to image a source often use an array of discrete elements sensitive to the signals. For example, an optical imaging sensor may contain a two-dimensional array of imaging elements, also known as pixels. However, sampling from discrete pixels can result in undesirable artifacts such as moiré effects, and require various anti-aliasing techniques. These anti-aliasing techniques introduce complexity, cost, and noise into sensor systems.
The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
Overview
Sensors designed to convert an incoming signal into a form suitable for processing are used in a variety of applications ranging from digital pictures to geophysical research. The incoming or impinging signal may include mechanical, electromagnetic energy, or other impinging energy. Traditional sensors such as those used for imaging have used arrays of discrete or pixelized sensing elements. For example, a conventional imaging sensor in a digital camera may have millions of discrete imaging elements, each of which generates a pixel in the picture.
Pixellated sensors suffer several drawbacks. They are typically expensive to manufacture, in part due to quality control efforts to minimize the number of “dead” or inactive pixels. Furthermore, pixellated sensors are subject to moiré pattern and other effects which result from attempting to sample a signal with a higher frequency than is directly resolvable by the sensor. Additionally, interpolation or the process of “filling in” a gap between pixels and may be used to smooth an image, to simulate a higher resolution image, to simulate a zoom or enlargement of a particular area of an image, and so forth. However, interpolation from conventional pixilated sensors is computationally intensive and may introduce unwanted artifacts which distort the image.
Disclosed here are devices and techniques for a hardware enabled interpolating sensor, as well as an interpolating display. The hardware enabled interpolating sensor (or “HEIS”) comprises an addressable array. This addressable array may contain one layer with conductors disposed in rows and another layer with conductors disposed in columns. Each layer of conductors couples to a resistive material. The layers are disposed on opposing sides of a transducer layer, such that the conductors therein form a plurality of addressable junctions. When an impinging signal interacts with the transducer layer, a change may be read out by the addressable array and used to generate an image.
The HEIS may also be used for reconstruction of impinging wavefronts, such as in holographic imaging. The time of arrival of the impinging signal at points within the sensor may be used in conjunction with the location and amplitude information to reconstruct the wavefront.
Additionally, by operating the HEIS in an alternative emissive mode, an interpolating display is possible. An interpolating display is configured similar to the HEIS described above, but when in the emissive mode a signal is applied to the transducing layer such that a signal is emitted from an emissive region. Unlike fixed size emissive pixels in a conventional display, the interpolating display may vary the size and intensity of the emission region.
Hardware Enabled Interpolating Sensor
An enlarged plan view 106 of the HEIS 102 is depicted, showing the rows 108(1)-(R) and columns 110(1)-(N) of an addressable array. For ease of illustration and not by way of limitation, eight rows 108 are labeled 1-8 and eight columns 110 are labeled A-H. Also for illustrative purposes, a given junction is designated herein by the row and column, such as “1-A.” The rows 108 and columns 110 may comprise transmissive elements such as electrical conductors, electromagnetic waveguides, optical fibers, and so forth. The rows 108 and columns 110 are disposed in a first and a second layer, respectively. Disposed amongst the rows 108 and columns 110 may be drone lines or wires. In this disclosure, drone lines have been omitted for clarity. These drone wires aid in the shaping of a field emanating from the energized transmissive element and also provide a linear drop off in signals from the energized transmissive element.
Between these first and second layers is a transducer layer 112. The transducer layer 112 is configured such that the impinging signal 104 will modify one or more characteristics within a signal affected region 114. The boundary of the signal affected region 114 may vary depending upon how the sensor is driven. In some implementations, the boundary may exhibit a sharp cutoff with non-signal affected regions, while in others the boundary may exhibit a gradual falloff. Some material used in the transducer layer 112 may alter a phase of a readout signal passing through it while electrical conductivity, capacitance, inductance, and so forth may vary in other materials. For example, where the impinging signal 104 is mechanical such as sound, the transducing layer 112 may comprise a piezoelectric material. Where the impinging signal 104 is visible light, a photodiode material may be used. Where the impinging signal 104 is heat, a peltier material may be used.
The impinging signal 104 varies one or more characteristics within the signal affected region 114, such as the electrical resistivity. As described above, the rows 108 and columns 110 intersect to form addressable intersections or junctions 116. The variation in electrical characteristic of the signal affected region 114 is read out at one or more of these junctions 116. Because the rows 108 and columns 110 are disposed within a resistive material, current may flow between neighboring rows 108 and columns 110, which results in non-aliased output.
Because the characteristics of the resistive material are known, and the junctions 116 may be read out individually, interpolation of the impinging signal 104 at locations between the junctions 116 is readily accomplished. For example, the resistive material may have a linear response to resistance such that resistance varies by a known given the thickness of the material. Given an impinging signal at a midpoint between four junctions 116, each of the four junctions 116 will show substantially the same resistance. As a result, a sensor controller may reconstruct that the signal impinged at the midpoint, and well as the amplitude of the signal.
While the addressable array is shown here as a grid having rows 108 and columns 110 which are generally perpendicular to one another, other configurations are possible. In other implementations, other addressable configurations may be used, such as polygonal arrays, polar arrays, and so forth. Additionally, in some implementations, each of the rows 108 and columns 110 may comprise a mesh of transmissive elements rather than a single transmissive element. The HEIS 102 and one implementation of its construction are discussed in more detail below with regards to
In some implementations, multiple transducer layers 112 may be used to allow for the imaging of different signals, such as different wavelengths. These transducer layers 112 may be disposed between layers containing the transmissive elements, allowing for readout of each respective transducer layer. In some implementations, the layers containing the transmissive elements may be shared with two transducer layers.
An image processing unit 206 is shown coupled to one or more display components 208 (or “displays”). In some implementations, multiple displays may be present and coupled to the image processing unit 206. These multiple displays may be located in the same or different enclosures or panels. Furthermore, one or more image processing units 206 may couple to the multiple displays.
The display 208 may present content in a human-readable format to a user. The display 208 may be reflective, emissive, or a combination of both. Reflective displays utilize incident light and include electrophoretic displays, interferometric modulator displays, cholesteric displays, pre-printed materials, and so forth. Emissive displays do not rely on incident light and, instead, emit light. Emissive displays include backlit liquid crystal displays, time multiplexed optical shutter displays, light emitting diode displays, backlit pre-printed materials, and so forth. When multiple displays are present, these displays may be of the same or different types. For example, one display may be an electrophoretic display while another may be a liquid crystal display. The display 208 may be implemented in any shape, and may have any ratio of height to width. Also, for stylistic or design purposes, the display 208 may be curved or otherwise non-linearly shaped. Furthermore the display 208 may be flexible and configured to fold or roll. In some implementations the display 208 may comprise the interpolated display 800 as described below with regards to
The content presented on the display 208 may take the form of electronic books or “eBooks.” For example, the display 208 may depict the text of the eBooks and also any illustrations, tables, or graphic elements that might be contained in the eBooks. The terms “book” and/or “eBook”, as used herein, include electronic or digital representations of printed works, as well as digital content that may include text, multimedia, hypertext, and/or hypermedia. Examples of printed and/or digital works include, but are not limited to, books, magazines, newspapers, periodicals, journals, reference materials, telephone books, textbooks, anthologies, instruction manuals, proceedings of meetings, forms, directories, maps, web pages, and so forth. Accordingly, the terms “book” and/or “eBook” may include any readable or viewable content that is in electronic or digital form.
The device 200 includes a hardware enabled interpolating sensor controller 210, or “sensor controller.” The sensor controller 210 couples to the processor 202, such as via a universal serial bus host controller, inter-integrated circuit (“I2C’), universal asynchronous receiver/transmitter (“UART”), serial peripheral interface bus (“SPI”), or other interface. The sensor controller 210 couples to the HEIS 102. In some implementations multiple sensors 102 may be present.
The sensor controller 210 is configured to scan the HEIS 102 and determine effects which correspond to the impinging signal 104. The sensor controller 210 may also interpolate the location and amplitude of impinging signals 104 which fall between junctions 116.
The device 200 may also include an external memory interface (“EMI”) 212 coupled to external memory 214. The EMI 212 manages access to data stored in external memory 214. The external memory 214 may comprise Static Random Access Memory (“SRAM”), Pseudostatic Random Access Memory (“PSRAM”), Synchronous Dynamic Random Access Memory (“SDRAM”), Double Data Rate SDRAM (“DDR”), Phase-Change RAM (“PCRAM”), or other computer-readable storage media.
The external memory 214 may store an operating system 216 comprising a kernel 218 operatively coupled to one or more device drivers 220. The device drivers 220 are also operatively coupled to peripherals 204, such as the sensor controller 210. The external memory 214 may also store data 222, which may comprise content objects for consumption on the device 200, executable programs, databases, user settings, configuration files, device status, and so forth. A signal analysis module 224 may also be stored in memory 214. The signal analysis module 224 may be configured to analyze data generated by the sensor controller 210 to characterize and image the impinging signal 104.
A power supply 226 provides operational electrical power to components of the device 200 for operation when the device. The device 200 may also include one or more other, non-illustrated peripherals, such as input controls, a hard drive using magnetic, optical, or solid state storage to store information, a firewire bus, a Bluetooth™ wireless network interface, camera, global positioning system, PC Card component, and so forth.
Couplings, such as that between the kernel 218 and the device drivers 220 are shown for emphasis. There are couplings between many of the components illustrated in
The transmissive elements 306 are configured to couple to one another. For example, where the transmissive elements 306 are electrical conductors, this coupling may comprise a connection of the transmissive elements 306 with a resistive material 308. In the implementation depicted here, the first layer 302 comprises transmissive elements 306 embedded within the resistive material 308. Drone wires (omitted here for clarity) may also be present which are generally parallel to the transmissive elements 306. In other implementations the transmissive elements 306 may be disposed adjacent to and in contact with the resistive material 308. In yet another implementation, the resistive material 308 may be disposed around the edges of the array, thus interconnecting transmissive elements 306 without obscuring the transducer layer 112.
The resistive material 308 is configured such that charge (or electromagnetic signals such as light) may flow between neighboring rows and columns, which results in non-aliased output. Because the characteristics of the resistive material 308 are known, and the junctions may be read out individually by the sensor controller 210, interpolation of the incoming signal at locations between the junctions is readily accomplished.
An active row 310 coupled to the sensor controller 210 is shown with a charge applied, such as may be used to scan the junction formed by the transmissive elements 306 of the first layer 302 and the second layer 304.
Within the transducer layer 112, the signal affected region 114 is depicted. The signal affected region 114 is a region within the transducer layer 112 material which has one or more characteristics which are altered as a result of the impinging signal 104. For example, in one implementation, the impinging signal 104 may alter the resistivity of that region. As a result, the sensor controller 210 may scan and detect that this region has received a signal based on the change in resistivity, and characterize the impinging signal 104.
Adjacent to and coupled with the transducer layer 304 is the second layer 304. The second layer 304 comprises an active column 312 and transmissive elements 306 (hidden in this view). In this implementation, the read out signal in the form of the charge on the active row 310 has propagated across the signal affected region 114 and is read out by the sensor controller 210 as a signal 314 at the active column 312.
As a result, at the junction 116 described by the active row 310 and the active column 312, the state of the transducer layer 112 may be characterized. This characterization results in a signal which corresponds, at least in part, to the impinging signal 104. The sensor controller 210 may scan the junctions 116(1)-(X) and generate a representation of an image corresponding to the impinging signal 104.
At 402, the sensor controller 210 scans the hardware enabled interpolating sensor (HEIS) 102 which comprises a transducer layer 112. As described above, the transducer layer 112 is configured to change in response to an impinging signal 104. The change may then be read out as a signal discernable by the sensor controller 210.
At 404, the sensor controller 210 detects signals at a plurality of intersections, wherein the signal is proportional to the incident signal 104 on the transducer layer 112 proximate to the plurality of intersections. Signals may include voltage, current, transmissivity, inductance, capacitance, phase, and so forth individually or in combination.
At 406, the sensor controller 210 generates interpolated output based at least in part upon a position and signal measured at the plurality of intersections. This interpolated output considers the characteristics of the resistive material 308 which the transmissive elements 306 are coupled to. Because of the known characteristics, such as resistance per distance, signals from multiple junctions 116 may be combined to generate interpolated data points. In effect, the sensor 102 is able to resolve input which occurs in between junctions 116, freeing the sensor 102 from pixellation constraints and aliasing issues.
As shown in the cross sectional view 504, the wavefront 502 impinges upon the sensor 102 at time T1 at junction 1-C. As the wavefront 502 moves, it contacts with the sensor 102 at time T2 at junction 3-E. Finally, the wavefront 502 is shown contacting at time T3 junction 5-G. As shown in the plan view 506, the wavefront 502 moves generally diagonally across the sensor 102.
By monitoring the position of the signal within the array and a time of arrival, the sensor controller 210 may determine a direction of origin of the impinging signal 104, reconstruct the wavefront 502 to generate a holographic image, and so forth. Where the impinging signal 104 has a duration of less than the scan rate of the sensor 102, timing information about the impinging signal 104 may be derived based upon known persistence effects within the transducing material. For example, consider where the transducer layer 112 comprises a phosphorescent material and the impinging signal 104 comprises a photon having a requisite energy to generate phosphorescence. The impinging signal 104 will trigger the phosphoerescence, which will persist for a particular amount of time and decay at a known rate. In combination with other factors known about the sensor 102 and which may be derived about the impinging signal 104, the time of arrival may be determined.
At 604, the sensor controller 210 determines a temporal-spatial distribution of the impinging signal 104 relative to the plurality the junctions. For example, as described above with regards to
At 606, the sensor controller 210 reconstructs a wavefront of the signal based at least in part upon the temporal-spatial distribution. This reconstruction may be used to determine direction, intensity, and other aspects of a source of the impinging signal 104. In some implementations, this wavefront reconstruction may be used to generate a holographic image of the signal source. In some implementations, the signal analysis module 224 may perform all or a portion of this reconstruction.
At 704, transmissive elements are emplaced in a regular arrangement on a second substrate and configured such that a coupling exists between them. The arrangement of the conductive elements on the first and second substrates is configured such that, when assembled, an addressable array is formed. In one arrangement, the conductive elements in the two layers are oriented perpendicularly to form a grid pattern.
At 706, the first substrate is joined to a first side of a transducer layer 112. Joining may include adhesion, lamination, contact, and so forth. At 708, the second substrate is joined to a second side of the transducer layer such that the conductive elements of the first substrate and the second substrate form an array. Once joined, the completed HEIS 102 may be coupled to the sensor controller 210.
As described herein, the fabrication process for a sensor is relatively inexpensive because conductive elements may be literally printed onto the substrate. Furthermore, because of the interpolating nature of the sensor, the assembly may tolerate a higher level of defects. As a result, quality controls may be relaxed, reducing costs due to wastage, additional testing, and so forth.
Interpolating Display
Similar to
As shown here, the active row 310 provides a signal 804 which traverses the emissive transducer layer 802 to an active column 312. This signal traversal results in formation of an emission region 806, which in turn generates an emitted signal 808. For example, the emitted signal 808 may comprise a photon or a pressure wave. In some implementations, the emission region 806 may not manifest a sharp boundary with adjacent non-emission regions. For example, in some instances a gradual falloff may be present.
The material within the emissive transducer layer 802 may be configured to vary size of emission region and intensity or flux of emitted signal 808 in response to variations in voltage, current, and so forth. Thus, each emission region may be more final controlled than a conventional display pixel.
In this illustration, the image processing unit 206 is configured to drive the interpolating display 800. For illustrative purposes, the image processing unit 206 is shown generating three emission points 808 with the following parameters:
Continuing the example of
Because the effects of the emissive transducer layer 802 are not confined to particular pixels, it is possible to also generate images or portions of an image utilizing union areas. A union area 906 occurs where two or more junctions 116 drive a particular region of the emissive transducer layer 802. This driving may occur sequentially or contemporaneously. This union area 906 may result in variations to the intensity, boundaries of the emissive region, and so forth. As shown in this illustration, a union area 906(1) results from an overlap between the emission centered at points 4-D and 6-E. The union area 906(1) may experience a greater intensity due to the combined effects of both active junctions.
Likewise a second union area 906(2) results from overlap of the emission regions 806 from junctions 6-E and 7-D. In some implementations, portions of the images may be formed by one or more union areas 906. In some implementations the union areas 906 may result in a signal emission, while the emission regions 806 at an individual junction 116 may have insufficient power to emit a signal. For example, the current may be reduced to provide a minimal or no intensity 904, except in the union area 906. Thus, an image may be formed not from discrete emission regions, but rather from emitting union areas 906.
While the emission regions 806 and corresponding illuminated areas 902 are depicted as generally circular, it is understood that other geometries may result as a result of interactions between transmissive elements, anisotropy in the emissive transducing layer 802, and so forth. For example, adjacent transmissive elements may be placed into a high impedance mode, have a differential voltage applied, be partially grounded, or otherwise modified to shape the emissive region. Thus in some implementations the emission regions 806 may appear as generally rectangular, elliptical, triangular, and so forth when seen in plan view. Furthermore, sensing regions within in the HEIS 102 may also be shaped in similar fashion.
At 1004, a signal is applied to the connected row and column, generating an emission within the emissive transducer layer 802 proximate to a junction of the row and the column. For example, the emission points 808 as described above. A plurality of emission points 808 may be combined to generate an image.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the claims. For example, the methodological acts need not be performed in the order or combinations described herein, and may be performed in any combination of one or more acts.
The present application claims priority to U.S. Provisional Application Ser. No. 61/230,592, filed on Jul. 31, 2009, entitled “Inventions Related to Touch Screen Technology.” This application is hereby incorporated by reference in its entirety, and the benefit of the filing date of this provisional application is claimed to the fullest extent permitted.
Number | Name | Date | Kind |
---|---|---|---|
3944740 | Murase et al. | Mar 1976 | A |
4526043 | Boie et al. | Jul 1985 | A |
4587378 | Moore | May 1986 | A |
4952031 | Tsunoda et al. | Aug 1990 | A |
4983786 | Stevens et al. | Jan 1991 | A |
5105548 | Fowler | Apr 1992 | A |
5543589 | Buchana et al. | Aug 1996 | A |
5597183 | Johnson | Jan 1997 | A |
5666113 | Logan | Sep 1997 | A |
5761485 | Munyan | Jun 1998 | A |
5818430 | Heiser | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5847698 | Reavey et al. | Dec 1998 | A |
6029214 | Dorfman et al. | Feb 2000 | A |
6072474 | Morimura et al. | Jun 2000 | A |
6128007 | Seybold | Oct 2000 | A |
6229502 | Schwab | May 2001 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6594606 | Everitt | Jul 2003 | B2 |
6707438 | Ishizuka et al. | Mar 2004 | B1 |
6762752 | Perski et al. | Jul 2004 | B2 |
6980202 | Carro | Dec 2005 | B2 |
6982699 | Lenssen et al. | Jan 2006 | B1 |
7123243 | Kawasaki et al. | Oct 2006 | B2 |
7166966 | Naugler et al. | Jan 2007 | B2 |
7190348 | Kennedy et al. | Mar 2007 | B2 |
7199322 | Bourdelais et al. | Apr 2007 | B2 |
7324093 | Gettemy et al. | Jan 2008 | B1 |
7331245 | Nishimura et al. | Feb 2008 | B2 |
7339577 | Sato et al. | Mar 2008 | B2 |
7471284 | Bathiche et al. | Dec 2008 | B2 |
7619616 | Rimas Ribikauskas et al. | Nov 2009 | B2 |
7760187 | Kennedy | Jul 2010 | B2 |
7800586 | Serban et al. | Sep 2010 | B2 |
7825905 | Philipp | Nov 2010 | B2 |
8089470 | Schediwy et al. | Jan 2012 | B1 |
8223278 | Kim et al. | Jul 2012 | B2 |
8243424 | Babu et al. | Aug 2012 | B1 |
8265717 | Gorsica et al. | Sep 2012 | B2 |
8316324 | Boillot | Nov 2012 | B2 |
8427424 | Hartmann et al. | Apr 2013 | B2 |
8466880 | Westerman et al. | Jun 2013 | B2 |
8558767 | Kwon | Oct 2013 | B2 |
8902174 | Peterson | Dec 2014 | B1 |
8947351 | Noble | Feb 2015 | B1 |
9069417 | Rimon et al. | Jun 2015 | B2 |
9244562 | Rosenberg et al. | Jan 2016 | B1 |
20010013855 | Fricker et al. | Aug 2001 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020109668 | Rosenberg et al. | Aug 2002 | A1 |
20020149572 | Schulz et al. | Oct 2002 | A1 |
20020180714 | Duret | Dec 2002 | A1 |
20030067449 | Yoshikawa et al. | Apr 2003 | A1 |
20030095115 | Brian et al. | May 2003 | A1 |
20030156098 | Shaw et al. | Aug 2003 | A1 |
20030210235 | Roberts | Nov 2003 | A1 |
20030234768 | Rekimoto et al. | Dec 2003 | A1 |
20040125087 | Taylor et al. | Jul 2004 | A1 |
20040174324 | Yamazaki et al. | Sep 2004 | A1 |
20050083316 | Brian et al. | Apr 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050174336 | Nakayama et al. | Aug 2005 | A1 |
20050200798 | Tanaka | Sep 2005 | A1 |
20050259087 | Hoshino et al. | Nov 2005 | A1 |
20060007172 | Baker et al. | Jan 2006 | A1 |
20060007182 | Sato et al. | Jan 2006 | A1 |
20060012580 | Perski et al. | Jan 2006 | A1 |
20060012581 | Haim et al. | Jan 2006 | A1 |
20060028459 | Underwood et al. | Feb 2006 | A1 |
20060050062 | Ozawa et al. | Mar 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060109252 | Kolmykov-Zotov et al. | May 2006 | A1 |
20060192726 | Huitema et al. | Aug 2006 | A1 |
20060198080 | Hawes et al. | Sep 2006 | A1 |
20060209045 | Su et al. | Sep 2006 | A1 |
20060236263 | Bathiche et al. | Oct 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060293864 | Soss | Dec 2006 | A1 |
20070128948 | Nakanishi et al. | Jun 2007 | A1 |
20070152976 | Townsend et al. | Jul 2007 | A1 |
20070236618 | Maag et al. | Oct 2007 | A1 |
20070262246 | Pittel et al. | Nov 2007 | A1 |
20080018608 | Serban et al. | Jan 2008 | A1 |
20080018611 | Serban et al. | Jan 2008 | A1 |
20080030464 | Sohm et al. | Feb 2008 | A1 |
20080053293 | Georges et al. | Mar 2008 | A1 |
20080074400 | Gettemy et al. | Mar 2008 | A1 |
20080143679 | Harmon et al. | Jun 2008 | A1 |
20080158183 | Hotelling et al. | Jul 2008 | A1 |
20080160656 | Chanda et al. | Jul 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080180406 | Han et al. | Jul 2008 | A1 |
20080204426 | Hotelling et al. | Aug 2008 | A1 |
20080211796 | Kim | Sep 2008 | A1 |
20080246723 | Baumbach | Oct 2008 | A1 |
20080254822 | Tilley | Oct 2008 | A1 |
20080296073 | McDermid | Dec 2008 | A1 |
20080303799 | Schwesig et al. | Dec 2008 | A1 |
20080309631 | Westerman et al. | Dec 2008 | A1 |
20090095540 | Zachut et al. | Apr 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090120696 | Hayakawa et al. | May 2009 | A1 |
20090141008 | Johnson et al. | Jun 2009 | A1 |
20090153152 | Maharyta et al. | Jun 2009 | A1 |
20090165296 | Carmi | Jul 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090218310 | Zu et al. | Sep 2009 | A1 |
20090219258 | Geaghan et al. | Sep 2009 | A1 |
20090227295 | Kim | Sep 2009 | A1 |
20090237371 | Kim et al. | Sep 2009 | A1 |
20090237374 | Li et al. | Sep 2009 | A1 |
20090249236 | Westerman et al. | Oct 2009 | A1 |
20090256817 | Perlin et al. | Oct 2009 | A1 |
20090289914 | Cho | Nov 2009 | A1 |
20090309616 | Klinghult et al. | Dec 2009 | A1 |
20090315848 | Ku et al. | Dec 2009 | A1 |
20100005427 | Zhang et al. | Jan 2010 | A1 |
20100006350 | Elias | Jan 2010 | A1 |
20100013780 | Ikeda et al. | Jan 2010 | A1 |
20100013797 | Kim et al. | Jan 2010 | A1 |
20100020043 | Park et al. | Jan 2010 | A1 |
20100026647 | Abe et al. | Feb 2010 | A1 |
20100039395 | Nurmi et al. | Feb 2010 | A1 |
20100056277 | Marks et al. | Mar 2010 | A1 |
20100090964 | Soo et al. | Apr 2010 | A1 |
20100117974 | Joguet et al. | May 2010 | A1 |
20100123670 | Philipp | May 2010 | A1 |
20100139990 | Westerman et al. | Jun 2010 | A1 |
20100156805 | Brand et al. | Jun 2010 | A1 |
20100182285 | Tremblay | Jul 2010 | A1 |
20100199221 | Yeung et al. | Aug 2010 | A1 |
20100225604 | Homma et al. | Sep 2010 | A1 |
20100267421 | Rofougaran | Oct 2010 | A1 |
20100277439 | Charlier et al. | Nov 2010 | A1 |
20100295780 | Vaisanen et al. | Nov 2010 | A1 |
20100295781 | Alameh et al. | Nov 2010 | A1 |
20100311356 | Rofougaran | Dec 2010 | A1 |
20110007021 | Bernstein et al. | Jan 2011 | A1 |
20110025619 | Joguet et al. | Feb 2011 | A1 |
20110037709 | Cottarel | Feb 2011 | A1 |
20110061947 | Krah et al. | Mar 2011 | A1 |
20110074701 | Dickinson et al. | Mar 2011 | A1 |
20110096033 | Ko | Apr 2011 | A1 |
20110109577 | Lee et al. | May 2011 | A1 |
20110141009 | Izumi | Jun 2011 | A1 |
20110163992 | Cordeiro et al. | Jul 2011 | A1 |
20110242037 | Gruber | Oct 2011 | A1 |
20110254864 | Tsuchikawa et al. | Oct 2011 | A1 |
20110267265 | Stinson | Nov 2011 | A1 |
20110267280 | De Mers et al. | Nov 2011 | A1 |
20110285657 | Shimotani et al. | Nov 2011 | A1 |
20120034888 | De Flaviis | Feb 2012 | A1 |
20120050181 | King et al. | Mar 2012 | A1 |
20120057064 | Gardiner et al. | Mar 2012 | A1 |
20120084691 | Yun | Apr 2012 | A1 |
20120105324 | Lee et al. | May 2012 | A1 |
20120173067 | Szczerba et al. | Jul 2012 | A1 |
20120174004 | Seder et al. | Jul 2012 | A1 |
20120206333 | Kim | Aug 2012 | A1 |
20120299648 | Homma et al. | Nov 2012 | A1 |
20120299849 | Homma et al. | Nov 2012 | A1 |
20120313880 | Geaghan et al. | Dec 2012 | A1 |
20120320247 | Kim et al. | Dec 2012 | A1 |
20120326994 | Miyazawa et al. | Dec 2012 | A1 |
20130002551 | Imoto et al. | Jan 2013 | A1 |
20140028557 | Otake et al. | Jan 2014 | A1 |
20140085202 | Hamalainen et al. | Mar 2014 | A1 |
20140267176 | Bathiche et al. | Sep 2014 | A1 |
20140285418 | Adachi | Sep 2014 | A1 |
20150109257 | Jalali | Apr 2015 | A1 |
Number | Date | Country |
---|---|---|
09282100 | Oct 2007 | JP |
WO2007141566 | Dec 2007 | WO |
WO2009008568 | Jan 2009 | WO |
WO2009021836 | Feb 2009 | WO |
Entry |
---|
Ashbrook, et al., “Nenya: Subtle and Eyes-Free Mobile Input with a Magnetically-Tracked Finger Ring”, CHI 2011, May 7-12, 2011, 4 pages. |
Harrison, et al., “Abracadabra: Wireless, High-Precision, and Unpowered Finger Input for Very Small Mobile Devices”, In Proceedings of the 22nd Annual ACM Symposium on User interface Software and Technology (Victoria, British Columbia, Canada, Oct. 4-7, 2009). UIST '09. ACM, New York, NY, 4 pages. |
Non-Final Office Action for U.S. Appl. No. 12/846,497, dated Dec. 14, 2012, Ilya D. Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 26 pages. |
Office action for U.S. Appl. No. 12/846,328, dated Dec. 24, 2012, Rosenberg et al., “Two-Sided Touch Sensor”, 15 pages. |
Office action for U.S. Appl. No. 13/247,699, dated Jul. 19, 2013, Beguin et al., “Interacting Through Noncontact Gestures”, 32 pages. |
Office action for U.S. Appl. No. 12/846,328, dated Aug. 15, 2013, Rosenberg et al., “Two-Sided Touch Sensor”, 18 pages. |
Office action for U.S. Appl. No. 12/846,519, dated Apr. 24, 2013, Rosenberg et al., “Touch Sensing Techniques”, 23 pages. |
Office action for U.S. Appl. No. 12/846,497, dated Apr. 25, 2013, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 27 pages. |
Office action for U.S. Appl. No. 12/846,295, dated May 21, 2013, Rosenberg et al., “Visually Consistent Arrays”, 14 pages. |
Office action for U.S. Appl. No. 12/846,268, dated May 3, 2013, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 15 pages. |
Non-Final Office Action for U.S. Appl. No. 13/247,669, dated Feb. 1, 2013, Julien G. Beguin et al., “Interacting Through Noncontact Gestures”, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 12/846,368, dated Feb. 15, 2013, Ilya D. Rosenberg et al., “Touch Distinction”, 23 pages. |
Office Action for U.S. Appl. No. 12/846,539, dated Feb. 15, 2013, Ilya D. Rosenberg et al., “Magnetic Touch Discrimination”, 20 pages. |
Moscovich, et al., “Multi-finger Cursor Techniques”, Department of Computer Science, Brown University, Year of Publication: 2006, 7 pages. |
Final Office Action for U.S. Appl. No. 12/846,539, dated Oct. 25, 2013, Ilya D. Rosenberg, “Magnetic Touch Discrimination”, 26 pages. |
Final Office Action for U.S. Appl. No. 12/846,295, dated Dec. 23, 2013, Ilya D. Rosenberg, “Visually Consistent Arrays including Conductive Mesh”, 16 pages. |
Office Action for U.S. Appl. No. 12/846,268, dated Oct. 23, 2013, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 37 pages. |
Office action for U.S. Appl. No. 12/846,519, dated Nov. 14, 2013, Rosenberg, et al., “Touch Sensing Techniques”, 24 pages. |
Office Action for U.S. Appl. No. 12/846,328, dated Dec. 19, 2013, Ilya D. Rosenberg, “Two-Sided Touch Sensor”, 13 pages. |
Office action for U.S. Appl. No. 12/846,368, dated Sep. 13, 2013,Rosenberg et al., “Touch Distinction”, 36 pages. |
Wolf, et al., “Angles, Azimuths, and Bearings”, Pearson Prentice Hall, Elementary Surveying, 12th Edition, 2008, Chapter 7, pp. 165-184. |
Office Action for U.S. Appl. No. 13/247,699, dated Jan. 31, 2014, Julien G. Beguin, “Interacting Through Noncontact Gestures”, 28 pages. |
Office Action for U.S. Appl. No. 12/846,268, dated Jul. 29, 2010, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 32 pages. |
Office Action for U.S. Appl. No. 12/846,497, dated Oct. 23, 2014, Ilya D. Rosenberg, “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 25 pages. |
Office action for U.S. Appl. No. 12/846,368, dated Jul. 17, 2014, Rosenberg et al., “Touch Distinction”, 45 pages. |
Office action for U.S. Appl. No. 12/846,295, dated Sep. 24, 2014, Rosenberg et al., “Visually Consistent Arrays including Conductive Mesh”, 17 pages. |
Final Office Action for U.S. Appl. No. 13/247,699, dated Sep. 26, 2014, Julien G. Beguin, “Interacting Through Noncontact Gestures”, 30 pages. |
Office Action for U.S. Appl. No. 12/846,539, dated Feb. 24, 2015, Ilya D. Rosenberg, “Magnetic Touch Discrimination”, 17 pages. |
Final Office Action for U.S. Appl. No. 12/846,368, dated Feb. 27, 2015, Ilya D. Rosenbert, “Touch Distinction”, 49 pages. |
Office action for U.S. Appl. No. 12/846,368, dated Oct. 18, 2016, Rosenberg et al., “Touch Distinction”, 21 pages. |
Office action for U.S. Appl. No. 12/846,519 dated Nov. 18, 2015, Rosenberg et al., “Touch Sensing Techniques”, 36 pages. |
Office action for U.S. Appl. No. 15/003,086, dated Dec. 15, 2016, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 23 pages. |
Office Action for U.S. Appl. No. 12/846,268, dated Dec. 22, 2014, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 36 pages. |
Office Action for U.S. Appl. No. 12/846,497, dated Dec. 22, 2016, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 43 pages. |
Office Action for U.S. Appl. No. 12/846,519, dated Mar. 11, 2015, Ilya D. Rosenberg, “Touch Sensing Techniques”, 35 pages. |
Office action for U.S. Appl. No. 12/846,497, dated Mar. 15, 2016, Rosenberg et al., “Capacitive Sensing with interpolating Force-Sensitive Resistor Array”, 37 pages. |
Final Office Action for U.S. Appl. No. 12/846,497, dated Mar. 20, 2015, Ilya D. Rosenberg, “Capacitive Sensing with interpolating Force-Sensitive Resistor Array”, 37 pages. |
Office action for U.S. Appl. No. 13/247,699, dated Mar. 24, 2016, Beguin et al., “Interacting Through Noncontact Gestures”, 25 pages. |
Final Office Action for U.S. Appl. No. 12/846,268, dated Apr. 2, 2015, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive input Devices”, 37 pages. |
Office action for U.S. Appl. No. 12/846,368, dated Apr. 21, 2016, Rosenberg et al., “Touch Distinction”, 24 pages. |
Office action for U.S. Appl. No. 15/003,086, dated Apr. 4, 2017. Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 24 pages. |
Office action for U.S. Appl. No. 15/003,086, dated Jun. 17, 2016, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 11 pages. |
Office Action for U.S. Appl. No. 12/846,268, dated May 16, 2014, liya D. Rosenberg, “Gestures and Touches on Force-sensitive input Devices”, 32 pages. |
Office action for U.S. Appl. No. 13/247,699, dated Aug. 27, 2015, Seguin et al., “Interacting Through Noncontact Gestures”, 24 pages. |
Office action for U.S. Appl. No. 12/846,368, dated Sep. 10, 2015, Rosenberg et al., “Touch Distinction”, 20 pages. |
Office Action for U.S. Appl. No. 12/846,497, dated Sep. 23, 2016, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 43 pages. |
Number | Date | Country | |
---|---|---|---|
61230592 | Jul 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12380350 | Feb 2009 | US |
Child | 12846428 | US |