This disclosure relates to efficiently focusing light.
By tracking the motion of a pen, for example, as it is used to write or draw on paper, it is possible to capture and reproduce electronically what is being written or drawn. Motion of a stylus that does not leave a mark on a writing surface can also be tracked. Systems for accomplishing this were described in U.S. Patent application Ser. No. 10/623,284 (Now U.S. Pat. no. 7,268,774), filed Jul. 17, 2003, which is incorporated here by reference.
In some proposed approaches, the surface on which the pen is moving may have an array of pixels or other sensing locations, such as touch sensors, each of which responds when the pen is at such a location. In other techniques triangulation algorithms are used to track the motion.
As described in the above-referenced application, the pen may emit light that is detected by sensors mounted on the writing surface, and its position may be determined based on the readings of those sensors.
In general, in one aspect, an array of sensitive pixel elements is configured to generate signals representative of sensed light that is associated with writing being done by a user, and an optical system concentrates light from a light source across a section of the array of sensitive pixel elements, the signals being useful to compute a subpixel measurement of a position of the light source.
Implementations may include one or more of the following features. The array includes a row of pixels along a length of the array. The sensitive pixel elements are sensitive to infrared light. The array of sensitive pixel elements includes a lateral position measurement detector. The lateral position measurement detector is a CMOS linear array The lateral position measurement detector is a CMOS 2D array The lateral position measurement detector is a CCD linear array The lateral position measurement detector is a CCD 2D array The lateral position measurement detector is a position sensing detector (PSD). The light includes infrared light. The optical system includes a lens. The lens includes a single spherical lens. The lens has a cross-section that is constant along a length of the lens. The lens includes a single cylindrical lens. The lens includes a single aspheric lens. The optical system is configured to concentrate the light into an elongated shape on the array. The shape is a line. Concentrating includes focusing the light onto the array. The optical system includes an opening in an optically opaque material. The optical system includes a combination of a transparent refractive lens and an opening in an optically opaque material. The optical system includes a series of two or more lenses The optical system includes a pinhole aperture The optical system includes a slit aperture The optical system includes a single Fresnel lens The optical system includes a series of two or more Fresnel lenses The optical system includes a single binary lens The optical system includes a series of two or more binary lenses The optical system includes a reflective surface positioned to reflect light from the light source onto a lens. The optical system includes a material to transmit infrared light and not transmit other light. The computed position of the light source is a vertical projection of a position of the light source onto a plane defined by the writing surface.
A second array of sensitive pixel elements is configured to generate signals representative of the sensed light, and a second lens concentrates light from the light source across a section of the second array of sensitive pixel elements, the signals from the first and second arrays being useful to compute a subpixel measurement in two dimensions of the position of the light source. The first and second lens each include an opening in an optically opaque material. The first and second lens each include a combination of a transparent refractive lens and an opening in an optically opaque material. The first array and first lens together include a first sensor, and the second array and second lens together include a second sensor, in which the first and second sensors are each rotated such that the first and second arrays are not in a common plane. The first and second sensors are positioned such that the sensors can detect the light source as it is moved over a writing surface. The writing surface includes an area corresponding to a standard size of paper. The writing surface includes a white board. The writing surface includes a drawing pad. The first and second sensors are located about 120 mm apart and are each rotated about twelve degrees towards each other, as measured from parallel lines through the two sensors. The first and second sensors are located at adjacent corners of a writing surface, and each rotated forty-five degrees relative to an adjacent edge of the writing surface. A third and forth sensor are located at four corners of a writing surface and are each rotated forty-five degrees relative to an adjacent edge of the writing surface. A third and forth sensor are located along four sides of a writing surface. A structure is configured to hold the arrays and lenses in positions where they can detect the position of the light source as it is moved over a writing surface. The structure includes a pen holder. The structure includes a pen cap. The light source is associated with a writing instrument, and the pen holder is configured to accommodate the writing instrument. The pen holder is configured to be attached to an edge of the writing surface. The structure includes a body of a computer, and the writing surface includes a screen of the computer.
In general, in one aspect, an array of sensitive pixel elements is configured to generate signals representative of sensed light that is associated with writing being done by a user, the array being characterized by a plane defined by the sensitive pixel elements, an axis through the sensitive pixel elements, and a center point between equal sets of the sensitive pixel elements, and a lens concentrates light from a light source onto the array of sensitive pixel elements, the lens being characterized by an axis through its center and positioned with its axis offset from the center point of the array, the signals being useful to compute a subpixel measurement of a position of the light source. Implementations may include the lens being characterized by a curved surface and a flat surface, and positioned such that the flat surface is not parallel to the plane of the array.
In general, in one aspect, an array of sensitive pixel elements is configured to generate signals representative of sensed light that is associated with writing being done by a user, the array being characterized by a plane defined by the sensitive pixel elements, an axis through the sensitive pixel elements, and a center point between equal sets of the sensitive pixel elements, and a lens concentrates light from a light source onto the array of sensitive pixel elements, the lens having a curved surface and a flat surface, and positioned such that the flat surface is not parallel to the plane of the array, the signals being useful to compute a subpixel measurement of a position of the light source.
Implementations may include one or more of the following features. The lens is cylindrical. An axis through the cylindrical shape of the lens is in a plane perpendicular to the axis of the sensitive pixel elements. An opening in an optically opaque material controls an amount of light reaching the lens. A second array of sensitive pixel elements is configured to generate signals representative of the sensed light, the second array being characterized by a plane defined by the sensitive pixel elements, an axis through the sensitive pixel elements, and a center point between equal sets of the sensitive pixel elements, and a second lens concentrates light from the light source onto the second array of sensitive pixel elements, the second lens having a curved surface and a flat surface, and positioned such that the flat surface is not parallel to the plane of the second array, the signals from the first and second arrays being useful to compute a subpixel measurement in two dimensions of the position of the light source.
In general, in one aspect, near a writing surface, an array of sensitive pixel elements is configured to generate signals representative of sensed light that is associated with writing being done by a user, a lens concentrates light from a light source in a shape other than a spot onto the array of sensitive pixel elements, and a reflective surface is positioned to reflect light from the light source onto the lens, when the light source is near the writing surface, the signals being useful to compute a subpixel measurement of a position of the light source. An optically opaque barrier may be positioned between the writing surface and the reflective surface, a portion of the barrier defining an opening positioned to admit light from the light source when the light source is near the writing surface.
In general, in one aspect, two optical sensors separated by a lateral distance are used to generate a set of signals representing subpixel values and usable to reconstruct a position of a light source that is associated with a writing instrument being used by a user, and the signals are used to compute a measurement of a position in two dimensions of the light source near a writing surface, in which using the optical sensors includes using an optical system to concentrate light from the light source in a shape other than a spot onto an array of sensitive pixel elements.
Implementations may include one or more of the following features. Computing the measurement includes using a quasi-triangulation algorithm. Computing the measurement includes using calibrated parameters. Computing the measurement includes using a lookup table. Computing the measurement includes using a polynomial approximation. The two optical sensors are used to generate a second set of signals representing subpixel values and usable to reconstruct a position of a second light source that is associated with a second position along a length of the writing instrument, and the first and second sets of signals are used to compute a measurement of an angle between the writing instrument and the writing surface. The first and second sets of signals are used to compute a position in two dimensions of a tip of the writing instrument on the writing surface.
In general, in one aspect, a device holds paper in a location, and at least two sensor assemblies are located near the location to generate signals representative of sensed light that is associated with writing being done on the paper. The sensor assemblies each include an array of sensitive pixel elements and a lens to concentrate light from a light source in a shape other than a spot onto the array of sensitive pixel elements.
In general, in one aspect, in or near a housing for a computer display, at least two sensor assemblies generate signals representative of sensed light that is associated with writing being done on the display. The sensor assemblies each include an aperture in the housing, an array of sensitive pixel elements, a lens to concentrate light from a light source in a shape other than a spot onto the array of sensitive pixel elements, and a reflective surface to direct light entering through the aperture onto the lens.
Implementations may include one or more of the following features. The computer display is a monitor of a tablet computer. The computer display is a monitor of a laptop computer. The computer display is a screen of a portable communications device.
Advantages include an improved vertical field of view to detect light reaching the sensors at an angle. Two light sources can be detected, allowing computation of the angle between the pen and the writing surface. The sensors can be positioned above or below the level of the writing surface, and at a variety of angles relative to the writing surface. Sensors can be positioned to sense writing over the entirety of a writing surface. The position of the pen can be determined with subpixel accuracy. A single lens can be used with each sensor to produce accurate measurements. Calibration can be performed during manufacturing. The sensors and/or lenses can be independently positioned to adjust the field of view and reduce blind spots.
Other features and advantages will become apparent from the description and from the claims.
The above-referenced application describes an electronic wireless pen that emits light that is collected by external sensors to measure the pen's position with respect to the sensors. The pen may also be equipped with traditional writing components to leave a visible mark as it is drawn across a page. The sensors are CMOS or CCD linear or 2D arrays, Position Sensitive Detectors (PSD), or other light sensitive detectors. The sensors can be clipped to the edge of writing surface allowing reconstruction of writing on that surface. The position of the pen is determined by mapping the sensor readings to the actual XY position of the pen on paper. In some examples, infrared (IR) light is used and the sensors are configured to detect IR light. Other light sources, including ultraviolet and visible light, may be used.
The electronic input device looks like a regular pen with a holder that contains the sensors. The user writes with it just as with any ordinary pen on paper, notebook or other flat surface. The input device is used to capture handwriting text or drawings. The pen's movements are detected by the sensors and communicated to the processor of the input device, which stores all of the pen's movements during its use by recording the sensor measurements into its memory. The input device then downloads its movements to a computer, personal digital assistant, handheld computer, cellular phone, or other device. The handwriting, as it appears on a page, is then automatically reconstructed from sensor information. In some examples, the sensors' readings may be recorded by a processor positioned in the pen itself rather than directly connected to the sensors. In some examples, the handwriting is reconstructed on the pen or pen cap/holder directly.
As shown in
The sensors 20, 22 capture and deliver sequences of signals that can be interpreted by a processor (not shown) to represent the position (e.g., angle 24 between a line connecting the light source 16 and a normal vector 27 from the center of the sensor 22) of the pen on the writing surface 14 at which the light is received from the pen for each of a succession of measurement times. To calculate the actual position of the pen 10 on the writing surface 14, two angles, one from each sensor 20, 22, can be used in a triangulation calculation. Circuitry associated with the processor uses an algorithm to process the signals from the sensors 20, 22 (and the known distance 26 between the sensors) to determine a succession of positions of the pen 10 as it is moved across the writing surface 14. The algorithm can use a mathematical model that translates pixel signals of the sensors 20, 22 into positions on the writing surface 14. The algorithm could be, for example, a quasi-triangulation algorithm using calibrated parameters (e.g., distance from lens to sensor, horizontal offset between them, and others) or it could be a polynomial approximation or a lookup table, or any combination of such techniques.
In some examples, the sensors are fixed in the writing surface, for example, in a tablet PC using optical sensors rather than a digitizing surface. More than two sensors may be used, for example, four sensors, one near each corner. This may be advantageous in applications where the writing surface is large enough that there is a risk of the user inadvertently placing a hand or other object between the pen and one of the sensors. The particular configuration of writing surface and sensor position in a given application dictates, to at least some extent, the necessary field of view of each sensor. This in turn can be used to determine the design of the sensor, including the relationship between any lenses used and the optically sensitive elements of the sensor.
When linear sensor arrays are used to detect the position of the light source, it is useful to control the way in which the light is focused on the sensor arrays. In some examples, a spherical half-ball lens is used. As shown in
In an example shown in
The enlarged vertical field of view of a cylindrical lens also allows the use of two or more light sources, located at different positions along the length of the writing instrument, as shown in
The distance a′ from the tip of the pen to the point A′ is given by Eq. (1). Cos(α) is found from the ratio of sides OB and OB′ (Eq. (2)). Length b′ is found from the measured locations of points A′ and B′, from which cos2(α) is found in terms of known values (Eq. (3)). Beginning with Eq. (4), the distances Δx01, Δy01 can be found based on the similarity of the triangles formed by the projected line segments and their corresponding coordinate components. That is, the ratio between distances Δx01 and Δx12 is the same as the ratio between Δy01 and Δy12, giving the relationship in Eq. (5). Substituting that back into Eq. (4), as in Eqs. (6a) and (6b), Δy01 is found (Eq. (7a)). Again using Eq. (5), Δx01 is likewise found in terms of known quantities (Eq. (7b)).
Tilt information may also be used for additional features, for example, allowing a user to vary the thickness of a line by varying the tilt of the pen. To differentiate the two light sources, they could be configured to illuminate in an alternating pattern, with the processor synchronized to identify which is which. Other methods of differentiating the light sources could also be used, such as illuminating at different frequencies or flashing at different rates. The ability of two light sources to allow calculation of the tip position and angle may also allow the light sources to be separate from the pen itself, for example, in an attachment that can be used with any pen provided by a user.
In a system with two sensors, a “dead zone” exists outside the two sensors' respective fields of view, for example, regions 400a, b, and c in
In some examples, a single printed circuit board is used for all of the electronics including the sensors. As shown in
Returning to
Moving the sensors to the side of the writing surface, as shown in
In another example, as shown in
In another example, as shown in
Two example sensor-lens assemblies 402 are shown in
In some examples, details of which components can be rotated and to what extent depend on the construction of the sensor assemblies. If both sensor assemblies 402 are directly attached to a single circuit board 804, as shown in
When the frame 806 is used to control the position of the lens 302 relative to the sensor array 210, manufacturing tolerances of the frame 806 need to be such that the lens 302 and sensor array 210 can be reliably positioned within whatever tolerance is required by their sensitivity, focus, and other parameters. Using the frame 806 to position the sensor array 210, as shown in
In some examples, the sensors may be placed in-plane with the writing surface, as shown in
As shown in
Once the sensor assemblies are assembled, either before or after being mounted in a device, such as a self-contained pen cap or the frame of a tablet computer, it may be necessary to further adjust their alignment. In some examples, as shown in
A similar process can be used to compute calibration parameters after the parts are assembled. As the pen is placed in each of a series of pre-determined locations, the response of the sensor arrays 210 is measured. These measurements are used to compute calibration parameters that can then be stored in a memory, for example, a ROM, of the device and used to adjust the measurements of the sensor arrays when the device is in use.
Although we have referred to spherical and cylindrical lenses above, other shapes of lenses could be used. Depending on the lens used, the focused spot or line could have characteristics (such as being slightly out of focus or having a useful profile of intensity from one side to the other) that would be useful in connection with sensing light at the sensor array. In addition, a cylindrical lens need not have a constant cross-sectional shape and size along its length. Aspherical lenses could also be used, for example, to flatten the focal plane of the lens so that more of the sensor array is in the region where the light is tightly focused. Fresnel lenses may be used in place of other types of lenses, for example, to decrease the space required for the lens. In some examples, pin holes or slits may be used in place of having any lenses at all or in addition to lenses. Although we refer to the lens focusing the light in our discussion above, the lens could also be arranged to concentrate the light without fully focusing it on the array.
Although we have described an example of an array that is a single row of pixel sensors of the same size and shape, the array could be more complex and could include pixels of different sizes and shapes, multiple rows or pixels, or other patterns of pixels arranged along the length of the sensor.
Other variations are within the scope of the following claims. For example, the writing surface could be a whiteboard used with markers or a drawing pad sensitive to pressure or to a magnetic stylus.
Number | Name | Date | Kind |
---|---|---|---|
3376551 | Armbruster | Apr 1968 | A |
3559307 | Barrekette et al. | Feb 1971 | A |
3581099 | Franke | May 1971 | A |
3761170 | Genesky et al. | Sep 1973 | A |
3801741 | Ablett | Apr 1974 | A |
3915015 | Crane et al. | Oct 1975 | A |
4124838 | Kiss | Nov 1978 | A |
4131880 | Siy et al. | Dec 1978 | A |
4550250 | Mueller | Oct 1985 | A |
4650335 | Ito et al. | Mar 1987 | A |
4682016 | Inoue | Jul 1987 | A |
4688933 | Lapeyre | Aug 1987 | A |
4705942 | Budrikis et al. | Nov 1987 | A |
4710760 | Kasday | Dec 1987 | A |
4751741 | Mochinga et al. | Jun 1988 | A |
4782328 | Denlinger | Nov 1988 | A |
4806707 | Landmeier | Feb 1989 | A |
4874937 | Okamoto | Oct 1989 | A |
4883926 | Baldwin | Nov 1989 | A |
4891474 | Kelly | Jan 1990 | A |
4896965 | Goff et al. | Jan 1990 | A |
4936683 | Purcell | Jun 1990 | A |
5026153 | Suzuki et al. | Jun 1991 | A |
5053757 | Meadows | Oct 1991 | A |
5121441 | Chefalas et al. | Jun 1992 | A |
5166668 | Aoyagi | Nov 1992 | A |
5185638 | Conzola et al. | Feb 1993 | A |
5198877 | Schulz | Mar 1993 | A |
5210405 | Toyoda et al. | May 1993 | A |
5215397 | Taguchi et al. | Jun 1993 | A |
5227622 | Suzuki | Jul 1993 | A |
5227732 | Hong | Jul 1993 | A |
5239139 | Zuta | Aug 1993 | A |
5247137 | Epperson | Sep 1993 | A |
5296838 | Suzuki | Mar 1994 | A |
5298737 | Proper | Mar 1994 | A |
5301222 | Fujiwara | Apr 1994 | A |
5308936 | Biggs et al. | May 1994 | A |
5313542 | Castonguay | May 1994 | A |
5317140 | Dunthorn | May 1994 | A |
5347477 | Lee | Sep 1994 | A |
5371516 | Toyoda et al. | Dec 1994 | A |
5434371 | Brooks et al. | Jul 1995 | A |
5453762 | Ito et al. | Sep 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5517579 | Baron et al. | May 1996 | A |
5525764 | Junkins et al. | Jun 1996 | A |
5546538 | Cobbley et al. | Aug 1996 | A |
5548092 | Shriver | Aug 1996 | A |
5572607 | Behrends | Nov 1996 | A |
5587558 | Matsushima | Dec 1996 | A |
5612720 | Ito et al. | Mar 1997 | A |
5629499 | Flickinger et al. | May 1997 | A |
5652412 | Lazzouni et al. | Jul 1997 | A |
5661761 | Iwamatsu | Aug 1997 | A |
5793361 | Kahn et al. | Aug 1998 | A |
5805719 | Pare, Jr. et al. | Sep 1998 | A |
5818424 | Korth | Oct 1998 | A |
5825921 | Dulong | Oct 1998 | A |
5831601 | Vogeley et al. | Nov 1998 | A |
5874947 | Lin | Feb 1999 | A |
5900943 | Owen | May 1999 | A |
5902968 | Sato et al. | May 1999 | A |
5945981 | Paull et al. | Aug 1999 | A |
5963194 | Umeda et al. | Oct 1999 | A |
5996956 | Shawver | Dec 1999 | A |
6014129 | Umeda et al. | Jan 2000 | A |
6038333 | Wang | Mar 2000 | A |
6055552 | Curry | Apr 2000 | A |
6100538 | Ogawa | Aug 2000 | A |
6124848 | Ballare et al. | Sep 2000 | A |
6137908 | Rhee | Oct 2000 | A |
6181329 | Stork et al. | Jan 2001 | B1 |
6191778 | Chery et al. | Feb 2001 | B1 |
6236753 | Inamoto | May 2001 | B1 |
6243165 | Norita et al. | Jun 2001 | B1 |
6326956 | Jaeger et al. | Dec 2001 | B1 |
6333716 | Pontoppidan | Dec 2001 | B1 |
6335727 | Morishita et al. | Jan 2002 | B1 |
6344848 | Rowe et al. | Feb 2002 | B1 |
6348914 | Tuli | Feb 2002 | B1 |
6377238 | McPheters | Apr 2002 | B1 |
6392821 | Benner, Jr. | May 2002 | B1 |
6414673 | Wood et al. | Jul 2002 | B1 |
6441362 | Ogawa | Aug 2002 | B1 |
6490563 | Hon et al. | Dec 2002 | B2 |
6501061 | Kitai et al. | Dec 2002 | B1 |
6525715 | Uchiyama et al. | Feb 2003 | B2 |
6526351 | Whitham | Feb 2003 | B2 |
6567078 | Ogawa | May 2003 | B2 |
6577299 | Schiller et al. | Jun 2003 | B1 |
6614422 | Rafii et al. | Sep 2003 | B1 |
6633671 | Munich et al. | Oct 2003 | B2 |
6647145 | Gay | Nov 2003 | B1 |
6760009 | Omura et al. | Jul 2004 | B2 |
6811264 | Raskar et al. | Nov 2004 | B2 |
6856349 | Trevino | Feb 2005 | B1 |
6897854 | Cho et al. | May 2005 | B2 |
7006134 | Arai et al. | Feb 2006 | B1 |
7054045 | McPheters et al. | May 2006 | B2 |
7091959 | Clary | Aug 2006 | B1 |
20010030668 | Erten et al. | Oct 2001 | A1 |
20020031243 | Schiller et al. | Mar 2002 | A1 |
20020118181 | Sekendur | Aug 2002 | A1 |
20020163511 | Sekendur | Nov 2002 | A1 |
20030095708 | Pittel | May 2003 | A1 |
20030122804 | Yamazaki et al. | Jul 2003 | A1 |
20030132918 | Fitch et al. | Jul 2003 | A1 |
20030184529 | Chien et al. | Oct 2003 | A1 |
20050073508 | Pittel et al. | Apr 2005 | A1 |
20050128183 | McGreevy | Jun 2005 | A1 |
20050128184 | McGreevy | Jun 2005 | A1 |
20060077188 | Byun | Apr 2006 | A1 |
20060176287 | Pittel et al. | Aug 2006 | A1 |
20060176288 | Pittel et al. | Aug 2006 | A1 |
20060290686 | Shimizu et al. | Dec 2006 | A1 |
20070030258 | Pittel et al. | Feb 2007 | A1 |
20070159453 | Inoue | Jul 2007 | A1 |
20070182725 | Pittel | Aug 2007 | A1 |
20080001078 | Pittel et al. | Jan 2008 | A1 |
20080018591 | Pittel et al. | Jan 2008 | A1 |
20080166175 | Pittel et al. | Jul 2008 | A1 |
Number | Date | Country |
---|---|---|
0 202 468 | Nov 1986 | EP |
0 717 368 | Jun 1996 | EP |
0 865 192 | Sep 1998 | EP |
0 869 690 | Oct 1998 | EP |
2 650 904 | Feb 1991 | FR |
A S59-220610 | Dec 1984 | JP |
A S62-211506 | Sep 1987 | JP |
A S62-243022 | Oct 1987 | JP |
A S63-241415 | Oct 1988 | JP |
A H03-196326 | Aug 1991 | JP |
A H07-175585 | Jul 1995 | JP |
07-200143 | Aug 1995 | JP |
A H07-234105 | Sep 1995 | JP |
A H08-506193 | Jul 1996 | JP |
A H09-319501 | Dec 1997 | JP |
A H10-176910 | Jun 1998 | JP |
A H11-21925 | Jan 1999 | JP |
A H11-84227 | Mar 1999 | JP |
2000-196326 | Jul 2000 | JP |
WO9409447 | Apr 1994 | WO |
WO9418663 | Aug 1994 | WO |
WO9502163 | Jan 1995 | WO |
WO9716799 | May 1997 | WO |
WO 9844316 | Oct 1998 | WO |
WO0011596 | Mar 2000 | WO |
WO0131570 | May 2001 | WO |
WO0177796 | Oct 2001 | WO |
WO 03038592 | May 2003 | WO |
WO2005058177 | Jun 2005 | WO |
Number | Date | Country | |
---|---|---|---|
20070262246 A1 | Nov 2007 | US |