Interactive input system and method

Information

  • Patent Grant
  • 9442607
  • Patent Number
    9,442,607
  • Date Filed
    Monday, December 4, 2006
    18 years ago
  • Date Issued
    Tuesday, September 13, 2016
    8 years ago
Abstract
An interactive input system comprises imaging devices with different viewpoints and having at least partially overlapping fields of view encompassing a region of interest. At least two of the imaging devices have different focal lengths. Processing structure processes image data acquired by the imaging devices to detect the existence of a pointer and determine the location of the pointer within the region of interest.
Description
FIELD OF THE INVENTION

The present invention relates to an interactive input or touch system and method.


BACKGROUND OF THE INVENTION

Touch systems are well known in the art and typically include a touch screen or panel having a touch surface on which contacts are made using a pointer in order to generate user input. Pointer contacts with the touch surface are detected and are used to generate corresponding output depending on areas of the touch surface where the contacts are made. Common touch systems utilize analog resistive, electromagnetic, capacitive, acoustic or machine vision to identify pointer interactions with the touch surface.


For example, International PCT Application No. PCT/CA01/00980 filed on Jul. 5, 2001 and published under No. WO 02/03316 on Jan. 10, 2002, assigned to SMART Technologies Inc., assignee of the present application, discloses a camera-based touch system comprising a touch screen that defines a touch surface on which a computer-generated image is presented. Depending on the application, a front or rear projection device may be used to project the image that is visible on the touch surface. A rectangular bezel or frame surrounds the touch surface and supports wide-angle digital cameras at its corners. The digital cameras have overlapping fields of view that encompass and look generally across the entire touch surface from different viewpoints. The digital cameras acquire images looking across the touch surface and generate image data. Image data acquired by the digital cameras is processed by digital signal processors to determine if a pointer exists in the captured image data. When it is determined that a pointer exists in the captured image data, the digital signal processors convey pointer characteristic data to a master controller, which in turn processes the pointer characteristic data to determine the location of the pointer in (x,y)-coordinates relative to the touch surface using triangulation. The pointer coordinate data is conveyed to a computer executing one or more applications programs. The computer uses the pointer coordinate data to update the computer-generated image that is presented on the touch surface. Pointer contacts on the touch surface can therefore be recorded as writing or drawing or used to control execution of applications programs executed by the computer.


In many environments such as in teaching institutions, large scale touch systems are desired so that visible presentations can be made to large groups. To satisfy this need, a large scale touch system as disclosed in U.S. patent application Ser. No. 10/750,219 to Hill et al. and assigned to SMART Technologies Inc., assignee of the subject application, has been developed. This large scale touch system includes a touch panel having a plurality of input sub-regions. The input sub-regions overlap to define a generally contiguous input surface. Each coordinate input sub-region comprises a set of wide-angle digital cameras having different viewpoints that look across an associated portion of the input surface. Each input sub-region processes image data captured by the digital cameras and generates pointer coordinate data in response to pointer contacts on the associated portion of the input surface. The pointer coordinate data is processed to update image data presented on the input surface. When a pointer contact is made on a coordinate input sub-region that does not overlap with an adjacent coordinate input sub-region, the coordinate input sub-region processes acquired images to derive pointer data and triangulates the position of the pointer using the derived pointer data thereby to determine the position of the pointer contact relative to the input surface. When a pointer contact is made on a coordinate input sub-region that overlaps with an adjacent coordinate input sub-region, each overlapping coordinate input sub-regions processes acquired images to derive pointer data and triangulates the position of the pointer using the derived pointer data. Thereafter, the triangulated positions generated by the overlapping coordinate input sub-regions are processed in accordance with defined logic thereby to determine the position of the pointer contact relative to the input surface.


Although the above touch systems work extremely well, resolution issues arise as the size of the touch surface increases. Typically cameras with very wide fields of view are employed so that each camera sees the entire touch surface. However, when a pointer is brought into contact with the touch surface at a location that is far from one or more of the cameras, the pointer may appear very small to those cameras. In fact, to the cameras, the pointer may appear to be only one (1) or two (2) pixels wide making pointer detection difficult and unreliable. As will be appreciated, there is a need to improve pointer detection especially in touch systems having very large touch surfaces.


It is therefore an object of the present invention to provide a novel interactive input system and method.


SUMMARY OF THE INVENTION

Accordingly, in one aspect there is provided in an interactive input system comprising:


imaging devices with different viewpoints and having at least partially overlapping fields of view encompassing a region of interest, at least two of said imaging devices having different focal lengths; and


processing structure processing image data acquired by the imaging devices to detect the existence of a pointer and determine the location of the pointer within the region of interest.


In one embodiment, at least some of the imaging devices are arranged in pairs. One imaging device of each pair has a wide field of view and the other imaging device of each pair has a narrow field of view. The wide field of view fully encompasses the narrow field of view. The imaging devices of each pair may be stacked vertically or arranged side-by-side.


In one embodiment, a touch surface is associated with the region of interest. Pairs of imaging devices are positioned adjacent corners of the touch surface. The imaging devices of each pair look generally across the touch surface. For each imaging device pair, the processing structure processes image data acquired by each imaging device of the pair to determine if a pointer is believed to exist in the image data with a desired level of confidence and further processes that image data to determine the location of the pointer. The desired level of confidence is existence of a pointer beyond a threshold size. The processing structure processes the image data acquired by the imaging devices of each pair to verify at least one of pointer existence and pointer location.


According to another aspect there is provided a touch system comprising:


a touch surface on which an image is visible;


imaging assemblies about the periphery of said touch surface, said imaging assemblies having at least partially overlapping fields of view encompassing said touch surface, each imaging assembly comprising at least two imaging devices with each imaging device having a different focal length; and


processing structure processing data generated by the imaging assemblies to determine the location of at least one pointer relative to the touch surface.


According to yet another aspect there is provided an interactive input system comprising:


camera assemblies with different viewpoints and having fields of view encompassing a region of interest, each camera assembly comprising at least two image sensors with the image sensors having different focal lengths; and


processing structure processing image data acquired by said camera assemblies to detect one or more pointers in said region of interest.


The interactive input system and method provides advantages in that reliable pointer detection can be achieved even in instances where the pointer is remote from one or more of the imaging devices. In addition, as in some instances imaging devices of different focal lengths see the same object, data extracted from the images acquired by the imaging devices can be used to calibrate the imaging devices and verify the pointer location.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments will now be described more fully with reference to the accompanying drawings in which:



FIG. 1 is a front plan view of a camera-based interactive input system;



FIG. 2 is a schematic diagram of the interactive input system of FIG. 1;



FIG. 3 is an enlarged front plan view of a corner of the touch panel of FIG. 2;



FIG. 4 is a front plan view of a camera assembly forming part of the touch panel of FIG. 2; and



FIG. 5 is a front plan view of the touch panel of FIG. 2 showing the fields of view of the camera assemblies.





DETAILED DESCRIPTION OF THE EMBODIMENTS

Referring now to FIGS. 1 and 2, a camera-based touch system is shown and is generally identified by reference numeral 50. Camera-based touch system 50 is similar to that disclosed in previously referenced International PCT Application Serial No. WO 02/03316, assigned to SMART Technologies Inc., assignee of the subject application, the content of which is incorporated herein by reference.


As can be seen, touch system 50 includes a touch panel 52 coupled to a digital signal processor (DSP)-based master controller 54. Master controller 54 is also coupled to a computer 56. Computer 56 executes one or more application programs and provides computer-generated image output to a projection device 58. Projection device 58 in turn presents a computer-generated image that is visible on the surface 60 of the touch screen 52. The touch panel 52, master controller 54, computer 56 and projection device 58 form a closed-loop so that pointer contacts on the touch surface 60 can be recorded as writing or drawing or used to control execution of application programs executed by the computer 56.


The touch surface 60 is bordered by a bezel or frame 62 similar to that disclosed in U.S. Pat. No. 6,972,401 to Akitt et al. issued on Dec. 6, 2005, assigned to SMART Technologies, Inc. assignee of the subject application, the content of which is incorporated herein by reference. A DSP-based digital camera assembly 70 having on-board processing capabilities, best seen in FIGS. 3 and 4, is positioned adjacent each top corner of the touch surface 60 and is accommodated by the bezel 62. In this embodiment, each digital camera assembly 70 comprises a pair of camera sensors 72 and 74 that look across the touch surface 60 and a processing unit (not shown) communicating with the camera sensors. The focal lengths of the camera sensors 72 and 74 are different as will be described. The camera sensors 72 and 74 of each digital camera assembly 70 are vertically stacked on top of one another. The optical axes of the camera sensors 72 and 74 are in line with the diagonals of the touch surface 60 and thus, the optical axes bisect the diagonally opposite corners of the touch surface 60.


The lower camera sensor 72 of each digital camera assembly 70 has a wide angle lens giving the camera sensor 72 a wide field of view so that the lower camera sensor 72 sees the entire touch surface 60. The upper camera sensor 74 of each digital camera assembly 70 has a narrow angle lens giving the camera sensor 74 a long and narrow field of view so that the upper camera sensor 74 sees only a portion of the touch surface 60. In this embodiment, the lower camera sensor 72 has a field of view equal to about 95°. The upper camera sensor 74 has a field of view in the range of from about 30° to 60°. Those of skill in the art will however appreciate that other combinations of fields of view for the lower and upper camera sensors 72 and 74 can be selected. As the camera sensors 72 and 74 are stacked on top of one another, the field of view (FOV1) of the lower camera sensor 72 fully encompasses the field of view (FOV2) of the upper camera sensor 74 as shown in FIG. 5. In this manner, the upper camera sensors 74 are best suited to detect accurately pointers that are distant from the digital camera assemblies 70 while the lower camera sensors 72 are best suited to detect accurately pointers that are proximate to the digital camera assemblies 70.


During operation of the touch system 50, the camera sensors 72 and 74 of each digital camera assembly 70, look across the touch surface 60 and acquire images. For each digital camera assembly 70, image data acquired by each camera sensor 72 and 74 thereof, is processed by the processing unit to determine if a pointer is believed to exist in each captured image with a desired level of confidence (i.e. the pointer is above a threshold size in the captured image). As will be appreciated, when the pointer is remote from a digital camera assembly 70, only its upper camera sensor 74 will detect the existence of a pointer with the desired level of accuracy and when the pointer is near the digital camera assembly 70, only its lower camera sensor 72 will detect the existence of the pointer with the desired level of accuracy. When a pointer is determined to exist in one of the captured images with the desired level of confidence, pointer characteristic data is derived from that captured image identifying the pointer position in the captured image. If the pointer is determined to exist in both captured images with the desired level of confidence, the pointer characteristic data is derived from the captured image in which the pointer appears the largest.


The pointer characteristic data derived by each digital camera assembly 70 is then conveyed to the master controller 54, which in turn processes the pointer characteristic data in a manner similar to that described in U.S. Pat. No. 6,954,197 to Morrison et al. issued on Oct. 4, 2005, assigned to SMART Technologies Inc., assignee of the subject application, the content of which is incorporated by reference, so that a bounding box surrounding the pointer contact on the touch surface 60 is determined allowing the location of the pointer in (x,y)-coordinates to be calculated.


The pointer coordinate data is then reported to the computer 56, which in turn records the pointer coordinate data as writing or drawing if the pointer contact is a write event or injects the pointer coordinate data into the active application program being run by the computer 56 if the pointer contact is a mouse event. As mentioned above, the computer 56 also updates the image data conveyed to the projection device 58 so that the image presented on the touch surface 60 reflects the pointer activity.


If desired, the image processing results during pointer existence determination for both the upper and lower camera sensors 72 and 74 of each digital camera assembly 70 can be compared to verify the existence of the pointer. Pointer characteristic data for each captured image can also be generated and compared to verify the location of the pointer within the captured images. Also, as the camera sensors 72 and 74 of each digital camera assembly 70 both see the same pointer when the pointer is brought towards the touch surface 60, pointer data derived from acquired images can be used to calibrate the camera sensors 72 and 74 of the digital camera assemblies 70.


Although the digital camera assemblies 70 are described as having vertically stacked camera sensors 72 and 74 with the field of view of the wide angle camera sensor 72 fully encompassing the field of view of the narrow angle camera sensor 74, those of skill in the art will appreciate that other camera assembly arrangements are possible. For example, the camera sensors 72 and 74 of the digital camera assemblies 70 may be arranged side-by-side with the field of view of the wide angle camera sensors 72 still fully encompassing the field of view of the narrow angle camera sensors 74. Of course, other camera sensor orientations are possible. The field of view of the wide angle camera sensor 72 need not fully encompass the field of view of the narrow angle camera sensor 74. The fields of view of the wide angle and narrow angle camera sensors can of course only partially overlap. As will be appreciated, in this arrangement there is less redundancy.


In the embodiment described above, although each camera assembly 70 is described as comprising two camera sensors 72 and 74 communicating with a single processing unit, each camera sensor may communicate with an associated processing unit. In this case, the processing units of each camera assembly 70 communicate to determine which processing unit is to provide pointer data to the master controller 54. In situations where a pointer is seen best by one camera sensor but the pointer is moving in a direction that is better viewed by the other camera sensor, the processing units can communicate pointer data between one another to ensure accurate pointer tracking as responsibility for tracking the pointer is handed from one processing unit to the other.


The touch system 50 as described above comprises a pair of digital camera assemblies 70 positioned adjacent the top corners of the touch surface 60. Those of skill in the art will appreciate that additional camera assemblies 70 may be disposed about the periphery of the touch surface 60, especially when the touch surface is very large as described in aforementioned U.S. patent Ser. No. 10/750,219 to Hill et al.


As will be appreciated by those of skill in the art, the pointer may be a finger, a passive or active stylus or other object, a spot of light or other radiation or other indicator that can be seen by the cameras. Although the touch system is described as including digital cameras, other imaging devices such as for example linear optical sensors that are capable of generating an image may be employed.


In the embodiments described above, pointer contacts made on a touch surface are detected and tracked. Those of skill in the art will appreciate that a touch surface is not required and that pointers intersecting a two-dimensional plane or within a three-dimensional volume that is viewed by the imaging devices may be detected and tracked.


Although embodiments have been described above, those of skill in the art will also appreciate that variations and modifications may be made without departing from the spirit and scope thereof as defined by the appended claims.

Claims
  • 1. An interactive input system comprising: imaging devices with different viewpoints and having at least partially overlapping fields of view of a touch surface, at least one of the imaging devices comprising first and second adjacent image sensors with different focal lengths, the first image sensor having a field of view encompassing generally the entirety of the touch surface and the second image sensor having a field of view encompassing only a portion of the touch surface; andprocessing structure configured to process image data acquired by the imaging devices to determine if a pointer image exists in the image data with a desired level of confidence and generate pointer data, wherein for each of said at least one imaging devices having said first and second adjacent image sensors with different focal lengths, said processing structure configured to generate the pointer data using image data acquired by only one of said first and second adjacent image sensors and to determine the location of a pointer on the touch surface based on the pointer data,wherein for each pointer contact with said touch surface at a location that is proximate the at least one imaging device, the pointer image is determined to exist in the image data with the desired level of confidence in only image data acquired by the first image sensor, andwherein for each pointer contact with said touch surface that is distant from the at least one imaging device, the pointer image is determined to exist in the image data with the desired level of confidence in only image data acquired by the second image sensor.
  • 2. The interactive input system according to claim 1 wherein all of said imaging devices comprise first and second adjacent image sensors with different focal lengths.
  • 3. The interactive input system according to claim 2 wherein the first and second adjacent image sensors are vertically stacked so that their optical axes are in line.
  • 4. The interactive input system according to claim 2 wherein the first and second adjacent image sensors are side-by-side.
  • 5. The interactive input system according to claim 2 wherein the imaging devices are positioned adjacent respective corners of the touch surface, the imaging devices looking generally across said touch surface.
  • 6. The interactive input system according to claim 1 wherein the imaging devices are positioned adjacent respective corners of the touch surface, the imaging devices looking generally across said touch surface.
  • 7. The interactive input system according to claim 1 wherein said desired level of confidence is existence of a pointer image having a size greater than a threshold size.
  • 8. A touch system comprising: a touch surface on which an image is visible;imaging assemblies about the periphery of said touch surface, said imaging assemblies having at least partially overlapping fields of view encompassing said touch surface, each imaging assembly comprising at least two proximate imaging devices with each imaging device having a different focal length, a first of said two proximate imaging devices having a field of view encompassing the entirety of said touch surface and a second of said two proximate imaging devices having a field of view encompassing only a portion of said touch surface; andprocessing structure configured to process image data generated by the imaging assemblies to determine the location of at least one pointer relative to the touch surface based on image data acquired by the imaging assemblies, wherein each imaging assembly generates said pointer data using image data acquired by only one of the first and second imaging devices thereof, wherein said processing structure is configured to process image data acquired by each imaging device to determine if a pointer image exists in the image data with a desired level of confidence,wherein when the pointer contacts the touch surface proximate a particular imaging assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the first imaging device of the particular imaging assembly, andwherein when the pointer contacts the touch surface distant the particular imaging assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the second imaging device.
  • 9. The touch system according to claim 8 wherein the first and second imaging devices of each imaging assembly are vertically stacked so that their optical axes are in line.
  • 10. The touch system according to claim 8 wherein the first and second imaging devices of each imaging assembly are side-by-side.
  • 11. The touch system according to claim 8 wherein said touch surface is rectangular and wherein imaging assemblies are positioned at least adjacent two opposite corners thereof.
  • 12. An interactive input system comprising: camera assemblies with different viewpoints and having fields of view encompassing a touch surface, each camera assembly comprising at least first and second adjacent image sensors with the image sensors having different focal lengths, the first image sensor having a field of view encompassing generally the entirety of the touch surface and the second image sensor having a field of view encompassing only a portion of the touch surface; andprocessing structure configured to process image data received from said camera assemblies to determine the position of at least one pointer relative to said touch surface based on image data acquired by the camera assemblies, wherein the image data provided to the processing structure by each camera assembly is based only on image data acquired by one of the first and second adjacent image sensors thereof, wherein said processing structure processes image data acquired by each image sensor to determine if a pointer image is believed to exist in the image data with a desired level of confidence,wherein when the pointer contacts the touch surface proximate a particular camera assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the first image sensor of the particular camera assembly, andwherein when the pointer contacts the touch surface distant the particular camera assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the second image sensor of the particular camera assembly.
  • 13. The interactive input system according to claim 12 wherein said desired level of confidence is existence of a pointer image having a size greater than a threshold size.
  • 14. The interactive input system according to claim 12 wherein the first and second adjacent image sensors of each camera assembly are vertically stacked so that their optical axes are in line.
  • 15. The interactive input system according to claim 12 wherein the first and second adjacent image sensors of each camera assembly are side-by-side.
  • 16. The interactive input system according to claim 12 wherein the camera assemblies are positioned adjacent corners of the touch surface, the first and second adjacent image sensors of each camera assembly looking generally across said touch surface.
  • 17. An interactive input system comprising: imaging assemblies at spaced locations about the periphery of a touch surface, said imaging assemblies having at least partially overlapping fields of view encompassing said touch surface and acquiring images looking generally across said touch surface, each imaging assembly comprising at least two proximate imaging devices with each imaging device having a different focal length, a first of said two proximate imaging devices having a field of view encompassing the entirety of said touch surface and a second of said two proximate imaging devices having a field of view encompassing only a portion of said touch surface, image data acquired by said imaging assemblies being processed to determine the location of at least one pointer relative to the touch surface based on the acquired image data, wherein image data acquired by only one of said first and second imaging devices of each imaging assembly is used to determine the location of the pointer, wherein image data acquired by each image device is used to determine if a pointer image is believed to exist in the image data with a desired level of confidence,wherein when the pointer contacts the touch surface proximate a particular imaging assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the first imaging device of the particular imaging assembly, andwherein when the pointer contacts the touch surface distant the particular imaging assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the second imaging device of the particular imaging assembly.
  • 18. The interactive input system according to claim 17 wherein the first and second imaging devices of each imaging assembly are vertically stacked so that their optical axes are in line.
  • 19. The interactive input system according to claim 17 wherein the first and second imaging devices of each imaging assembly are side-by-side.
  • 20. The interactive input system according to claim 17 wherein said touch surface is generally rectangular and wherein imaging assemblies are positioned at least adjacent two corners thereof.
  • 21. The interactive input system according to claim 17 wherein said touch surface is generally rectangular and wherein imaging assemblies are positioned at least adjacent two corners thereof.
  • 22. The interactive input system according to claim 17 wherein the imaging device that best sees the pointer is the imaging device that sees a largest pointer image having a size greater than a threshold size.
  • 23. An interactive input system comprising: camera assemblies with different viewpoints and having fields of view encompassing a touch surface, each camera assembly comprising at least first and second closely positioned image sensors with the image sensors having different focal lengths and acquiring images of said touch surface, the first image sensor having a field of view encompassing the entirety of the touch surface and the second image sensor having a field of view encompassing only a portion of the touch surface, image data acquired by said camera assemblies being processed to determine the location of at least one pointer relative to the touch surface based on the acquired image data, wherein image data acquired only one of said first and second image sensors of each imaging assembly is used to determine the location of the pointer, wherein image data acquired by each image sensor is used to determine if a pointer image is believed to exist in the image data with a desired level of confidence,wherein when the pointer contacts the touch surface proximate a particular camera assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the first image sensor of the particular camera assembly, andwherein when the pointer contacts the touch surface distant the particular camera assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the second image sensor of the particular camera assembly.
  • 24. The interactive input system according to claim 23 wherein the first and second image sensors of each camera assembly are vertically stacked so that their optical axes are in line.
  • 25. The interactive input system according to claim 23 wherein the first and second image sensors of each camera assembly are side-by-side.
  • 26. The interactive input system according to claim 23 wherein the camera assemblies are positioned adjacent corners of the touch surface, the first and second image sensors of each camera assembly looking generally across said touch surface.
  • 27. An interactive input system comprising: imaging assemblies with different viewpoints and having fields of view encompassing a touch surface, each imaging assembly comprising at least first and second imaging devices with each imaging device having a different focal length, the first imaging device of each imaging assembly having a focal length encompassing the entirety of the touch surface and the second imaging device of each imaging assembly having a focal length encompassing only a portion of the touch surface, wherein image data acquired only by either the first imaging device or by the second imaging device of each imaging assembly is processed to determine the location of at least one pointer within the region of interest, wherein image data acquired by each imaging device is used to determine if a pointer image is believed to exist in the image data with a desired level of confidence,wherein when a pointer contacts the touch surface proximate a particular imaging assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the first imaging device of the particular imaging assembly, andwherein when the pointer contacts the touch surface distant the particular imaging assembly, the pointer image is detected with the desired level of confidence in only image data acquired by the second imaging device of the particular imaging assembly.
  • 28. The interactive input system according to claim 27 wherein the first and second imaging devices of each imaging assembly are vertically stacked so that their optical axes are in line.
  • 29. The interactive input system according to claim 27 wherein the first and second imaging devices of each imaging assembly are side-by-side.
  • 30. The interactive input system according to claim 27 wherein said region of interest is generally rectangular and wherein said imaging assemblies are positioned at least adjacent two corners thereof.
US Referenced Citations (419)
Number Name Date Kind
2769374 Sick Nov 1956 A
3025406 Stewart et al. Mar 1962 A
3128340 Harmon Apr 1964 A
3187185 Milnes Jun 1965 A
3360654 Muller Dec 1967 A
3478220 Milroy Nov 1969 A
3613066 Cooreman Oct 1971 A
3764813 Clement et al. Oct 1973 A
3775560 Ebeling et al. Nov 1973 A
3857022 Rebane et al. Dec 1974 A
3860754 Johnson et al. Jan 1975 A
4107522 Walter Aug 1978 A
4144449 Funk et al. Mar 1979 A
4243879 Carroll et al. Jan 1981 A
4247767 O'Brien et al. Jan 1981 A
4420261 Barlow et al. Dec 1983 A
4459476 Weissmueller et al. Jul 1984 A
4468694 Edgar Aug 1984 A
4507557 Tsikos Mar 1985 A
4550250 Mueller et al. Oct 1985 A
4553842 Griffin Nov 1985 A
4558313 Garwin et al. Dec 1985 A
4639720 Rympalski et al. Jan 1987 A
4672364 Lucas Jun 1987 A
4673918 Adler et al. Jun 1987 A
4703316 Sherbeck Oct 1987 A
4710760 Kasday Dec 1987 A
4737631 Sasaki et al. Apr 1988 A
4742221 Sasaki et al. May 1988 A
4746770 McAvinney May 1988 A
4762990 Caswell et al. Aug 1988 A
4766424 Adler et al. Aug 1988 A
4782328 Denlinger Nov 1988 A
4811004 Person et al. Mar 1989 A
4818826 Kimura Apr 1989 A
4820050 Griffin Apr 1989 A
4822145 Staelin Apr 1989 A
4831455 Ishikawa et al. May 1989 A
4851664 Rieger Jul 1989 A
4868551 Arditty et al. Sep 1989 A
4868912 Doering Sep 1989 A
4888479 Tamaru Dec 1989 A
4893120 Doering et al. Jan 1990 A
4916308 Meadows Apr 1990 A
4928094 Smith May 1990 A
4943806 Masters et al. Jul 1990 A
4980547 Griffin Dec 1990 A
4990901 Beiswenger Feb 1991 A
5025314 Tang et al. Jun 1991 A
5025411 Tallman et al. Jun 1991 A
5097516 Amir Mar 1992 A
5103085 Zimmerman Apr 1992 A
5105186 May Apr 1992 A
5109435 Lo et al. Apr 1992 A
5130794 Ritcher Jul 1992 A
5140647 Ise et al. Aug 1992 A
5148015 Dolan Sep 1992 A
5162618 Knowles Nov 1992 A
5162783 Moreno Nov 1992 A
5164714 Wehrer Nov 1992 A
5168531 Sigel Dec 1992 A
5179369 Person et al. Jan 1993 A
5196835 Blue et al. Mar 1993 A
5196836 Williams Mar 1993 A
5239152 Caldwell et al. Aug 1993 A
5239373 Tang et al. Aug 1993 A
5272470 Zetts Dec 1993 A
5317140 Dunthorn May 1994 A
5359155 Heiser Oct 1994 A
5374971 Clapp et al. Dec 1994 A
5414413 Tamaru et al. May 1995 A
5422494 West et al. Jun 1995 A
5448263 Martin Sep 1995 A
5457289 Huang et al. Oct 1995 A
5483261 Yasutake Jan 1996 A
5483603 Luke et al. Jan 1996 A
5484966 Segan Jan 1996 A
5490655 Bates Feb 1996 A
5502568 Ogawa et al. Mar 1996 A
5525764 Junkins et al. Jun 1996 A
5528263 Platzker et al. Jun 1996 A
5528290 Saund Jun 1996 A
5537107 Funado Jul 1996 A
5554828 Primm Sep 1996 A
5581276 Cipolla et al. Dec 1996 A
5581637 Cass et al. Dec 1996 A
5591945 Kent Jan 1997 A
5594469 Freeman et al. Jan 1997 A
5594502 Bito et al. Jan 1997 A
5617312 Iura et al. Apr 1997 A
5638092 Eng et al. Jun 1997 A
5670755 Kwon Sep 1997 A
5686942 Ball Nov 1997 A
5698845 Kodama et al. Dec 1997 A
5729704 Stone et al. Mar 1998 A
5734375 Knox et al. Mar 1998 A
5736686 Perret, Jr. et al. Apr 1998 A
5737740 Henderson et al. Apr 1998 A
5739479 Davis-Cannon et al. Apr 1998 A
5745116 Pisutha-Arnond Apr 1998 A
5764223 Chang et al. Jun 1998 A
5771039 Ditzik Jun 1998 A
5784054 Armstrong et al. Jul 1998 A
5785439 Bowen Jul 1998 A
5786810 Knox et al. Jul 1998 A
5790910 Haskin Aug 1998 A
5801704 Oohara et al. Sep 1998 A
5804773 Wilson et al. Sep 1998 A
5818421 Ogino et al. Oct 1998 A
5818424 Korth Oct 1998 A
5819201 DeGraaf Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5831602 Sato et al. Nov 1998 A
5854491 Pryor et al. Dec 1998 A
5909210 Knox et al. Jun 1999 A
5911004 Ohuchi et al. Jun 1999 A
5914709 Graham et al. Jun 1999 A
5920342 Umeda et al. Jul 1999 A
5936615 Waters Aug 1999 A
5940065 Babb et al. Aug 1999 A
5943783 Jackson Aug 1999 A
5963199 Kato et al. Oct 1999 A
5982352 Pryor Nov 1999 A
5988645 Downing Nov 1999 A
5990874 Tsumura et al. Nov 1999 A
6002808 Freeman Dec 1999 A
6008798 Mato, Jr. et al. Dec 1999 A
6031531 Kimble Feb 2000 A
6061177 Fujimoto May 2000 A
6075905 Herman et al. Jun 2000 A
6076041 Watanabe Jun 2000 A
6091406 Kambara et al. Jul 2000 A
6100538 Ogawa Aug 2000 A
6104387 Chery et al. Aug 2000 A
6118433 Jenkin et al. Sep 2000 A
6122865 Branc et al. Sep 2000 A
6128003 Smith et al. Oct 2000 A
6141000 Martin Oct 2000 A
6144366 Numazaki et al. Nov 2000 A
6147678 Kumar et al. Nov 2000 A
6153836 Goszyk Nov 2000 A
6161066 Wright et al. Dec 2000 A
6179426 Rodriguez, Jr. et al. Jan 2001 B1
6188388 Arita et al. Feb 2001 B1
6191773 Maruno et al. Feb 2001 B1
6208329 Ballare Mar 2001 B1
6208330 Hasegawa et al. Mar 2001 B1
6209266 Branc et al. Apr 2001 B1
6215477 Morrison et al. Apr 2001 B1
6222175 Krymski Apr 2001 B1
6226035 Korein et al. May 2001 B1
6232962 Davis et al. May 2001 B1
6252989 Geisler et al. Jun 2001 B1
6256033 Nguyen Jul 2001 B1
6262718 Findlay et al. Jul 2001 B1
6310610 Beaton et al. Oct 2001 B1
6320597 Ieperen Nov 2001 B1
6323846 Westerman Nov 2001 B1
6326954 Van Ieperen Dec 2001 B1
6328270 Elberbaum Dec 2001 B1
6335724 Takekawa et al. Jan 2002 B1
6337681 Martin Jan 2002 B1
6339748 Hiramatsu Jan 2002 B1
6346966 Toh Feb 2002 B1
6352351 Ogasahara et al. Mar 2002 B1
6353434 Akebi Mar 2002 B1
6359612 Peter et al. Mar 2002 B1
6362468 Murakami et al. Mar 2002 B1
6377228 Jenkin et al. Apr 2002 B1
6384743 Vanderheiden May 2002 B1
6414671 Gillespie et al. Jul 2002 B1
6414673 Wood et al. Jul 2002 B1
6421042 Omura et al. Jul 2002 B1
6427389 Branc et al. Aug 2002 B1
6429856 Omura et al. Aug 2002 B1
6429857 Masters et al. Aug 2002 B1
6480187 Sano et al. Nov 2002 B1
6496122 Sampsell Dec 2002 B2
6497608 Ho et al. Dec 2002 B2
6498602 Ogawa Dec 2002 B1
6504532 Ogasahara et al. Jan 2003 B1
6507339 Tanaka Jan 2003 B1
6512513 Fleck et al. Jan 2003 B2
6512838 Rafii et al. Jan 2003 B1
6517266 Saund Feb 2003 B2
6518600 Shaddock Feb 2003 B1
6522830 Yamagami Feb 2003 B2
6529189 Colgan et al. Mar 2003 B1
6530664 Vanderwerf et al. Mar 2003 B2
6531999 Trajkovic Mar 2003 B1
6532006 Takekawa et al. Mar 2003 B1
6540366 Keenan et al. Apr 2003 B2
6540679 Slayton et al. Apr 2003 B2
6545669 Kinawi et al. Apr 2003 B1
6559813 DeLuca et al. May 2003 B1
6563491 Omura May 2003 B1
6567078 Ogawa May 2003 B2
6567121 Kuno May 2003 B1
6570103 Saka et al. May 2003 B1
6570612 Saund et al. May 2003 B1
6577299 Schiller et al. Jun 2003 B1
6587099 Takekawa Jul 2003 B2
6590568 Astala et al. Jul 2003 B1
6594023 Omura et al. Jul 2003 B1
6597508 Seino et al. Jul 2003 B2
6603867 Sugino et al. Aug 2003 B1
6608619 Omura et al. Aug 2003 B2
6614422 Rafii et al. Sep 2003 B1
6624833 Kumar et al. Sep 2003 B1
6626718 Hiroki Sep 2003 B2
6630922 Fishkin et al. Oct 2003 B2
6633328 Byrd et al. Oct 2003 B1
6650318 Arnon Nov 2003 B1
6650822 Zhou Nov 2003 B1
6674424 Fujioka Jan 2004 B1
6683584 Ronzani et al. Jan 2004 B2
6690357 Dunton et al. Feb 2004 B1
6690363 Newton Feb 2004 B2
6690397 Daignault, Jr. Feb 2004 B1
6710770 Tomasi et al. Mar 2004 B2
6714311 Hashimoto Mar 2004 B2
6720949 Pryor et al. Apr 2004 B1
6736321 Tsikos et al. May 2004 B2
6738051 Boyd et al. May 2004 B2
6741250 Furlan et al. May 2004 B1
6747636 Martin Jun 2004 B2
6756910 Ohba et al. Jun 2004 B2
6760009 Omura et al. Jul 2004 B2
6760999 Branc et al. Jul 2004 B2
6774889 Zhang et al. Aug 2004 B1
6778207 Lee et al. Aug 2004 B1
6803906 Morrison et al. Oct 2004 B1
6828959 Takekawa et al. Dec 2004 B2
6829372 Fujioka Dec 2004 B2
6864882 Newton Mar 2005 B2
6911972 Brinjes Jun 2005 B2
6919880 Morrison et al. Jul 2005 B2
6927384 Reime et al. Aug 2005 B2
6933981 Kishida et al. Aug 2005 B1
6947032 Morrison et al. Sep 2005 B2
6954197 Morrison et al. Oct 2005 B2
6972401 Akitt et al. Dec 2005 B2
6972753 Kimura et al. Dec 2005 B1
6985937 Keshav et al. Jan 2006 B1
7002555 Jacobsen et al. Feb 2006 B1
7007236 Dempski et al. Feb 2006 B2
7015418 Cahill et al. Mar 2006 B2
7030861 Westerman et al. Apr 2006 B1
7057647 Monroe Jun 2006 B1
7058204 Hildreth et al. Jun 2006 B2
7075054 Iwamoto et al. Jul 2006 B2
7084857 Lieberman et al. Aug 2006 B2
7084859 Pryor Aug 2006 B1
7084868 Farag et al. Aug 2006 B2
7113174 Takekawa et al. Sep 2006 B1
7151533 Van Ieperen Dec 2006 B2
7176904 Satoh Feb 2007 B2
7184030 McCharles et al. Feb 2007 B2
7187489 Miles Mar 2007 B2
7190348 Kennedy et al. Mar 2007 B2
7190496 Klug et al. Mar 2007 B2
7202860 Ogawa Apr 2007 B2
7227526 Hildreth et al. Jun 2007 B2
7232986 Worthington et al. Jun 2007 B2
7236162 Morrison et al. Jun 2007 B2
7237937 Kawashima et al. Jul 2007 B2
7242388 Lieberman et al. Jul 2007 B2
7265748 Ryynanen Sep 2007 B2
7268692 Lieberman et al. Sep 2007 B1
7274356 Ung et al. Sep 2007 B2
7283126 Leung Oct 2007 B2
7283128 Sato Oct 2007 B2
7289113 Martin Oct 2007 B2
7302156 Lieberman et al. Nov 2007 B1
7305368 Lieberman et al. Dec 2007 B2
7330184 Leung Feb 2008 B2
7333094 Lieberman et al. Feb 2008 B2
7333095 Lieberman et al. Feb 2008 B1
7355593 Hill et al. Apr 2008 B2
7372456 McLintock May 2008 B2
7375720 Tanaka May 2008 B2
RE40368 Arnon Jun 2008 E
7411575 Hill et al. Aug 2008 B2
7414617 Ogawa Aug 2008 B2
7479949 Jobs et al. Jan 2009 B2
7492357 Morrison et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7538759 Newton May 2009 B2
7559664 Walleman et al. Jul 2009 B1
7619617 Morrison et al. Nov 2009 B2
7692625 Morrison et al. Apr 2010 B2
8456451 Morrison Jun 2013 B2
20010019325 Takekawa Sep 2001 A1
20010022579 Hirabayashi Sep 2001 A1
20010026268 Ito Oct 2001 A1
20010033274 Ong Oct 2001 A1
20010050677 Tosaya Dec 2001 A1
20010055006 Sano et al. Dec 2001 A1
20020008692 Omura et al. Jan 2002 A1
20020015159 Hashimoto Feb 2002 A1
20020036617 Pryor Mar 2002 A1
20020041327 Hildreth et al. Apr 2002 A1
20020050979 Oberoi et al. May 2002 A1
20020064382 Hildreth et al. May 2002 A1
20020067922 Harris Jun 2002 A1
20020075243 Newton Jun 2002 A1
20020080123 Kennedy et al. Jun 2002 A1
20020118177 Newton Aug 2002 A1
20020145595 Satoh Oct 2002 A1
20020163530 Takakura et al. Nov 2002 A1
20030001825 Omura et al. Jan 2003 A1
20030025951 Pollard et al. Feb 2003 A1
20030043116 Morrison et al. Mar 2003 A1
20030063073 Geaghan et al. Apr 2003 A1
20030071858 Morohoshi Apr 2003 A1
20030085871 Ogawa May 2003 A1
20030095112 Kawano et al. May 2003 A1
20030137494 Tulbert Jul 2003 A1
20030142880 Hyodo Jul 2003 A1
20030151532 Chen et al. Aug 2003 A1
20030151562 Kulas Aug 2003 A1
20030156118 Ayinde Aug 2003 A1
20030161524 King Aug 2003 A1
20030210803 Kaneda et al. Nov 2003 A1
20030227492 Wilde et al. Dec 2003 A1
20040001144 McCharles et al. Jan 2004 A1
20040012573 Morrison et al. Jan 2004 A1
20040021633 Rajkowski Feb 2004 A1
20040031779 Cahill et al. Feb 2004 A1
20040032401 Nakazawa et al. Feb 2004 A1
20040046749 Ideda Mar 2004 A1
20040051709 Ogawa et al. Mar 2004 A1
20040071363 Kouri et al. Apr 2004 A1
20040108990 Lieberman et al. Jun 2004 A1
20040125086 Hagermoser et al. Jul 2004 A1
20040149892 Akitt et al. Aug 2004 A1
20040150630 Hinckley et al. Aug 2004 A1
20040169639 Pate et al. Sep 2004 A1
20040178993 Morrison et al. Sep 2004 A1
20040178997 Gillespie et al. Sep 2004 A1
20040179001 Morrison et al. Sep 2004 A1
20040189720 Wilson et al. Sep 2004 A1
20040201575 Morrison Oct 2004 A1
20040204129 Payne et al. Oct 2004 A1
20040218479 Iwamoto et al. Nov 2004 A1
20040221265 Leung et al. Nov 2004 A1
20040252091 Ma et al. Dec 2004 A1
20050012573 Grandchamp Jan 2005 A1
20050052427 Wu et al. Mar 2005 A1
20050057524 Hill et al. Mar 2005 A1
20050077452 Morrison et al. Apr 2005 A1
20050083308 Homer et al. Apr 2005 A1
20050104860 McCreary et al. May 2005 A1
20050128190 Ryynanen Jun 2005 A1
20050151733 Sander et al. Jul 2005 A1
20050156900 Hill et al. Jul 2005 A1
20050190162 Newton Sep 2005 A1
20050241929 Auger et al. Nov 2005 A1
20050243070 Ung et al. Nov 2005 A1
20050243411 Cook Nov 2005 A1
20050248539 Morrison et al. Nov 2005 A1
20050248540 Newton Nov 2005 A1
20050270781 Marks Dec 2005 A1
20050276448 Pryor Dec 2005 A1
20060012579 Sato Jan 2006 A1
20060022962 Morrison et al. Feb 2006 A1
20060028456 Kang Feb 2006 A1
20060034486 Morrison et al. Feb 2006 A1
20060152500 Weng Jul 2006 A1
20060158437 Blythe et al. Jul 2006 A1
20060163446 Guyer et al. Jul 2006 A1
20060170658 Nakamura et al. Aug 2006 A1
20060192799 Vega et al. Aug 2006 A1
20060197749 Popovich Sep 2006 A1
20060202953 Pryor et al. Sep 2006 A1
20060227120 Eikman Oct 2006 A1
20060244734 Hill et al. Nov 2006 A1
20060274067 Hidai Dec 2006 A1
20060279558 Van Delden et al. Dec 2006 A1
20070002028 Morrison et al. Jan 2007 A1
20070019103 Lieberman et al. Jan 2007 A1
20070075648 Blythe et al. Apr 2007 A1
20070075982 Morrison et al. Apr 2007 A1
20070089915 Ogawa et al. Apr 2007 A1
20070116333 Dempski et al. May 2007 A1
20070126755 Zhang et al. Jun 2007 A1
20070139932 Sun et al. Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070152986 Ogawa Jul 2007 A1
20070165007 Morrison et al. Jul 2007 A1
20070167709 Slayton et al. Jul 2007 A1
20070205994 van Ieperen Sep 2007 A1
20070236454 Ung et al. Oct 2007 A1
20070269107 Iwai et al. Nov 2007 A1
20070273842 Morrison et al. Nov 2007 A1
20070285805 Lundgren Dec 2007 A1
20070290996 Ting Dec 2007 A1
20070291125 Marquet Dec 2007 A1
20080029691 Han Feb 2008 A1
20080042999 Martin Feb 2008 A1
20080055262 Wu et al. Mar 2008 A1
20080055267 Wu et al. Mar 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080062149 Baruk Mar 2008 A1
20080068352 Worthington et al. Mar 2008 A1
20080083602 Auger et al. Apr 2008 A1
20080106706 Holmgren et al. May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080129707 Pryor Jun 2008 A1
20080259050 Lin et al. Oct 2008 A1
20080259052 Lin et al. Oct 2008 A1
20080292196 Jain et al. Nov 2008 A1
20090058832 Newton Mar 2009 A1
20090058833 Newton Mar 2009 A1
20090146972 Morrison et al. Jun 2009 A1
20100193259 Wassvik Aug 2010 A1
20100309169 Lieberman Dec 2010 A1
20110006981 Chtchetinine Jan 2011 A1
20110316814 Kao Dec 2011 A1
Foreign Referenced Citations (144)
Number Date Country
2003233728 Dec 2003 AU
2006243730 Nov 2006 AU
2058219 Apr 1993 CA
2367864 Apr 1993 CA
2219886 Apr 1999 CA
2251221 Apr 1999 CA
2267733 Oct 1999 CA
2268208 Oct 1999 CA
2252302 Apr 2000 CA
2350152 Jun 2001 CA
2412878 Jan 2002 CA
2341918 Sep 2002 CA
2386094 Dec 2002 CA
2372868 Aug 2003 CA
2390503 Dec 2003 CA
2390506 Dec 2003 CA
2432770 Dec 2003 CA
2493236 Dec 2003 CA
2448603 May 2004 CA
2453873 Jul 2004 CA
2460449 Sep 2004 CA
2521418 Oct 2004 CA
2481396 Mar 2005 CA
2491582 Jul 2005 CA
2563566 Nov 2005 CA
2564262 Nov 2005 CA
2501214 Sep 2006 CA
2606863 Nov 2006 CA
2580046 Sep 2007 CA
1310126 Aug 2001 CN
1784649 Jun 2006 CN
101019096 Aug 2007 CN
101023582 Aug 2007 CN
1440539 Sep 2009 CN
3836429 May 1990 DE
198 10 452 Dec 1998 DE
60124549 Sep 2007 DE
0125068 Nov 1984 EP
0279652 Aug 1988 EP
0347725 Dec 1989 EP
0420335 Apr 1991 EP
0 657 841 Jun 1995 EP
0762319 Mar 1997 EP
0829798 Mar 1998 EP
0897161 Feb 1999 EP
0911721 Apr 1999 EP
1059605 Dec 2000 EP
1262909 Dec 2002 EP
1739528 Jan 2003 EP
1739529 Jan 2003 EP
1420335 May 2004 EP
1 450 243 Aug 2004 EP
1457870 Sep 2004 EP
1471459 Oct 2004 EP
1517228 Mar 2005 EP
1550940 Jun 2005 EP
1611503 Jan 2006 EP
1674977 Jun 2006 EP
1741186 Jan 2007 EP
1766501 Mar 2007 EP
1830248 Sep 2007 EP
1877893 Jan 2008 EP
2279823 Sep 2007 ES
2176282 May 1986 GA
1575420 Sep 1980 GB
2204126 Nov 1988 GB
2263765 Aug 1993 GB
57-211637 Dec 1982 JP
61-196317 Aug 1986 JP
62-005428 Jan 1987 JP
63-223819 Sep 1988 JP
3-054618 Mar 1991 JP
03-244017 Oct 1991 JP
4-355815 Dec 1992 JP
5-181605 Jul 1993 JP
5-189137 Jul 1993 JP
5-197810 Aug 1993 JP
06-110608 Apr 1994 JP
7-110733 Apr 1995 JP
7-230352 Aug 1995 JP
8-016931 Feb 1996 JP
8-108689 Apr 1996 JP
8-240407 Sep 1996 JP
8-315152 Nov 1996 JP
9-091094 Apr 1997 JP
9-224111 Aug 1997 JP
9-319501 Dec 1997 JP
10-105324 Apr 1998 JP
10-222646 Aug 1998 JP
11-064026 Mar 1999 JP
11-085376 Mar 1999 JP
11-110116 Apr 1999 JP
11-203042 Jul 1999 JP
11-212692 Aug 1999 JP
2000-105671 Apr 2000 JP
2000-132340 May 2000 JP
2001-142642 May 2001 JP
2001-282456 Oct 2001 JP
2002-055770 Feb 2002 JP
2002-236547 Aug 2002 JP
2003-65716 Mar 2003 JP
2003-167669 Jun 2003 JP
2003-173237 Jun 2003 JP
2005-108211 Apr 2005 JP
2005-182423 Jul 2005 JP
2005-202950 Jul 2005 JP
9807112 Feb 1998 WO
9908897 Feb 1999 WO
9921122 Apr 1999 WO
9928812 Jun 1999 WO
9940562 Aug 1999 WO
0124157 Apr 2001 WO
0131570 May 2001 WO
0163550 Aug 2001 WO
0191043 Nov 2001 WO
0203316 Jan 2002 WO
0207073 Jan 2002 WO
0227461 Apr 2002 WO
03104887 Dec 2003 WO
03105074 Dec 2003 WO
2004072843 Aug 2004 WO
2004090706 Oct 2004 WO
2004102523 Nov 2004 WO
2004104810 Dec 2004 WO
2005031554 Apr 2005 WO
2005034027 Apr 2005 WO
2005107072 Nov 2005 WO
2006002544 Jan 2006 WO
2006092058 Sep 2006 WO
2006095320 Sep 2006 WO
2006096962 Sep 2006 WO
2006116869 Nov 2006 WO
2007003196 Jan 2007 WO
2007019600 Feb 2007 WO
2007037809 Apr 2007 WO
2007064804 Jun 2007 WO
2007079590 Jul 2007 WO
2007132033 Nov 2007 WO
2007134456 Nov 2007 WO
2008128096 Oct 2008 WO
2009029764 Mar 2009 WO
2009029767 Mar 2009 WO
2009146544 Dec 2009 WO
2010051633 May 2010 WO
Non-Patent Literature Citations (67)
Entry
Partial European Search Report for EP 03 25 7166 which was completed on May 19, 2006.
International Search Report with a date of mailing of Oct. 22, 2001 for PCT/CA 01/00980 with an International Filing Date of Jul. 5, 2001.
Bud K. Funk, CCDs in optical panels deliver high resolution, Electronic Design, Sep. 27, 1980, pp. 139-143.
Bernhard P. Wrobel, “Minimum Solutions for Orientation”, Calibration and Orientation of Cameras in Computer Vision, Springer Series in Infomation Sciences, vol. 34, 2001, pp. 28-33.
Kenichi Kanatani, “Camera Calibration”, Geometric Computation for Machine Vision, Oxford Engineering Science Series, vol. 37, 1993, pp. 56-63.
Richard Hartley and Andrew Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, First published 2000, Reprinted (with corrections) 2001, pp. 70-73, 92-93 and 98-99.
Wolfgang Förstner, “On Estimating Rotations”, Festschrift far Prof. Dr.-Ing. Heinrich Ebner zum 60. Geburtstag, Herausg.: C. Heipke und H. Mayer, Lehrstuhl für Photogrammetrie und Fernerkundung, TU München, 1999, 12 pages. (http://www.ipb.uni-bonn.de/papers/#1999).
European Search Report for EP 04 25 1392 for a search that was completed on Jan. 11, 2007.
European Search Report for EP 06 01 9269 for a search that was completed on Nov. 9, 2006.
European Search Report for EP 06 01 9268 for a search that was completed on Nov. 9, 2006.
European Search Report for EP 02 25 3594 for a search that was completed on Dec. 14, 2005.
Fie-Yue Wang, et al., “Stereo camera calibration without absolute world coordinate information”, SPIE, vol. 2620, pp. 655-662, Jun. 14, 1995.
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority regarding International Application No. PCT/CA2007/002184, as mailed on Mar. 13, 2008.
International Search Report for PCT/CA2008/001350 mailed Oct. 17, 2008 (5 Pages).
International Search Report and Written Opinion for PCT/CA2004/001759 mailed Feb. 21, 2005 (7 Pages).
International Search Report and Written Opinion for PCT/CA2009/000773 mailed Aug. 12, 2009 (11 Pages).
European Search Opinion for EP 07 25 0888 dated Jun. 22, 2007 (2 pages).
European Search Report for EP 07 25 0888 dated Jun. 22, 20067 (2 pages).
May 12, 2009 Office Action for Canadian Patent Application No. 2,412,878 (4 pages).
Tapper, C.C., et al., “On-Line Handwriting Recognition—A Survey”, Proceedings of the International Conference on Pattern Recognition (ICPR), Rome, Nov. 14-17, 1988, Washington, IEEE Comp. Soc. Press. US, vol. 2 Conf. 9, Nov. 14, 1988, pp. 1123-1132.
Press Release, “IntuiLab introduces IntuiFace, An interactive table and its application platform” Nov. 30, 2007.
Overview page for IntuiFace by IntuiLab, Copyright 2008.
NASA Small Business Innovation Research Program: Composite List of Projects 1983-1989, Aug. 1990.
Touch Panel, vol. 1 No. 1 (2005).
Touch Panel, vol. 1 No. 2 (2005).
Touch Panel, vol. 1 No. 3 (2006).
Touch Panel, vol. 1 No. 4 (2006).
Touch Panel, vol. 1 No. 5 (2006).
Touch Panel, vol. 1 No. 6 (2006).
Touch Panel, vol. 1 No. 7 (2006).
Touch Panel, vol. 1 No. 8 (2006).
Touch Panel, vol. 1 No. 9 (2006).
Touch Panel, vol. 1 No. 10 (2006).
Touch Panel, vol. 2 No. 1 (2006).
Touch Panel, vol. 2 No. 2 (2007).
Touch Panel, vol. 2 No. 3 (2007).
Touch Panel, vol. 2 No. 4 (2007).
Touch Panel, vol. 2 No. 5 (2007).
Touch Panel, vol. 2 No. 6 (2007).
Touch Panel, vol. 2 No. 7-8 (2008).
Touch Panel, vol. 2 No. 9-10 (2008).
Touch Panel, vol. 3 No. 1-2 (2008).
Touch Panel, vol. 3 No. 3-4 (2008).
Touch Panel, vol. 3 No. 5-6 (2009).
Touch Panel, vol. 3 No. 7-8 (2009).
Touch Panel, vol. 3 No. 9 (2009).
Touch Panel, vol. 4 No. 2-3 (2009).
English Translation of Decision of Rejection for Japanese Patent Application No. 2002-507309, date of Decision: Aug. 18, 2011, 9 pages.
International Preliminary Report on Patentability, PCT/NZ2004/000029, May 20, 2005 (21 pages).
“International Preliminary Report on Patentability”, PCT/US2008/060102, Oct. 22, 2009 (9 pages).
International Search Report for PCT/CA2010/001085 mailed Oct. 12, 2010 (5 pages).
“International Application Serial No. PCT/US2008/060102, Search Report & Written opinion mailed Feb. 12, 2009” (14 pages).
International Application Serial No. PCT/US2008/074749, Search Report & Written Opinion mailed Feb. 11, 2009 (10 pages).
“International Application Serial No. PCT/US2008/074755, International Search Report and Written Opinion mailed Jan. 29 2009” (14 pages).
International Search Report for PCT/NZ05/00092 Sep. 27, 2006 (4 pages).
Loinaz et al., “A 200-mW, 3.3-V, CMOS Color Camera IC Producing 352×288 24-B Video at 30 Frames/s,” IEEE Journal of Solid-StateCircuits,vol. 31,No. 12,Dec. 1998, pp. 2092-2103.
Yawcheng Lo, “Solid-state image sensor: technologies and applications,” Input/Output and Imaging Technologies, Y.T. Tsai, T-M. Kung, and J. Larsen, eds. SPIE Proceedings vol. 3422, pp. 70-80 (1998).
Touch Panel, vol. 5 No. 2-3 (Sep. 2010).
Touch Panel, vol. 5 No. 4 (Nov. 2010).
“Store Window Presentations”, Heddier Electronic.
“ThruGlass”, Projected Capacitive Touchscreencs Specifications, Micro Touch.
Benko, et al., “Precise Selection Techniques for Multi-Touch Screens”, Proc. ACM CHI 2006: Human Factors in Computer Systems, pp. 1263-1272.
Buxton, W., “Issues and Techniques in Touch-Sensitive Tablet Input,” Computer Graphics, 19(3), Proceedings of SIGGRAPH '85, 1985, pp. 215-223.
VGA-format CMOS Camera-on-a-Chip for Multimedia Applications, Photobit Corporation, 1999 (2 pages).
“White Paper”, Digital Vision Touch Technology Feb. 2003.
Aug. 24, 2011 letter from Olivares & CIA to David A. Ruston summarizing an Office Action for Mexican Patent Application No. MX/a/2009/005943.
Feb. 23, 2012 letter from Olivares & CIA to David A. Ruston summarizing an Office Action for Mexican Patent Application No. MX/a/2009/005943.
Related Publications (1)
Number Date Country
20080129700 A1 Jun 2008 US