This application claims priority from Canadian Patent Application No. 2,370,752 filed Feb. 5, 2002, the disclosure of which is incorporated herein by reference.
The invention relates to the field of computer graphics processing and more specifically to a method and system for the fast rendering of pyramid lens distorted raster images in detail-in-context presentations.
Display screens are the primary interface for displaying information from a computer. Display screens are limited in size, thus presenting a challenge to graphical user interface design, particularly when large amounts of information are to be displayed. This problem is normally referred to as the “screen real estate problem”.
Well-known solutions to this problem include panning, zooming, scrolling or combinations thereof. While these solutions are suitable for a large number of visual display applications, these solutions become less effective where sections of the visual information are spatially related, such as maps, three-dimensional representations, newspapers and such like. In this type of information display, panning, zooming and/or scrolling is not as effective as much of the context of the panned, zoomed or scrolled display is hidden.
A recent solution to this problem is the application of “detail-in-context” presentation techniques. Detail-in-context is the magnification of a particular region of interest (the “focal region” or “detail”) in a data presentation while preserving visibility of the surrounding information (the “context”). This technique has applicability to the display of large surface area media, such as maps, on computer screens of variable size including graphics workstations, laptop computers, personal digital assistants (“PDAs”), and cell phones.
In the detail-in-context discourse, differentiation is often made between the terms “representation” and “presentation”. A representation is a formal system, or mapping, for specifying raw information or data that is stored in a computer or data processing system. For example, a digital map of a city is a representation of raw data including street names and the relative geographic location of streets and utilities. Such a representation may be displayed visually on a computer screen or printed on paper. On the other hand, a presentation is a spatial organization of a given representation that is appropriate for the task at hand. Thus, a presentation of a representation organizes such things as the point of view and the relative emphasis of different parts or regions of the representation. For example, a digital map of a city may be presented with a region magnified to reveal street names.
In general, a detail-in-context presentation may be considered as a distorted view (or distortion) of a portion of the original representation where the distortion is the result of the application of a “lens” like distortion function to the original representation. A detailed review of various detail-in-context presentation techniques such as Elastic Presentation Space may be found in a publication by Marianne S. T. Carpendale, entitles “A Framework for Elastic Presentation Space” (Carpendale, Marianne S. T., A Framework for Elastic Presentation Space (Burnaby, British Columbia: Simon Fraser University, 1999)), and incorporated herein by reference.
In general, detail-in-context data presentations are characterized by magnification of areas of an image where detail is desired, in combination with compression of a restricted range of areas of the remaining information (i.e. the context), the result typically giving the appearance of a lens having been applied to the display surface. Using the techniques described by Carpendale, points in a representation are displaced in three dimensions and a perspective projection is used to display the points on a two-dimensional presentation display. Thus, when a lens is applied to a two-dimensional continuous surface representation, for example, the resulting presentation appears to be three-dimensional. In other words, the lens transformation appears to have stretched the continuous surface in a third dimension.
One shortcoming of these detail-in-context presentation systems is that the rendering of a raster image which has undergone a distortion through the application of one or more lenses (e.g. pyramid lenses) can be time consuming. Both the CPU and input/output systems of the detail-in-context presentation system can be taxed when performing such a rendering. Past techniques for rendering these images have usually involved texturing the raster image upon a three-dimensional triangulated surface. This technique can provide fast results, especially when combined with the use of accelerated three-dimensional rendering hardware, but rendering can be slow when no such hardware is present. It would be desirable to have a method that allows for the fast rendering of pyramid lenses without the requirement of hardware acceleration.
A need therefore exists for the fast and effective display of pyramid lens presentations in detail-in-context presentation systems. Consequently, it is an object of the present invention to obviate or mitigate at least some of the above mentioned disadvantages.
In general, the present invention provides for the effective presentation of pyramid lens type presentations in detail-in-context presentation systems through the use of improved rendering techniques.
According to one aspect of this invention, there is provided a method for generating a presentation of a region-of-interest in an information representation including the steps of: selecting a viewpoint for the region-of-interest; creating a lens surface for the region-of-interest; the lens surface having a focal region and a shoulder region surrounding the focal region; creating a transformed presentation by: determining boundaries in the representation for the focal region and the shoulder region; determining boundaries on the lens surface corresponding to the boundaries in the representation by applying a distortion function defining the lens surface to the boundaries in the representation; perspectively projecting the boundaries on the lens surface onto a plane spaced from the viewpoint; and, copying information in the representation lying within the boundaries in the representation onto the focal region and the shoulder region of the lens surface using respective focal region and shoulder region stretch bit-block transfer operations; and, displaying the transformed presentation on a display screen to generate the presentation of the region-of-interest.
Preferably, the lens surface shape is a pyramidal frustum shape having a top and a plurality of sides. The pyramidal frustum shape can be right square with a square shaped top and with first, second, third, and fourth quadrilateral shaped sides. The focal region is generally the top of the pyramidal frustum shape and the shoulder region includes the first, second, third, and fourth sides. The sides can be subdivided into a plurality of slices which can be, for example, one pixel high or wide.
Preferably, the steps of determining boundaries in the representation for the shoulder region and determining boundaries on the lens surface corresponding to the boundaries in the representation include: determining boundaries in the representation for each of the plurality of slices and determining boundaries on the lens surface corresponding to the boundaries in the representation for each of the plurality of slices, respectively. The boundaries can be determined by interpolation. And, the shoulder region stretch bit-block transfer operation can include respective shoulder region stretch bit-block transfer operations for each of the plurality of slices.
Preferably, the sides of the lens surface are subdivided into a plurality of quadrilaterals by a plurality of horizontal or vertical ribs. The plurality of quadrilaterals and ribs can include a plurality of slices. The steps of determining boundaries in the representation for the shoulder region and determining boundaries on the lens surface corresponding to the boundaries in the representation can include: determining boundaries in the representation for each of the plurality of ribs and determining boundaries on the lens surface corresponding to the boundaries in the representation for each of the plurality of ribs, respectively. And, the shoulder region stretch bit-block transfer operation can include respective shoulder region stretch bit-block transfer operations for each of the plurality of ribs or slices. Information for the lens surface lying between the plurality of ribs or slices can be determined by interpolation.
Preferably, the step of copying includes scaling the information. Scaling can include antialiasing and interpolation can include linear interpolation and perspective correct interpolation. In addition, the distortion function can include shoulder functions defining the shoulder regions.
Advantageously, the pyramid lens rendering method of the present invention may be applied to a wide variety of detail-in-context presentations where fast rendering is required and where hardware based accelerating devices are not available. In addition, the present invention does not require time consuming rasterizing of polygonal surfaces.
Embodiments of the invention may best be understood by referring to the following description and accompanying drawings. In the description and drawings, line numerals refer to like structures or processes. In the drawings:
In the following description, numerous specific details are set forth to provide a thorough understanding of the invention. However, it is understood that the invention may be practiced without these specific details. In other instances, well-known software, circuits, structures and techniques have not been described or shown in detail in order not to obscure the invention. In the drawings, like numerals refer to like structures or processes.
The term “data processing system” is used herein to refer to any machine for processing data, including the computer systems and network arrangements described herein. The term “Elastic Presentation Space” (“EPS”)(or “Pliable Display Technology” (“PDT”)) is used herein to refer to techniques that allow for the adjustment of a visual presentation without interfering with the information content of the representation. The adjective “elastic” is included in the term as it implies the capability of stretching and deformation and subsequent return to an original shape. EPS graphics technology is described by Carpendale in “A Framework for Elastic Presentation Space” (Carpendale, Marianne S. T., A Framework for Elastic Presentation Space (Burnaby, British Columbia: Simon Fraser University, 1999)), which is incorporated herein by reference. In EPS graphics technology, a two-dimensional visual representation is placed onto a surface; this surface is placed in three-dimensional space; the surface, containing the representation, is viewed through perspective projection; and the surface is manipulated to effect the reorganization of image details. The presentation transformation is separated into two steps; surface manipulation or distortion and perspective projection.
EPS is applicable to multidimensional data and is well suited to implementation on a computer for dynamic detail-in-context display on an electronic display surface such as a monitor. In the case of two dimensional data, EPS is typically characterized by magnification of areas of an image where detail is desired 233, in combination with compression of a restricted range of areas of the remaining information (i.e. the context) 234, the end result typically giving the appearance of a lens 230 having been applied to the display surface. The areas of the lens 230 where compression occurs may be referred to as the “shoulder” 234 of the lens 230. The area of the representation transformed by the lens may be referred to as the “lensed area”. The lensed area thus includes the focal region and the shoulder. To reiterate, the source image or representation to be viewed is located in the basal plane 210. Magnification 233 and compression 234 are achieved through elevating elements of the source image relative to the basal plane 210, and then projecting the resultant distorted surface onto the reference view plane 201. EPS performs detail-in-context presentation of n-dimensional data through the use of a procedure wherein the data is mapped into a region in an (n+1) dimensional space, manipulated through perspective projections in the (n+1) dimensional space, and then finally transformed back into n-dimensional space for presentation. EPS has numerous advantages over conventional zoom, pan, and scroll technologies, including the capability of preserving the visibility of information outside 234 the local region of interest 233.
For example, and referring to
System.
Method. The problem of rendering a raster image distorted through the use of PDT can be considered to be a problem of transforming texels between two image “spaces.” In general, a texel (i.e. texture element) may be defined as a texture map pixel or pixels. The texels from the undistorted image can be said to belong to the “source” image space (i.e. representation), and the texels from the distorted image belong to the “target” image space (i.e. presentation). In general, all rendering approaches must somehow determine color values for each target texel using the known source texels and the known PDT transformation. The present invention takes advantage of symmetry in pyramid lenses across the x and y axis in order to perform fast copies of entire rows or columns of pixels. Improvement in rendering speed is achieved because copying large blocks of pixels is significantly faster than performing multiple copies of individual pixels.
In accordance with the invention, the rendering of the focal region 433 of the lens 430 is performed by a single stretch bit block transfer (“stretch BLT” or “stretch blitting”) operation. In general, a stretch BLT operation can stretch source data in the x and y axis directions to a target destination larger or smaller that the source. First, the undisplaced rectangular bounds of the focal region are determined or calculated in the source space (i.e. representation). These bounds specify the region in the source space from which information for the focal region 433 will be drawn. By definition, the distortion function defining the lens specifies the extent of the undisplaced focal and shoulder regions for the lens in source space. Second, the displaced rectangular bounds 440 of the focal region 433 are calculated in the target space (i.e. presentation) from application of the distortion function defining the lens to the undisplaced rectangular bounds of the focal region. These bounds 440 specify the region in the target space in which the focal region 433 will be rendered. Third, a single stretch BLT operation from the source raster image to the target raster image is used to fill the focal region 433. To summarize, by definition, lenses define the bounds of their undisplaced focal and shoulder regions in source space and hence this information is known. To find displaced coordinates, the known undisplaced focal and shoulder region bounds are transformed using the “displace” functionality in PDT. Scaling between the bounds in source (i.e. x, y coordinates) and target (i.e. u, v coordinates) spaces is then used to find pixel coordinates for the displaced regions in the target space.
With respect to the rendering of the shoulder regions 434, 435, 436, 437 of the pyramid lens 430, each of the four shoulder regions 434, 435, 436, 437 is rendered individually as follows. The top and bottom shoulder regions 435, 437 are rendered in the same fashion, by copying individual texel rows from source raster to target raster. The left and right shoulder regions 434, 436 are rendered in the same fashion, by copying individual texel columns from source raster to target raster.
Note that source slice 503 coordinates 530, 531, 532, 533 can be determined using interpolation. Interpolation may be either standard linear interpolation or perspective correct interpolation. With respect to linear interpolation of texture values, the following procedure can be used. First, assume that p1 and p2 are known positions in space. Also, assume that t1 and t2 are known texture coordinates. A corresponding texture coordinate t3 for a given point p3 that lies between p1 and p2 is found by calculating the fraction of the distance f that p3 lies along the path from p1 to p2 as follows: f=(p3−p1)/(p2/p1). The texture coordinate t3 is then calculated as follows: t3=t1+f*(t2−t1). For perspective correct interpolation, one must also know the z values, z1 and z2, corresponding to the depths of p1 and p2 from the viewpoint 240. The interpolation method described is used to interpolate between 1/z1 and 1/z2 to find 1/z3 as follows: 1/z3=1/z1+f*(1/z2−1/z1). Then the same method is used to interpolate between t1/z1 and t2/z2 to find t3/z3. Next, the value t3/z3 is multiplied by the inverse of 1/z3 to determine t3. The t3 value arrived at with this method is the “correct” texture value to be used in cases where textures are mapped onto 3D objects.
Once the source coordinates 530, 531, 532, 533 defining the slice 503 in source raster coordinates 502 and the coordinates 520, 521, 522, 523 defining the slice 504 in target raster coordinates 501 are determined, copying can be performed. A stretch BLT operation 540 is performed to copy the slice 503 from source raster space 501 to target raster space 502 as shown in
It is known that linearly interpolating 2D textures in 3D space (as described above) can result in artificial curvature of the textures. Methods for overcoming this artificial curvature are also known. These methods may be easily integrated into the rendering method of the present invention, if desired. In accordance with an embodiment of the invention, one method of overcoming artificial curvature to produce perspective correct texturing is as follows. When texture coordinates are interpolated, the value 1/z for each calculated point is interpolated along with the values of u/z and v/z, where z is the distance from the viewpoint 240 to a point on the lens 230, 430. The interpolated 1/z value at any point is used to find a z value. This z value is then multiplied by the interpolated u/z or v/z values to find the correct u or v values for rendering of the texture. Thus, rather than simply interpolating u and v values, the value 1/z, u/z, and v/z are interpolated to determine the correct values for u and v for each point.
In accordance with an embodiment of the invention, optional antialiasing is supported. If antialiasing is desired when copying slices, the coverage of the target slice 504 on the source image space 501 can be calculated. This might result in, for example, a 1-pixel wide target slice 504 mapping to a 3-pixel wide source slice 503. The 3-pixel wide source slice 503 can then be compressed to a 1-pixel wide slice when it is copies. If antialiasing is not desired, the source slice 503 can be assumed to be 1 pixel wide.
The lens surface shape is generally a pyramidal frustum shape 430 having a top and a plurality of sides. The pyramidal frustum shape can be a right square with a square shaped top and with first, second, third, and fourth quadrilateral shaped sides. The focal region 433 is generally the top of the pyramidal frustum shape and the shoulder region includes the first, second, third, and fourth sides 434, 435, 436, 437. The sides can be subdivided into a plurality of horizontal or vertical slices 504 which can be one pixel high or wide.
The steps of determining boundaries 704 in the representation for the shoulder region and determining boundaries on the lens surface corresponding to the boundaries in the representation include: determining boundaries in the representation 530, 531, 532, 533 for each of the plurality of slices 503 and determining boundaries on the lens surface 520, 521, 522, 523 corresponding to the boundaries in the representation for each of the plurality of slices 504, respectively. The boundaries can be determined by interpolation. And, the shoulder region stretch bit-block transfer operation can include respective shoulder region stretch bit-block transfer operations for each of the plurality of slices 503, 504.
The sides 434, 435, 436, 437 of the pyramidal surface can be subdivided into a plurality of quadrilaterals 603, 604, 605 by a plurality of horizontal or vertical ribs 606, 607 (i.e. parallel to the basal plane 210). The plurality of quadrilaterals and ribs can include a plurality of slices. The steps of determining boundaries 704 in the representation for the shoulder region and determining boundaries on the lens surface corresponding to the boundaries in the representation can include: determining boundaries in the representation for each of the plurality of ribs and determining boundaries on the lens surface corresponding to the boundaries in the representation for each of the plurality of ribs, respectively. The shoulder region stretch bit-block transfer operation can include respective shoulder region stretch bit-block transfer operations for each of the plurality of ribs or slices. And, information for the lens surface lying between the plurality of ribs or slices can be determined by interpolation.
The step of copying 704 can include scaling the information. This scaling can include anti-aliasing. Interpolation can include linear interpolation and perspective correct interpolation. In addition, the distortion function can include shoulder functions defining the shoulder regions.
The lens parameters used to describe a lens according to the present invention may include such lens position, focal region shape, shoulder function, number of ribs, location of ribs, and a folding vector. These parameters can be formatted into a format suitable for transmission over a network (e.g. compressed or encrypted) and sent from the data processing system 300 to a remote machine. This is advantageous in situations where remote machines need to synchronize their presentations.
In addition, lens parameters such as lens position, focal region shape, shoulder function, number of ribs, location of ribs, and folding vector can be stored in a file which may be compressed or encrypted. This is advantageous as it allows a lens surface to be archived for later retrieval or sharing with other systems and users.
Data Carrier Product. The sequences of instructions which when executed cause the method described herein to be performed by the exemplary data processing system of
Computer Software Product. The sequences of instructions which when executed cause the method described herein to be performed by the exemplary data processing system of
Integrated Circuit Product. The sequences of instructions which when executed cause the method described herein to be performed by the exemplary data processing system of
Although preferred embodiments of the invention have been described herein, it will be understood by those skilled in the art that variations may be made thereto without departing from the spirit of the invention or the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2370752 | Feb 2002 | CA | national |
Number | Name | Date | Kind |
---|---|---|---|
3201546 | Richardson | Aug 1965 | A |
4581647 | Vye | Apr 1986 | A |
4630110 | Cotton et al. | Dec 1986 | A |
4688181 | Cottrell et al. | Aug 1987 | A |
4790028 | Ramage | Dec 1988 | A |
4800379 | Yeomans | Jan 1989 | A |
4885702 | Ohba | Dec 1989 | A |
4888713 | Falk | Dec 1989 | A |
4985849 | Hideaki | Jan 1991 | A |
4992866 | Morgan | Feb 1991 | A |
5048077 | Wells et al. | Sep 1991 | A |
5175808 | Sayre | Dec 1992 | A |
5185599 | Doornink et al. | Feb 1993 | A |
5185667 | Zimmermann | Feb 1993 | A |
5200818 | Neta et al. | Apr 1993 | A |
5206721 | Ashida et al. | Apr 1993 | A |
5227771 | Kerr et al. | Jul 1993 | A |
5250934 | Denber et al. | Oct 1993 | A |
5258837 | Gormley | Nov 1993 | A |
5321807 | Mumford | Jun 1994 | A |
5329310 | Liljegren et al. | Jul 1994 | A |
5341466 | Perlin et al. | Aug 1994 | A |
5416900 | Blanchard et al. | May 1995 | A |
5432895 | Myers | Jul 1995 | A |
5451998 | Hamrick | Sep 1995 | A |
5459488 | Geiser | Oct 1995 | A |
5473740 | Kasson | Dec 1995 | A |
5521634 | McGary | May 1996 | A |
5523783 | Cho | Jun 1996 | A |
5528289 | Cortjens et al. | Jun 1996 | A |
5539534 | Hino et al. | Jul 1996 | A |
5581670 | Bier et al. | Dec 1996 | A |
5583977 | Seidl | Dec 1996 | A |
5588098 | Chen et al. | Dec 1996 | A |
5594859 | Palmer et al. | Jan 1997 | A |
5596690 | Stone et al. | Jan 1997 | A |
5598297 | Yamanaka et al. | Jan 1997 | A |
5610653 | Abecassis | Mar 1997 | A |
5613032 | Cruz et al. | Mar 1997 | A |
5638523 | Mullet et al. | Jun 1997 | A |
5644758 | Patrick et al. | Jul 1997 | A |
5651107 | Frank et al. | Jul 1997 | A |
5652851 | Stone et al. | Jul 1997 | A |
5657246 | Hogan et al. | Aug 1997 | A |
5670984 | Robertson et al. | Sep 1997 | A |
5680524 | Maples et al. | Oct 1997 | A |
5682489 | Harrow et al. | Oct 1997 | A |
5689287 | Mackinlay et al. | Nov 1997 | A |
5689628 | Robertson | Nov 1997 | A |
5721853 | Smith | Feb 1998 | A |
5729673 | Cooper et al. | Mar 1998 | A |
5731805 | Tognazzini et al. | Mar 1998 | A |
5742272 | Kitamura et al. | Apr 1998 | A |
5745166 | Rhodes et al. | Apr 1998 | A |
5751289 | Myers | May 1998 | A |
5754348 | Soohoo | May 1998 | A |
5764139 | Nojima et al. | Jun 1998 | A |
5786814 | Moran et al. | Jul 1998 | A |
5798752 | Buxton et al. | Aug 1998 | A |
5808670 | Oyashiki et al. | Sep 1998 | A |
5812111 | Fuji et al. | Sep 1998 | A |
5818455 | Stone et al. | Oct 1998 | A |
5848231 | Teitelbaum et al. | Dec 1998 | A |
5852440 | Grossman et al. | Dec 1998 | A |
5872922 | Hogan et al. | Feb 1999 | A |
5909219 | Dye | Jun 1999 | A |
5923364 | Rhodes et al. | Jul 1999 | A |
5926209 | Glatt | Jul 1999 | A |
5949430 | Robertson et al. | Sep 1999 | A |
5950216 | Amro et al. | Sep 1999 | A |
5969706 | Tanimoto et al. | Oct 1999 | A |
5973694 | Steele et al. | Oct 1999 | A |
5991877 | Luckenbaugh | Nov 1999 | A |
5999879 | Yano | Dec 1999 | A |
6005611 | Gullichsen et al. | Dec 1999 | A |
6037939 | Kashiwagi et al. | Mar 2000 | A |
6052110 | Sciammarella et al. | Apr 2000 | A |
6057844 | Strauss | May 2000 | A |
6064401 | Holzman et al. | May 2000 | A |
6067372 | Gur et al. | May 2000 | A |
6073036 | Heikkinen et al. | Jun 2000 | A |
6075531 | DeStefano | Jun 2000 | A |
6081277 | Kojima | Jun 2000 | A |
6084598 | Chekerylla | Jul 2000 | A |
6091771 | Seeley et al. | Jul 2000 | A |
6108005 | Starks et al. | Aug 2000 | A |
6128024 | Carver et al. | Oct 2000 | A |
6133914 | Rogers et al. | Oct 2000 | A |
6154840 | Pebley et al. | Nov 2000 | A |
6160553 | Robertson et al. | Dec 2000 | A |
6184859 | Kojima | Feb 2001 | B1 |
6198484 | Kameyama | Mar 2001 | B1 |
6201546 | Bodor et al. | Mar 2001 | B1 |
6201548 | Cariffe et al. | Mar 2001 | B1 |
6204845 | Bates et al. | Mar 2001 | B1 |
6204850 | Green | Mar 2001 | B1 |
6215491 | Gould | Apr 2001 | B1 |
6219052 | Gould | Apr 2001 | B1 |
6241609 | Rutgers | Jun 2001 | B1 |
6246411 | Strauss | Jun 2001 | B1 |
6249281 | Chen et al. | Jun 2001 | B1 |
6256043 | Aho et al. | Jul 2001 | B1 |
6256115 | Adler et al. | Jul 2001 | B1 |
6256737 | Bianco et al. | Jul 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6271854 | Light | Aug 2001 | B1 |
6278443 | Amro et al. | Aug 2001 | B1 |
6278450 | Arcuri et al. | Aug 2001 | B1 |
6288702 | Tachibana et al. | Sep 2001 | B1 |
6304271 | Nehme | Oct 2001 | B1 |
6307612 | Smith et al. | Oct 2001 | B1 |
6320599 | Sciammarella et al. | Nov 2001 | B1 |
6337709 | Yamaashi et al. | Jan 2002 | B1 |
6346938 | Chan et al. | Feb 2002 | B1 |
6346962 | Goodridge | Feb 2002 | B1 |
6359615 | Singh | Mar 2002 | B1 |
6381583 | Kenney | Apr 2002 | B1 |
6384849 | Morcos et al. | May 2002 | B1 |
6396648 | Yamamoto et al. | May 2002 | B1 |
6396962 | Haffey et al. | May 2002 | B1 |
6400848 | Gallagher | Jun 2002 | B1 |
6407747 | Chui et al. | Jun 2002 | B1 |
6411274 | Watanabe et al. | Jun 2002 | B2 |
6416186 | Nakamura | Jul 2002 | B1 |
6417867 | Hallberg | Jul 2002 | B1 |
6438576 | Huang et al. | Aug 2002 | B1 |
6487497 | Khavakh et al. | Nov 2002 | B2 |
6491585 | Miyamoto et al. | Dec 2002 | B1 |
6504535 | Edmark | Jan 2003 | B1 |
6515678 | Boger | Feb 2003 | B1 |
6522341 | Nagata | Feb 2003 | B1 |
6542191 | Yonezawa | Apr 2003 | B1 |
6552737 | Tanaka et al. | Apr 2003 | B1 |
6559813 | DeLuca et al. | May 2003 | B1 |
6577311 | Crosby et al. | Jun 2003 | B1 |
6577319 | Kashiwagi et al. | Jun 2003 | B1 |
6584237 | Abe | Jun 2003 | B1 |
6590568 | Astala et al. | Jul 2003 | B1 |
6590583 | Soohoo | Jul 2003 | B2 |
6608631 | Milliron | Aug 2003 | B1 |
6612930 | Kawagoe et al. | Sep 2003 | B2 |
6631205 | Melen et al. | Oct 2003 | B1 |
6633305 | Sarfield | Oct 2003 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6720971 | Yamamoto et al. | Apr 2004 | B1 |
6727910 | Tigges | Apr 2004 | B2 |
6731315 | Ma et al. | May 2004 | B1 |
6744430 | Shimizu | Jun 2004 | B1 |
6747610 | Taima et al. | Jun 2004 | B1 |
6747611 | Budd et al. | Jun 2004 | B1 |
6760020 | Uchiyama et al. | Jul 2004 | B1 |
6768497 | Baar et al. | Jul 2004 | B2 |
6798412 | Cowperthwaite | Sep 2004 | B2 |
6833843 | Mojaver et al. | Dec 2004 | B2 |
6842175 | Schmalstieg et al. | Jan 2005 | B1 |
6874126 | Lapidous | Mar 2005 | B1 |
6882755 | Silverstein et al. | Apr 2005 | B2 |
6906643 | Samadani et al. | Jun 2005 | B2 |
6911975 | Iizuka et al. | Jun 2005 | B2 |
6919921 | Morota et al. | Jul 2005 | B1 |
6924822 | Card et al. | Aug 2005 | B2 |
6938218 | Rosen | Aug 2005 | B1 |
6956590 | Barton et al. | Oct 2005 | B1 |
6961071 | Montagnese et al. | Nov 2005 | B2 |
6975335 | Watanabe | Dec 2005 | B2 |
6985865 | Packingham et al. | Jan 2006 | B1 |
7038680 | Pitkow | May 2006 | B2 |
7055095 | Anwar | May 2006 | B1 |
7071971 | Elberbaum | Jul 2006 | B2 |
7084886 | Jetha et al. | Aug 2006 | B2 |
7088364 | Lantin | Aug 2006 | B2 |
7106349 | Baar et al. | Sep 2006 | B2 |
7133054 | Aguera y Arcas | Nov 2006 | B2 |
7134092 | Fung et al. | Nov 2006 | B2 |
7158878 | Rasmussen et al. | Jan 2007 | B2 |
7173633 | Tigges | Feb 2007 | B2 |
7173636 | Montagnese | Feb 2007 | B2 |
7197719 | Doyle et al. | Mar 2007 | B2 |
7213214 | Baar et al | May 2007 | B2 |
7233942 | Nye | Jun 2007 | B2 |
7246109 | Ramaswamy | Jul 2007 | B1 |
7256801 | Baar et al. | Aug 2007 | B2 |
7274381 | Mojaver et al. | Sep 2007 | B2 |
7275219 | Shoemaker | Sep 2007 | B2 |
7280105 | Cowperthwaite | Oct 2007 | B2 |
7283141 | Baar et al. | Oct 2007 | B2 |
7310619 | Baar et al. | Dec 2007 | B2 |
7312806 | Tigges | Dec 2007 | B2 |
7321824 | Nesbitt | Jan 2008 | B1 |
7411610 | Doyle | Aug 2008 | B2 |
7423660 | Ouchi et al. | Sep 2008 | B2 |
7450114 | Anwar | Nov 2008 | B2 |
7472354 | Jetha et al. | Dec 2008 | B2 |
7486302 | Shoemaker | Feb 2009 | B2 |
7489321 | Jetha et al. | Feb 2009 | B2 |
7495678 | Doyle et al. | Feb 2009 | B2 |
7580036 | Montagnese | Aug 2009 | B2 |
20010040585 | Hartford et al. | Nov 2001 | A1 |
20010040636 | Kato et al. | Nov 2001 | A1 |
20010048447 | Jogo | Dec 2001 | A1 |
20010055030 | Han | Dec 2001 | A1 |
20020033837 | Munro | Mar 2002 | A1 |
20020038257 | Joseph et al. | Mar 2002 | A1 |
20020044154 | Baar et al. | Apr 2002 | A1 |
20020062245 | Niu et al. | May 2002 | A1 |
20020075280 | Tigges | Jun 2002 | A1 |
20020087894 | Foley et al. | Jul 2002 | A1 |
20020089520 | Baar et al. | Jul 2002 | A1 |
20020093567 | Cromer et al. | Jul 2002 | A1 |
20020101396 | Huston et al. | Aug 2002 | A1 |
20020122038 | Cowperthwaite | Sep 2002 | A1 |
20020135601 | Watanabe et al. | Sep 2002 | A1 |
20020143826 | Day et al. | Oct 2002 | A1 |
20020171644 | Reshetov et al. | Nov 2002 | A1 |
20020180801 | Doyle et al. | Dec 2002 | A1 |
20030006995 | Smith et al. | Jan 2003 | A1 |
20030007006 | Baar et al. | Jan 2003 | A1 |
20030048447 | Harju et al. | Mar 2003 | A1 |
20030052896 | Higgins et al. | Mar 2003 | A1 |
20030061211 | Shultz et al. | Mar 2003 | A1 |
20030100326 | Grube et al. | May 2003 | A1 |
20030105795 | Anderson et al. | Jun 2003 | A1 |
20030112503 | Lantin | Jun 2003 | A1 |
20030118223 | Rahn et al. | Jun 2003 | A1 |
20030137525 | Smith | Jul 2003 | A1 |
20030151625 | Shoemaker | Aug 2003 | A1 |
20030151626 | Komar et al. | Aug 2003 | A1 |
20030174146 | Kenoyer | Sep 2003 | A1 |
20030179198 | Uchiyama | Sep 2003 | A1 |
20030179219 | Nakano et al. | Sep 2003 | A1 |
20030179237 | Nelson et al. | Sep 2003 | A1 |
20030196114 | Brew et al. | Oct 2003 | A1 |
20030210281 | Ellis et al. | Nov 2003 | A1 |
20030227556 | Doyle | Dec 2003 | A1 |
20030231177 | Montagnese et al. | Dec 2003 | A1 |
20040026521 | Colas et al. | Feb 2004 | A1 |
20040056869 | Jetha et al. | Mar 2004 | A1 |
20040056898 | Jetha et al. | Mar 2004 | A1 |
20040111332 | Baar et al. | Jun 2004 | A1 |
20040125138 | Jetha et al. | Jul 2004 | A1 |
20040150664 | Baudisch | Aug 2004 | A1 |
20040194014 | Anwar | Sep 2004 | A1 |
20040217979 | Baar et al. | Nov 2004 | A1 |
20040240709 | Shoemaker | Dec 2004 | A1 |
20040257375 | Cowperthwaite | Dec 2004 | A1 |
20040257380 | Herbert et al. | Dec 2004 | A1 |
20050041046 | Baar et al. | Feb 2005 | A1 |
20050134610 | Doyle et al. | Jun 2005 | A1 |
20050259118 | Mojaver et al. | Nov 2005 | A1 |
20050278378 | Frank | Dec 2005 | A1 |
20050285861 | Fraser | Dec 2005 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060033762 | Card et al. | Feb 2006 | A1 |
20060036629 | Gray | Feb 2006 | A1 |
20060082901 | Shoemaker | Apr 2006 | A1 |
20060098028 | Baar | May 2006 | A1 |
20060139375 | Rasmussen et al. | Jun 2006 | A1 |
20060192780 | Lantin | Aug 2006 | A1 |
20060214951 | Baar et al. | Sep 2006 | A1 |
20070033543 | Ngari et al. | Feb 2007 | A1 |
20070064018 | Shoemaker et al. | Mar 2007 | A1 |
20070097109 | Shoemaker et al. | May 2007 | A1 |
20090141044 | Shoemaker | Jun 2009 | A1 |
20090147023 | Jetha et al. | Jun 2009 | A1 |
20090172587 | Carlisle | Jul 2009 | A1 |
20090265656 | Zeenat | Oct 2009 | A1 |
20090284542 | Baar | Nov 2009 | A1 |
Number | Date | Country |
---|---|---|
2350342 | Nov 2002 | CA |
2386560 | Nov 2003 | CA |
2393708 | Jan 2004 | CA |
2394119 | Jan 2004 | CA |
0635779 | Jan 1995 | EP |
0650144 | Apr 1995 | EP |
0816983 | Jan 1998 | EP |
0816983 | Jul 1998 | EP |
Number | Date | Country | |
---|---|---|---|
20030151626 A1 | Aug 2003 | US |