Claims
- 1. An image processing method which comprises acquiring an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and analyzing said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image includes first and second areas which have intrinsic image patterns different from each other and between which the boundary cannot be detected as a continuous line based on the differences between individual pixel data value, and said analyzing said image comprises:
calculating a texture characteristic's value in each position of a texture analysis window of a predetermined size based on pixel data in said texture analysis window, while moving said texture analysis window; and estimating a boundary between said first and second areas based on a distribution of the texture characteristic's values calculated in said calculating a texture characteristic's value, and when it is known that a specific area is a part of said first area in said image, said calculating a texture characteristic's value comprises:
calculating said texture characteristic's value while changing a position of said texture analysis window in said specific area and examining how said texture characteristic's value in said specific area varies according to the position of said texture analysis window; and calculating said texture characteristic's value while changing a position of said texture analysis window outside said specific area.
- 2. The image processing method according to claim 1, wherein at least one of intrinsic patterns of said first and second areas is known.
- 3. The image processing method according to claim 2, wherein the size of said texture analysis window is determined according to said known intrinsic pattern.
- 4. The image processing method according to claim 1, wherein said texture characteristic's value is at least one of mean and variance of pixel data in said texture analysis window.
- 5. A detecting method with which to detect characteristic information of an object based on a distribution of light through said object when illuminating said object, said detecting method comprising:
processing an image formed by said light through said object with the image processing method according to claim 1; and detecting characteristic information of said object based on the processing result of said processing an image.
- 6. The detecting method according to claim 5, wherein the characteristic information of said object is shape information of said object.
- 7. The detecting method according to claim 5, wherein the characteristic information of said object is position information of said object.
- 8. An exposure method with which to transfer a given pattern onto a substrate, said exposure method comprising:
detecting position information of said substrate with the detecting method according to claim 7; and transferring said given pattern onto said substrate while controlling a position of said substrate based on the position information of said substrate detected in said detecting position information of said substrate.
- 9. The detecting method according to claim 5, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 10. An exposure method with which to transfer a given pattern onto a substrate by illuminating with an exposure beam via an optical system, said exposure method comprising:
detecting optical characteristic information of said optical system with the detecting method according to claim 9; and transferring said given pattern onto said substrate based on the detecting result of said detecting optical characteristic information.
- 11. An image processing method which comprises acquiring an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and analyzing said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image includes first and second areas which have intrinsic image patterns different from each other and between which the boundary cannot be detected as a continuous line based on the differences between individual pixel data, and said analyzing said image comprises:
determining weight information which is assigned to each of pixels in a square texture analysis window, and which is defined by a ratio of an inscribed circle area of said texture analysis window to a whole area of a rectangular sub-area, for each of said rectangular sub-areas into which said texture analysis window is divided according to each pixel; calculating a texture characteristic's value in each position of said texture analysis window based on said weight information and said each pixel data in said texture analysis window, while moving said texture analysis window; and estimating a boundary between said first and second areas based on a distribution of the texture characteristic's values calculated in said calculating a texture characteristic's value.
- 12. The image processing method according to claim 11, wherein said weight information further includes additional weight information according to the type of texture analysis.
- 13. The image processing method according to claim 11, wherein said texture characteristic's value is at least one of weighted mean and weighted variance of pixel data in said texture analysis window.
- 14. A detecting method with which to detect characteristic information of an object based on a distribution of light through said object when illuminating said object, said detecting method comprising:
processing an image formed by said light through said object with the image processing method according to claim 11; and detecting characteristic information of said object based on the processing result of said processing an image.
- 15. The detecting method according to claim 14, wherein the characteristic information of said object is shape information of said object.
- 16. The detecting method according to claim 14, wherein the characteristic information of said object is position information of said object.
- 17. An exposure method with which to transfer a given pattern onto a substrate, said exposure method comprising:
detecting position information of said substrate with the detecting method according to claim 16; and transferring said given pattern onto said substrate while controlling a position of said substrate based on the position information of said substrate detected in said detecting position information of said substrate.
- 18. The detecting method according to claim 14, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 19. An exposure method with which to transfer a given pattern onto a substrate by illuminating with an exposure beam via an optical system, said exposure method comprising:
detecting optical characteristic information of said optical system with the detecting method according to claim 18; and transferring said given pattern onto said substrate based on the detecting result of said detecting optical characteristic information.
- 20. An image processing method which comprises acquiring an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and analyzing said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image is an image having no fewer than three tones that includes first and second areas which are different from each other in brightness of pixels in the vicinity of the boundary, and said analyzing said image comprises:
calculating a threshold of brightness information to discriminate said first and second areas in said image based on a distribution of brightness of said image; and obtaining a position on said pixel at which the brightness is estimated to be equal to said threshold, based on said brightness distribution of said image with accuracy higher than accuracy on the pixel scale, and estimating the obtained position to be a boundary position between said first and second areas.
- 21. The image processing method according to claim 20, wherein
said image is a set of brightness of a plurality of pixels arranged two-dimensionally along first and second directions, and said estimating a boundary position comprises:
estimating a first estimated boundary position in said first direction based on brightness of first and second pixels that have a first magnitude relation and are adjacent to each other in said first direction in said image, and said threshold.
- 22. The image processing method according to claim 21, wherein said first magnitude relation is a relation where one of a first condition and a second condition is fulfilled, in said first condition brightness of said first pixel being greater than said threshold and brightness of said second pixel being not greater than said threshold, and in said second condition brightness of said first pixel being not less than said threshold and brightness of said second pixel being less than said threshold.
- 23. The image processing method according to claim 22, wherein said first estimated boundary position is at a position which divides internally a line segment joining the centers of said first and second pixels in proportion to an absolute value of difference between brightness of said first pixel and said threshold, and an absolute value of difference between brightness of said second pixel and said threshold.
- 24. The image processing method according to claim 21, wherein said estimating a boundary position further comprises:
estimating a second estimated boundary position in said second direction based on brightness of third and fourth pixels that have a second magnitude relation and are adjacent to each other in said second direction in said image, and said threshold.
- 25. The image processing method according to claim 24, wherein said second magnitude relation is a relation where one of a third condition and a fourth condition is fulfilled, in said third condition brightness of said third pixel being greater than said threshold and brightness of said fourth pixel being not greater than said threshold, and in a fourth condition brightness of said third pixel being not less than said threshold and brightness of said fourth pixel being less than said threshold.
- 26. The image processing method according to claim 25, wherein said second estimated boundary position is at a position which divides internally a line segment joining the centers of said third and fourth pixels in proportion to an absolute value of difference between brightness of said third pixel and said threshold, and an absolute value of difference between brightness of said fourth pixel and said threshold.
- 27. A detecting method with which to detect characteristic information of an object based on a distribution of light through said object when illuminating said object, said detecting method comprising:
processing an image formed by said light through said object with the image processing method according to claim 20; and detecting characteristic information of said object based on the processing result of said processing an image.
- 28. The detecting method according to claim 27, wherein the characteristic information of said object is shape information of said object.
- 29. The detecting method according to claim 27, wherein the characteristic information of said object is position information of said object.
- 30. An exposure method with which to transfer a given pattern onto a substrate, said exposure method comprising:
detecting position information of said substrate with the detecting method according to claim 29; and transferring said given pattern onto said substrate while controlling a position of said substrate based on the position information of said substrate detected in said detecting position information of said substrate.
- 31. The detecting method according to claim 27, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 32. An exposure method with which to transfer a given pattern onto a substrate by illuminating with an exposure beam via an optical system, said exposure method comprising:
detecting optical characteristic information of said optical system with the detecting method according to claim 31; and transferring said given pattern onto said substrate based on the detecting result of said detecting optical characteristic information.
- 33. An image processing method which comprises acquiring an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and analyzing said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image has no fewer than three areas divided by no fewer than three boundary lines that extend radially from a specific point, and said analyzing said image comprises:
preparing a template pattern that includes at least three line pattern elements extending from a reference point, and when said reference point coincides with said specific point, said at least three line pattern elements extend through respective areas of said no fewer than three areas and have level values corresponding to predicted level values of said respective areas; and calculating a correlation value between said image and said template pattern in each position of said image, while moving said template pattern in said image.
- 34. The image processing method according to claim 33, wherein each said line pattern element extends along a bisector of an angle predicted to be made by the boundary lines of said respective areas in said image.
- 35. The image processing method according to claim 33, wherein the numbers of said no fewer than three boundary lines and said no fewer than three areas are four, and out of said four boundary lines, two boundary lines are substantially on a first straight line, and the other two boundary lines are substantially on a second straight line.
- 36. The image processing method according to claim 35, wherein said first and second straight lines are perpendicular to each other.
- 37. The image processing method according to claim 35, wherein the number of said line pattern elements is four.
- 38. The image processing method according to claim 37, wherein
among said four areas in said image, adjacent two areas are different from each other in level value, and two areas diagonal across said specific point are substantially the same in level value.
- 39. The image processing method according to claim 33, wherein level values of said line pattern elements have a same magnitude relation as a magnitude relation of level values that said respective areas in said image is predicted to have.
- 40. A detecting method with which to detect position information of a mark that has no fewer than three areas divided by no fewer than three boundary lines extending radially from a specific point, said detecting method comprising:
acquiring an image formed by light through said object, and processing said image with the image processing method according to claim 33; and detecting position information of said mark based on the processing result of said processing said image.
- 41. An exposure method with which to transfer a given pattern onto a substrate, said exposure method comprising:
detecting position information of a mark formed on at least one of said substrate and a measurement substrate with the detecting method according to claim 40; and transferring said given pattern onto said substrate while controlling a position of said substrate based on the position information of said mark detected in said detecting position information of a mark.
- 42. An image processing unit which comprises an image acquiring unit which acquires an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and an image analyzing unit which analyzes said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image includes first and second areas which have intrinsic image patterns different from each other and between which the boundary cannot be detected as a continuous line based on the differences between individual pixel data value, and said image analyzing unit comprises:
a characteristic value calculating unit that calculates a texture characteristic's value in each position of a texture analysis window of a predetermined size based on pixel data in said texture analysis window, while moving said texture analysis window; and a boundary estimating unit that estimates the boundary between said first and second areas based on a distribution of the texture characteristic's values calculated by said characteristic value calculating unit, and when it is known that a specific area is a part of said first area in said image, said characteristic value calculating unit:
calculates said texture characteristic's value while changing a position of said texture analysis window in said specific area, and examines how said texture characteristic's value in said specific area varies according to the position of said texture analysis window; and calculates said texture characteristic's value while changing a position of said texture analysis window outside said specific area.
- 43. The image processing unit according to claim 42, wherein
at least one of intrinsic patterns of said first and second areas is known, and said characteristic value calculating unit calculates said texture characteristic's value while moving said texture analysis window whose size has been determined according to said known intrinsic pattern.
- 44. The image processing unit according to claim 42, wherein
it is known that a specific area is a part of said first area in said image, and said characteristic value calculating unit obtains a size of said texture analysis window with which the texture characteristic's value is almost constant even when changing a position of said texture analysis window in said specific area, and calculates said texture characteristic's value while moving said texture analysis window of the obtained size.
- 45. The image processing unit according to claim 42, wherein said image acquiring unit is an image picking up unit.
- 46. A detecting unit which detects characteristic of an object based on a distribution of light through said object when illuminating said object, said detecting unit comprising:
an image processing unit according to claim 42, which processes an image formed by said light through said object; and a characteristic detecting unit that detects characteristic information of said object based on the processing result of said image processing unit.
- 47. The detecting unit according to claim 46, wherein the characteristic information of said object is shape information of said object.
- 48. The detecting unit according to claim 46, wherein the characteristic information of said object is position information of said object.
- 49. An exposure apparatus which transfers a given pattern onto a substrate, said exposure apparatus comprising:
a detecting unit according to claim 48, which detects position information of said substrate; and a stage unit that has a stage on which said substrate is mounted, the position information of said substrate being detected by said detecting unit.
- 50. The detecting unit according to claim 46, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 51. An exposure apparatus which transfers a given pattern onto a substrate by illuminating with an exposure beam, said exposure apparatus comprising:
an optical system that guides said exposure beam to said substrate; and a detecting unit according to claim 50, which detects characteristic information of said optical system.
- 52. An image processing unit which comprises an image acquiring unit which acquires an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and an image analyzing unit which analyzes said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image has first and second areas which have intrinsic image patterns different from each other and between which the boundary cannot be detected as a continuous line based on the differences between individual pixel data value, and said image analyzing unit comprises:
a weight determining unit that determines weight information which is assigned to each pixel in a square texture analysis window, and which is defined by a ratio of an inscribed circle area of said texture analysis window to a whole area of a rectangular sub-area, for each of said rectangular sub-areas into which said texture analysis window is divided according to each pixel: a characteristic value calculating unit that calculates a texture characteristic's value in each position of said texture analysis window based on said weight information and each pixel data in said texture analysis window, while moving said texture analysis window; and a boundary estimating unit that estimates a boundary between said first and second areas based on a distribution of the texture characteristic's values calculated by said characteristic value calculating unit.
- 53. The image processing unit according to claim 52, wherein said image acquiring unit is an image picking up unit.
- 54. A detecting unit which detects characteristic of an object based on a distribution of light through said object when illuminating said object, said detecting unit comprising:
an image processing unit according to claim 52, which processes an image formed by said light through said object; and a characteristic detecting unit that detects characteristic information of said object based on the processing result of said image processing unit.
- 55. The detecting unit according to claim 54, wherein the characteristic information of said object is shape information of said object.
- 56. The detecting unit according to claim 54, wherein the characteristic information of said object is position information of said object.
- 57. An exposure apparatus which transfers a given pattern onto a substrate, said exposure apparatus comprising:
a detecting unit according to claim 56, which detects position information of said substrate; and a stage unit that has a stage on which said substrate is mounted, the position information of said substrate being detected by said detecting unit.
- 58. The detecting unit according to claim 54, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 59. An exposure apparatus which transfers a given pattern onto a substrate by illuminating with an exposure beam, said exposure apparatus comprising:
an optical system that guides said exposure beam to said substrate; and a detecting unit according to claim 58, which detects characteristic information of said optical system.
- 60. An image processing unit which comprises an image acquiring unit which acquires an image of a plurality of areas of which has two adjacent areas have different image characteristics from each other; and an image analyzing unit which analyzes said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image is an image having no fewer than three tones that includes first and second areas which are different from each other in brightness of pixels in the vicinity of the boundary, and said image analyzing unit comprises:
a threshold calculating unit that calculates a threshold to discriminate said first and second areas in said image based on a distribution of brightness of said image; and a boundary position estimating unit that obtains a position on said pixel at which the brightness is estimated to be equal to said threshold based on said brightness distribution of said image with accuracy higher than accuracy on the pixel scale, and estimates the obtained position to be a boundary position between said first and second areas.
- 61. The image processing unit according to claim 60, wherein said image acquiring unit is an image picking up unit.
- 62. A detecting unit which detects characteristic of an object based on a distribution of light through said object when illuminating said object, said detecting unit comprising:
an image processing unit according to claim 60, which processes an image formed by said light through said object; and a characteristic detecting unit that detects characteristic information of said object based on the processing result of said image processing unit.
- 63. The detecting unit according to claim 62, wherein the characteristic information of said object is shape information of said object.
- 64. The detecting unit according to claim 62, wherein the characteristic information of said object is position information of said object.
- 65. An exposure apparatus which transfers a given pattern onto a substrate, said exposure apparatus comprising:
a detecting unit according to claim 64, which detects position information of said substrate; and a stage unit that has a stage on which said substrate is mounted, the position information of said substrate being detected by said detecting unit.
- 66. The detecting unit according to claim 62, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 67. An exposure apparatus which transfers a given pattern onto a substrate by illuminating with an exposure beam, said exposure apparatus comprising:
an optical system that guides said exposure beam to said substrate; and a detecting unit according to claim 66, which detects characteristic information of said optical system.
- 68. An image processing unit which comprises an image acquiring unit which acquires an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and an image analyzing unit which analyzes said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image has no fewer than three areas divided by no fewer than three boundary lines that extend radially from a specific point, and said image analyzing unit comprises:
a template preparing unit that prepares a template pattern that includes at least three line pattern elements extending from a reference point, and when said reference point coincides with said specific point at least three line pattern elements extend through respective areas of said no fewer than three areas and have level values corresponding to predicted level values of said respective areas; and a correlation value calculating unit that calculates a correlation value between said image and said template pattern in each position of said image, while moving said template pattern in said image.
- 69. The image processing unit according to claim 68, wherein said image acquiring unit is an image picking up unit.
- 70. A detecting unit which detects characteristic of an object based on a distribution of light through said object when illuminating said object, said detecting unit comprising:
an image processing unit according to claim 68, which processes an image formed by said light through said object; and a characteristic detecting unit that detects characteristic information of said object based on the processing result of said image processing unit.
- 71. The detecting unit according to claim 70, wherein the characteristic information of said object is shape information of said object.
- 72. The detecting unit according to claim 70, wherein the characteristic information of said object is position information of said object.
- 73. An exposure apparatus which transfers a given pattern onto a substrate, said exposure apparatus comprising:
a detecting unit according to claim 72, which detects position information of said substrate; and a stage unit that has a stage on which said substrate is mounted, the position information of said substrate being detected by said detecting unit.
- 74. The detecting unit according to claim 70, wherein said object is at least one optical element, and the characteristic information of said object is optical characteristic information of said at least one optical element.
- 75. An exposure apparatus which transfers a given pattern onto a substrate by illuminating with an exposure beam, said exposure apparatus comprising:
an optical system that guides said exposure beam to said substrate; and a detecting unit according to claim 74, which detects characteristic information of said optical system.
- 76. A detecting unit which detects position information of a mark that has no fewer than three areas divided by no fewer than three boundary lines extending radially from a specific point, said detecting unit comprising:
an image processing unit according to claim 68 that acquires an image formed by light through said object and processes said image: and a mark position detecting unit that detects position information of said mark based on the processing result of said image processing unit.
- 77. An exposure apparatus which transfers a given pattern onto a substrate, said exposure apparatus comprising:
a substrate supporting apparatus that supports at least one of said substrate and a measurement substrate; and a detecting unit according to claim 76 which detects position information of a mark formed on at least one of said substrate and said measurement substrate supported by said substrate supporting apparatus.
- 78. An image processing method which comprises acquiring an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and analyzing said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image includes first and second areas which have intrinsic image patterns different from each other and between which the boundary cannot be detected as a continuous line based on the differences between individual pixel data value, and said analyzing said image comprises:
determining the size of a texture analysis window to perform texture analysis on said image with; calculating a texture characteristic's value in each position of a texture analysis window of said determined size based on pixel data in said texture analysis window, while moving said texture analysis window; and estimating a boundary between said first and second areas based on a distribution of the texture characteristic's values calculated in said calculating a texture characteristic's value, and when it is known that a specific area is a part of said first area in said image, said determining comprises:
calculating said texture characteristic's value, while changing the position and size of said texture analysis window in said specific area; and obtaining such a size of said texture analysis window that the texture characteristic's value is almost constant even when changing the position of said texture analysis window in said specific area.
- 79. The image processing method according to claim 78, wherein said texture characteristic's value is at least one of mean and variance of pixel data in said texture analysis window.
- 80. An image processing unit which comprises an image acquiring unit which acquires an image of a plurality of areas of which two adjacent areas have different image characteristics from each other; and an image analyzing unit which analyzes said image with using the difference between image characteristics of said two adjacent areas to obtain information about a boundary between said two adjacent areas, wherein
said image has first and second areas which have intrinsic image patterns different from each other and between which the boundary cannot be detected as a continuous line based on the differences between individual pixel data value, and said image analyzing unit comprises:
a determining unit that determines. the size of a texture analysis window to perform texture analysis on said image with; a characteristic value calculating unit that calculates a texture characteristic's value in each position of a texture analysis window of said determined size based on pixel data in said texture analysis window, while moving said texture analysis window; and a boundary estimating unit that estimates the boundary between said first and second areas based on a distribution of the texture characteristic's values calculated by said characteristic value calculating unit, and when it is known that a specific area is a part of said first area in said image, said determining unit obtains such a size of said texture analysis window that the texture characteristic's value is almost constant even when changing the position of said texture analysis window in said specific area.
Priority Claims (4)
| Number |
Date |
Country |
Kind |
| 2000-362,758 |
Nov 2000 |
JP |
|
| 2000-362,659 |
Nov 2000 |
JP |
|
| 2001-144,984 |
May 2001 |
JP |
|
| 2001-170,365 |
Jun 2001 |
JP |
|
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This is a continuation of International Application PCT/JP01/10394, with an international filing date of Nov. 28, 2001, the entire content of which being hereby incorporated herein by reference, which was not published in English.
Continuations (1)
|
Number |
Date |
Country |
| Parent |
PCT/JP01/10394 |
Nov 2001 |
US |
| Child |
10447230 |
May 2003 |
US |