The present invention relates to an optical device, such as an automatic focusing camera and a monitor camera, having an automatic focusing function.
Various techniques for measuring distances to objects which are in different directions using an optical device are known. One of these techniques is disclosed in the Japanese Patent Publication No. 4-67607. Further, a plurality of techniques for estimating an area, in the object space, which includes a main object on the basis of distance distribution information of the objects obtained by using the techniques for measuring distances to the objects are disclosed.
Below, a typical conventional method of estimating an area which includes a main object is explained with reference to
First, a scene, such as the one shown in
Next, grouping of blocks is performed in order to separate objects, in the object space, on the sensed images. After grouping, the M×n blocks are combined into areas each includes each object as shown in
As for a method of grouping, there is a method of determining similarity of adjoining blocks by, e.g., comparing the values of the adjoining blocks shown in
For example, a distance value of a given block as shown in
Thereafter, characteristics of each group in the image are evaluated, and the group which includes the main object is determined out of all the groups.
In a case of the groups as shown in
(possibility)=W1×(width)×(height)+W2/(distance from the center of frame)+W3/(average distance) (1)
In the equation 1, W1, W2, and W3 are constants for weighting items, “distance from the center of frame” is the distance between the center of the frame and the center of mass of the group, and “average distance” is an average of distances to the object from the camera in all the blocks of each group. The probability is calculated for every group, and the group having the largest probability is determined as including the main object.
Thereafter, a focal length is determined on the basis of the distance information of the group, determined as including the main object, so as to focus on an object in the group, then the lens is actually moved to focus on the object.
In the conventional focusing function as described above, an area including the main object is automatically determined on the basis of the aforesaid information, for instance, and focus control is performed on the basis of the determined result.
However, there are a variety of scenes to be sensed and a variety of objects to be the main object; therefore, the main object which the user actually intends to focus on is not always correctly determined as the main object in the aforesaid method.
Furthermore, an increase in the distance measuring points and areas caused by an increase in the number pixels of the CCDs complicates the determination of main object, which increases difficulty to focus on the main object completely automatically. However, since full-manual-selection of, selecting the main object also complicates operation of a camera, it is desirable to realize unification of automatic control and manual control to a high degree.
The present invention has been made in consideration of the above situation, and has as its object to provide an optical device and method for selecting a distance measurement point capable of correctly selecting an object which an operator intends to focus on, and focusing on the selected object.
According to the present invention, the foregoing object is attained by providing an optical device comprising: area discrimination means for discriminating a plurality of areas in a sensed image on the basis of a predetermined condition; main object area determination means for determining a main object area out of the plurality of areas discriminated by the area discrimination means; main object area changing means for changing the main object area to another area; and focus control means for focusing on the main object area.
Further, the foregoing object is also attained by providing an distance measuring point selection method comprising: an area discrimination step of discriminating a plurality of areas in a sensed image on the basis of a predetermined condition; a main object area determination step of determining a main object area out of the plurality of areas discriminated in the area discrimination step; a main object area changing step of changing the main object area to another area; a change instruction detection means of detecting whether or not there is any instruction to change the main object area; a control step of controlling to disable the main object area changing step when it is determined that there is no instruction to change the main object area in the change designation determination step; and a focus control step of focusing on the main object area.
Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Preferred embodiment of the present invention will be described in detail in accordance with the accompanying drawings.
Taking a function of selecting a distance measurement point of an automatic focusing camera as an example, a system capable of arbitrary changing a main object area from an area, obtained as a result of automatic main object area selection, to another area using a single axis rotary operation member (dial) is explained.
A configuration of an optical system used for generating distance distribution information is explained below.
Using an optical system as described above, two images having predetermined parallax are obtained.
Note, a camera having the aforesaid configuration is disclosed in detail in the Japanese Patent Application Laid-Open No. 7-134238.
In
Regarding communication signals, SO is a data signal outputted from the camera controller PRS, SI is a data signal inputted to the camera controller PRS, and SCLK is a synchronizing clock for signals SO and SI.
Further, in
When the camera controller PRS controls the selection signal CLCM to be “H” and outputs predetermined data, such as the data signal SO, in synchronization with the synchronizing clock SCLK, the lens communication buffer circuit LCM outputs buffer signals LCK and DCL corresponding to the synchronizing clock SCLK and the data signal LO, respectively, to the lens via a communication node between the camera controller PRS and the lens. At the same time, a buffer signal of the signal DLC is outputted from a lens unit LNS as the data signal SI, and the camera controller PRS receives data of the lens as the data signal SI in synchronization with the synchronizing clock SCLK.
Reference DDR denotes a circuit for detecting operation of various switches SWS and for display. It is selected when the signal CDDR is “H”, and controlled by the camera controller PRS by using the data signals SO and SI, and the synchronizing clock SCLK. More specifically, the circuit DDR changes the displayed contents on a display member DSP of the camera on the basis of data sent from the camera controller PRS, and notifies the camera controller PRS of the ON/OFF state of each of the operation switches SWS of the camera. The state of the rotary operation member of the main object area changing unit 53 is also detected by the circuit DDR.
Reference OLC denotes an outside liquid crystal display provided in the upper portion of the camera, and reference ILC denotes a liquid crystal display inside of a finder.
Switches SW1 and SW2 are coupled with a shutter release button (not shown), and with a half press of the shutter release button, the switch SW1 is turned on, and with a full press of the shutter release button, the switch SW2 is turned on. The camera controller PRS performs photometry and automatic focus adjustment in response to the “on” operation of the switch SW1, and in response to the “on” operation of the switch SW2, it controls exposure, thereafter, advances the film one frame.
Note, the switch SW2 is connected to an interruption input terminal of the camera controller PRS, and even when programs triggered by the “on” operation of the switch SW1 is under execution, the “on” operation of the switch SW2 interrupts the execution, and the camera controller PRS swiftly moves to a predetermined interrupt program.
Reference MTR1 denotes a motor for advancing the film, and reference MTR2 denotes a motor for moving a mirror in the up and down direction and charging of a shutter spring, and the film and the mirror are controlled to move in the forward and reverse directions by the motors MDR1 and MDR2, respectively. Signals M1F, M1R, M2F and M2R, inputted from the camera controller PRS to driving circuits MDR1 and MDR2, are forward and reverse rotation control signals.
References MG1 and MG2 denote front and rear curtain operation magnets, which are supplied with electric power via amplifying transistors TR1 and TR2 in response to control signals SMG1 and SMG2, and the shutter is controlled by the camera controller PRS. Note, the motor driving circuits MDR1 and MDR2 and shutter control are not directly related to the present invention, therefore, detailed explanation of them are omitted.
A buffer signal DCL which is inputted to the lens controller LPRS in synchronization with the buffer signal LCK is instruction data from the camera controller PRS to the lens unit LNS, and operation of the lens unit LNS corresponding to each instruction is predetermined. The lens controller LPRS analyzes the instruction in a predetermined procedure, and outputs operation states of focusing control, iris diaphragm control, the output signal DLC, each element of the lens unit LNS (e.g., operation states of focusing control optical system and operation states of iris diaphragm), and various parameters (open f-number, focus distance, coefficient of amount of movement of focusing control optical system corresponding to defocus amount, various focus correction amounts, etc.)
A zoom lens is explained in the first embodiment as an example, and when an instruction for focus adjustment is transmitted from the camera controller PRS, a motor LMTR for focus adjustment is operated on the basis of the signals LMF and LMR showing an amount and direction of displacement which are transmitted simultaneously. Accordingly, focus adjustment is performed by either rotating the optical system in the forward or reverse direction along the optical axis. The amount of displacement of the optical system is obtained in the following manner. First, a pattern of a pulse board, which rotates, coupled with the optical system, is detected by a photocoupler, monitoring a pulse signal SENCF outputted from an encoder ENCF which outputs pulses whose number corresponds to the displacement amount. The number of the pulses SENCF is counted by a counter (not shown) provided inside of the lens controller LPRS. When the optical system finishes moving by a calculated amount, then the lens controller LPRS controls the signals LMF and LMR to “L” to operate the motor LMTR.
Therefore, after an instruction for focus adjustment is transmitted from the camera controller PRS, the camera controller PRS does not care about operation of the lens until the lens finishes displacement. Further, the lens controller LPRS transmits the value of the counter to the camera controller PRS when the camera controller PRS requests to do so.
When an instruction for controlling the iris diaphragm is transmitted from the camera controller PRS, a stepping motor DMTR, which is known for driving an iris diaphragm, is operated based on an iris diaphragm stage number transmitted at the same time. Note, since the stepping motor DMTR can control the opening of the iris diaphragm, an encoder for monitoring the operation is not necessary.
Reference ENCZ denotes an encoder attached to a zoom optical system, and the lens controller LPRS detects the zoom position by receiving a signal SENCZ from the encoder ENCZ. The lens controller LPRS stores lens parameters corresponding to respective zoom positions, and outputs a parameter corresponding to a current zoom position to the camera controller PRS when the camera controller PRS requests to do so.
Reference ICC denotes an area sensor unit including area sensors, such as CCDs, and an operation circuit for operating the area sensors, which is used for focus state detection and photometry. The area sensor unit ICC is selected when the selection signal CICC is “H”, and controlled by the camera controller PRS using the data signals SO and SI, and the synchronizing clock SCLK.
φV and φH are read signals for the area sensors, and φR is a reset signal. These sensor signals are generated by a driving circuit provided inside of the area sensor unit ICC on the basis of signals from the microcomputer PRS. The signals outputted from the area sensors are amplified, then inputted into an analog signal input terminal of the microcomputer PRS as output signals IMAGE. Thereafter, the microcomputer PRS converts the analog output signals IMAGE into digital signals, and the values of the digital signals are sequentially stored in the RAM at predetermined addresses. With these digitized signals, to generate distance distribution information in the object space, focus control, or photometry is performed.
Reference DL denotes a rotary operation member, which is used for indicating the direction to change the main object area, which will be explained later. Further, reference DLS denotes a sensor for detecting the rotated direction of the rotary operation member DL, which outputs a signal RTDR indicating the detected direction to the camera controller PRS.
Note that the camera body and the lens are described detachable (i.e., lenses are exchangeable) in
Below, a detailed operation and overall flow of the operation of the camera having the aforesaid configuration are explained with reference to
First, referring to
Next in step S101, a sub-routine for generating distance distribution information (distance distribution generation processing) in the object space is called by the distance distribution generation unit 51.
In step S201 in
Next, the camera controller PRS transmits an instruction to start charging, and accumulation of charges starts in response to the instruction. After a predetermined period elapses, the accumulation of charges ends.
Thereafter, the control signals φV and φH are controlled to drive the area sensors to sequentially output image signals IMAGE, which are A/D converted and stored in RAM in the camera controller PRS. Accordingly, image reading operation in step S201 is completed.
The output image signals from the two area sensors are stored as IMG1 and IMG2 in a predetermined area of the RAM.
Next, each of the images IMG1 and IMG2 obtained from the two area sensors is divided into M×n blocks and defocus distribution (defocus map) is generated in step S202 and its subsequent steps.
First in step S202, variables x and y, which are for indicating a block, are initialized. Next in step S203, luminance signals necessary for performing distance measuring operation for a block B(x, y), i.e., luminance signals of the block obtained from one of the area sensors, are extracted from the image data IMG1 stored in the RAM, and copied in the RAM at predetermined addresses A.
In step S204, luminance signals necessary for performing distance measuring operation for the block B(x, y) are extracted from the image data IMG2, and copied in the RAM at predetermined addresses B.
In step S205, a known correlation operation COR(x, y) is performed on the luminance signals (luminance distribution signals) recorded at the addresses A and B, and a phase difference between the two luminance distributions is calculated.
Thereafter, in step S206, a distance value or a defocused amount is obtained using any known function F( ) on the basis of the phase difference between luminance distributions calculated in step S205, and stored in the RAM at an address D(x, y) which is reserved for recording distance distribution information.
In step S207, the variable x is increased by 1 to move a block to be processed to the next block.
In step S208, x and the division number M in the x direction are compared, and if it is determined in step S208 that x is less than M, then the process returns to step S203, and a distance value or a defocused amount of the next block in the x direction is calculated and stored. If it is determined in step S208 that x is equal to or greater than M, then the process moves to step S209, where x is initialized to 0 and y is increased by 1.
In step S210, y and the division number n in the y direction are compared, and if it is determined that y is less than n, then the process returns to step S203 and the operation for the next row starts. Whereas, if it is determined that y is equal to or greater than n, then the process proceeds to step S211, where operation for obtaining a distance value or a defocused amount of every block has been completed. Accordingly, the distance distribution generation sub-routine of step S101 in
Next, in step S102, a sub-routine for detecting a main object area (main object area detection processing) is called by the main object area determination unit 52. The contents of the main object area detection sub-routine are explained with reference to
In step S301 in
In a case of numbering (grouping) the blocks while performing raster scanning in the direction of an arrow from the left-uppermost block in
Further, the results of determination are stored in the RAM at reserved addresses G(0, 0) to G(M−1, n−1). First, the block B(x, y)=B(0, 0) is registered with a group number g=1. Whenever a new group which composes another object is detected, the group number is increased by 1 and the increased number becomes the group number of the new group. With this operation, respective blocks of the sensed image as shown in
The numbering processing as described above is a known technique called “labeling method”, and a flowchart of grouping of the blocks is omitted. Further determination method of determining whether or not two blocks belong to an identical group is disclosed in the Japanese Patent Laid-Open Application No. 10-161013; accordingly, detailed explanation of the method is omitted.
The number of groups detected in step S301 is set to a variable Gnum in step S302.
In step S303 and its subsequent steps, characteristics of each group forming the sensed image are evaluated, and a group which most probably includes a main object (main object area) is determined out of all the groups (object areas) on the basis of the evaluated characteristics. First in step S303, a variable Gcur which denotes a group to be processed is set to 1.
In step S304, a likelihood (probability), S(Gcur), of a block or blocks, forming a group Gcur, of including a main object is calculated. The probability S(Gcur) is determined on the basis of an average distance to an object included in the group, width and height of the group, and the location of the group in the image. A group which is considered to include the main object is determined on the basis of the calculated probabilities of all the groups. As for a function of calculating the possibility S(Gcur), the equation 1, described above, may be used.
In step S305, the value of Gcur is increased by 1, and the group subjected to the operation is proceeded to the next group.
In step S306, Gcur and Gnum are compared to check whether or not the operation has been performed on all the groups. If Gcur is equal to or less than Gnum, then all the groups have not been processed; therefore, the process returns to step S304. Whereas, if Gcur is greater then Gnum, then the process proceeds to step S307.
In step S307, using a function MAX, a group number having the maximum probability among the probabilities S(1) to S(Gnum) is found, and the group number having the greatest probability substitutes a variable Gmain. The blocks labeled the same group number as Gmain are determined as forming the main object area.
In step S308, the main object area detection processing is completed; thus step S102 in
Thereafter, the process proceeds to step S103 in
In step S401 in
In the subsequent step S402, distance information for focus adjustment is calculated on the basis of information on the main object area set in step S102. Here, an algorithm which gives priority to the minimum distance in the object area is used, and the minimum distance to the object in the area is calculated.
Thereafter, in step S403, the minimum distance is set as the distance to an object to be focused on; thereby the focal length determination processing is completed, and the process returns to the flowchart in
After the operation of steps S101 to S103, performed in the camera controller PRS, an instruction to control focus is transmitted to the lens from the camera controller PRS so as to focus on a point at the distance determined in step S103, and the lens controller LPRS controls the motor LMTR to focus on the object which is currently considered as the main object in step S104.
Next in step S105, whether or not the current focus state is proper, namely, whether the main object area which is currently selected includes the main object which the operator intends to focus on, is determined. In the first embodiment, whether or not the operator wishes to change the main object area is checked by detecting a state of the rotary operation member of the main object area changing unit 53 or a state of the switch SW2 of the shutter release button which trigger image sensing operation. More specifically, if the rotary operation member of the main object area changing unit 53 is operated before the switch SW2 of the shutter release button is turned on, then it is determined that the operator wishes to change the main object area, whereas, if the switch SW2 is turned on before detecting any operation of the rotary operation member, then it is determined that the operator is satisfied with the main object which is currently set.
First, in a case of leaving the main object area unchanged, control signals SMG1 and SMG2 to the respective front and rear curtain operation magnets MG1 and MG2 are generated at proper intervals by the camera controller PRS in step S107, the film is exposed, and the image sensing operation is completed.
Whereas, if a change in main object area is requested in step S105, then the process proceeds to step S106, where the process of selecting a main-object area is performed.
The main object area selection process is explained with reference to
When a change in main object area is requested in step S105 as described above, namely, when the rotary operation member is operated, the direction of the rotation is detected in step S501. In the first embodiment, a system of inputting a moving direction in the object space from the currently selected main object area to an object area which the operator desires using the single axis rotary operation member DL (dial). When rotation to the left is detected, the process proceeds to step S502, whereas when rotation to the right is detected, the process proceeds to step S503.
In step S502 or S503, on the basis of the rotational direction detected in step S501, an object area on the left or right of the area which is currently selected as the main object area is set as a new main object area; accordingly, the main object area selection process is completed.
After the selection of main object area by manual operation is performed in step S106 as described above, the process returns to step S103, where a focal length is determined on the basis of information on the current main object area. Further, the lens is moved in step S104, and whether or not the current main object area is to be changed is determined in step S105, and depending upon the determined result, either exposure operation in step S106 or the main object area selection processing in step S107 is performed.
The operation of changing the main object area performed in step S501 and its subsequent steps is explained taking a scene in
In order to show which object area is currently selected as the main object area, an area selected as the main object area, namely, blocks having the identical group number, is surrounded by a line on a screen of a viewfinder. Alternatively, a mark may be displayed at a rough center of the main object area.
When the object which the operator intends to is focused, by pushing the release shutter button until the switch SW2 is turned on, image sensing operation is realized as desired.
In the first embodiment as described above, only one single axis rotary operation member is used as an input member. However, the present invention is not limited to this, and two single axis rotary operation members whose rotation axes are orthogonal to each other may be used. In this case, four directions of rotation (i.e., up, down, left and right) may be detected. A slide-type direction designation member may be used instead of the rotary operation member. Further, an operation member, such as a track ball, capable of freely rotating in any direction may be used. In this case, operability of the camera would be improved.
Further, instead of designating the location of the main object area by an operation member, a priority order may be determined on the basis of the probabilities S(1) to S(Gnum), which are results of evaluation functions, by sorting the probabilities, and the main object area may be changed from an object area having a higher probability to an object area having a lower probability. In a case where only one single axis rotary operation member is provided to the camera, a system of better operability may be realized by changing the main object area in accordance with a priority order.
Next, a case where the operation member is a focusing system of an optical device, such as a camera, is explained as the second embodiment.
In the second embodiment, so called a full-time manual type camera, whose automatic focusing system can be also operated manually is considered. A basic structure of the camera is the same as that explained in the first embodiment; therefore, configuration elements which are important for explaining the second embodiment are mainly described.
Note, by configuring the system so that the signal flows from the lens controller 66 to the main object area changing unit 63 via the focal length determination unit 54, it is possible to simplify the configuration of the system.
In the processing shown in
More specifically, when the operator wishes to focus on an object which is at a shorter distance or at a longer distance than the distance where the currently focused object exists, the operator operates the focusing system in the desired direction, and the main object area is automatically changed to an object area which is in the direction the operator desires.
The operation of changing the main object area is explained taking a scene in
According to the first and second embodiments of the present invention as described above, a main object area which is automatically determined is checked by an operator, and if the determined area is not the area where the operator intends to focus on, it is possible for the operator to change the main object area to an arbitrary area by operating a predetermined operation member.
Accordingly, a camera capable of performing focus control while reflecting intention of the operator with easy operation is realized.
The present invention is not limited to the above embodiments and various changes and modifications can be made within the spirit and scope of the present invention. Therefore to appraise the public of the scope of the present invention, the following claims are made.
Number | Date | Country | Kind |
---|---|---|---|
9-358512 | Dec 1997 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4520240 | Swindler | May 1985 | A |
4555169 | Suda et al. | Nov 1985 | A |
4618236 | Akashi et al. | Oct 1986 | A |
4634255 | Suda et al. | Jan 1987 | A |
4643557 | Ishizaki et al. | Feb 1987 | A |
4670645 | Ohtaka et al. | Jun 1987 | A |
4688920 | Suda et al. | Aug 1987 | A |
4699493 | Koyama et al. | Oct 1987 | A |
4716282 | Akashi et al. | Dec 1987 | A |
4728785 | Ohnuki et al. | Mar 1988 | A |
4739157 | Akashi et al. | Apr 1988 | A |
4774539 | Suda et al. | Sep 1988 | A |
4792668 | Akashi et al. | Dec 1988 | A |
4792669 | Ohnuki et al. | Dec 1988 | A |
4800410 | Akashi et al. | Jan 1989 | A |
4801963 | Koyama et al. | Jan 1989 | A |
4812869 | Akashi et al. | Mar 1989 | A |
4825239 | Suda et al. | Apr 1989 | A |
4833313 | Akashi et al. | May 1989 | A |
4841326 | Koyama et al. | Jun 1989 | A |
4849782 | Koyama et al. | Jul 1989 | A |
4855777 | Suda et al. | Aug 1989 | A |
4859842 | Suda et al. | Aug 1989 | A |
4907026 | Koyama et al. | Mar 1990 | A |
4908645 | Higashihara et al. | Mar 1990 | A |
4922282 | Koyama et al. | May 1990 | A |
4954701 | Suzuki et al. | Sep 1990 | A |
4959677 | Suda et al. | Sep 1990 | A |
4969003 | Ohnuki et al. | Nov 1990 | A |
4972221 | Ohnuki et al. | Nov 1990 | A |
4974002 | Ohnuki et al. | Nov 1990 | A |
4974003 | Ohnuki et al. | Nov 1990 | A |
4975727 | Ohtaka et al. | Dec 1990 | A |
4980716 | Suzuki et al. | Dec 1990 | A |
4992817 | Aoyama et al. | Feb 1991 | A |
4992819 | Ohtaka et al. | Feb 1991 | A |
4994843 | Kitazawa | Feb 1991 | A |
4998124 | Ishida et al. | Mar 1991 | A |
5003400 | Murakami et al. | Mar 1991 | A |
5005037 | Akashi et al. | Apr 1991 | A |
5005041 | Suda et al. | Apr 1991 | A |
5060002 | Ohnuki et al. | Oct 1991 | A |
5061951 | Higashihara et al. | Oct 1991 | A |
5061953 | Higashihara et al. | Oct 1991 | A |
5079581 | Kadohara et al. | Jan 1992 | A |
5081479 | Kadohara et al. | Jan 1992 | A |
5089843 | Higashihara et al. | Feb 1992 | A |
5103254 | Bell et al. | Apr 1992 | A |
5126777 | Akashi et al. | Jun 1992 | A |
5138260 | Molyneaux et al. | Aug 1992 | A |
5151732 | Akashi et al. | Sep 1992 | A |
5235428 | Hirota et al. | Aug 1993 | A |
5311241 | Akashi et al. | May 1994 | A |
5333028 | Akashi et al. | Jul 1994 | A |
5365302 | Kodama | Nov 1994 | A |
5382996 | Kadohara | Jan 1995 | A |
5392088 | Abe et al. | Feb 1995 | A |
5404152 | Nagai | Apr 1995 | A |
5496106 | Anderson | Mar 1996 | A |
5608489 | Ozaki | Mar 1997 | A |
5615398 | Matsuyama | Mar 1997 | A |
5625415 | Ueno et al. | Apr 1997 | A |
5629735 | Kaneda et al. | May 1997 | A |
5640619 | Takayama et al. | Jun 1997 | A |
5659823 | Mukai et al. | Aug 1997 | A |
5682559 | Yoshino et al. | Oct 1997 | A |
5684627 | Ganser et al. | Nov 1997 | A |
5729771 | Ohtaka | Mar 1998 | A |
5745175 | Anderson | Apr 1998 | A |
5771413 | Suda et al. | Jun 1998 | A |
5797049 | Ohtaka et al. | Aug 1998 | A |
5839001 | Ohtaka et al. | Nov 1998 | A |
5864721 | Suda et al. | Jan 1999 | A |
5890021 | Onoda | Mar 1999 | A |
5895130 | Saito et al. | Apr 1999 | A |
5900927 | Hasegawa | May 1999 | A |
5913082 | Onoda | Jun 1999 | A |
6035139 | Nakamura | Mar 2000 | A |
6067114 | Omata et al. | May 2000 | A |
6370262 | Kawabata | Apr 2002 | B1 |
6977687 | Suh | Dec 2005 | B1 |
20020061127 | Bacus et al. | May 2002 | A1 |
Number | Date | Country |
---|---|---|
61-182516 | Aug 1986 | JP |
4-67607 | Oct 1992 | JP |
7-134238 | May 1995 | JP |
9-127405 | May 1997 | JP |
9-211316 | Aug 1997 | JP |
10-161013 | Jun 1998 | JP |