This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2011-176983, filed on Aug. 12, 2011; the entire contents of which are incorporated herein by reference.
Embodiments of the present invention relate generally to an illuminating device.
It has been known that illuminating light is incident on a human's retina, and then, acts on a region called suprachiasmatic nucleus through a retinohypothalamic tract to thereby exert not only visual influence but also physiological influence. It has also been known that this effect is especially high in light having a lot of short-wavelength lights near 460 nm. For example, it is pointed that exposure to illuminating light at night inhibits secretion of hormone that is melatonin, which may affect a sleep quality. Therefore, a reduction of such influence has been demanded for the illumination used at night.
In order to reduce the influence, it is considered that the illuminating light does not contain light near 460 nm. However, this undesirably deteriorates color rendering properties that are significant functions for the illumination. The color rendering properties are an index indicating how close a color appearance of an object under a certain illumination is to a color appearance of the object under a standard light source. The measurement of the color rendering properties is specified under Japanese Industrial Standards JIS Z8726.
Evaluation methods of the color rendering properties include a general color rendering index and a special color rendering index. The general color rendering index is obtained by calculating color appearances of 8 colors, each having middle level of brightness and vividness, and having a whole hue. The special color rendering index is obtained by calculating color appearances of 15 colors, which include, in addition to 8 colors used for the general color rendering index, colors that reproduce a color and skin color more vivid than those of 8 colors. These values can be calculated by a value of a spectral distribution of the illuminating light.
There has been known a technique of reducing a non-visible influence of illumination with color rendering properties being kept beyond a certain level by solving an optimization problem of minimizing a value of melatonin secretory inhibiting amount under a restraint condition such that the value of the color rendering index is kept beyond a certain level.
However, in the related art, even when an object that does not reflect light near 460 nm is illuminated, for example, light near 460 nm has to be contained in a certain level in order to keep the value of the color rendering index beyond a certain level. Specifically, light with a wavelength unnecessary for keeping a color appearance of an object is contained, resulting in that the non-visual influence cannot efficiently be reduced.
In an embodiment, an illuminating device includes a light source, a light source control unit, an estimation unit, a first calculating unit, and a second calculating unit. The light source includes light emitters, each having a different spectral distribution. The light source control unit determines an emission intensity of each of the light emitters. The estimation unit estimates a spectral reflectivity of an object to which illuminating light is irradiated. The first calculating unit calculates a first evaluation value, indicating an adequacy of a color of the object visually perceived, based on the spectral distributions and the spectral reflectivity. The second calculating unit calculates a second evaluation value, indicating how much an influence is given by the illuminating light to factors other than a visual sense, based on the spectral distribution. The light source control unit determines the emission intensities by which the first evaluation value and the second evaluation value satisfy a restraint condition determined beforehand.
Preferable embodiments of an illuminating device will be described in detail below with reference to the attached drawings.
First Embodiment
The illuminating device according to the first embodiment prevents light with a wavelength unnecessary for keeping a color appearance of an object from being contained in illuminating light, thereby efficiently reducing non-visual influence.
The light source 1 is configured to independently control emission intensity, and include two or more types of light emitters, each of which has different spectral distribution. The light emitters are typically LEDs (Light Emitting Diode) corresponding to three primary colors of RGB.
The LED is compact, and light-weight. Therefore, it is relatively easy to incorporate plural LEDs into one illuminating device, and to control the emission intensity of each LED independently.
When the spectral distribution of each LED is defined as Pi(λ), and the emission intensity of each LED is defined as ai, the spectral power distribution P(λ) as the whole illuminating device having incorporated therein n types of LEDs, each having a different spectral distribution, can be represented by Equation (1) described below.
Specifically, the spectral power distribution of the light source 1 can be considered to be a value of a function determined by n-dimensional vectors A=(a1 a2 a3 . . . an).
A value of the color rendering index and a value of melatonin secretory inhibiting amount can be calculated, if the spectral distribution is found. Therefore, if a value of n-dimensional vector A is determined, these can be calculated.
The light emitters may include LEDs of three or more colors, or may include white LEDs, each having a different color temperature. The light emitters may also include both of them.
The light source control unit 2 controls the emission of each light emitter constituting the light source 1. Typically, the light source control unit 2 controls the amount of current flowing through each light emitter. The light source control unit 2 may control a voltage applied to each light emitter. A DC current and DC voltage may be controlled, or an AC current and AC voltage may be controlled. Any control method such as PWM control or phase control may be employed. The light source control unit 2 holds therein a table of a value of a spectral distribution of each light emitter in the light source 1. If the number of the types of light emitters is n, this table stores values of each of Pi(λ), (i=1, 2, . . . , n) at a predetermined interval within a region of a visible light. When the emission intensity of each of the light emitters is defined as ai(i=1, 2, . . . n), the light source control unit 2 has a function of calculating the spectral power distribution P(λ) of the light source 1 according to Equation (2) described below.
The light source control unit 2 also has a function of reporting the calculated P(λ) to the first calculating unit 4 and the second calculating unit 5. The light source control unit 2 also has a function of receiving estimated values (first evaluation value and second evaluation value) calculated by the first calculating unit 4 and the second calculating unit 5, and determining a vector A by solving an optimization problem having the estimated values defined as target variables.
The estimation unit 3 estimates a spectral reflectivity of an object (not illustrated) that is to be illuminated by the illuminating light from the light source 1.
The imaging unit 301 is an image sensor such as a CCD camera or a CMOS camera. The spectral sensitivity S(λ) of the imaging unit 301 has already been known. The imaging unit 301 images an object, which is illuminated by the illuminating light from the light source 1 through the variable filter 302, in synchronous with the variable filter 302 in accordance with the report from the imaging control unit 303. The captured image is transmitted to the image processing unit 304.
The variable filter 302 can change plural filters whose spectral transmittance has been known. The variable filter 302 changes the spectral transmittance in accordance with the report from the imaging control unit 303.
The imaging control unit 303 controls the variable filter 302 and the imaging unit 301 such that the variable filter 302 changes the spectral transmittance, and then, the imaging unit 301 captures an image.
The image processing unit 304 estimates the spectral reflectivity of the object, which is illuminated by the illuminating light from the light source 1, from plural images captured by the imaging unit 301 under different spectral transmittances of the variable filter 302. The method of estimating the spectral reflectivity by the image processing unit 304 is as described below.
The spectral reflectivity on any portion of the object is defined as R(λ), the spectral power distribution of the light source 1 is defined as P(λ), the spectral sensitivity of the imaging unit 301 is defined as S(λ), and the spectral transmittance that can be changed by the variable filter 302 is defined as Tj(λ), (j=1, 2, . . . , m). The values of S(λ) and Tj(λ) are held as a table (not illustrated) in the image processing unit 304, for example.
The output value Vj of the imaging unit 301 to the object, captured through the variable filter 302 whose spectral transmittance is changed to the jth spectral transmittance, is represented by Equation (3) described below.
Vj=∫λ
λ1 and λw respectively indicate a lower limit and an upper limit of a wavelength where the sensitivity of the imaging unit 301 is guaranteed. When the integration described above is approximated by a discrete value, and resolved, a determinant illustrated by Equation (4) is obtained. Here, Δλ is a quantization range for performing the discretization.
When the matrix at the left in the right side in Equation (4) is put as F, and a pseudo inverse matrix G of F is obtained by wiener method, the spectral reflectivity R(λ) of the object can be obtained by Equation (5) described below.
In the above description, the number of channels of the imaging unit 301 is 1 (the monochromatic image). If the imaging unit 301 having 3 channels of REB is used, for example, the number of the equations in Equation (4) can be tripled. Therefore, the spectral reflectivity can be estimated with high precision.
The method of estimating the spectral reflectivity of the object is not limited thereto. Any other techniques may be used.
The second calculating unit 5 estimates a non-visual influence quantity Y1 of the illumination based on the spectral power distribution P(λ) of the light source 1 reported from the light source control unit 2. The estimated non-visual influence quantity Y1 is reported to the light source control unit 2.
The non-visual influence quantity Y1 (second evaluation value) indicates how much the influence is given by the illuminating light to non-visual factors. For example, the non-visual influence quantity Y1 is a value of integral of a product of a action spectrum for melatonin suppression and the spectral power distribution P(λ) of the light source 1. The value of the melatonin secretory inhibition prediction expression considering a response of a cone, a rod, and a melanopsin-containing ganglion cell into may be employed as the non-visual influence quantity Y1.
The value of integral of the product of the melatonin secretory inhibiting action spectral and the spectral power distribution of the light source 1 is defined by Equation (6) described below wherein the spectral power distribution of the light source 1 is defined as P(λ), and the action spectrum for melatonin suppression is defined as M1(λ).
Y1=∫380 nm730 nmP(λ)M1(λ)dλ (6)
The value of the melatonin secretory inhibition prediction expression considering the response of a cone, a rod, and a melanopsin-containing ganglion cell is classified according to a value of T (Equation (7)). When T≧0, it can be calculated by Equation (8), and when T<0, it can be calculated by Equation (9).
Here, constants are set as k=0.31, α1=0.285, α2=0.2, α3=0.72, b1=0.01, b2=0.001, and rodsat=6.5. M2(λ) is the spectral reaction sensitivity of the melanopsin-containing ganglion cell, V10(λ) is the spectral reaction sensitivity of an L cone and M cone, V′(λ) is the spectral reaction sensitivity of the rod, and S(λ) is the spectral sensitivity of an S cone.
The first calculating unit 4 calculates the output value (first evaluation value) indicating an adequacy of a color of an object (color appearance of an object) visually perceived, based on the spectral distribution of the light source 1 and the spectral reflectivity of the object. For example, the first calculating unit 4 estimates the color appearance of the object under a standard light source based on the value R(λ), estimated by the estimation unit 3, of the spectral reflectivity of the object illuminated by the illuminating light from the light source 1, and the value P(λ) of the spectral power distribution of the light source 1 determined by the light source control unit 2. The specific method of the estimation will be described below.
Firstly, the first calculating unit 4 obtains a correlated color temperature of emission color by the spectral power distribution P(λ) of the light source 1, and determines the light source used as the standard light source. When the correlated color temperature of P(λ) is less than 5000 K, the first calculating unit 4 defines light with the correlated color temperature equal to P(λ) of a complete radiator as the standard light source. When it is 5000 K or more, the first calculating unit 4 defines light with the correlated color temperature equal to P(λ) of CIE daylight as the standard light source. The value of the spectral distribution of the standard light source obtained here is defined as S(λ) below. This value is stored as a table (not illustrated) in the first calculating unit 4, for example.
A coordinate value (Xp, Yp, Zp) corresponding to the light source 1 in an XYZ color system and a coordinate value (Xs, Ys, Zs) corresponding to the standard light source are obtained by Equation (10) to Equation (17) described below.
Xp=Kp∫380 nm780 nmP(λ){dot over (x)}(λ)dλ (10)
Yp=Kp∫380 nm780 nmP(λ){dot over (y)}(λ)dλ (11)
Zp=Kp∫380 nm780 nmP(λ)ż(λ)dλ (12)
wherein
wherein
wherein {dot over (x)}(λ), {dot over (y)}(λ), ż(λ) are color-matching functions in an XYZ color system.
Next, a coordinate value (up, vp) corresponding to the light source 1 on a CIE1960UCS chromaticity diagram and a coordinate value (us, vs) corresponding to the standard light source are obtained by Equation (18) to Equation (21) described below.
A value (Xpr, Ypr, Zpr)of a tristimulus value of an object color under the light source 1 and a value (Xsr, Ysr, Zsr) thereof under the standard light source are obtained by Equation (22) to Equation (29) described below.
Xsr=Kp∫380 nm780 nmS(λ)R(λ){dot over (x)}(λ)dλ (22)
Ysr=Kp∫380 nm780 nmS(λ)R(λ){dot over (y)}(λ)dλ (23)
Zsr=Kp∫380 nm780 nmS(λ)R(λ)ż(λ)dλ (24)
wherein
wherein
From these values, a coordinate value (upr, vpr) under the light source 1 on the CIE1960UCS chromaticity diagram and a coordinate value (usr, vsr) under the standard light source are obtained by Equation (30) to Equation (33) described below.
Next, a chromatic-adaptation transform is executed in accordance with Equation (34) to Equation (37) described below.
Here, cs, cp, cpr, ds, dp, and dpr are obtained by Equation (38) to Equation (43) described below.
A coordinate (W*sr, U*sr, V*sr) under the standard light source in a CIE1964 uniform color space and a coordinate (W*pr, U*pr, V*pr) under the light source 1 are obtained by Equation (44) to Equation (49) described below.
W*sr=25(Ysr)1/3−17 (44)
U*sr=13W*sr(usr−us) (45)
V*sr=13W*sr(vsr−us) (46)
W*pr=25(Ypr)1/3−17 (47)
U*pr=13W*pr(u′pr−u′p) (48)
V*pr=13W*pr(v′pr−v′p) (49)
A chromaticity difference ΔE is obtained by Equation (50) described below through the procedure described above.
ΔE=√{square root over ((W*sr−W*pr)2+(U*sr−U*pr)2+(V*sr−V*pr)2)}{square root over ((W*sr−W*pr)2+(U*sr−U*pr)2+(V*sr−V*pr)2)}{square root over ((W*sr−W*pr)2+(U*sr−U*pr)2+(V*sr−V*pr)2)} (50)
The value of ΔE is obtained for each pixel of the image captured by the imaging unit 301. Therefore, the value of each pixel is defined as ΔEhw(h=1, 2, . . . , H) (w=1, 2, . . . , W), and the total of the chromaticity difference of the whole image is specified as farness in color appearance under two illuminations.
In Equation (51), the values are summed up over the whole region of the image. However, only the values in a specific region may be considered into.
A closeness in the color appearance can be defined by any function Y2=F(ΔEsum), wherein Y2 becomes smaller as the value of ΔEsum increases. The value Y2 of the closeness in the color appearance becomes the output value (first evaluation value) of the first calculating unit 4.
The method described above is a method of calculating the color appearance under the standard light source in the CIE1964 uniform color space and the closeness in the color appearance under the light source 1 of the illuminating device 100. This method can be replaced by a method of obtaining a chromaticity difference in another color space such as L*a*b* color space. Alternatively, an equation of chromaticity difference such as CIEDE2000 can be used instead.
The control process of the illuminating device 100 thus configured according to the first embodiment will be described next with reference to
The estimation unit 3 estimates the spectral reflectivity of an object illuminated by the illumination (step S101), The result of the estimation is reported to the first calculating unit 4.
The light source control unit 2 optimizes the vector A (A=a1 a2 a3 . . . an), which is a value determining the emission intensity of the light source 1, based on the optimization condition determined beforehand (step S102). In this case, the light source control unit 2 reports a value of P(λ) corresponding to a temporary vector A to the second calculating unit 5 and the first calculating unit 4. The second calculating unit 5 and the first calculating unit 4 return the estimated values Y1 and Y2 corresponding to the value of the temporary vector A. The condition for optimization used here is the one for minimizing Y1 with Y2 being kept to be not less than a certain value. This is mathematically a general optimization problem with restraint condition. The value of the vector A satisfying the above-mentioned condition can be obtained by using a general optimization method such as a gradient method or simulated annealing.
The light source control unit 2 controls such that each of the light emitters of the light source 1 emits light based on the determined value of the vector A (step S103).
With this process, when an object that hardly reflects blue light (especially, light with a wavelength near 460 nm) is illuminated, for example, the light source 1 does not have to contain blue light even if the value of Y2 is likely to be kept large. On the other hand, if the value of the color rendering index is likely to be kept large as in the related case, the light source 1 has to contain blue light. Therefore, the present embodiment that tries to keep the value of Y2 large can reduce the value of Y1 more than the related method that tries to keep the value of the color rendering index. Specifically, the present embodiment can efficiently reduce the non-visual influence due to the illumination without deteriorating the color appearance of an object illuminated by the illumination.
The restraint condition used in step S102 is not limited to the condition in which the Y1 is minimized with the Y2 being kept to be not less than a certain value. For example, the condition such that the Y2 is maximized with the Y1 being kept to be not more than a certain value may be employed, if the reduction in the non-visual influence takes priority. If an arousal level is to be enhanced, the condition such that the Y1 is maximized with the Y2 being kept to be not less than a certain value may be employed.
Modification of First Embodiment
Specifically, the value of Y2 used for obtaining the vector A by solving the above-mentioned optimization problem may be replaced by a chromaticity difference in plural regions, each having different spectral reflectivity. For this replacement, what is done in the first calculating unit 4 is replaced as described below in the present modification.
Firstly, a cluster analysis is performed to the value R(λ) of the spectral reflectivity of each pixel estimated by the estimation unit 3. With this process, the value R(λ) of the spectral reflectivity is classified into a cluster belonging to each region. The values of the spectral reflectivity corresponding to the representative values (e.g., centroids) of the first region and the second region are set as R1(λ) and R2(λ).
The coordinate in the CIE1964 uniform color space under the light source 1 can be calculated in the same manner as in the above-mentioned method. The coordinate of the first region is set as (W*pr1, U*pr1, V*pr1), and the coordinate of the second region is set as (W*pr2, U*pr2, V*pr2).
The chromaticity difference ΔE12 between the first region and the second region can be obtained by Equation (52) described below. This value can be set as the output value Y2 of the first calculating unit 4.
ΔE12=√{square root over ((W*pr1−W*pr2)2+(U*pr1−U*pr2)2+(V*pr1−V*pr2)2)}{square root over ((W*pr1−W*pr2)2+(U*pr1−U*pr2)2+(V*pr1−V*pr2)2)}{square root over ((W*pr1−W*pr2)2+(U*pr1−U*pr2)2+(V*pr1−V*pr2)2)} (52)
In this modification, the method of obtaining the chromaticity difference in the CIE1964 uniform color space can be replaced by a method of obtaining a chromaticity difference in another color space such as L*a*b* color space. Alternatively, an equation of chromaticity difference such as CIEDE2000 can be used instead.
The above-mentioned process is for the case in which there are two regions, each having a different spectral reflectivity. However, the similar process can be applied to the case in which there are three or more regions. In this case, the chromaticity difference may be calculated for each combination of the respective regions, and its average or minimum value may be set as the output value of the first calculating unit 4.
For example, when characters (reflecting generally wavelengths of red, green and blue) that look white under the white light source are written on a background (that hardly reflects light of all wavelengths) that looks black under the white light source, the characters look yellow, which is very far from black, even if the light source 1 does not contain blue light, because the character region reflects light having wavelengths of red and green. Therefore, the value of Y2 can be kept large without the inclusion of the blue light that gives a great non-visual influence. Specifically, the present modification can more efficiently reduce the non-visual influence.
The case between the background color and the character color is described above. However, the modification is applicable for a case except for the case between the background and characters, e.g., for a case in which different colors can be distinguished (e.g., check for defective goods).
Second Embodiment
In the second embodiment, the functions of the light source control unit 2-2 and the estimation unit 3-2 are different from those in the first embodiment. The other components and functions are similar to those in
Instead of the function of the light source control unit 2 in the first embodiment, the light source control unit 2-2 has a function of controlling the spectral distribution of the light source 1 in accordance with the input from the estimation unit 3-2.
The estimation unit 3-2 has a function of estimating the spectral reflectivity of an object illuminated by the illuminating light from the light source 1, and also has a function of giving an instruction to the light source control unit 2-2 to change the spectral distribution.
The imaging control unit 303-2 issues a request of changing the spectral distribution of the light source 1 and a request of sequentially executing an image-capture by the imaging unit 301 to the light source control unit 2-2. The light source control unit 2-2 controls the light source 1 to have different m types of spectral distributions according to the request.
The image processing unit 304-2 estimates the spectral reflectivity of the object, which is illuminated by the illuminating light from the light source 1, from plural images captured by the imaging unit 301 under different spectral distributions of the light source 1. The method of estimating the spectral reflectivity by the image processing unit 304-2 according to the present embodiment is as described below.
The spectral reflectivity on any portion of the object is defined as R(λ), the m types of spectral distributions of the light source 1 are defined as Pj(λ), (j=1, 2, . . . , m), and the spectral sensitivity of the imaging unit 301 is defined as S(λ).
The output value Vj of the imaging unit 301 to the object captured under the light source 1 having the j-th type of spectral distribution is represented by Equation (53) described below.
Vj=∫λ
When the integration described above is approximated by a discrete value, and resolved, a determinant illustrated by Equation (54) is obtained.
When the matrix at the left in the right side in Equation (54) is put as F, and a pseudo inverse matrix G of F is obtained by wiener method, the spectral reflectivity R(λ) of the object can be obtained by Equation (55) described below.
As in the first embodiment, if the imaging unit 301 having 3 channels of RGB is used, for example, the number of the equations in Equation (54) can be tripled. Therefore, the spectral reflectivity can be estimated with high precision.
As described above, even the illuminating device according to the second embodiment that does not include the variable filter can provide the same effect as that of the first embodiment. The variable filter can further be provided in the second embodiment, as in the first embodiment. In this case, when the types of the variable filter are defined as m1, and the types of the different spectral distributions by the light source 1 are defined as m2, the number of the equations in Equation (54) can be a maximum of m1×m2. Accordingly, the spectral reflectivity of the object can more precisely be estimated.
Third Embodiment
In the third embodiment, spectral reflectivity of an object illuminated by an illuminating device is estimated based on a user's manual input.
The input unit 305 is composed of a touch panel display, for example.
The input processing unit 306 estimates a spectral reflectivity with reference to the spectral reflectivity database 307 based on the information reported from the input unit 305. The result of the estimation is reported to the first calculating unit 4.
The spectral reflectivity database 307 is a storage unit that stores spectral reflectivity of a typical (average) sheet or ink for each color, for example. The spectral reflectivity database 307 can be composed of a storage medium popularly used, such as HDD (Hard Disk Drive), an optical disk, a memory card, and a RAM (Random Access Memory).
The spectral reflectivity database 307 transmits the value of the spectral reflectivity P(λ) corresponding to each color to the input processing unit 306 in accordance with the inquiry from the input processing unit 306. The spectral reflectivity database 307 may be present on external network such as Web. In this case, the input processing unit 306 has a function of connecting to the network.
With this configuration, the estimation unit 3-3 in the third embodiment can estimate the spectral reflectivity of the object. The third embodiment can realize functions like those of the first and second embodiments without mounting a high-cost camera (imaging unit) or variable filter to the estimation unit 3-3.
Modification of Third Embodiment
In the present modification, a user does not directly designate a color of an object illuminated by the illumination, but designates a name of an object illuminated by the illumination.
The spectral reflectivity database 307 stores spectral reflectivity for the sheet and each color of ink used for the object designated by the input unit 305. The spectral reflectivity database 307 transmits the spectral reflectivity P(λ) corresponding to each of the used colors to the input processing unit 306 in accordance with the inquiry from the input processing unit 306. The spectral reflectivity database 307 may be provided or may be present on external network such as Web. The spectral reflectivity database 307 can be configured such that, when a new magazine is first published, or when a used sheet or a type of used ink is changed, the content thereof can be updated on a case-by-case basis.
The present modification can spare the user the trouble of designating the color of the object illuminated by the illumination. Since the spectral reflectivity of the actual object can be used, the spectral distribution of the light source can be optimized based on the highly-precise spectral reflectivity.
As described above, the first to the third embodiments can evaluate a color appearance of an object, which is illuminated by illuminating light from the light source, in consideration of the spectral reflectivity of the object, in order to prevent light with a wavelength unnecessary for keeping the color appearance of the object from being contained in the illuminating light. Consequently, the non-visual influence can more efficiently be reduced.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2011-176983 | Aug 2011 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
6577073 | Shimizu et al. | Jun 2003 | B2 |
6817735 | Shimizu et al. | Nov 2004 | B2 |
7008078 | Shimizu et al. | Mar 2006 | B2 |
7656371 | Shimizu et al. | Feb 2010 | B2 |
7939794 | Rains et al. | May 2011 | B2 |
8222584 | Rains et al. | Jul 2012 | B2 |
8525444 | Van Duijneveldt | Sep 2013 | B2 |
20100060195 | Tsuboi et al. | Mar 2010 | A1 |
20100063566 | Uchiumi et al. | Mar 2010 | A1 |
20130328501 | Moriuchi et al. | Dec 2013 | A1 |
Number | Date | Country |
---|---|---|
2000156102 | Jun 2000 | JP |
2008251337 | Oct 2008 | JP |
2010238407 | Oct 2010 | JP |
2008069101 | Dec 2008 | WO |
2008069103 | Dec 2008 | WO |
Entry |
---|
Office Action mailed Apr. 30, 2014 in counterpart Japanese Patent Application No. 2011-176983 and English translation thereof. |
Number | Date | Country | |
---|---|---|---|
20130039053 A1 | Feb 2013 | US |