The present application is related to electronic devices having cameras and, more particularly, to concentricity of cameras'within apertures in electronic device housings.
Concentricity of parts visible within apertures of electronic device housings contribute to product appeal. Generally, the more centered a part is, the more appealing the product is. Quality control efforts regarding the concentricity of parts, such as a camera, have previously been based on visual evaluation. The results of such evaluations have been inconsistent.
In some embodiments, a method for determining the concentricity of a camera mounted within an aperture of an electronic device housing is provided. The method includes obtaining an image of a camera mounted within the aperture of the housing and identifying a plurality of circular shapes within the image using a processor. One of the circular shapes represents the aperture in the housing and the other circular shape represents the camera. An offset between two of the circular shapes is calculated.
In some embodiments, a system for determining the concentricity of a component relative to an aperture in a housing of an electronic device is provided. The system includes a processor and a memory coupled to the processor. The memory stores instructions executable by the processor to calculate the concentricity. The system also includes a camera configured to capture an image of the component within the aperture and provide the image to the processor. The image is processed by the processor to discern the shape of the aperture and the shape of the component, and determine a concentricity therebetween.
In some embodiments, a method for determining a concentricity of a camera is provided. The method includes capturing an image of a camera mounted within an aperture of a housing and processing the image. An edge of an outer circular shape is determined, as is an edge of an inner circular shape. The inner circular shape is circumscribed by the outer circular shape. A concentricity value of the inner circular shape relative to the outer circular shape is computed.
Certain embodiments take the form of methods and/or systems related to automated processes for determining concentricity of a camera within an aperture of an electronic device housing. In particular, in some embodiments, a method is provided for determining concentricity by obtaining an image of the camera within the aperture of the housing and processing the image to identify a plurality of circular shapes within the image. Concentricity is then determined based on the relative location of at least two of the plurality of circular images. Generally and as used herein, “concentricity” refers to the degree to which the center of a component is also the center of the aperture (e.g., how close to being centered within the aperture the camera is). Ideally, the camera is sufficiently centered that an offset is not discernible to user.
In some embodiments, a process for determining concentricity may include capturing an image of the camera, as mounted within the aperture, and discerning an outer circular shape and an inner circular shape. A centroid for each of the outer and inner circular shapes is determined. A distance between the two centroids may be computed and may be used to calculate a concentricity value.
In some embodiments, the concentricity value may be determined based on a ratio of a maximum distance and a minimum distance from the edges of the inner circle to the outer circle. That is, in some embodiments, a maximum distance between an edge of the inner circle to the outer circle is found, as is a minimum distance from the edge of the inner circle to the outer circle. The maximum value and the minimum values are expressed as a ratio indicative of the concentricity of the camera. The closer the value is to 1, the more concentric, or better centered, the camera is within the aperture.
Referring to the drawings and turning initially to
The camera 106 may be external to a housing of the system 100. In some embodiments, the camera 106 may be communicatively coupled to the processor 102 via a universal serial bus (USB) connection 110, or other suitable connection. In some embodiments, the camera 106 may include fiber optic cables that couple to a sensor positioned with the housing of the system 100. The camera 106 is coupled to a stabilizing apparatus 112 that is configured to hold the camera 106 over the aperture when the camera is capturing an image. The display 108 may also be coupled to the processor 102 and may be configured to display the captured image, the circular shapes, the concentricity value, and/or other graphics.
Additionally,
Turning to
In some embodiments, an outer circular shape may be determined and then an inner circular shape may be found within the outer circular shape. It should be appreciated that in some instances multiple circular shapes may be found within the outer circular shape. Generally, a first circular shape moving inwardly from the outer circular shape may be used for concentricity calculation purposes, although other circular shapes may also or alternatively be used. One of the plurality of circular shapes found in the image may be presumed to represent the aperture 202 and one or more circular shapes may be presumed to represent one or more layers of the camera 201. The multiple circular shapes may include one or more layers of the camera assembly, such as a bezel or housing of the camera, one or more lenses for the camera, and so forth. A concentricity of the camera may be found by determining an offset of the one or more circular shapes from the aperture 202. The offset may be determined and represented in several different ways. The determination of concentricity will be described in greater detail below.
In one embodiment, the concentricity may be represented by a raw offset of the centroids of the circular shapes. As such, the centroids of the circular shapes are determined (Block 308) and then an offset of the centroids is calculated as a raw concentricity value (Block 310). Specifically, once the inner and outer circular shapes are determined, a distance between their centroids is determined. In some embodiments, the distance may be measured in inches, millimeters, or pixels, for example. As can be appreciated, a smaller distance between the centroids indicates more concentric circles.
Additionally, in some embodiments the concentricity may be represented by a relative concentricity value (Block 312). In some embodiments, the relative concentricity value may be calculated as a ratio of a maximum distance and a minimum distance between edges of the circular shapes. It should be appreciated, that in some embodiments, either the raw measurement or the relative measurement may be implemented without the other.
In some embodiments, the concentricity values may be compared against thresholds (Block 314). The thresholds may be based on an acceptable variance range, such as up to 0.03 mm for the raw measurements. In other embodiments, the thresholds may be determined based on empirical determinations as to what appears to be acceptable or concentric. As may be appreciated, the concentricity thresholds may vary based on the relative sizes of the circles, as well as the size of the housing relative to the circles. Upon comparing with the thresholds, it may be determined whether one or more of the concentricity values exceed a threshold (Block 316). If no threshold is exceeded, an indication that the camera is concentric may be provided (Block 318) and the electronic device housing may be routed for further processing (Block 320). Alternatively, if one or more of the concentricity values exceeds a threshold, an indication may be provided that the camera is not concentric (Block 322) and adjustments may be performed to correct the camera concentricity (Block 324). In some embodiments, adjustments may be made to the camera 106 that captures the images of the circles, the housing 200, or the camera within the housing. Additionally, in some embodiments, upon providing an indication that the camera is not concentric, the process may be repeated without making any adjustments. A determination may be made based on the comparison to the thresholds (Block 316).
To find circular shapes within the obtained image, known image processing techniques may be applied to the image to find edges of the shapes. For example, in one embodiment, characteristics such as color, darkness, and so forth, of adjacent pixels in the image may be compared to determine where an edge occurs. As one example, if one pixel is found to be significantly darker than an adjacent pixel during the comparison of adjacent pixels, it may indicate that the pixels represent an edge of a shape.
In some embodiments, one or more filters may be applied to help to neutralize and/or eliminate the background (e.g., the housing 200 and any surrounding details captured by the camera) so that it does not impact the determination and treatment of the circular shapes. Specifically, a threshold for a filter may be set that corresponds to characteristics of the housing to create a binary image with the circular shapes being revealed and the housing being essentially eliminated. For example, in some embodiments, the housing 200 may have a lighter color, such as white or aluminum, such that the color of the housing allows dictates a threshold setting that renders the housing as featureless white space in a captured image. Thus, this background is essentially eliminated. In other embodiments, the housing may have a darker color, such as black, that may require a more nuanced filter and/or a higher threshold so that the circular shapes may be discerned, as the camera components generally may appear as a dark color.
Additionally, in some embodiments, the area of the aperture may be used to help filter the image. In particular and as discussed above, the area of the aperture may be known within a relatively small threshold. Thus, the image may be scanned for a circular shape that is within the threshold limits of the known value. The found circular shape may be assumed to be the aperture. Once found, the image may be searched within the area circumscribed by the found circular shape to the exclusion of portions of the image outside the circular shape. In some embodiments, the image may be searched inwardly from the found circular image for an inner circular shape (e.g., the next largest circular shape circumscribed within the circular shape). That is, for example, the image may be searched from a left-most edge of the found circular shape until it reaches an edge of another circular shape.
The circular shapes in the image 500 are processed to determine their concentricity. As mentioned above, the concentricity may be a raw measurement, such as the distance between centroids of the circular shapes, or it may be a relative measurement, such as calculating a ratio between maximum and minimum distances from the edges of the inner circle to the outer circle.
Generally, the image 500 may be processed on as an x-y coordinate grid, where the x-axis is the horizontal axis and the y-axis is the vertical axis. The centroids of the circular shapes may be determined by taking an average of the columns and rows for each circular shape. For the purposes of this discussion and the following example, the units of the x- and y-axis may be arbitrary. However, in some embodiments, the units may take the form or inches, millimeters, pixels, and the like.
Referring to
In another embodiment, the column address values of all the pixels of the circular region are summed and then divided by the total number of columns that make up the circular region. The result is the x location (in pixels) of the center of the circular region. The process is repeated for the rows to determine the y-location of the center of the circular region. This embodiment may provide more robust results for instances where the image of the circles are ragged.
In addition to the centroids, the radius and diameter of the circular shapes may also be determined. It should be appreciated that the diameter and the radius may be determined in a variety of different ways. For example, the difference between the centroid and one of the start or end row or columns is the radius and the diameter is the difference between the start and end of the row or column respectively. Specifically, the radius (r1) of the aperture is 7 (e.g., 15−8=7). The diameter of the aperture is 14 (e.g., 22−8=14). The radius (r2) of the bezel is 3 (e.g., 17−14=3). Additionally, the radius and diameter may be determined relative to the other once one is known, as the diameter is twice the radius.
In another embodiment, the radius (e.g., in pixels) is set equal to the square root of A/pi, where A is the total number of pixels in the circular region, and pi is the ratio of the circumference of a circle to its diameter (i.e., 3.141592654 . . . ).
The centroids, radii, and diameters may be used to determine the raw concentricity and relative concentricity of the circular shapes. The raw concentricity may be represented by “C”, as shown in
The computation of the relative concentricity is based on a determination of the maximum and minimum distances between outer circular shape 202 and inner circular shape 201. For convenience, as illustrated in
A=r1−r2+C (in this example, A=7−3+√5);
and the minimum distance may be computed by the equation:
B=r1−r2−C (in this example, B=7−3−√5).
The relative concentricity is a ratio of A and B. As such, in one embodiment, the relative concentricity may be A/B or another ratio that may be constructed to represent the relative position of the circular shapes. In some embodiments, such as in the case of the ratio of A/B, the closer the ratio is to 1 is indicative of more concentric circular shapes.
The concentricity values, once computed, may be outputted to a display or otherwise provided so that decisions may be made relative to quality control. In some embodiments, both the raw and relative concentricity measurements may be outputted, while in other embodiments, only one of the concentricity measurements may be outputted. Additionally, the concentricity measurements may be compared against a threshold representative of acceptable concentricity values. For example, a raw concentricity value greater than two may be determined to be unacceptable. As such, a threshold of two may be set and a raw concentricity value greater than two may set a user alert that the threshold has been exceeded. A user may then take appropriate action, such as manipulating the camera positioning within the housing, routing the device for correction of the camera concentricity, and so forth. In some embodiments, a determination of non-concentricity may be the result of misalignment of the camera 106. As such, in some embodiments, the user may manipulate the camera into a proper position and repeat the testing for concentricity. In some embodiments, the manipulation may be an automated process completed through mechanical or robotic devices. After manipulation, the concentricity may again be determined.
Although the present disclosure has been described with respect to particular systems and methods, it should be recognized upon reading this disclosure that certain changes or modifications to the embodiments and/or their operations, as described herein, may be made without departing from the spirit or scope of the invention. Indeed, one or more of the techniques described herein may be implemented in whole or in part and may be combined with one or more of the other techniques disclosed herein. Additionally, it should be appreciated, that the techniques disclosed herein may be applicable beyond the specific examples described above. For example, concentricity may be determined for a variety of optical oriented devices such as viewfinders or binoculars. Additionally, concentricity may be determined for the placement of displays, buttons, and so on, within a housing. Further, applications may be found in other areas such as picture frames and matting. As such, it should be appreciated that the scope may extend to non-circular objects as well.
Number | Name | Date | Kind |
---|---|---|---|
3363104 | Waite et al. | Jan 1968 | A |
3761947 | Volkmann et al. | Sep 1973 | A |
4620222 | Baba et al. | Oct 1986 | A |
4691366 | Fenster et al. | Sep 1987 | A |
4823194 | Mishima et al. | Apr 1989 | A |
4992666 | Robertson | Feb 1991 | A |
5086478 | Kelly-Mahaffey et al. | Feb 1992 | A |
5272473 | Thompson et al. | Dec 1993 | A |
5274494 | Rafanelli et al. | Dec 1993 | A |
5283640 | Tilton | Feb 1994 | A |
5337081 | Kamiya et al. | Aug 1994 | A |
5625408 | Matsugu et al. | Apr 1997 | A |
5748199 | Palm | May 1998 | A |
5757423 | Tanaka et al. | May 1998 | A |
6002423 | Rappaport et al. | Dec 1999 | A |
6043838 | Chen | Mar 2000 | A |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6282655 | Given | Aug 2001 | B1 |
6310662 | Sunakawa et al. | Oct 2001 | B1 |
6339429 | Schug | Jan 2002 | B1 |
6389153 | Imai et al. | May 2002 | B1 |
6416186 | Nakamura | Jul 2002 | B1 |
6421118 | Shaar | Jul 2002 | B1 |
6456339 | Surati et al. | Sep 2002 | B1 |
6516151 | Pilu | Feb 2003 | B2 |
6525772 | Johnson et al. | Feb 2003 | B2 |
6560711 | Given et al. | May 2003 | B1 |
6561654 | Mukawa et al. | May 2003 | B2 |
6614471 | Ott | Sep 2003 | B1 |
6618076 | Sukthankar et al. | Sep 2003 | B1 |
6636292 | Roddy et al. | Oct 2003 | B2 |
6807010 | Kowarz | Oct 2004 | B2 |
6862022 | Slupe | Mar 2005 | B2 |
6862035 | Jeong et al. | Mar 2005 | B2 |
6877863 | Wood et al. | Apr 2005 | B2 |
6903880 | Beatson et al. | Jun 2005 | B2 |
6921172 | Ulichney et al. | Jul 2005 | B2 |
6924909 | Lee et al. | Aug 2005 | B2 |
6930669 | Weiner et al. | Aug 2005 | B2 |
6931601 | Vronay et al. | Aug 2005 | B2 |
6970080 | Crouch et al. | Nov 2005 | B1 |
7028269 | Cohen-Solal et al. | Apr 2006 | B1 |
7058234 | Gindele et al. | Jun 2006 | B2 |
7079707 | Baron | Jul 2006 | B2 |
7103212 | Hager et al. | Sep 2006 | B2 |
7123292 | Seeger et al. | Oct 2006 | B1 |
7123298 | Schroeder et al. | Oct 2006 | B2 |
7307709 | Lin et al. | Dec 2007 | B2 |
7352913 | Karuta et al. | Apr 2008 | B2 |
7370336 | Husain et al. | May 2008 | B2 |
7413311 | Govorkov et al. | Aug 2008 | B2 |
7453510 | Kolehmainen et al. | Nov 2008 | B2 |
7460179 | Pate et al. | Dec 2008 | B2 |
7483065 | Gruhlke et al. | Jan 2009 | B2 |
7512262 | Criminisi et al. | Mar 2009 | B2 |
7551771 | England, III | Jun 2009 | B2 |
7561731 | Wallace et al. | Jul 2009 | B2 |
7567271 | Berestov | Jul 2009 | B2 |
7570881 | Perala et al. | Aug 2009 | B2 |
7590335 | Kobayashi et al. | Sep 2009 | B2 |
7590992 | Koplar et al. | Sep 2009 | B2 |
7598980 | Imai et al. | Oct 2009 | B2 |
7613389 | Suzuki et al. | Nov 2009 | B2 |
7629897 | Koljonen | Dec 2009 | B2 |
7641348 | Yin et al. | Jan 2010 | B2 |
7643025 | Lange | Jan 2010 | B2 |
7653304 | Nozaki et al. | Jan 2010 | B2 |
7658498 | Anson | Feb 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7869204 | Bair et al. | Jan 2011 | B2 |
7901084 | Willey et al. | Mar 2011 | B2 |
7925077 | Woodfill et al. | Apr 2011 | B2 |
7964835 | Olsen et al. | Jun 2011 | B2 |
8044880 | Nakamura et al. | Oct 2011 | B2 |
8319822 | McClatchie | Nov 2012 | B2 |
20020021288 | Schug | Feb 2002 | A1 |
20030038927 | Alden | Feb 2003 | A1 |
20030086013 | Aratani | May 2003 | A1 |
20030117343 | Kling | Jun 2003 | A1 |
20040119988 | Chen et al. | Jun 2004 | A1 |
20040189796 | Ho et al. | Sep 2004 | A1 |
20040193413 | Wilson et al. | Sep 2004 | A1 |
20050132408 | Dahley et al. | Jun 2005 | A1 |
20050146634 | Silverstein et al. | Jul 2005 | A1 |
20050168583 | Thomason | Aug 2005 | A1 |
20050182962 | Given et al. | Aug 2005 | A1 |
20050237385 | Kosaka et al. | Oct 2005 | A1 |
20050280786 | Moiroux et al. | Dec 2005 | A1 |
20060002605 | Chang et al. | Jan 2006 | A1 |
20060140452 | Raynor et al. | Jun 2006 | A1 |
20060197843 | Yoshimatsu | Sep 2006 | A1 |
20060218781 | Nakamura et al. | Oct 2006 | A1 |
20070027580 | Ligtenberg et al. | Feb 2007 | A1 |
20070177279 | Cho et al. | Aug 2007 | A1 |
20070236485 | Trepte | Oct 2007 | A1 |
20070300312 | Chitsaz et al. | Dec 2007 | A1 |
20080062164 | Bassi et al. | Mar 2008 | A1 |
20080131107 | Ueno | Jun 2008 | A1 |
20080158362 | Butterworth | Jul 2008 | A1 |
20080191864 | Wolfson | Aug 2008 | A1 |
20090008683 | Nishizawa | Jan 2009 | A1 |
20090015662 | Kim et al. | Jan 2009 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090051797 | Yao | Feb 2009 | A1 |
20090079734 | McDaniel | Mar 2009 | A1 |
20090115915 | Steinberg et al. | May 2009 | A1 |
20090116732 | Zhou et al. | May 2009 | A1 |
20090221368 | Yen et al. | Sep 2009 | A1 |
20090262306 | Quinn et al. | Oct 2009 | A1 |
20090262343 | Archibald | Oct 2009 | A1 |
20090273679 | Gere et al. | Nov 2009 | A1 |
20090309826 | Jung et al. | Dec 2009 | A1 |
20100060803 | Slack et al. | Mar 2010 | A1 |
20100061659 | Slack et al. | Mar 2010 | A1 |
20100073499 | Gere | Mar 2010 | A1 |
20100079426 | Pance et al. | Apr 2010 | A1 |
20100079468 | Pance et al. | Apr 2010 | A1 |
20100079653 | Pance | Apr 2010 | A1 |
20100079884 | Gere | Apr 2010 | A1 |
20100083188 | Pance et al. | Apr 2010 | A1 |
20100103172 | Purdy, Sr. | Apr 2010 | A1 |
20100118122 | Hartman | May 2010 | A1 |
20100309287 | Rodriguez | Dec 2010 | A1 |
20100309315 | Hogasten et al. | Dec 2010 | A1 |
20110026039 | Nimmakayala et al. | Feb 2011 | A1 |
20110064327 | Dagher et al. | Mar 2011 | A1 |
20110074931 | Bilbrey et al. | Mar 2011 | A1 |
20110075055 | Bilbrey | Mar 2011 | A1 |
20110115964 | Gere | May 2011 | A1 |
20110149094 | Chen et al. | Jun 2011 | A1 |
20110200247 | Ravid et al. | Aug 2011 | A1 |
20110242286 | Pace et al. | Oct 2011 | A1 |
20110317005 | Atkinson | Dec 2011 | A1 |
20120044322 | Tian et al. | Feb 2012 | A1 |
20120044328 | Gere | Feb 2012 | A1 |
20120050490 | Chen et al. | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
167314 | Jan 1990 | EP |
2053844 | Apr 2009 | EP |
2002354493 | Dec 2002 | JP |
2003-299113 | Oct 2003 | JP |
2005-197792 | Jul 2005 | JP |
10-2007-0100890 | Oct 2007 | KR |
10-2009-0049343 | May 2009 | KR |
WO9311631 | Jun 1993 | WO |
WO2007100057 | Sep 2007 | WO |
WO2009001512 | Dec 2008 | WO |
Entry |
---|
Author Unknown, “YCbCr,” http://en.wikipedia.org/wiki/Y%27CbCr, 4 pages, at least as early as Jun. 17, 2010. |
Koschan et al., “Finding Objects in a 3D Environment by Combining Distance Measurement and Color Indexing,” IEEE, vol. 1, pp. 858-861, Oct. 2001. |
Sokolova et al., “Experiments in Stereo Vision,” Computer Science 570, Final Project, http://disparity.wikidot.com/, 14 pages, at least as early as Jun. 16, 2010. |
Stern et al., “Three-Dimensional Image Sensing, Visualization, and Processing Using Integral Imaging,” Proceedings of the IEEE, Mar. 2006, vol. 94, No. 3, pp. 591-607. |
Thomas, “An Introduction to Photo Stitching,” Eastern Suburbs Photographic Society, 2007, 12 pages. |
Wang, Video Processing and Communications, 2001, Prentice-Hall, Chapter 1: Video Formation, Perception, and Representation, pp. 1-32. |
Number | Date | Country | |
---|---|---|---|
20120076363 A1 | Mar 2012 | US |