The present invention relates generally to foot-sizing applications. More particularly, and in one aspect, the invention relates to a foot measuring application running on a mobile computing device with an integrated camera that can quickly and accurately inform the user the dimensions of the foot being measured through an image capture process, and give accurate and real-time feedback during the measuring process.
Traditionally, the size of a foot is determined by a physical device, such as a “Brannock” device, which is common in many footwear stores. A Brannock device includes a sliding element to line up with the forward edge of a toe, where a heel of the foot is braced against an opposite end of the device. This same device can also measure foot width with a separate sliding element.
One advantage of these traditional devices is that the user, or those accompanying the user, can see the dimensions generated in real-time. These individuals can agree with the measurements, dispute them, attempt further adjustment of the sliding elements or foot position, and therefore, adjust the foot measurements.
Without a physical measuring device, traditional foot measuring involves a process of trial and error to determine an appropriate shoe size. This process involves obtaining shoes of varying sizes so that the customer can try on each shoe and determine an optimal fit. This traditional method depends on feedback from the user in reaction to different physical shoes. Such trial and error can be especially challenging with children, or those who have physical disabilities or other impairments that make it challenging to gauge accurate feedback from the user.
Furthermore, a less common method of physical size determination is the use of special shoes with the forefoot constructed from clear or transparent materials. This method allows for the toe area to be viewed through the shoe. Similarly, the goal with this method is real-time feedback for the user to gauge accurate foot size. However, the main deficiency is the requirement that all different shoe styles and sizes include clear or transparent materials.
As consumers continue to move to online shopping for shoes, there is a growing interest in alternate methods of foot-sizing that are not dependent on such traditional methods of physical devices or shoes. One alternative method includes providing directions to a user to trace an outline of a target foot on a piece of paper and choose two points on the resulting contour to generate a linear measurement. Problems with this method include demanding too much time and effort from the user, dependency on the accuracy of the traced contour, and user understanding of how to measure the resulting contour itself. Furthermore, many consumers do not wish to be bothered by this process, may draw an inaccurate outline (e.g., by drawing at an angle instead of vertical), or may incorrectly measure the traced outline (e.g., by not measuring along the major longitudinal axis of the foot).
Another alternative method includes using an application on “multi-touch” device having a relatively large screen, such as a tablet computer. These applications attempt to measure foot size by having the user place the target foot on the screen, while the tablet is situated on the ground with the screen facing up. Problems with this method include inability to measure feet that are larger than the screen, inability to capture features which are not at ground/screen level and that affect the measured maximum size (for example, a heel that juts out higher on the foot), and user sensitivity to device damage by stepping on the device.
Another alternative method includes an application that attempts to measure foot size by utilizing a camera built into the computing device. These applications may use a reference object of a known size (e.g., a coin) in the same field of view and obtain a picture of the foot along with the reference object. Problems with these applications include inaccuracy based on flawed processing techniques and algorithms, unacceptably long processing times if the image or multiple images are sent to a remote server for processing, and the lack of real-time feedback to the user.
Another class of alternative methods include applications which attempt to scan an entire foot and record 3D features of the foot, or by constructing a 3D digital model of the foot through multiple images. In addition to suffering from even longer processing times than the above methods due to the greater amount of data in a 3D calculation, there are doubts about the accuracy of the data generated given the complexity of the full-foot 3D model and reliance on in-situ consumer electronics and settings (e.g., lighting, background differences, etc.). This 3D-based method is also impractical given the limited commercially viable ways to convert a full 3D digital map of a foot to a completely custom shoe. Three-dimensional printing of shoes mapped to a custom 3D database is not yet viable as a mass-consumer option due to a number of issues, including the availability of shoe materials that are also available for 3D printing (e.g., specific materials for flexibility and durability), and the processing time needed to print an object the size of a shoe.
Therefore, a need exists in the field for a foot measuring application that can both provide real-time feedback to a user and provide an accurate determination of foot size in a timely and easy fashion.
According to one aspect of the present invention, a novel foot measuring application is described. The application generally involves a software application program running on a computing device that includes a built-in camera, or is coupled to a camera. The computing device may be capable of generating a “live view” of the image field of the camera when the camera function is turned on, also referred to as a “preview screen” or a “digital viewfinder.” The computing device may also be capable of connecting to remote server computers, through TCP/IP or other internet protocols. The novel foot measuring application may generate dimensional data by comparing points on the target foot with a known measuring reference that is also in the camera field of view. In some embodiments, the application utilizes a sheet of standard-sized paper, such as Letter or A4, as a reference object. For example, the application may instruct a user to place a sheet of paper on the ground, place his or her foot on the paper (with or without socks or nylons), and then to use the built-in or coupled camera with the application running on the computing device to capture video (a series of images) of the target foot in the viewfinder within a prescribed zone of distance from the foot.
Certain embodiments may display an “augmented reality” indication of measurement in the preview screen, such as zones of different colors corresponding to different shoe sizes, overlaid under or around the target foot. For example, such indications may be provided in order to give the user feedback about the dimensions being generated by the application's size-analyzing process. These measurement overlays are “target relative” in that they appear to be fixed relative to the target foot and paper reference, even if the camera or computing device are moved (e.g., if the overlays moved when the computing device moved, the overlays would be defined as “device relative”). Such embodiments better replicate the experience of current traditional physical foot measuring methods. Specifically, the user may remeasure the foot based on the user's perception of the process accuracy, and understand during the live process how and why the resulting foot size measurement was generated.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well as the singular forms, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one having ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure and will not be interpreted in an idealized or overly formal sense unless expressly so defined.
In describing the invention, it will be understood that a number of techniques and steps are disclosed. Each of these has individual benefit and each can also be used in conjunction with one or more, or in some cases all, of the other disclosed techniques. Accordingly, for the sake of clarity, this description will refrain from repeating every possible combination of the individual steps in an unnecessary fashion. Nevertheless, the specification and claims should be read with the understanding that such combinations are entirely within the scope of the invention and the claims.
New foot measuring applications for determining the size of feet and appropriate corresponding shoe sizes are discussed herein. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be evident, however, to one skilled in the art that the present invention may be practiced without these specific details.
The present disclosure is to be considered as an exemplification of the invention, and is not intended to limit the invention to the specific embodiments illustrated by the figures or description below. The present invention will now be described by referencing the appended figures representing preferred embodiments.
Referring now to
In some embodiments, a first step in effectively measuring a target foot involves establishing the location of the reference paper. Accordingly, the four corners of the reference paper must be established. The camera field of view may be divided into nine grid blocks, and the user is instructed to manipulate the camera such that each corner of the reference paper lies within grid blocks 14, 15, 16, and 17. The corner guides 11 may be displayed for user guidance only, and the program may not require that the four corners be precisely aligned with these guides, which may increase usability. The grid blocks, and corner search in specific blocks, are a key part of the overall measurement algorithm and greatly reduce errors and process time.
In order to find the paper corners within the four grid block areas, the invention applies aspects of “corner detection” algorithms established in the field of computer vision. A two-step process may be utilized to detect corner separation between the paper and the floor. In a first step, the invention converts the color image data into grey-scale with average intensity values. The second step includes downsampling, which may include reducing resolution but maintaining key details of the data.
Specifically, during the first step of the algorithm, red-blue-green (RBG) image data is converted into a hue-saturation-value (HSV) color space, with intensity values determined based on averaging three different HSV values. In some embodiments, the camera may utilize a Behr-type sensor with double the number of green detection elements than red and blue elements. Thus, the program may only account for half of the green values. In turn, this process ignores and/or removes all color information and converts the data into “brightness” values across all different color values. During the second step, the HSV values are then transformed into a vector space through computations of first and second order differentials to determine velocity and acceleration of change in HSV value. In areas where there is the most change in HSV value, the program maintains more data, and where there is less change in HSV value, the program increases the rate of averaging of HSV values. This “binarization” process maintains more data along the edges of the paper. The paper corners are then determined based on the intersections of two edges, and the vertex of a corner is determined as the intersection of the two edges at that corner.
In some embodiments, the computing device may be a mobile smartphone or tablet. The image processing described above may be performed based on a video feed from at least one camera included within or coupled to the device. Typically, such devices may generate video at a rate of thirty frames per second. The corner-detection process described above may thus be performed on each successive video frame until the program determines that (i) four corners have been located, and (ii) there is one corner in each of grid blocks 14, 15, 16, and 17. In some embodiments, when a corner has been detected within a given block 14, 15, 16, or 17, the reference guide corner 11 in that corresponding grid block will change color in the camera preview screen. This color change provides the user with feedback on the progression of the overall process. If there are problems at this step (e.g., one or more corners are obscured from the camera's view, are crumpled or distorted, or if there is no paper in the camera's view), the program will continue to run the corner detection process on succeeding video frames until a defined time limit is reached. Once such limit is reached, the program may prompt the user to confirm that all reference paper corners are visible in the view screen of the camera.
Referring back to
Referring now to
In one embodiment, the program may perform size analysis based on the size of the object frame 38. For example, length 44 may represent a foot length, and width 45 may represent foot width. These measurements may be determined relative to the known measurements of the edges of the reference frame 34. As a first step, the program may perform filtering to disregard results that are too extreme. For example, filtering may allow the algorithm to be more efficient, as the program does not need to run the rest of the analysis if the initial analysis returns gross errors. Gross error filtering may include discarding results for length 44 or width 45 that are too small (e.g., less than the size of the smallest infant shoe size). Where the non-conforming results are filtered out, results for length 44 and width 45 are compared to determine whether the resulting aspect ratio is within the preprogrammed range of possibility for a foot. In some embodiments, when the results finish the gross error checking, a visual indication on the camera preview screen may provide an indication to the user that the foot has been detected in the overall process.
In one embodiment, the program maintains results that are not filtered out and determines a final result based on the average of a preset X multiple number of measurements. For example, in some embodiments, five results are averaged. The program will run this final average analysis when X consecutive video frames generate measurements, for both length 44 and width 45, within a preset tolerance Y of each other. For example, in some embodiments, the tolerance may be set at 5 mm. Depending on the application and device used (e.g., depending on camera specifications), in some embodiments, the program may utilize more or less consecutive values, may use a greater or lower tolerance than 5 mm, and/or may use a different averaging methodology (e.g., accepting any five values within the accepted 5 mm tolerance within a ten measurement block of values).
Referring now to
In some embodiments, the program may instruct the user to wear a sock before stepping on the paper. Although, the processes described above may function independent of whether a user is wearing a sock, the use of a sock in this embodiment may, for example, assist in preventing a foot from sticking to the reference paper (e.g., due to moisture on the foot). Furthermore, while running the video analysis, the program may activate a built-in flashlight continuously (sometimes referred to as “torch” mode) until a final value is determined. This extra light source may assist to mitigate extreme lighting, such as prominent shadows and glare, that can affect the analysis. For similar reasons, the program may instruct the user to choose a measurement environment with even lighting.
In some embodiments, the program may be run locally, without requiring access to remote server computations. This client-side calculation may significantly improve responsiveness of the application and reduce latency, compared to a solution that would rely on remote server-side computations. In practice, under the proper conditions as described above, the analysis may typically be performed by the computing device in less than five seconds and may be perceived as relatively “instantaneous” to the user.
Referring now to
Referring now to
In some embodiments, the method for identifying the foot in the field of view can be determined by other means. For example, identification may be based on analysis of skin color compared to the color of the reference paper. Another alternative identification is to detect subcutaneous blood flow through, for example, “Euler methodology,” in order to distinguish the skin of the target foot from the reference paper.
Referring now to
In one embodiment, as illustrated in
In order to accurately measure the distance to the wall that best represents the dimension of the foot (either length or width), the camera may be required to be “square” to the wall, such that the forward-facing camera surface is coplanar with the wall. In one embodiment, built-in sensors of the device, such as gyroscopes, will determine if the device is orthogonal to the ground plane. If the camera is tilted relative to the ground plane, the program will prompt the user to straighten the smartphone, for example, through on-screen directions. To determine if the camera is tilted relative to the wall plane, the device may utilize a similar edge detection process, as described in the embodiments above, to determine the wall-to-floor boundary line 60. The program may then determine if this line is angled or horizontal, and may prompt the user to turn the smartphone if needed in order to achieve a horizontal line. Once the device orientation is coplanar to the wall within preprogrammed tolerances, the distance to the wall is recorded and the program is then able to calculate the foot dimension, and thus, appropriate shoe size. The program may then follow a similar process as in the embodiments described above, for example, by displaying the foot dimension onscreen and optionally communicating with remote servers.
In another embodiment, as shown in
Although a variety of examples are used herein to describe embodiments within the scope of the appended claims, no limitation of the claims should be implied based on particular features or arrangements in such examples. Accordingly, one of ordinary skill in the art will be able to use these examples to derive a variety of implementations. Although subject matter may have been described in language specific to examples of different structures and processes, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to these structure and processes. For example, such functionality can be distributed differently or performed in components other than those identified herein. Rather, the features are disclosed as examples of components of systems and methods within the scope of the appended claims.
This application is a continuation of U.S. patent application Ser. No. 15/796,457, entitled “FOOT MEASURING AND SIZING APPLICATION,” filed on Oct. 27, 2017, now abandoned, which is a continuation-in-part of U.S. patent application Ser. No. 15/699,861, entitled “FOOT MEASURING AND SIZING APPLICATION,” filed on Sep. 8, 2017, and now granted as U.S. Pat. No. 10,420,397, which claims priority to U.S. Provisional Ser. No. 62/434,151, entitled “FOOT MEASURING AND SIZING APPLICATION,” filed on Dec. 14, 2016. The content of each of these applications is hereby incorporated by reference in their entireties for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5164793 | Wolfersberger et al. | Nov 1992 | A |
5689446 | Sundman et al. | Nov 1997 | A |
6289107 | Borchers et al. | Sep 2001 | B1 |
6546356 | Genest | Apr 2003 | B1 |
8065200 | Schwartz | Nov 2011 | B2 |
8567081 | Smith | Oct 2013 | B2 |
8655053 | Hansen | Feb 2014 | B1 |
8818883 | Lawrence et al. | Aug 2014 | B2 |
8868157 | Soliz | Oct 2014 | B1 |
9345957 | Geisner et al. | May 2016 | B2 |
9386889 | Fischer | Jul 2016 | B2 |
9462838 | Smith et al. | Oct 2016 | B1 |
9477980 | Zagel et al. | Oct 2016 | B2 |
9648926 | Marks | May 2017 | B2 |
9799064 | Ohnemus et al. | Oct 2017 | B2 |
9875546 | Bhole et al. | Jan 2018 | B1 |
10008040 | Lam | Jun 2018 | B2 |
10013803 | Mach Shepherd | Jul 2018 | B2 |
10067500 | Hargovan et al. | Sep 2018 | B2 |
10380794 | Hauswiesner et al. | Aug 2019 | B2 |
10420397 | Hei | Sep 2019 | B2 |
10573004 | Husheer | Feb 2020 | B2 |
10755431 | Shea | Aug 2020 | B2 |
20040081336 | Brooks | Apr 2004 | A1 |
20070005174 | Thomas | Jan 2007 | A1 |
20070056212 | Fink | Mar 2007 | A1 |
20090051683 | Goonetilleke et al. | Feb 2009 | A1 |
20090247909 | Mukumoto | Oct 2009 | A1 |
20090287452 | Stanley et al. | Nov 2009 | A1 |
20100111370 | Black et al. | May 2010 | A1 |
20100296726 | Rutschmann | Nov 2010 | A1 |
20110298897 | Sareen et al. | Dec 2011 | A1 |
20120249741 | Maciocci et al. | Oct 2012 | A1 |
20130100256 | Kirk et al. | Apr 2013 | A1 |
20130114869 | Hernandez stark et al. | May 2013 | A1 |
20130246222 | Weerasinghe | Sep 2013 | A1 |
20130307851 | Hernandez Stark et al. | Nov 2013 | A1 |
20140040041 | Ohnemus et al. | Feb 2014 | A1 |
20140104395 | Rohaly et al. | Apr 2014 | A1 |
20140108208 | Piana | Apr 2014 | A1 |
20140156449 | Ganesan et al. | Jun 2014 | A1 |
20140270540 | Spector et al. | Sep 2014 | A1 |
20140285646 | Kahlon | Sep 2014 | A1 |
20140358738 | Ohnemus et al. | Dec 2014 | A1 |
20150012380 | Bank et al. | Jan 2015 | A1 |
20150039422 | Abraham et al. | Feb 2015 | A1 |
20150066707 | Unger et al. | Mar 2015 | A1 |
20150066712 | Altieri | Mar 2015 | A1 |
20150127132 | Nyong'o et al. | May 2015 | A1 |
20150127363 | Nyong'o et al. | May 2015 | A1 |
20150154453 | Wilf | Jun 2015 | A1 |
20150223730 | Ferrantelli | Aug 2015 | A1 |
20150228084 | Belyaev et al. | Aug 2015 | A1 |
20150258431 | Stafford et al. | Sep 2015 | A1 |
20150328016 | Summit et al. | Nov 2015 | A1 |
20150339752 | Chetuparambil et al. | Nov 2015 | A1 |
20150342266 | Cooper et al. | Dec 2015 | A1 |
20150359461 | Alfaro et al. | Dec 2015 | A1 |
20160081435 | Marks | Mar 2016 | A1 |
20160093085 | Ray et al. | Mar 2016 | A1 |
20160110479 | Li | Apr 2016 | A1 |
20160125499 | Gooch et al. | May 2016 | A1 |
20160180391 | Zabaneh | Jun 2016 | A1 |
20160183879 | Goldish et al. | Jun 2016 | A1 |
20160247017 | Sareen et al. | Aug 2016 | A1 |
20160286906 | Malal et al. | Oct 2016 | A1 |
20160350833 | Andon | Dec 2016 | A1 |
20160367191 | Esposito et al. | Dec 2016 | A1 |
20170004568 | Radner | Jan 2017 | A1 |
20170169571 | Hung et al. | Jun 2017 | A1 |
20170272728 | Rafii et al. | Sep 2017 | A1 |
20180033202 | Lam et al. | Feb 2018 | A1 |
20180035762 | Towns et al. | Feb 2018 | A1 |
20180160776 | Hei et al. | Jun 2018 | A1 |
20180160777 | Hei et al. | Jun 2018 | A1 |
20180182091 | Mackinnon et al. | Jun 2018 | A1 |
20180240238 | Husheer | Aug 2018 | A1 |
20180247426 | Gluck et al. | Aug 2018 | A1 |
20180300445 | Schouwenburg et al. | Oct 2018 | A1 |
20180300791 | Ganesan et al. | Oct 2018 | A1 |
20190037134 | Merati | Jan 2019 | A1 |
20190082794 | Liu | Mar 2019 | A1 |
20190139252 | Zaiss et al. | May 2019 | A1 |
20190188784 | Bleicher et al. | Jun 2019 | A1 |
Number | Date | Country |
---|---|---|
101707946 | May 2010 | CN |
101742048 | Jun 2010 | CN |
101819663 | Sep 2010 | CN |
102682395 | Sep 2012 | CN |
103093543 | May 2013 | CN |
103597519 | Feb 2014 | CN |
103852130 | Jun 2014 | CN |
104040580 | Sep 2014 | CN |
104170519 | Nov 2014 | CN |
6-149376 | May 1994 | JP |
2000-515088 | Nov 2000 | JP |
2002203167 | Jul 2002 | JP |
2003-127994 | May 2003 | JP |
2003331108 | Nov 2003 | JP |
2005169015 | Jun 2005 | JP |
2007526028 | Sep 2007 | JP |
2010510587 | Apr 2010 | JP |
2013050937 | Mar 2013 | JP |
2013097799 | May 2013 | JP |
2014-40231 | Mar 2014 | JP |
2016001360 | Jan 2016 | JP |
2016040649 | Mar 2016 | JP |
2016532197 | Oct 2016 | JP |
20100019067 | Feb 2010 | KR |
20100131404 | Dec 2010 | KR |
20120123842 | Nov 2012 | KR |
20120123845 | Nov 2012 | KR |
20150061089 | Jun 2015 | KR |
20150070459 | Jun 2015 | KR |
20160005977 | Jan 2016 | KR |
20160021118 | Feb 2016 | KR |
9748027 | Dec 1997 | WO |
2005006905 | Jan 2005 | WO |
2012072844 | Jun 2012 | WO |
2014159726 | Oct 2014 | WO |
2016051416 | Apr 2016 | WO |
2017127132 | Jul 2017 | WO |
2018048902 | Mar 2018 | WO |
2018109421 | Jun 2018 | WO |
Entry |
---|
Accu Foot size, “Accu Foot Size App Review”, Retrieved from the internet <URL htlps://www.apkmonk.com/app/com.accufootsize/> entire document, 2015, 7 pages. |
Ganesan, Anand, “Footwear Virtual Fitting Service for E-Commerce & Physical Stores”, findmeashoe.com, Jul. 2018, 27 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2017/065878, dated Jun. 27, 2019, 13 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2017/065878, dated Mar. 29, 2018, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/699,861, dated May 24, 2019, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/796,457, dated Aug. 27, 2019, 12 pages. |
Notice of Allowance received for U.S. Appl. No. 15/699,861, dated Jul. 11, 2019, 8 pages. |
Office Action received for Japanese Patent Application No. 2018-502047, dated Nov. 19, 2018, 4 pages. |
Anonymous: “How to Calculate Height of Android Phone from ground—Stack Overflow”. Aug. 17, 2017 (Aug. 17, 2017), XP055836361, Retreived from the Internet: URL:https://stackoverflow.com/questions/17443936/how-to-calculate-height-of-android-phone-from-ground [retreived on Aug. 31, 2021]. |
Number | Date | Country | |
---|---|---|---|
20200178651 A1 | Jun 2020 | US |
Number | Date | Country | |
---|---|---|---|
62434151 | Dec 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15796457 | Oct 2017 | US |
Child | 16791572 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15699861 | Sep 2017 | US |
Child | 15796457 | US |