The present technology is directed to an image capturing device for building/construction materials. More particularly, some embodiments of the present technology relate to a portable, hand-held device for capturing images of a surface of a slab and associated methods.
Knowing characteristics of a building material is crucial in design stages. One way to measure or collect the characteristics is to capture an image of that building material. Capturing images of building materials can be challenging especially for the materials having relatively large sizes and weights, such as slabs. Some building materials have high reflectively which makes capturing images thereof even more challenging. One conventional method for capturing images of a slab is to bring the slab into a photography studio that has enough physical space to accommodate the slab. This method is, however, time consuming, expensive, and inefficient. Therefore, there is a need for an improved device or method to address the foregoing issues.
Many aspects of the present technology can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. Instead, emphasis is placed on illustrating the principles of the present technology.
In various embodiments, the present technology relates to a portable image capturing device (or scanner) for building materials, such as a slab, surface coating materials on a flat surface (e.g., wall, floor, ceiling, etc.), and/or other suitable materials. The present technology also relates to methods of operating the portable image capturing device. The portable image capturing device has a compact, portable design and can be held and operated by a single operator. The portable image capturing device is configured to capture multiple images of a surface of a building material, when the portable image capturing device is positioned on the surface and moved thereon. The captured images can be analyzed, adjusted, and/or stored for future use (e.g., for design projects considering using the slab as a building material).
In some embodiments, the portable image capturing device includes a housing, an image sensor (e.g., in a camera) positioned in the housing, and one or more lighting components (e.g., one or more LED (light-emitting diode) light strips or bulbs) positioned in the housing. The housing can have an interior surface with a low-reflective or anti-reflective coating (or film). The lighting components are spaced apart from the image sensor. The lighting components are positioned such that the light rays emitted by the lighting components do not directly reach the image sensor (e.g., the first reflected rays of the emitted light rays do not reach the image sensor). In some embodiments, the lighting components can each be positioned in a recess formed with the housing such that the light rays emitted from the lighting component are not directly reflected to the image sensor.
For example, the surface of an object to be scanned (e.g., a slab) can first reflect the light rays from the lighting component (the light rays' first reflections), not directly toward the image sensor (see e.g.,
Another aspect of the present technology includes methods of analyzing, organizing, and utilizing the captured images. In some embodiments, the method can include (1) determining a boundary or an edge of the scanned object based on captured images; (2) identifying a type of the scanned object based on the captured images; (3) adjusting the color (and/or light consistency) or distortion of the captured images; (4) identifying a defect or a mark on the scanned object based on the captured images; and/or (5) consolidating (e.g., stitching, combining, incorporating, etc.) the captured images to form a processed image that is indicative of the characteristics of the scanned object.
In some embodiments, the method can include (i) determining (e.g., by an encoder or a processor) the dimensions of the scanned object based on the captured images; (ii) storing the captured and processed images based on the identified type and the determined dimensions; and/or (iii) transmitting or exporting, automatically or upon a user instruction, the stored images upon a request in various data formats (e.g., upon a request from an interior designer, exporting the stored images from a server to a client computing device with particular software installed).
Specific details of several embodiments of image capturing devices and associated systems and methods are described below.
In some embodiments, the portable image capturing device 100 can have only one handle. In some embodiments, the portable image capturing device 100 can be moved by the operator holding other suitable components such as a knob, a lever, a protrusion, etc., formed with the housing 101. In some embodiments, the portable image capturing device 100 can include more than two handles. In some embodiments, the sizes and dimensions of the two or more handles can be different.
In the illustrated embodiment, the housing 101 has a generally symmetric shape. In other embodiments, the housing 101 can have other suitable shapes. In some embodiments, the housing 101 can have an interior surface with a low-reflective or anti-reflective coating or film.
As shown in
In some embodiments, the controller 105 can be a computing system embedded in a chip, a PCB board, or the like. In some embodiments, the controller 105 can include a memory or suitable storage component that is configured to store collected images or software/firmware for processing the collected images. In some embodiments, the controller 105 can be communicably coupled to other components of the device 100 (e.g., image sensor 109, lighting components 107, roller 111, etc. as discussed below) and control these components. In some embodiments, the controller 105 can include a relatively small and affordable computer system such as Raspberry Pi.
In the illustrated embodiments shown in
In some embodiments, the lighting components 107a, 107b can include one or more LED light strips or light bulbs. In some embodiments, the portable image capturing device 100 can have more than two lighting components. For example, the portable image capturing device 100 can have a plurality of lighting components circumferentially positioned inside the housing 101.
As also shown in
In some embodiments, the portable image capturing device 100 can include a distance sensor 113 coupled to the roller 111a. The distance sensor 113 is configured to measure and record the distance traveled by the portable image capturing device 100. In some embodiments, the distance sensor 113 can include an encoder that can convert distance information to a digital signal, which can later be transmitted to the controller 105. In some embodiments, the controller 105 can instruct the image sensor 109 to take an image according to the distance information measured by the distance sensor 113.
For example, at a first time point T1, the controller 105 can instruct the image sensor 109 to take a first image of a first portion of the surface 20 that is covered by the housing 101 at the first time point T1. Assume that the distance between the rollers 111a, 111b is distance D. When the distance sensor 113 measures that the portable image capturing device 100 has traveled distance D (or a distance less than distance D such that there can be an overlap between two captured images) at a second time point T2, the controller 105 can instruct the image sensor 109 to take a second image of a second portion of the surface 20 that is covered by the housing 101 at the second time point T2. In some embodiments, the controller 105 can instruct the image sensor 109 to take additional images at other time points. For example, the image sensor 109 can take an image at a third time point T3 when the distance sensor 113 measures that the portable image capturing device 100 has traveled a half of distance D. In some embodiments, the foregoing image taking process can repeat until the image sensor 109 has taken enough images to form an overall image for the whole surface 20 of the object 22.
In some embodiments, the first and second images (as well as other images taken) can be combined and/or processed by the controller 105 so as to form a processed image. In some embodiments, the first and second images can be processed by a processor or a computer external to the portable image capturing device 100. In some embodiments, the controller 105 can program the encoder 113 to move a distance to ensure that the first and second captured images overlap and can then analyze the first and second images and determine how to combine the first and second images. For example, the controller 105 can combine the first and second images by removing a duplicate portion of the first or second image and then “stitch” the first and second images to form the processed image. In some embodiments, the controller 105 can identify an edge 24 of the surface 20 in the first and second images, and then remove a corresponding part (e.g., the part of image external to the image of the edge 24) of the first and second images.
In some embodiments, the controller 105 can adjust the colors (and/or light consistency) of the first and second images (and other captured images) based on a color reference (e.g., a physical color bar, a reference object that has been scanned together with the object 22, etc.). The color reference is indicative regarding how a surface of a building material looks like in a specific lighting environment (e.g., natural lighting during a day, a room with ceiling lights, a room with lamps, etc.). In some embodiments, the controller 105 can first compare (i) a portion of a collected image that shows the color reference with (ii) the remaining portion of the collected image. The controller 105 can then adjust the remaining portion of the collected image based on the color reference to form an adjusted image. The adjusted image can visually present the surface 20 in the specific lighting environment. It is advantageous to have such an adjusted image in a design stage when considering whether and how to use the object 22 as a building material for a project. Embodiments regarding adjusting colors are discussed below in detail with reference to
As shown in
In some embodiments, the distance measured by the encoder 313 can be used by the controller (not shown in
In some embodiments, the functionality of the light diffuser 315 may be implemented through light mapping in software. In an example, instead of using the light diffuser 315 to provide an even light field, the brightness of each pixel that is captured by the image sensor is adjusted based on its deviation from a known value. In another example, the adjustment may be based on a baseline value for “true white” that is recorded by placing the device on a white surface and capturing an image thereof. The brightness of each captured pixel may be compared to the baseline value and adjusted, thereby approximating the functionality of the diffuser pattern discussed above.
As shown, the housing 403 includes a center portion 4011, two side portions 4013a, 4013b, and two bottom portions 4015a, 4015b. The center portion 4011 is coupled to the side portions 4013a, 4013b. The side portions 4013a, 4013b are coupled to the bottom portions 4015a, 4015b. In some embodiments, the center portion 4011, the side portions 4013a, 4013b, and the bottom portions 4015a, 4015b can be coupled by welding, connectors, nuts/bolts, etc. In some embodiments, the center portion 4011, the side portions 4013a, 4013b, and the bottom portions 4015a, 4015b can be integrally formed (e.g., by molding).
The center portion 4011 is positioned and spaced apart (or elevated) from the surface 40 of the material during operation. By this arrangement, the light rays emitted by the LED light tubes 407a, 407b (which are at least partially positioned in the recesses 421a, 421b formed with the side portions 4013a, 4013b) do not directly reach the image sensor 409 positioned at the center of the center portion 4011.
In
As shown in
In some embodiments, the position of the corner corresponding to the first angle relative to the position of the camera (or image sensor) 409 and the light source 407a is selected to ensure that a direct reflection from the light source does not reach the camera (e.g., light rays R1, R2 and R3 reflect at least twice before reaching the image sensor).
In some embodiments, the light sources 407a, 407b are laterally spaced apart from the image sensor 409 advantageously using dark field illumination to illuminate the surface 40. That is, specular reflection (e.g., reflection of light waves from a surface) is directed away from the image sensor, and only diffused reflected light is measured and imaged. This results in an image wherein the surface 40 is brightly lit with a dark background since the color or brightness distortion caused by the direct reflection of light is eliminated.
The two wheels 411a, 411b are positioned outside the bottom portions 4015a, 4015b and are configured to move the portable image capturing device 400 along the surface 40. When the portable image capturing device 400 is in operation, the lower section of the bottom portions 4015a, 4015b are in close contact with the surface 40, such that no external light rays get into the housing 403. In some embodiments, to achieve this goal, the portable image capturing device 400 can include a contacting components 423a, 423b (e.g., a rubber seal, a light blocker, etc.) positioned between the surface 40 and the bottom portions 4015a, 4015b, respectively.
In some embodiments, the surface scanner 500 can be moved in a curvature trajectory CT. In such embodiments, the wheel 511 can include multiple rolling components such that when they rotate at different rates, the surface scanner 500 can be moved in the curvature trajectory CT. In the similar fashion as described above, the wheels 511 can provide information regarding how the surface scanner 500 has been moved, and then the controller 505 can accordingly instruct the surface scanner 500 to capture images in the image capturing area 55. The images captured at the multiple time points can then be combined to form an overall image of the slab 50. In some embodiments, the surface scanner 500 can operate without the wheels 511.
In some embodiments, the color reference bar can be held by a holding component (e.g., a holding arm, a clamp, etc.) inside a housing of the slab scanner. The holding component can move, rotate, and/or fold the color reference bar such that the color reference bar can be switched between a first position (where the color reference bar will be scanned) and a second position (where the color reference bar will not be scanned). Accordingly, the operator of the slab scanner can determine whether to put the color reference bar in the image 60. In some embodiments, a controller of the slab scanner can operate the holding component based on a predetermined rule (e.g., only scan the color reference bar at first five images captured by the slab scanner). In some embodiments, the colors of the image 60 can be adjusted based on the image of the color reference bar (the color reference area 65).
In some embodiments, the image 60 can include a mark 67. The mark 67 can be the image of a defect of the slab. In some embodiments, the mark 67 can be the image of a sign created by an operator (e.g., a circle drawn by a marker, etc.) before scanning the surface of the slab. When processing the image 60 with the mark 67, the operator can be notified that a further action (e.g., fix the defect, polish the slab, etc.) may be required.
In some embodiments, the image 60 can include an edge 69. The edge is indicative of a boundary of the slab that has been scanned. When processing the image 60 with the edge 69, the image external to the edge 69 can be removed and a note suggesting a further action (e.g., check the boundary of the slab) can be sent to the operator.
At block 803, the method 800 includes moving the portable image capturing device along a trajectory. In some embodiments, the trajectory can include straight lines, curves, or a combination thereof. In some embodiments, the trajectory passes over at least a substantial part (e.g., over 95%) of the surface of the building material. In some embodiments, the trajectory can pass over a small part of the surface of the building material.
At block 805, the method 800 includes measuring, by the encoder, a distance traveled by the portable image capturing device along the trajectory. At block 807, the method 800 continues by transmitting the measured distance traveled by the portable image capturing device to the controller. At block 809, the method 800 continues by instructing, by the controller based on the determined distance, the image sensor to capture multiple images at multiple time points along the trajectory. In some embodiments, the method 800 can include storing the captured images in a storage device (e.g., a hard drive, a flash drive, etc.) or a memory of the portable image capturing device. In some embodiments, the captured images can be transmitted to a server or an external computer via a wired or wireless connection (e.g., based on communication protocols, such as, Wi-Fi, Bluetooth, NFC, etc.).
At block 903, the method 900 includes analyzing the (captured) images by identifying an edge of each of the (captured) images. In some embodiments, the method 900 includes adjusting colors (and/or light consistency) of the captured images at least partially based on a color reference. In some embodiments, the method 900 includes identifying a mark in the captured images and adjusting the captured images accordingly. At block 905, the method 900 includes combining the (captured) images based on the trajectory so as to form an overall image of the surface of the building material. The overall image of the surface can be stored for further use (e.g., for design projects considering using the building material). In some embodiments, the captured images can be combined or stitched based on control points in the images without using the trajectory.
This disclosure is not intended to be exhaustive or to limit the present technology to the precise forms disclosed herein. Although specific embodiments are disclosed herein for illustrative purposes, various equivalent modifications are possible without deviating from the present technology, as those of ordinary skill in the relevant art will recognize. In some cases, well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the present technology. Although steps of methods may be presented herein in a particular order, alternative embodiments may perform the steps in a different order. Similarly, certain aspects of the present technology disclosed in the context of particular embodiments can be combined or eliminated in other embodiments. Furthermore, while advantages associated with certain embodiments of the present technology may have been disclosed in the context of those embodiments, other embodiments can also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages or other advantages disclosed herein to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.
Throughout this disclosure, the singular terms “a,” “an,” and “the” include plural referents unless the context clearly indicates otherwise. Similarly, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Additionally, the term “comprising” is used throughout to mean including at least the recited feature(s) such that any greater number of the same feature and/or additional types of other features are not precluded. Reference herein to “one embodiment,” “some embodiment,” or similar formulations means that a particular feature, structure, operation, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present technology. Thus, the appearances of such phrases or formulations herein are not necessarily all referring to the same embodiment. Furthermore, various particular features, structures, operations, or characteristics may be combined in any suitable manner in one or more embodiments.
From the foregoing, it will be appreciated that specific embodiments of the present technology have been described herein for purposes of illustration, but that various modifications may be made without deviating from the scope of the invention. The present technology is not limited except as by the appended claims.
This application claims priority to U.S. Provisional Patent Application No. 62/877,343, filed on Jul. 23, 2019, the entire contents of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62877343 | Jul 2019 | US |