The present invention generally relates to the field of three-dimensional (3D) metrology, and, more specifically, to handheld 3D scanning systems and scanners. The scanners, systems and methods described in the present document may be used in a wide variety of practical applications, including but without being limited to manufacturing, quality control of manufactured pieces, and reverse-engineering.
Three-dimensional (3D) scanning and digitization of the surface geometry of objects is commonly used in many industries and services. The shape of an object is scanned and digitized using optical sensors that measure the distance between the sensor and a set of points on the surface.
Conventionally in handheld 3D scanners, the optical sensors include one, two or more “positioning” or “geometry measurement” cameras arranged alongside one another and configured for acquiring geometric and positioning data so that measurements of surface points can be derived. In some scanners, in order to have some texture (a.k.a. color) information pertaining to that same surface, a texture (color) camera may be provided on the scanner alongside the one, two or more “geometry measurement” cameras.
In high-end metrology-graded 3D handheld scanners, since the scanner is displaced during the scanning process, it is desirable to capture an image when all pixels of a (or multiple) camera(s) are exposed at the same time and concurrently while a structured light pattern is being projected onto the surface being scanned. For that reason, global shutter cameras are typically used for the one, two or more “geometry measurement” cameras and for one or more texture camera(s). A key feature of global shutter cameras is that all pixels start and stop to integrate light at the same time and thus more accurate surface measurements may be obtained by the handheld scanner even in the presence of movement by the scanner. A drawback of global shutter cameras is that they are generally complex devices to manufacture and are more costly compared to some alternatives. While the cost may be acceptable for certain high-end applications, it is not the case for other applications, which creates an obstacle to adopting such scanners.
Another type of camera that may be used for the one, two or more “geometry measurement” cameras and texture camera is a rolling shutter camera. Rolling shutters are found in image capture devices that use complementary metal oxide semiconductor (CMOS) sensors, such as digital still and video cameras, cell phone cameras, CCTV cameras, and barcode readers. With a rolling shutter, a picture is captured by scanning across a scene rapidly, whether vertically, horizontally, or rotationally. In contrast with a global shutter where an entire frame is captured at a same instant, not all parts of the image of the scene are recorded at the same instant with a rolling shutter. Despite the time lag in capture, the entire image of the scene is displayed at once as if it represents a single instant in time.
A rolling shutter camera has the advantage of being less expensive than global shutter cameras but are only suited to applications where the camera remains substantially fixed in position while an image is being acquired. In particular, since the pixels are acquired only sequentially by a rolling shutter camera, a rolling shutter camera is not ideal in cases where there is movement of the camera during acquisition of an image, as is the case for handheld scanners where the background and the object are moving compared to the scanner's camera(s) common coordinate system. For this reason, this type of camera is ill-suited for high-end hand-held scanners. The delay in data acquisition resulting from the use of a rolling shutter camera causes temporal distortions in the image. Moreover, the use of such cameras often results in a low contrast between ambient light and an illuminating pattern of light emitted by the scanner due to the comparatively long time during which the pixels are exposed to light.
Against the background described above, it is clear that there remains a need in the industry to provide improved solutions to low-cost handheld 3D scanners that alleviate at least some of the deficiencies in application of rolling shutter cameras to handheld 3D scanners.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify all key aspects and/or essential aspects of the claimed subject matter.
The present disclosure presents handheld scanners and associated methods and systems that use rolling shutter cameras for metrology measurements as the one, two, three or more “geometry measurement” cameras. To diminish the effect of the temporal delays of rolling shutter cameras, the handheld scanner is configured so that activation of the projector of a structured light pattern is delayed until the pixels of the cameras are concurrently active and exposed to light. Following this, after a specific time period, the structured light pattern is deactivated. This process is repeated multiple times during the scan in order to acquire texture, geometry and positioning data over multiple frames.
In some implementations, infrared (IR) light sources may be used by the projector for the projected light pattern and IR light emitting diodes (LEDs) may be used to illuminate positioning targets on or near a surface being scanned. The use of IR light may assist in addressing the problem of insufficient contrast between the projected pattern and ambient light which occurs as a result of a long period for integration of light of the rolling shutter camera (when compared to global shutter cameras). In some implementations, an IR bandpass or longpass filter is used in front of the rolling shutter geometric camera lens to reject wavelengths of light other than IR. The use of IR projected light advantageously does not conflict with the use of a color camera as part of the scanner.
In some embodiments, the handheld scanner may include a color camera positioned alongside the one, two, three or more “geometry measurement” cameras. Like the geometry measurement cameras, the color camera is also configured as a rolling-shutter camera. Additionally, the color camera may, in some implementations, be equipped with a liquid crystal device (LCD) shutter configured to permit light to pass through and be captured by the camera sensor at certain specific time intervals and block light during other time intervals. A shortpass filter (or band stop filter or bandpass filter designed to transmit only the visible spectrum 400-700 nm approximately) used with the color camera may allow white light to be incident on the LCD shutter while blocking light in the IR spectrum range. The LCD shutter may be configured to transmit white light to acquire a color texture image either synchronized with the geometry measurement cameras or with a delay from the acquisition of the geometry measurement cameras. In a specific implementation, the LCD shutter may comprise a single optical cell that covers the entire display area and can be toggled between an open state (a clear state allowing light to pass through) and a closed state (an opaque state that partially or fully blocks light from passing through). The different states may be achieved in different manners known in the art such as, for example, by applying a square wave drive voltage to open and close the LCD shutter.
According to one aspect of the disclosure, a scanner is provided for generating 3D data relating to a surface of a target object comprises a) a scanner frame structure on which is mounted a set of imaging modules including: i. a light projector unit for projecting a structured light pattern onto the surface of the target object, ii. a set of cameras positioned alongside the light projector unit, said set of cameras including one or more rolling shutter cameras for capturing data conveying a set of images including reflections of the light pattern projected onto the surface of the target object, the one or more rolling shutter cameras having sensor surfaces defining a plurality of pixel lines, and b) one or more processors in communication with the set of imaging modules for receiving and processing the data conveying the set of images, wherein the one or more processors are further configured to send control signals to the light projector unit to intermittently project the structured light pattern in accordance with a specific sequence.
Some specific embodiments may include one or more of the following features: the one or more processors may be configured for sending control signals to the light projector unit to intermittently project the structured light pattern in accordance with the specific sequence to cause the light projector unit to toggle between: i. an activated pattern state, during which the light projector unit projects the structured light pattern onto the surface of the target object, and ii. a deactivated pattern state, during which the light projector unit: 1) omits to project the structured light pattern onto the surface of the target object, or 2) projects a substantially attenuated version of the structured light pattern. The sensor surfaces of the one ore more rolling shutter cameras may be activated in accordance with an operating pattern as part of a capture cycle, the operating pattern being characterized by: a. specific time periods during which the individual pixel lines in the plurality of pixel lines are concurrently exposed for a current specific capture cycle, and b. other time periods distinct from the specific time periods during which specific subsets of the individual pixel lines in the plurality of pixel lines are read and cease to be exposed for the current specific capture cycle, wherein the specific subsets the individual pixel lines omit at least one of the individual pixel lines in the plurality of pixel lines. The specific subsets the individual pixel lines may omit at least some of the individual pixel lines in the plurality of pixel lines. The activated pattern state of the light projector unit may at least partially coincide with the specific time periods during which the individual pixel lines in the plurality of pixel lines are concurrently exposed for the current specific capture cycle. The deactivated pattern state of the light projector unit may at least partially coincide with the time periods during which subsets of the individual pixel lines in the plurality of pixel lines are read and cease to be exposed for the current specific capture cycle. The one or more processors may be configured for: a. sending a reset signal to the one ore more rolling shutter cameras to start a new specific capture cycle for the plurality of pixel lines during which pixel lines in the plurality of pixel lines sequential begin to be exposed, b. following a first delay period after the sending of the reset signal, sending an activation control signal to the light projector unit to cause it to toggle into the activated pattern state, c. following a second delay period after the sending of the activation signal to the light projector unit, sending a deactivation control signal to the light projector unit to cause it to toggle into the deactivated pattern state. In some implementations, the light projector unit may include a light source for configured for emitting light with wavelengths in a specific wavelength range. The one or more rolling shutter cameras may include at least one rolling shutter geometric camera for generating image data to derive 3D measurements of the surface of the object, the at least one rolling shutter geometric camera being configured for: a. allowing light with wavelengths in the specific wavelength range to pass through onto the sensor surfaces, b. substantially attenuating light in spectrums outside the specific wavelength range. The light source may be configured to emit at least one of a white light, an infrared light and a blue light. In a very specific implementation, the specific wavelength range may be a infrared wavelength range. The light source may be configured to emit light having wavelengths between 405 nm and 1100 nm. The light source may be embodied in a variety of different devices including, for example but without being limited to, a laser and one or more light emitting diodes (LEDs). The one or more rolling shutter cameras may include at least one rolling shutter geometric camera for generating image data to derive 3D measurements of the surface of the object. The at least one rolling shutter geometric camera may include at least two rolling shutter geometric camera. The rolling shutter geometric camera may include a near infrared camera and/or may include an infrared filter configured to let infrared light pass and to substantially attenuate light in spectrums outside infrared.
In some embodiments, the one or more rolling shutter cameras may further include a rolling shutter color camera for generating image data to derive texture information associated with the surface of the object. The rolling shutter color camera may comprise a liquid crystal device (LCD) shutter. For example, the color rolling shutter camera comprises a. a sensor, b. a lens, and c. wherein the liquid crystal device (LCD) shutter is positioned between the sensor and the lens. The one or more processors may be configured for sending control signals to the LCD shutter for toggling the LCD shutter between an open state and a closed state, wherein in the open state the LCD shutter is translucent and wherein in the closed state the LCD shutter is at least partially opaque. In some specific implementations, in the closed state the LCD shutter may be fully opaque so that light incident on the LCD shutter is substantially blocked from passing through the LCD shutter. The toggling of the LCD shutter between the open state and the closed state may at least partially coincides with the light projector unit toggling between the activated pattern state and the deactivated pattern state so that: a. the LCD shutter is in the open state at least partially concurrently while the light projector unit is in the activated pattern state, b. the LCD shutter is in the closed state at least partially concurrently while the light projector unit is in the deactivated pattern state. The light projector unit may be a first light projector unit projecting light of a first type including the structured light pattern, and the scanner may comprise a second light projector unit including a second projector light source configured for projecting light of a second type onto the surface on the object. The second projector light source is a white light source and wherein the light of the second type is a white light. The second projector light source may include one or more LEDs and/or lasers, for example. The rolling shutter color camera may comprise a filter for blocking at least in part wavelengths of light corresponding to wavelength of light projected by the first light projector unit, for example the filter may be configured to block light in the infrared spectrum. The one ore more rolling shutter cameras in the set of cameras may be mounted to have fields of view at least partially overlapping with one another. The one or more rolling shutter cameras may include two rolling shutter cameras, three rolling shutter cameras or more cameras. In some very specific implementations, the rolling shutter cameras may include at least two rolling shutter geometric cameras and at least one rolling shutter color camera.
In some embodiments, the one or more processors may be further configured to send control signals to the light projector unit to intermittently project the structured light pattern in accordance with the specific sequence, wherein the specific sequence is a periodic sequence so that the light projector unit intermittently projects the structured light pattern onto the surface of the object at regular time intervals. In some implementations, the one or more processors may be configured for processing the set of images including the reflections of the structured light pattern to perform a 3D reconstruction process of the surface of the target object. In some alternative implementations, the one or more processors are configured for transmitting the data conveying the set of images including the reflections of the structured light pattern to a remote computing system distinct from the scanner, the remote computing system being configured for performing a 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the light pattern. In some specific practical implementations, the scanner may be a handheld scanner or a fixed-mounted scanner, for example.
According to another aspect of the disclosure, a scanning system is provided for generating 3D data relating to a surface of a target object, the scanning system comprising: a. a scanner as described above; b. a computing system in communication with said scanner, the computing system being configured for: i. performing a 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the structured light pattern captured by the scanner; and ii. rendering on a graphical user interface displayed on a display device a visual representation of at least portion of the surface of the target object resulting from the 3D reconstruction process.
According to another aspect of the disclosure, a method is provided for generating 3D data relating to a surface of a target object using a 3D scanner, the 3D scanner having a set of imaging modules including a light projector and a set of cameras, the light projector being configured to project a structured light pattern onto the surface of the target object, the set of cameras including one or more rolling shutter cameras for capturing data conveying a set of images including reflections of the structured light pattern projected onto the surface of the target object, the one or more rolling shutter cameras having sensor surfaces defining a plurality of pixel lines, the method comprising a. sending control signals to the light projector unit to cause it to intermittently project the structured light pattern according to a specific sequence by toggling the light projector unit between: i. an activated pattern state, during which the light projector unit projects the structured light pattern onto the surface of the target object, ii. a deactivated pattern state, during which the light projector unit: 1) omits to project the structured light pattern onto the surface of the target object; or 2) projects a substantially attenuated version of the structured light pattern; b. wherein occurrences of the activated pattern state of the light projector unit coincide at least in part with time periods during which the plurality of pixel lines are concurrently exposed in a same capture cycle; c. processing the set of images to perform a 3D reconstruction process of the surface of the target object.
Some specific embodiments may include one or more of the following features: the sensor surfaces of the one ore more rolling shutter cameras are activated in accordance with an operating pattern as part of a current specific capture cycle, the operating pattern being characterized by: a. specific time periods during which the individual pixel lines in the plurality of pixel lines are concurrently exposed in a current specific capture cycle, and b. other time periods distinct from the specific time periods during which specific subsets of the individual pixel lines in the plurality of pixel lines are read and cease to be exposed for the current specific capture cycle, wherein the specific subsets the individual pixel lines omit at least one of the individual pixel lines in the plurality of pixel lines. The specific subsets the individual pixel lines may omit at least some of the individual pixel lines in the plurality of pixel lines. The activated pattern state of the light projector unit may at least partially coincide with the specific time periods during which the individual pixel lines in the plurality of pixel lines are concurrently exposed for the current specific capture cycle. The deactivated pattern state of the light projector unit at least partially coincides with the time periods during which subsets of the individual pixel lines in the plurality of pixel lines are read and cease to be exposed for the current capture cycle. In some emblements, the method may further include: a. sending a reset signal to the one ore more rolling shutter cameras to restart a new specific capture cycle for the plurality of pixel lines during which pixel lines in the plurality of pixel lines sequential begin to be exposed for the new specific capture cycle, b. following a first delay period after the sending of the reset signal, sending an activation control signal to the light projector unit to cause it to toggle into the activated pattern state, c. following a second delay period after the sending of the activation signal to the light projector unit, sending a deactivation control signal to the light projector unit to cause it to toggle into the deactivated pattern state.
In some specific embodiments, the one or more rolling shutter cameras may include a rolling shutter color camera for generating image data to derive texture information associated with the surface of the object, and the rolling shutter color camera may in some cases comprise a liquid crystal device (LCD) shutter. The method may comprise sending control signals to the LCD shutter for toggling the LCD shutter between an open state and a closed state, wherein in the open state the LCD shutter is translucent and wherein in the closed state the LCD shutter is at least partially opaque. The toggling the LCD shutter between the open state and the closed state may at least partially coincide with the light projector unit toggling between the activated pattern state and the deactivated pattern state so that: a. the LCD shutter is in the open state at least partially concurrently while the light projector unit is in the activated pattern state, b. the LCD shutter is in the closed state at least partially concurrently while the light projector unit is in the deactivated pattern state.
In some embodiments, the one or more processors may be further configured to send control signals to the light projector unit to intermittently project the structured light pattern in accordance with the specific sequence, wherein the specific sequence is a periodic sequence so that the light projector unit intermittently projects the structured light pattern onto the surface of the object at regular time intervals. In some implementations, the method may comprise processing the set of images including the reflections of the structured light pattern to perform a 3D reconstruction process of the surface of the target object. In some alternative implementations, the method may comprise transmitting the data conveying the set of images including the reflections of the structured light pattern to a remote computing system distinct from the scanner, the remote computing system being configured for performing a 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the light pattern.
According to another aspect of the disclosure, a computer program product is provided including program instructions tangibly stored on one or more tangible computer readable storage media, the instructions of the computer program product, when executed by one or more processors, cause a 3D scanner to perform operations to generate 3D data relating to a surface of a target object, the 3D scanner having a set of imaging modules including a light projector and a set of cameras, the light projector being configured to project a structured light pattern onto the surface of the target object, the set of cameras including one or more rolling shutter cameras for capturing data conveying a set of images including reflections of the structured light pattern projected onto the surface of the target object, the one or more rolling shutter cameras having sensor surfaces defining a plurality of pixel lines, the operations implementing the method described above.
According to another aspect of the disclosure, a scanner is provided for generating 3D data relating to a surface of a target object. The scanner comprises a scanner frame structure on which is mounted a set of imaging modules including (i) a light projector unit for projecting a structured light pattern onto the surface of the target object, the light projector unit having a light source configured for emitting light with wavelengths in a specific wavelength range; and (ii) a set of cameras positioned alongside the light projector unit, the set of cameras including one or more rolling shutter cameras for capturing data conveying a set of images including reflections of the structured light pattern projected onto the surface of the target object, the one or more rolling shutter cameras having sensor surfaces defining a plurality of pixel lines, said at least one rolling shutter geometric camera being configured for (1) allowing light with wavelengths in the specific wavelength range to pass through onto the sensor surfaces; and (2) substantially attenuating light in spectrums outside the specific wavelength range. The scanner further comprises one or more processors in communication with the set of imaging modules for receiving and processing the data conveying the set of images, wherein the one or more processors are further configured to send control signals to the light projector unit to intermittently project the structured light pattern in accordance with a specific sequence to cause the light projector unit to toggle between an activated pattern state, during which the light projector unit projects the structured light pattern onto the surface of the target object, and a deactivated pattern state, during which the light projector unit omits to project the structured light pattern onto the surface of the target object, or projects a substantially attenuated version of the structured light pattern.
According to another aspect of the disclosure, described is a scanner for generating 3D data relating to a surface of a target object, the scanner comprising (a) a scanner frame structure on which is mounted a set of imaging modules including (i) a light projector unit for projecting a structured light pattern onto the surface of the target object, (ii) a set of cameras positioned alongside the light projector unit, said set of cameras including one or more rolling shutter cameras for capturing data conveying a set of images including reflections of the light pattern projected onto the surface of the target object, the one or more rolling shutter cameras having sensor surfaces defining a plurality of pixel lines, wherein the one or more rolling shutter cameras include at least 1) a rolling shutter geometric camera; and 2) a rolling shutter color camera comprising a liquid crystal device (LCD) shutter for generating image data to derive texture information associated with the surface of the object. The scanner further comprising one or more processors in communication with the set of imaging modules for receiving and processing the data conveying the set of images, wherein the one or more processors are further configured to send control signals to the light projector unit to intermittently project the structured light pattern in accordance with a specific sequence to cause the light projector unit to toggle between (i) an activated pattern state, during which the light projector unit projects the structured light pattern onto the surface of the target object, and (ii) a deactivated pattern state, during which the light projector unit (1) omits to project the structured light pattern onto the surface of the target object, or (2) projects a substantially attenuated version of the structured light pattern.
All features of exemplary embodiments which are described in this disclosure and are not mutually exclusive can be combined with one another. Elements of one embodiment or aspect can be utilized in the other embodiments/aspects without further mention. Other aspects and features of the present invention will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments in conjunction with the accompanying Figures.
The above-mentioned features and objects of the present disclosure will become more apparent with reference to the following description taken in conjunction with the accompanying drawings, wherein like reference numerals denote like elements and in which:
In the drawings, exemplary embodiments are illustrated by way of example. It is to be expressly understood that the description and drawings are only for the purpose of illustrating certain embodiments and are an aid for understanding. They are not intended to be a definition of the limits of the invention.
A detailed description of one or more specific embodiments of the invention is provided below along with accompanying Figures that illustrate principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any specific embodiment described. The scope of the invention is limited only by the claims. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of describing non-limiting examples and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in great detail so that the invention is not unnecessarily obscured.
The present disclosure presents handheld scanners and associated methods and systems that use rolling shutter cameras for metrology measurements as the one, two, three or more “geometry measurement” cameras. To diminish the effect of the temporal delays of rolling shutter cameras, the handheld scanner is configured so that the intermittent activation of the projector of a structured light pattern is delayed until the pixels of the cameras are concurrently active and exposed to light. Following this, after a specific time period, the structured light pattern is deactivated. This process is repeated multiple times during the scan in order to acquire geometry and positioning data over multiple frames.
Infrared (IR) light sources may be used by the projector for the projected light pattern and IR light emitting diodes (LEDs) may be used to illuminate positioning targets on or near a surface being scanned. The use of infrared (IR) light may assist in addressing the problem of insufficient contrast between the projected pattern and ambient light which occurs as a result of a long period for integration of light of the rolling shutter camera and longer period of exposure to light (when compared to global shutter cameras). In some applications, an IR filter may be used in front of rolling shutter camera lens to better select reflected IR light.
In some embodiments, the handheld scanner may include a color camera positioned alongside the one, two, three or more “geometry measurement” cameras that also is configured as a rolling-shutter camera. The color camera may be equipped with a liquid crystal device (LCD) shutter configured to permit light to pass through and be captured by the rolling shutter camera sensor at certain specific time intervals and block light during other time intervals. A shortpass filter may allow white light to be incident on the LCD shutter but largely exclude IR radiation. The LCD shutter may be configured to transmit the incident white light to acquire a color texture image either synchronized with the geometry measurement cameras or with a delay from the acquisition of the geometry measurement cameras. In a specific implementation, the LCD shutter may comprise a single optical cell that covers the entire display area and can be toggled between an open state (a clear state allowing light to pass through) and a closed state (an opaque state that partially or fully blocks light from passing through). The different states may be achieved in different manners known in the art such as, for example, by applying a square wave drive voltage to open and close the LCD shutter.
Herein, a “pixel line” refers to single linear array of connected pixels within an array of pixels. An array of pixels is comprised of a set of pixel lines, wherein a set of pixel lines includes two, three or more pixel-lines.
Using the 3D scanner 100 with at least one processor 160, 3D points can be obtained after applying a suitable computer-implemented method where two images of a frame are captured using the two cameras C1, C2. In metrology, with a hand-held scanner, the two images are captured nearly simultaneously, typically less than 1 ms, meaning that there is no relative displacement between the scene and the 3D scanner 100 during the acquisition of the images or that this relative displacement is negligible. The cameras are synchronized to either capture the images at the same time or sequentially during a time period during which the relative position of the 3D scanner 100 with respect to the scene remains the same or varies within a predetermined negligible range. Such simultaneous capture is typically carried out using cameras with global shutters, which take an image when all pixels of each camera are exposed to incident light at the same time as when the pattern of light is projected from the light projector unit P.
The 3D scanner 100 is configured to obtain distance measurements between the 3D scanner 100 and a set of points on the surface of the object 110 of interest. Since from a given viewpoint the 3D scanner 100 can only acquire distance measurements on the visible or near portion of the surface, the 3D scanner 100 is moved to a plurality of viewpoints to acquire sets of distance measurements that cover the portion of the surface of the object 110 that is of interest. Using the 3D scanner 100, a model of the object's surface geometry can be built from the set of distance measurements and rendered in the coordinate system of the object 110. The object 110 has several object visual targets 117 affixed to its surface and/or on a rigid surface adjacent to the object 110 that is still with reference to the object 110. In some specific practical implementations, to properly position the scanner 100 in space, the object visual targets 117 are affixed by a user to the object 110, although the object visual targets 117 may also be omitted.
In the embodiment of
The cameras C1, C2 and the light projector unit P or light projector units P1, P2 are calibrated in a common coordinate system using methods known in the art. In some practical implementations, films performing bandpass filter functions may be affixed on the camera lenses to match the wavelength(s) of the projector P. Such films performing bandpass filter functions may help reduce interference from ambient light and other sources.
In the handheld 3D scanner, one or more LEDs 38 can also be included. The LEDS 38 can be configured to all emit the same type of light as each other or be configured to emit different types of light. For example, some LEDs 38 can emit white light (e.g., the LEDs 38 closest to the third camera 34) while others of the LEDS 38 can emit IR light (e.g., LEDs 38 closest to the first and second cameras 31, 32). In one embodiment, the LEDs 38 are configured to emit IR radiation of the same or similar wavelength as the light projector unit 36.
In some embodiments, the type of cameras used for the first and second cameras 31, 32 are monochrome cameras and will depend on the type of the light source of the projector unit 36. In some embodiments, the first and second cameras 31, 32 are monochrome or color visible spectrum and near infrared cameras and the projector unit 36 is an infrared light generator or near-infrared light generator. The first and second cameras 31, 32 may implement any suitable shutter technology, including but not limited to: rolling shutters, full frame shutters and electronic shutters and the like. Specific embodiments of the shutters used with first and second cameras 31, 32 are discussed in detail below.
In some implementations, the third camera 34 may be a color camera (also called a texture camera). The texture camera may implement any suitable shutter technology, including but not limited to, rolling shutters, global shutters, and the like. Specific embodiments of the shutters used with the third camera 34 are discussed in detail below.
The first camera 31 is positioned on the main member 52 of the frame structure 20 and alongside the projector unit 36. The first camera 31 is generally oriented in a first camera direction and configured to have a first camera field of view (120 in
The third camera 34 (e.g., the texture camera or color camera) is also positioned on the main member 52 of the frame structure 20 and, as depicted, may be positioned alongside the first camera 31, the second camera 32, and the projector unit 36. The third camera 34 (e.g., the texture camera) is oriented in a third camera direction and is configured to have a third camera field of view at least partially overlapping with the field of projection, with the first field of view 120, and with the second field of view 122.
A data connection 45 (such as a USB connection) can transfer data collected by the first camera 31, the second camera 32 and the third camera 34 to be processed by a computer processor and memory remote from the handheld 3D scanner 10. Such a remote computer processor and memory are in communication with the processor 160 (shown in
The first and second cameras 31, 32 as well as the third camera 34 which is a texture camera use rolling shutters.
In
In
Rolling shutter cameras have simpler electronic components than global shutter cameras and so are less expensive. However, such cameras are not normally used in metrology applications. In a rolling shutter camera, there is a temporal delay between exposure of each pixel line of the camera. While the temporal delay is very small between each adjacent pixel lines, the time delay between the first line and last line (e.g., TN-T1) can be significant. While such a delay may not cause problems in a completely stationary setup (where the cameras in the scanner, background, and object are all fixed and stationary with respect to each other), in mobile hand-held scanners the background and the object are moving compared to the scanner cameras' common coordinate system. The time delay between capture of the first line and last line of a rolling shutter can cause distortions resulting in an unacceptably large measurement error. Additionally, the long exposition time as each pixel line in turn begins acquiring a signal can create issues due to diminished contrast of pattern of light emitted over the ambient light.
In
Diagram 400 illustrates two different capture cycles 415, 425. Capture cycles 415, 425 are substantially identical to each other. Each capture cycle 415, 425 represents the acquisition of a frame or image. In each capture cycle the pixel lines are set or reset to start capturing data for a new image, the subsequently captured data read out, and any delay between cycles allowed to elapsed before the next cycle is begun.
Beginning a first capture cycle 415, a first signal S1 is sent (e.g., by the processor or a processor within the camera itself) to reset data of the first pixel line PL1. The first pixel line PL1 then begins newly integrating the light signal incident on the first pixel line PL1 starting at T1, to form the first line of a new image or frame. Next, a second signal S2 is sent to reset the second pixel line PL2. The second signal S2 can be sent at the same time T1 that PL1 begins capture for the current capture cycle, or immediately before or after that time. Following the second signal S2, the second pixel line PL2 resets and begins newly acquiring and integrating the light signal incident on the second pixel line PL2 at T2. A series of signals are sent that trigger consecutive pixel lines to reset and begin a new capture until resetting of the final pixel line PLN is triggered by signal SN and the final pixel line begins integrating light to form the last image line for the new cycle TN. The signals S1 to SN are sent in a timed sequence, for example, at regular intervals. The interval between S1 and S2 can be 0.0073 ms and the interval between S1 and SN 14.2 ms (for an array with 1944 pixel lines). The time between S1 and E1 can be 17.7 ms, and E1 to EN (which is equivalent to S1 to SN) be 14.2 ms.
At this point, all the pixels in the pixel lines PL1 to PLN are reset and concurrently acquiring data belonging to the current capture cycle 415. Only once all the pixels lines PL1 to PLN are concurrently acquiring data for the current capture cycle is the projector unit 36 (and optionally any LEDs) triggered to toggle from the deactivated pattern state during which it omits to project the structured light pattern onto the surface of the target object (or projects a substantially attenuated version of the structured light pattern), to the activated pattern state, during which the light projector unit projects the structured light pattern onto the surface of the target object. For example, the one or more processors send control signals to the light projector unit to cause it to toggle into the activated pattern state when all of the pixel lines are concurrently exposed to light for the current frame/image (and also to toggle into a deactivated pattern state when one or more pixel lines have ceased being exposed to light for the current frame/image). In another example, the one or more processors send control signals to the light projector unit to cause it to toggle into the activated pattern state after a sufficient time has elapsed since S1 to allow for all of the pixel lines to be concurrently exposed to light.
The light projected from the projector unit is reflected back from the object and received during a projected structured light pattern time period, LP. The projected structured light pattern time period LP is shown as near simultaneous with time TN where all the pixels are reset and concurrently acquiring data for the current capture cycle, and in fact takes place just after time TN (e.g., immediately after, in response to detecting all the pixel lines are concurrently exposed to light, or in response to detecting that the required time period has elapsed). The projected structured light pattern as reflected back from the object thus will be detected simultaneously by all pixels during the projected light pattern time period LP. The time period LP associated with the activated pattern state coincides with the time period during which the pixel lines have been reset and concurrently are exposed to light in the same capture cycle, and the time period associated with the deactivated pattern state of the projector unit coincides with all other time periods. The time period LP associated with the activated pattern state can be, for example, 3.5 ms.
The structured light pattern projection is then turned off in conjunction with a stop signal sent to read out the data captured by the pixels of the camera for the current frame by the processor or a processor within the camera itself. First, an end signal E1 is read out the data of the just detected by the first pixel line PL1 (e.g., since time T1). Subsequent end signals are sent to sequentially read out all the pixel lines until the final end signal EN. At this point the data of the pixels in the pixel lines PL1 to PLN have been read. A new capture cycle 425 then begins, where the previous sequence of signals and pixel reset and readout is repeated. A cycle delay time may elapse between the end of one cycle and the beginning of the next cycle (e.g., a cycle delay time between the time of EN signal for cycle 415 and the time of the S1 signal of cycle 425). The cycle delay time may be chosen to determine the number of cycles per second. These capture cycles can occur multiple times a second during a metrology measurement, for example, 15, 30, 60 or more times per second.
Typically, high-resolution rolling shutter cameras have memory sufficient to contain only a few pixel lines of an image and transfer the data immediately as they a new cycle of being exposed to light. For portability, it is desirable for data transfer to be carried out with a single USB connection. For a 3D scanner such as described with respect to
The method as illustrated in
The IR rolling shutter scanner 505 has two rolling shutter cameras 515, 520 (equivalent to first and second cameras 31, 32 in
In the IR rolling shutter scanner 505, an IR projector 555 is used as a projector unit 36 (of
In the IR rolling shutter scanner 505 one or more IR LEDs 560 are used, which are also configured to emit IR light 575 towards the object 110. The IR light from the IR LEDs 560 is used to illuminate object visual targets 117 on or near to the object 110. The IR light 575 from the IR LEDs 560 is emitted simultaneously with the light the pattern of light 565 emitted by the IR projector 555 to simultaneously get data from the object visual targets 177 and the object itself.
In the IR rolling shutter scanner 505, one or more processors 160 control signals and process data. Data is transferred from the IR rolling shutter scanner 505 along data communication line 562.
In some embodiments, the IR projector 455 emits light at a wavelength of 850 nm. Other wavelengths are possible, for between 405 nm and 1100 nm. The IR LEDs 560 also can emit light at a wavelength of 850 nm. Other wavelengths are possible, for between 405 nm and 1100 nm.
The IR rolling shutter scanner 505 is a variation on the handheld 3D scanner 10 of
Similar to the IR rolling shutter scanner 505, the color scanner 705 includes rolling shutter cameras 515, 520 and IR projector 555, which operate as discussed with respect to
Also integrated into the color scanner 705 is a rolling shutter color camera 720 (equivalent to the third camera 34). The color camera includes a rolling shutter color sensor 730 (e.g., a CMOS sensor with an array of pixels that is configured as a rolling shutter camera as described above).
In the rolling shutter color camera 720, an LCD shutter 743 is placed in front of the color sensor 730 and behind the lens 751 of the rolling shutter color camera 720. An LCD shutter such as the LCD shutter 743 embodied herein includes two polarizers set at 90 degrees with respect to each other with a liquid crystal liquid in between. As is known in the art, such an LCD shutter transmits light based on the angle of the incident light and allows toggling of the light exposure of the rolling shutter color camera 720 between on and off. The arrangement of the LCD shutter 743 behind the lens 751 (rather than in front of or embedded within the lens) allows the LCD shutter to be smaller in size when located behind the lens compared to if positioned in front of it. Light transmitted through the lens is more parallel, so the position of the LCD shutter has less effect on the color detected. The positioning also relaxes tolerances on the optical quality of the LCD shutter.
Placement of the LCD shutter 743 behind the lens 751 also protects the shutter from the exterior environment and contaminants such as dust. Additionally, as LCD shutters are sensitive to temperature, the placement of the LCD shutter 743 behind the lens 751 enables easier temperature control.
Unlike the IR projector 555 that emits a structured light pattern, the white light projector 755 of the color scanner 705 emits a single “spotlight” of visible light. In some embodiments, the white light projector 755 has the form of white light LEDs. In some embodiments, the white light projector 755 may be omitted, and the rolling shutter color camera 720 makes use of white light in the ambient environment.
The rolling shutter color camera 720 includes a filter for blocking at least in part wavelengths of light corresponding to wavelengths of light projected by the IR light projector unit 555. Accordingly, a shortpass filter 753 is included in the rolling shutter color camera 720, so that the majority of incident light with longer wavelengths (e.g., IR radiation) is not transmitted to the lens 751, the LCD shutter 743, nor the rolling shutter color sensor 730. Incident light 770 that is reflected back from the object 110 includes light emitted 765 by the color scanner 705, which can include both IR light from the IR projector 555 (and IR LEDs 560 if used) as well as visible light from white light projector 755 (if used) and from the ambient environment. Use of a rolling shutter color camera 720 advantageously uses white light and excludes the IR projected light. The color scanner 705 is thus able to acquire the color of the object 110 (from received white light) simultaneously with the geometry and position (from received IR light). Rather than alternating between visible projected light (e.g., from the white light projector 755) and IR projected light (from the IR projector 555), the two types of light can be projected and/or captured simultaneously. IR filters 545, 550 in front of the rolling shutter cameras 515, 520 filter out the white light, and so projected (and ambient) white light does not dilute the signal falling on the rolling shutter sensors 525, 530 that determine the 3D positions of the surface of the object 110. The two types of light do not need to be acquired in alternance with altering patterns of projected light.
One or more processors (e.g., processor 160) are configured for sending control signals to the LCD shutter for toggling the LCD shutter between an open state and a closed state, wherein in the open state the LCD shutter is translucent and wherein in the closed state the LCD shutter is opaque. In the closed state the LCD shutter is at least partially opaque (e.g., blocks at least 40%, more preferably at least 50% or most preferably at least 65%) and, in some implementations, may be fully opaque so that a majority of light is blocked from passing through (e.g., blocks at least 75, more preferably at least 80%, more preferably at least 85% and most preferably at least 95% of the light). In some embodiments, toggling the LCD shutter between the open state and the closed state is timed to interleave with periods of time during which the light projector unit toggles between the activated pattern state (where P1 emits a structured IR light pattern) and the deactivated pattern state. This can occur so that when the light projector unit toggles into the activated pattern state, the LCD shutter also toggles into the open state and when the light projector unit toggles into the deactivated pattern state, the LCD shutter toggles in the closed state. Such an arrangement advantageously allows the geometry data (acquired from the IR structured light pattern) to be acquired at the same instant as the texture data.
Simultaneously, at step 820, signals are sent to control the behavior of the rolling shutter camera that is receiving white light. The LCD shutter is controlled to enter the open state (step 825) at the same time that white light is projected (step 827). The white light is projected and timed so that all pixels of the rolling shutter cameras are receiving reflections of the projected white light as permitted by both the rolling shutter and the LCD shutter. Accordingly, the LCD shutter is in the open state at least partially concurrently while the light projector unit is in the activated pattern state and the LCD shutter is in the closed state at least partially concurrently while the light projector unit is in the deactivated pattern state.
The captured white signals are processed, step 830. These steps are repeated. The processed IR signals that are indicative of the 3D surface configuration of the imaged object and the processed white light signals that are indicative of the color and appearance of the imaged object are both output to a user, step 840. All of the steps of method 800 are repeated multiple times to fully characterize an object. Note that step 827 may be omitted in embodiments where the projector unit does not include a white light projector; in such embodiments white light from the environment is received at the LCD shutter. Steps 820 and 825 are repeated over multiple cycles.
In a non-limiting example, some or all the functionality of the computer processor 992 (e.g., the processor 160 of
Those skilled in the art should appreciate that in some non-limiting embodiments, all or part of the functionality previously described herein with respect to the processing system may be implemented using pre-programmed hardware or firmware elements (e.g., microprocessors, FPGAs, application specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.), or other related components.
In other non-limiting embodiments, all or part of the functionality previously described herein with respect to a processor 160 of the 3D scanner 100 or 100′ may be implemented as software consisting of a series of program instructions for execution by one or more computing units. The series of program instructions can be tangibly stored on one or more tangible computer readable storage media, or the instructions can be tangibly stored remotely but transmittable to the one or more computing unit via a modem or other interface device (e.g., a communications adapter) connected to a computer network over a transmission medium. The transmission medium may be either a tangible medium (e.g., optical or analog communications lines) or a medium implemented using wireless techniques (e.g., microwave, infrared or other transmission schemes).
The techniques described above may be implemented, for example, in hardware, software tangibly stored on a computer-readable medium, firmware, or any combination thereof. The techniques described above may be implemented in one or more computer programs executing on a programmable computer including a processor, a storage medium readable by the processor (including, for example, volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Program code may be applied to input entered using the input device to perform the functions described and to generate output. The output may be provided to one or more output devices.
Those skilled in the art should further appreciate that the program instructions may be written in a number of suitable programming languages for use with many computer architectures or operating systems.
Note that titles or subtitles may be used throughout the present disclosure for convenience of a reader, but in no way these should limit the scope of the invention. Moreover, certain theories may be proposed and disclosed herein; however, in no way they, whether they are right or wrong, should limit the scope of the invention so long as the invention is practiced according to the present disclosure without regard for any particular theory or scheme of action.
All references cited throughout the specification are hereby incorporated by reference in their entirety for all purposes.
It will be understood by those of skill in the art that throughout the present specification, the term “a” used before a term encompasses embodiments containing one or more to what the term refers. It will also be understood by those of skill in the art that throughout the present specification, the term “comprising”, which is synonymous with “including,” “containing,” or “characterized by,” is inclusive or open-ended and does not exclude additional, un-recited elements or method steps.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention pertains. In the case of conflict, the present document, including definitions will control.
As used in the present disclosure, the terms “around”, “about” or “approximately” shall generally mean within the error margin generally accepted in the art. Hence, numerical quantities given herein generally include such error margin such that the terms “around”, “about” or “approximately” can be inferred if not expressly stated.
Although various embodiments of the disclosure have been described and illustrated, it will be apparent to those skilled in the art in light of the present description that numerous modifications and variations can be made. The scope of the invention is defined more particularly in the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CA2022/050805 | 5/20/2022 | WO |