The present disclosure is generally related to the calibration, alignment, and/or the synchronization of multiple cameras, image sensors, and/or other sensor technologies. More particularly, the present disclosure relates an active marker that emits light of more than one wavelength, and a calibration wand with one or more such multi-wavelength active markers, usable in the calibration, alignment, and/or the synchronization of multiple cameras, image sensors, and/or other sensor technologies, such as image sensors in a motion-capture system, and to related methods of calibration and tracking of objects.
Motion capture systems may be used to track movement(s) of a person, animal, or other object to which a computer model may be mapped to produce animation that accurately imitates real-world movement(s). Further, motion capture may allow movement for animation to be produced in a quick manner relative to frame-by-frame animation. The quick production time may permit an animation director to experiment with different movements or perspectives before mapping the movement to computer models, which may result in more flexible animation production.
Motion capture technology is also known for use in producing cinematic special effects. For example, a motion capture system may be utilized in conjunction with one or more cinema cameras to record a scene and simultaneously record the positions and movements of persons and items in the scene, and optionally the position, orientation, and movement of the cinema camera(s). The information on the positions and movements can then be utilized to produce or overlay animations or other special effects onto the scene recorded by the cinema camera(s).
Conventional motion-capture systems include multiple cameras or image sensors that detect one or more objects (including persons) within a capture volume around which the cameras or image sensors are positioned. One or more active markers with light-emitting devices, or passive markers such as retroreflectors, can be attached to the objects to facilitate precise location of the objects and parts thereof. For example, markers may be attached to key locations on a person, such as their head, nose, chin, chest, hands, fingers, joints, etc. In order to accurately capture positional information via the motion capture cameras or image sensors, e.g., via triangulation, it is necessary for the cameras or image sensors to be calibrated to account for their relative positions and orientations, and adjust for any synchronization errors. Such calibration methods may involve moving one or more active markers, or a calibration wand with one or more active markers, within the capture volume during the calibration routine, as described in U.S. Pat. No. 9,019,349 to Richardson, which is assigned to the assignee of the present application and incorporated herein by reference.
Some motion capture camera systems include image sensors that are sensitive to light waves of a particular wavelength, such as an infrared (IR) wavelength, allowing greater sensitivity to active markers emitting such wavelengths. Some motion capture cameras may also emit light waves of the particular wavelength to which their image sensors are sensitive, such as IR light of a particular wavelength, which is then reflected back to the motion capture camera by passive markers such as retroreflectors. Cinema cameras or other reference cameras that may be used in conjunction with a motion capture system have excellent fidelity in the visible spectrum, but may not be sensitive to IR light or to another wavelength of light utilized by the motion capture cameras, or may purposefully filter IR light or other wavelengths to improve color balance and video quality. Consequently, the calibration and synchronization of a cinema camera (or other reference camera) with a motion capture system may be complicated by the cinema cameras' lack of sensitivity to the wavelengths of light emitted by the active markers utilized by the motion capture cameras.
One known method of calibrating disparate camera systems sensitive to two different wavelengths involves the use of a T-shaped calibration wand having pairs of two different types of light-emitting diodes (LEDs) (i.e., active markers) mounted thereon—one that emits red light and one that emits IR light. The pairs of LEDs are spaced apart on the body of the wand, and within each pair the red and IR LEDs are positioned in side-by-side relation. The present inventors have recognized that the separation distance between the red and IR LEDs of each pair may lead to calibration errors as between the disparate camera systems. The red and IR LEDs are also recessed below the outer surface of the wand body, which limits the angle of emission of the LEDs and can result in one or both of the LEDs being occluded by the body of the wand and/or the other LED, depending on the orientation of the wand relative to the camera.
A need remains for a more accurate and reliable method of calibrating and synchronizing two or more camera systems sensitive to different wavelengths of light, and/or for improved active markers and calibration wands for use with such disparate co-calibrated camera systems.
A multi-wavelength active marker for image capture systems, such as a motion capture system, includes a first source emitting a first light wave having a first wavelength and a second source emitting a second light wave having a second wavelength different from the first wavelength. A waveguide is arranged adjacent the first source and the second source so that the first and the second light waves are transmitted into a first end of the waveguide and guided along the waveguide to a second end thereof, preferably through total internal reflection at the sides of the waveguide. A diffuser attached to the second end of the waveguide receives the first and second light waves and diffuses and emits them from the diffuser.
The waveguide may be shaped to converge the first and the second light waves from the first end to the second end, and the diffuser may be sized and shaped so that the first and second wavelengths are emitted from the diffuser with an emission angle of more than 180 degrees. For example, the diffuser may have a spherical emission surface.
In some embodiments, the first source is an LED that emits infrared (IR) or near-IR light, and the second source is an LED that emits visible light, such as red light. The first source and the second source may be mounted on a printed circuit board (PCB) adjacent one another in spaced-apart relation.
The multi-wavelength active markers may be used in a motion-capture system that may include a calibration wand having one or more of the multi-wavelength active markers. The calibration wand may comprise a T-shaped body with an elongated handle and a head extending transversely to the elongated handle, and with a plurality of the multi-wavelength active markers supported on the head. In some embodiments, the calibration wand may include three or more such multi-wavelength markers irregularly spaced apart. The motion capture system may include first and second camera systems, including a first camera system sensitive to the first wavelength of light; and a second camera system sensitive to the second wavelength of light and not sensing the first light wave.
Methods of calibrating camera systems are also disclosed, which include the steps of waving a calibration wand having one or more of the multi-wavelength active markers while emitting first and second light waves from a common emission surface of each of the markers, capturing the first light wave using one or more first sensors sensitive to the first wavelength, and capturing the second light wave using one or more second sensors sensitive to the second wavelength. The first and second sensors are then calibrated in response to capturing the respective first and second light waves from the multi-wavelength markers.
Additional aspects and advantages will be apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
This disclosure describes apparatuses, systems, and methods used to calibrate, align, and/or synchronize a plurality of cameras, image sensors, and/or other sensor technologies, which may be embedded in or on a motion-capture system. To achieve accurate calibration, alignment, and/or synchronization of the cameras, image sensors, and/or other sensor technologies, this disclosure describes a multi-wavelength active marker and a calibration wand. The calibration wand may include one or more multi-wavelength active markers.
In one aspect, the multi-wavelength active marker includes a first source that emits a first electromagnetic wave (first light) having a first wavelength, and a second source that emits a second electromagnetic wave (second light) having a second wavelength different from the first wavelength. The multi-wavelength active marker also includes a waveguide with a first end and a second end, where the waveguide is mechanically coupled to the first source and the second source so that the first and the second light waves are transmitted into the first end of the waveguide. The waveguide of the multi-wavelength active marker guides light from the first and second light sources from the first end to the second end. The multi-wavelength active marker also includes a diffuser that is mechanically coupled to second end of the waveguide, where the diffuser diffuses the first and the second light waves. Thus, the emissions of the first and second light sources are optically aligned with one another, which facilitates calibration of disparate cameras or imaging systems sensitive to the different wavelengths of light.
In another aspect, even though the multi-wavelength active marker described herein includes at least two sources (e.g., light sources), the multi-wavelength active marker behaves, or operates, like having a single source that emits a first electromagnetic wave having a first wavelength and a second electromagnetic wave having a second wavelength that is different from the first wavelength.
For the sake of clarity, as used herein, the terms “light” and “light waves” may include infrared (IR) waves, IR radiation or IR light; near-IR waves, near-IR radiation, or near-IR light; visible light; and ultraviolet (UV) waves, radiation, and/or light. And the term “electromagnetic waves” may include light, infrared (IR) waves, IR radiation or IR light; near-IR waves, near-IR radiation, or near-IR light; visible light; ultraviolet (UV) waves, radiation, or light; X-rays; gamma rays, microwaves, radio waves, etc. For the sake of brevity, however, the present disclosure generally focuses on the use of IR or near-IR light and the use of visible light. Generally, IR light includes light having approximately an 800 nanometer (nm) to 1 millimeter (mm) wavelength, and near-IR light includes light having approximately an 800 to 2500 nm wavelength. Visible light may include red light (approximately 620 to 780 nm wavelength); orange light (585 to 620 nm); yellow light (570 to 585 nm); green light (490 to 570 nm); blue light (440 to 490 nm); indigo light (420 to 440 nm); violet light (400 to 420 nm); combinations thereof (e.g., a white light), or any specific range of wavelengths in the visible spectrum. It is to be understood that there is an inverse relation between the wavelength and the frequency of a light or any other electromagnetic wave. That is, a shorter first wavelength translates to a higher frequency of the electromagnetic wave, and vice versa.
In some embodiments, the motion-capture system 100 may include additional or fewer cameras 102 than what is illustrated in
In some embodiments, the user (e.g., the director, the camera operator) may use a computing device 112 to operate the cameras 102. The computing device 112 communicates with the cameras 102 using a communication link 114, which may comprise wired and/or wireless communication links.
Communication(s) in the environment of
To facilitate motion capture, objects and bodies 116 in the capture volume 110 may be fitted with one or more markers 106 that are more easily identified by cameras 102 and motion capture software used therewith. Body 116 may belong to a person (e.g., actor, supporting staff), and the markers 106 may correspond to different joints or points of the person. In some embodiments, the body 116 may belong to an animal. In some embodiments, the body 116 may represent an object, such as a mechanized object.
In some embodiments, the markers 106 are passive markers that reflect incident light to enhance the brightness or the signal of the markers 106, and the cameras 102 can detect the reflected incident light. In some embodiments, each, some, or all of the cameras 102 may include a light source(s) that is/are substantially coaxially aligned with an axis extending perpendicularly outward from each of the camera(s) to provide illumination to the markers 106. The markers 106 of a passive variety may include retro-reflective material with corner cube reflectors, or other structures, patterns, and/or configurations to reflect incident light back to its source regardless of its angle of incidence. In some embodiments, the sources (e.g., light sources) of the cameras 102 provide IR light, near-IR light, visible light or combinations thereof that may be reflected by the corner cube reflectors of the markers 106 back toward the cameras 102 to relatively enhance light from the markers 106 relative to other elements of the scene.
In some embodiments, the markers 106 are active markers that emit their own light, instead of, or in addition to, reflecting incident light. In a preferred embodiment, the cameras are sensitive to active markers that emit IR light.
Regardless of the markers 106 being active or passive markers, the markers 106 may be associated with a body 116 located in the volume 110 and/or the scene 104. In some embodiments, the body 116 may be a moving body, a static body, or a combination thereof (e.g., sometimes moving, other times static). In some embodiments, one or more of the markers 106 may include or emit an identifier (e.g., a pulse-modulated code or label, or an electronic tag synchronized with light emissions) in order to track the motion and/or the position of specific objects in the capture volume 110 or specific markers 106 on the body 116.
One type of camera operation may be referred to herein as a “marker-tracking mode,” which may produce a marker-tracking view of the scene 104 using, in part, the computing device 112. When operating in the marker-tracking mode, a combination of optical filters, components, firmware, and/or software of a camera 102 and/or the computing device 112 may be used to facilitate detection of the markers 106, as well as distinguish those markers from other elements in the scene 104 and/or the body 116.
When the cameras 102 operate in the marker-tracking mode, the position of the markers 106 is determined from the scene data recorded by the cameras 102. In some embodiments, the cameras 102 send marker position data to the computing device 112, which then calculates the marker positions through triangulation from the location of marker position data in the scenes simultaneously captured by different cameras from different angles. The position(s) of the markers 106 correspond to the movement of a person(s), animal(s), and/or object(s), which may be mapped by the computing device 112 to a three-dimensional (3D) model for computer-aided animation.
In some embodiments, the cameras 102 may be configured to enhance light from the scene 104 by, for example, increasing the signal-to-noise ratio (SNR) of the scene 104 in order to make markers 106 appear brighter relative to other elements of the scene 104. For example, the SNR may be increased by increasing the gain of the cameras 102. As another example, the SNR may be increased by filtering light received by the image sensor(s) that may be embedded in or on the camera 102. The sensor(s) may be any combination of one or more IR sensors, visible (or optical) light sensors, UV light sensors, hyperspectral image sensors, and/or other image sensors. In some embodiments, by detecting the scene 104 where light from the markers 106 is relatively enhanced, the markers 106 may be more easily identifiable by a motion-capture program that may be executed using the computing device 112.
Furthermore, each, some, or all of the cameras 102 that are configured to operate in the marker-tracking mode may also be configured to enhance light from the scene 104 to a lesser extent than in the marker-tracking mode. That is, one or more of the cameras 102 may be configured to detect the scene 104 at an SNR that is different than the SNR used during the marker-tracking mode. For example, the SNR may be lowered such that the scene 104 is detected at an SNR that approximates normal video recording, where elements of the scene 104 are substantially not skewed through light enhancement or other methods. In other words, the brightness of markers 106 may be enhanced to a lesser extent, or not at all, relative to other elements of the scene 104. This more balanced type of camera operation may be referred to herein as a “scene mode,” which may produce a view of the scene 104 using, in part, the computing device 112.
Cameras 102 that are operable in the scene mode may be controlled (e.g., remotely) by the computing device 112 to selectively transition between the marker-tracking mode and the scene mode. In some cases, each, some, or all of the cameras 102 in the motion-capture system 100 may be selectively configured to operate in the marker-tracking mode, in the scene mode, or a combination of these modes. It should be appreciated that many different configurations and combinations are possible.
In some embodiments, the captured volume 110 may be defined based on, or as a result of, a camera calibration procedure, in which each of the cameras 102 perform a given calibration step substantially simultaneously (or concurrently). During calibration, the cameras 102 are simultaneously set to operate in marker-tracking mode, so that all of the cameras 102 detect the position and/or movement of markers 106 for calibration.
In some embodiments, the calibration procedure may involve a user (e.g., an actor, a supporting staff) moving a calibration wand 118 about an area, as is illustrated by a movement 120, wherein the calibration wand has one or more active markers 108 mounted thereon. The active markers 108 are detected by the cameras 102 in order to create data points organized into a calibration data set from which the captured volume 110, or a portion thereof, may be defined. The active marker 108 illustrated in
In some embodiments, a marker-defined reference plane 122 (“reference plane 122”) may be placed on the ground within the volume 110 and may be detected to create data points that are interpolated to define the plane at which the ground resides in the volume 110. For example, the plane defined by the reference plane 122 ties the position data captured by the cameras 102 to world coordinates that are local to the volume 110. As another example, various different patterns may be used to create data points, such as a grid or checkerboard pattern. It will be appreciated that, in some embodiments, the volume 110 and the camera calibration may be achieved in any suitable manner utilizing any suitable reference object (e.g., wand 118) and/or marker-defined reference plane (e.g., reference plane 122).
In some embodiments, as part of the calibration procedure, a calibration data set of data points created as a result of detecting movement of the active marker 108 and detecting the position of the reference plane 122 may be used to determine intrinsic and extrinsic properties of each of the cameras 102 using minimization techniques. The intrinsic and extrinsic properties may define parameters of each of the cameras 102 that produce a given photograph (or still) image or video (e.g., a sequence of image frames). The intrinsic properties may include parameters that define lens properties, a focal length, a distortion, pixels, pixels-per-degree, an image format, a principal point relative to the camera itself, and other intrinsic parameters. The extrinsic properties may include parameters that define a position of a camera center and the cameras heading in world coordinates and may be defined relative to the other cameras.
The solution to the calibration data set resulting from application of the minimization techniques that includes optimization of the intrinsic and extrinsic parameters of each camera may be represented, for example, by a series of transformations (e.g., a matrix of camera intrinsic parameters, a three-by-three (3×3) rotation matrix, a translation vector, etc.). The series of transformations may be referred to as a camera projection matrix.
In some embodiments, the camera projection matrix can be used to associate points in a camera's image space with locations in 3D world space. The camera calibration procedure may provide a solution to the calibration data set of position data received from each camera 102. The solution to the calibration data set may define the intrinsic and extrinsic properties of each camera 102. Since the same solution may be used for calibration of all of the cameras 102, the intrinsic and extrinsic properties may be interpolated more quickly and easily by applying minimization techniques than by calibrating each camera 102 individually. Therefore, each camera 102 may be calibrated substantially simultaneously (or concurrently) to reduce calibration time. Nevertheless, in some embodiments, each camera 102 may be calibrated using the solution to the calibration data set without simultaneous calibration. Regardless of the particular timing of calibration or a given calibration step, it may be a substantial benefit to avoid having to use an entirely separate procedure to calibrate scene mode cameras.
It will be appreciated that the captured volume 110 and camera calibration may be achieved in any suitable manner. For example, various different types of reference markers (e.g., markers of the reference plane 122, active markers 108) may be used to create data points for interpolation of the intrinsic and extrinsic properties of each camera 102. As another example, various different patterns may be used to create data points, such as a grid or checkerboard pattern. Since the cameras 102 may be collectively calibrated using the same solution to the calibration data set, a marker-tracking view produced by cameras operating in the marker-tracking mode, and a scene view produced by cameras operating in the scene mode, may be automatically calibrated. Consequently, when the marker-tracking view is overlaid with the scene view, the markers 106 align with corresponding points on the moving (or static) body 116 (e.g., a person's joints and/or points of motion). The proper alignment of the marker 106 with the body 116 can be achieved because of the camera calibration procedure during which the properties of the camera are learned.
Subsequent to the camera calibration, in some embodiments, the computing device 112 can create a composite and/or an overlay view where at least a portion of the marker-tracking view is overlaid on the scene view, or at least on a portion thereof. Creation of the composite and/or the overlay may be enabled based on calibration of the cameras 102 and the resulting solution to the calibration data set. Since the cameras 102 are calibrated, the markers 106 (that may be relatively enhanced compared to the scene 104) can be overlaid with the body 116 in the scene 104 so that the markers 106 are suitably aligned with points of movement (e.g., joints, head, etc.) of the body 116.
In some embodiments, the composite and/or the overlay view may have many useful applications that can be leveraged to improve accuracy and efficiency during a motion-capture session. For example, the composite view may be used to verify that the positions of the markers 106 correspond to the actual movement of the body 116. As another example, the composite view may be used to reconstruct situations when the markers 106 do not correctly represent movement and/or position of the body 116 in order to diagnose and solve the problem. As another example, the composite view may be enhanced by importing virtual objects, such as animations or models, that are aligned with the markers 106. As another example, in a motion capture session for, for example, a movie, a director may view a composite view of an actor (e.g., the body 116) and a model of a costume that may be imported into the composite view and be fitted to the markers 106. Since the markers 106 are aligned with the points of movement of the actor, the costume may accurately move as the actor (e.g., body 116) moves in the scene view. As another example, the composite view may be used to by the director to quickly try different sets, costumes, camera angles, etc. that may facilitate more efficient use of a motion capture session, which may result in shorter capture sessions and reduced production costs. As another example, during a motion-capture session, one or some cameras may operate in the marker-tracking mode, and another or other cameras may operate in the scene mode. As yet another example, for every five cameras that operate in the marker-tracking mode, one camera may operate in the scene mode. This configuration may facilitate accurate marker tracking, while still being able to produce a composite view that includes at least some of the marker-tracking view overlaid on at least some of the scene view.
In some embodiments, one or more of the cameras 102 may be a cinema camera or other reference camera that is sensitive to visible wavelengths, but not sensitive to the IR wavelengths to which other ones of the cameras 102 are sensitive, or which may filter IR wavelengths to improve the fidelity of color reproduction. In some embodiments, the reference camera may have an image sensor that is sensitive to both visible and IR wavelengths but may optionally use an IR filter that prevents IR wavelengths from being transmitted to the image sensor and being sensed by the image sensor. In some embodiments (not illustrated) a cinema camera or other reference camera may be located within the capture volume 110 and may be an object having a marker that is tracked by motion tracking cameras 102. To facilitate use of motion capture data in conjunction with the visible scene captured by the reference camera, it is useful for the reference camera to be calibrated and synchronized with the motion capture cameras 102 (e.g. co-calibrated). However, the reference camera may not be configured to sense IR light emitted by a conventional active marker 108 of wand 118. Accordingly, the present inventors have created a multi-wavelength active marker for use as active marker 108 of wand 118 to simultaneously emit both IR wavelengths, and a visible wavelength, such as red light, for use in simultaneously or substantially simultaneously calibrating the motion capture cameras 102 and the reference camera together. Alternatively, all of the cameras 102, including the reference camera, may be sensitive to both visible and IR wavelengths, but with some cameras calibrated using IR wavelengths, while others, such as the reference camera, are calibrated using visible wavelengths. Methods for calibrating and synchronizing reference cameras with motion capture cameras may utilize a software calibration routine and methodology similar to the conventional methodology described above in the context of a group of motion capture cameras that are all operative to receive IR—with the main difference being the use of multi-wavelength active markers and disparate camera systems that are sensitive to different active marker wavelengths. Exemplary embodiments of a multi-wavelength marker and a calibration wand using such markers are described below with reference to
With reference to
As is illustrated in
The multi-wavelength active marker 200 preferably includes a waveguide 210 for gathering and directing light emitted by LEDs 202, 204. With reference to
The waveguide 210 is sized and shaped to guide the IR light (or near-IR light) and the visible light through total internal reflection, or near total internal reflection, inside the waveguide 210, due to the low (oblique) angle of the sides of the waveguide 210 relative to the incident light from visible and IR LEDs 204, 202. For example, the angle of the sides of the waveguide 210 relative to the incident rays of light from LEDS 202, 204 is less than a critical angle for light in the medium of the waveguide 210 relative to the surrounding air. In some embodiments, the waveguide 210 includes an opaque cover or shield at or around the external surface of the waveguide 210, as further described below with reference to
In some embodiments, the multi-wavelength active marker 200 may utilize standard, off-the-shelf, or commercially available IR LEDs 202 and visible LEDs 204. Therefore, in some embodiments, the IR LED 202 may differ in size from the visible LED 204 (e.g., different widths, depths, and heights). Since the both the IR LED 202 and the visible LED 204 are mounted on the same substantially flat surface of the PCB 206, the first end 212 of the waveguide 210 includes an entry surface 216 and an entry surface 218 to accommodate the different sizes of the IR LED 202 and the visible LED 204, respectively (e.g., as is shown in
The multi-wavelength active marker 200 may further include a diffuser 222 coupled to the second end 214 of the waveguide 210. The diffuser 222 may be constructed using the same material as, or a different material from, the waveguide 210. For example, both of the diffuser 222 and the waveguide 210 may be constructed of transparent plastic resin material. In another example, the diffuser 222 may be constructed of glass and the waveguide 210 may be constructed using plastic, or vice versa. Regardless of whether the diffuser 222 is constructed using the same material as, or a different material from, the waveguide 210, the material of the diffuser 222 is substantially transparent to, or allows propagation of, the IR light (or near-IR light) and the visible light. For example, the diffuser 222 may be constructed using clear glass or clear plastic with a matte finish on the exterior spherical surface to promote relatively homogenous and diffused emission of light therefrom.
The diffuser 222 diffusively emits the IR light (or near-IR light) originating from IR LED 202 and the visible light originating from visible LED 204 into the capture volume 110 and/or the scene 104 of
In some embodiments, the waveguide 210 and the diffuser 222 of the multi-wavelength active marker 200, collectively, increase the emission angles of all the light (e.g., the IR or near-IR light, and the visible light). The diffuser 222 is spaced apart from the circuit board 206 and is larger than the second end 214 of diffuser, which allows emission angles of greater than 180 degrees and up to 270 degrees or more, including emissions from an underside of the spherical diffuser 222 adjacent where it is fitted to waveguide 210. The increased emission angles facilitate visibility of the mutli-wavelength active marker 200. Thus, a user that is calibrating the motion-capture system 100 need not face the multi-wavelength active marker 200 directly toward a particular sensor and/or camera of the motion-capture system 100 of
In some embodiments, the waveguide 210 and the diffuser 222, collectively, cause the multi-wavelength active marker 200 to behave like having a single source that emits a first light wave having a first wavelength (e.g., IR light, near-IR light, etc.) and a second light wave having a second wavelength (e.g., visible light, red light, etc.) that is different from the first wavelength. For clarity, the IR LED 202 is located adjacent to the visible LED 204. Even though the distance between the IR LED 202 and the visible LED 204 is relatively small, these two sources still occupy different locations on the PCB 206. The waveguide 210 is shaped to cause the IR light (or near-IR light) and the visible light to converge into the common exit surface 220 of the waveguide 210. The diffuser 222 then substantially simultaneously diffuses both the IR light (or near-IR light) and the visible light into the scene 104 and/or the volume 110 of
The use of a plurality of multi-wavelength active markers 200 (instead of only one such multi-wavelength active marker) can increase the accuracy and speed of the calibration, alignment, and/or synchronization of image sensors sensing different wavelengths, such as the motion capture cameras 102 and reference camera(s) of the motion-capture system 100.
The multi-wavelength active markers 200 may be mounted on wand 300 in an asymmetric arrangement. For example, the distance between the first and the second multi-wavelength active markers 200 may be different (e.g., greater or less) than the distance between the second and the third multi-wavelength active markers 200. This predetermined asymmetric arrangement in a predetermined spacing helps the sensors and/or the cameras 102 of the motion-capture system 100 of
In some embodiments, the elongated handle 302 may include a plurality of sections (or posts), and the sections allows a user to adjust the length of the elongated handle 302. The elongated handle 302 may be hollow and may be configured as a housing (or enclosure) for various electrical, electronic, and/or mechanical components, wiring, or combinations thereof. In some embodiments, the head 304 may also be configured as a housing (or enclosure) for various electronic, electrical, and/or mechanical components, wiring, or combinations thereof. For example, the housing of the head 304 may enclose respective PCBs 206 of the plurality of multi-wavelength active markers 200. The housing of the head 304 may include bottom covers 308. The bottom covers 308 may be mechanically coupled to the head 304 using screws (as is illustrated), clamps, or another coupling mechanism.
In step 402, the method 400 may include a user waving a calibration wand (e.g., the calibration wand 300 of
In step 404 of method 400, each multi-wavelength active marker of the calibration wand substantially simultaneously emits a first light wave of a first wavelength and a second light wave of a second wavelength that is different from the first wavelength. In some embodiments, the first light wave may be an IR or a near-IR light, and the second light wave may be visible light (e.g., red light or another light in the visible spectrum). In some embodiments, the multi-wavelength active marker may emit more than two light waves all having different wavelengths.
In step 406 of method 400, the first light wave is captured, sensed or recorded using one or more first image sensors sensitive to the first light wave, and the second light wave is captured, sensed or recorded using one or more second image sensors sensitive to the second light wave. In some embodiments, one or some of the cameras 102 of
In step 408, the method 400 method may include calibrating and/or synchronizing the one or more first sensors, the one or more second sensors, and/or the cameras 102 of
Although embodiments are described herein in the context of a motion capture system used with a reference camera, the multi-wavelength markers and calibration wands according to the present disclosure may be used with any two or more image sensors or other sensors sensing two or more different wavelengths, regardless of whether the image sensors are used for motion capture. Also, the multi-wavelength markers may be used for various purposes such as calibration, tracking, identification, and/or marking of any sort.
As described herein, in some embodiments the adjustment of the sensors and/or cameras may include determining intrinsic properties and/or extrinsic properties of each, some, or all of the sensors and/or cameras. For example, the intrinsic properties may include parameters that define lens properties, a focal length, a distortion, pixels, pixels-per-degree, an image format, a principal point, or combinations thereof of the sensors and/or cameras. As another example, the extrinsic properties may include parameters that define a position of each, some, or all of the sensors and/or cameras relative to each other, each multi-wavelength active marker, to a scene, or combinations thereof.
Having identified various components and methodologies in the present disclosure, it should be understood that any number components and arrangements may be employed to achieve the desired functionality within the scope of the present disclosure. For example, although some components are depicted as single components, many of the elements described herein may be implemented as discrete or distributed components or in conjunction with other components, and in any suitable combination and location. Some elements may be omitted altogether.
Moreover, various functions described herein as being performed by one or more entities may be carried out by hardware, firmware, and/or software, as described herein. For instance, various functions may be carried out by a processor executing instructions stored in memory. As such, other arrangements and elements (e.g., machines, interfaces, functions, orders, and groupings of functions, etc.) can be used in addition to or instead of those shown.
The subject matter of examples is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this disclosure. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies.
Further, while examples of the present disclosure may generally refer to the apparatuses, the systems, and the schematics described herein, it is understood that the techniques described may be extended to other implementation contexts.
Examples disclosed herein are intended in all respects to be illustrative rather than restrictive. It will be obvious to those having skill in the art that many changes may be made to the details of the above-described examples without departing from the underlying principles of the disclosure.