The present application relates to an electronic device, a parameter calibration method, and a non-transitory computer readable storage medium. More particularly, the present application relates to an electronic device, a parameter calibration method, and a non-transitory computer readable storage medium with a SLAM module.
Self-tracking devices, such as VR headsets or trackers, access a SLAM module to determine their positions in the real space with the images captured by the cameras within the self-tracking devices. However, changes in the self-tracking devices, such as damage or breakage during deliver or usage, can affect the relative position and the relation rotation between the cameras of the self-tracking devices, and the pre-set extrinsic parameters, including the pre-set relative position parameter and the pre-set relative rotation parameter, between the cameras of the self-tracking devices may no longer be used, in which the performance of the SLAM module may be decreased.
When the changes to the cameras (such as the relative position and the relation rotation between the cameras) of the self-tracking devices become significant, the self-tracking devices may become unable to track themselves with the SLAM module even if the cameras themselves are functioning properly. Several methods are proposed to recalculate the extrinsic parameters of the cameras of the self-tracking devices, such as recalculating the extrinsic parameters with a checkerboard or a Deltille grid. However, it is impractical for the users to carry the checkerboard or the Deltille grid at any time.
Therefore, how to calibrate the extrinsic parameters between the cameras of the self-tracking device without the existence of the checkerboard or the Deltille grid is a problem to be solved.
The disclosure provides an electronic device. The electronic device includes a memory, several cameras, and a processor. The memory is configured to store a SLAM module. The several cameras are configured to capture several images of a real space. The processor is coupled to the camera and the memory. The processor is configured to: process the SLAM module to establish an environment coordinate system in correspondence to the real space and to track a device pose of the electronic device within the environment coordinate system according to several images; and perform a calibration process. The operation of performing the calibration process includes: calculating several poses of several cameras within the environment coordinate system according to several light spots within each of several images, in which several light spots are generated by a structured light generation device; and calibrating several extrinsic parameters between several cameras according to several poses.
The disclosure provides a parameter calibration method suitable for an electronic device. The parameter calibration method includes the following operations: capturing several images of a real space by several cameras; processing a SLAM module to establish an environment coordinate system in correspondence to the real space and to track a device pose of the electronic device within the environment coordinate system according to several images by a processor; and performing a calibration process by the processor. The operation of performing the calibration process includes the following operations: calculating several poses of several cameras within the environment coordinate system according to several light spots within each of several images, wherein several light spots are generated by a structured light generation device; and calibrating several extrinsic parameters between several cameras according to several poses.
The disclosure provides a non-transitory computer readable storage medium with a computer program to execute aforesaid parameter calibration method.
It is to be understood that both the foregoing general description and the following detailed description are by examples and are intended to provide further explanation of the invention as claimed.
Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is noted that, according to the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
Reference will now be made in detail to the present embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
It will be understood that, in the description herein and throughout the claims that follow, although the terms “first,” “second,” etc. may be used to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the embodiments.
It will be understood that, in the description herein and throughout the claims that follow, the terms “comprise” or “comprising,” “include” or “including,” “have” or “having,” “contain” or “containing” and the like used herein are to be understood to be open-ended, i.e., to mean including but not limited to.
It will be understood that, in the description herein and throughout the claims that follow, the phrase “and/or” includes any and all combinations of one or more of the associated listed items.
Reference is made to
Reference is made to
It should be noted that, in
One or more programs are stored in the memory 110 and the memory 210 and are configured to be executed by the processor 130 or the processor 230, in order to perform a parameter calibration method.
In some embodiments, the electronic device 100 and the electronic device 200 may be an HMD (head-mounted display) device, a tracking device, or any other device with self-tracking function. The HMD device may be wear on the head of a user.
In some embodiments, the memory 110 and the memory 210 store a SLAM (Simultaneous localization and mapping) module. The electronic device 100 and the electronic device 200 may be configured to process the SLAM module. The SLAM module includes functions such as image capturing, features extracting from the image, and localizing according to the extracted features. In some embodiments, the SLAM module include a SLAM algorithm, in which the processor 130 access and process the SLAM module so as to localize the electronic device 100 according to the images captured by the cameras 150A to 150C. Similarly, the processor 230 access and process the SLAM module so as to localize the electronic device 200 according to the images captured by the cameras 250A to 250C. The details of the SLAM system will not be described herein.
Specifically, in some embodiments, the electronic device 100 may be applied in a virtual reality (VR)/mixed reality (MR)/augmented reality (AR) system. For example, the electronic device 100 may be realized by, a standalone head mounted display device (HMD) or VIVE HMD.
In some embodiments, the processor 130 and 230 can be realized by, for example, one or more processing circuits, such as central processing circuits and/or micro processing circuits, but are not limited in this regard. In some embodiments, the memory 110 and 210 include one or more memory devices, each of which includes, or a plurality of which collectively include a computer readable storage medium. The non-transitory computer readable storage medium may include a read-only memory (ROM), a flash memory, a floppy disk, a hard disk, an optical disc, a flash disk, a flash drive, a tape, a database accessible from a network, and/or any storage medium with the same functionality that can be contemplated by persons of ordinary skill in the art to which this disclosure pertains.
The cameras 150A to 150C and the cameras 250A to 250C are configured to capture one or more images of the real space that the electronic device 100 and 200 are operated. In some embodiments, the cameras 150A to 150C and the cameras 250A to 250C may be realized by camera circuit devices or any other camera circuits with image capture functions.
In some embodiments, the electronic device 100 and 200 include other circuits such as a display circuit and an I/O circuit. In some embodiments, the display circuit covers a field of view of the user and shows a virtual image at the field of view of the user.
For ease of illustration, the following take the electronic device 100 as illustrated in
Reference is made to
As illustrated in
In other embodiments, the mixed reality environment coordinate system M could be an augmented reality environment coordinate system or an extended reality environment coordinate system. The following takes the mixed reality environment coordinate system M for examples for illustrative purposes; however, the embodiments of the present disclosure are not limited thereto.
In some embodiments, the device pose of the electronic device 100 includes a position and a rotation angle.
When calculating the device pose of the electronic device 100 according to the images captured by the cameras 150A to 150C, the intrinsic parameter and the extrinsic parameter of each of the cameras 150A to 150C are considered. In some embodiments, the extrinsic parameters represent a rigid transformation from 3D world coordinate system to the 3D camera's coordinate system. The intrinsic parameters represent a projective transformation from the 3D camera's coordinates into the 2D image coordinates. In some embodiments, the extrinsic parameters of the cameras 150A to 150C include the difference between the poses of the cameras.
Reference is made to
When the processor 130 tracks the device pose of the electronic device 100 within the mixed reality environment coordinate system M, the extrinsic parameters between each two of the cameras are preset within the SLAM module are considered. However, during the operation of the electronic device 100, the positions of the camera 150A and the camera 150B and the rotation angles of the camera 150A and the camera 150B relative to the electronic device 100 may be changed, and the SLAM module may no longer be working properly with the images captured by the cameras 150A and 150B and the preset extrinsic parameter between the cameras 150A and 150B. Therefore, a method for calibrating the extrinsic parameters between the cameras of the electronic device 100 is in need. In some embodiments, the extrinsic parameters are stored in the memory 110 for the processor 130 to access and operate with the SLAM module.
Reference is made to
As shown in
In operation S510, the SLAM module is processed to track the device pose of the electronic device within the mixed reality environment coordinate system according to several images. In some embodiments, the processor 130 tracks the device pose of the electronic device 100 within the mixed reality environment coordinate system M according to several space feature points within the images captured by the cameras 130A to 130C.
In operation S520, it is determined whether the SLAM module is working properly with the extrinsic parameters. In some embodiments, when the SLAM module is working properly with the extrinsic parameters stored in the memory 110, operation S530 is performed. On the other hand, when the SLAM module is not working properly with the extrinsic parameters stored in the memory 110, operation S540 is performed.
In some embodiments, the processor 130 of the electronic device 100 determines the pose of the electronic device 100 every period of time. When determining the pose of the electronic device 100, the processor 130 refers to the previous pose of the electronic device 100 determined at a previous period of time. In some embodiments, the processor 130 further refers to the positions of the space feature points determined previously when determining the pose of the electronic device 100.
When the processor 130 is unable to calculate the pose of the electronic device 100 in reference to the space feature points determined previously and/or the pose of the electronic device 100 determined at a previous period of time, it is determined that the SLAM module is not working properly with the extrinsic parameters. On the other hand, when the processor 130 is able to calculate the pose of the electronic device 100 in reference to the space feature points determined previously and/or the pose of the electronic device 100 determined at a previous period of time, it is determined that the SLAM module is working properly with the extrinsic parameters.
In operation S530, a calibration process is performed. Detail of the calibration process will be described in reference to
Reference is made to
In operation S532, several poses of several cameras are calculated within the mixed reality environment coordinate system according to several light spots within each of the several images.
In some embodiments, the light spots are generated by the structured light generation device 170 as illustrated in
In some embodiments, the structured light generation device 170 generates and emits several light spots with a fixed frequency. The processor 130 adjusts the exposure of each of the cameras 150A to 150C, so that the cameras 150A to 150C are able to capture the images with the light spots.
Detail of the operation S532 will be described in reference to
Reference is made to
In operation S532A, several space feature points are detected from a first image captured by a first camera and a second image captured by a second camera.
Reference is made to
The processor 130 as illustrated in
Similarly, the processor 130 as illustrated in
Reference is made to
In some embodiments, the processor 130 selects the same area circled byte same space feature points of
After the processor 130 selects the area FPA from
In operation S532C, a pose of the first camera is calculated according to the first image and a pose of the second camera is calculated according to the second image. Reference is made to
It should be noted that, in operation S532C, the pose of the camera 150A and the pose of the camera 150B may be calculated according to several light spots.
Reference is made to
Reference is made to
In operation S534A, a difference between the first pose of the first camera and the second pose of the second camera is calculated. Reference is made to
Reference is made to
Through the operations of S530, by calculating the poses of the cameras according to the same feature points within the mixed reality environment coordinate system M, the extrinsic parameters between the cameras can be calibrated.
Reference is made to
Reference is made to
In operation S541, the extrinsic parameter between the first camera and the second camera is reset. Reference is made to
In operation S543, a first pose of the first camera is obtained according to the image captured by the first camera and a second pose of the second camera is obtained according to the image captured by the second camera. Reference is made to
In operation S545, a difference between the first pose and the second pose is calculated. In some embodiments, the processor 130 as illustrated in
In operation S546, the differences between the first pose and the second pose over a period of time is recorded when the first pose and the second pose are stably calculated. In some embodiments, in operation S546, the camera 150A, the camera 150B and the processor 130 as illustrated in
It should be noted that, in operation S546, the first pose and the second pose are stably calculated. In some embodiments, when the first pose and the second pose are not stably calculated, the processor 130 asks the user to change the pose of the electronic device 100. In some embodiments, the processor 130 sends a signal to the display circuit (not shown) of the electronic device 100 so as to display the signal for asking the user to change the pose of the electronic device 100. In some other embodiments, when the first pose and the second pose are not stably calculated, the processor 130 resets or adjusts the extrinsic parameter between the first camera and the second camera again.
In operation S547, it is determined whether the differences within the period of time are smaller than a threshold value. In some embodiments, the threshold value is stored in the memory 110 as illustrated in
In some embodiments, when not all of the differences between the poses of the camera 150A and the poses of the camera 150B recorded in operation S546 are smaller than the threshold value , the extrinsic parameter between the first camera and the second camera is adjusted by the processor 130 before performing operation S543. In some embodiments, the adjustment to the extrinsic parameter includes increasing/decreasing a distance value between the camera 150A and the camera 150B. In some other embodiments, the adjustment to the extrinsic parameter includes increasing/decreasing a relative rotation value between the camera 150A and the camera 150B.
In some embodiments, after the extrinsic parameter between the camera 150A and the camera 150B is adjusted, operation S543 is performed so as to recalculate the pose of the camera 150A and the pose of the camera 150B with the adjusted extrinsic parameter between the camera 150A and the camera 150B.
In some embodiments, operations S540 is operated until all of the differences between the poses of the camera 150A and the poses of the camera 150B over the period of time are smaller than the threshold value.
The examples mentioned above takes the camera 150A and the camera 150B as illustrated in
It should be noted that, in the embodiments of the present disclosure, the pose and/or the positions of the devices and the feature points are obtained with the SLAM module.
The structured light generation devices 170 and 900 mentioned above are devices with the function of projecting a known pattern (often grids or horizontal bars) on to a scene. The way that these deform when striking surfaces allows vision systems to calculate the depth and surface information of the objects in the scene, as used in structured light 3D scanners. The embodiments of the present disclosure utilizes the function of the projecting a known pattern with light spots of the structured light generation device, so as to mimic the feature points of the chessboard or the Deltille grid, and to compensate for the problem of insufficient feature points in general environments, such as the real space R as mentioned above. By increasing the feature points, the accuracy of the calibration to the extrinsic parameters between the cameras is improved.
Through the operations of various embodiments described above, an electronic device, a parameter calibration method, and a non-transitory computer readable storage medium are implemented. The extrinsic parameters between the cameras of the self-tracking device can be calibrated with the structured light generation device, in which the deviations of the extrinsic parameters between the cameras can be corrected and the accuracy of the calibration of the extrinsic parameters between the cameras can be improved.
Furthermore, in the embodiments of the present disclosure, a chessboard or a Deltille grid is not necessary, and the users can operate the calibration process without a chessboard or a Deltille grid, which is more convenient. Moreover, by generating the spot lights at the real space R, the number of the feature points within the real space R is increased, which improves the accuracy of the calculation of the pose of the devices, and the accuracy of the calibration to the extrinsic parameters between the cameras is thereby improved.
Additionally, when critical situations occur, for example, when the SLAM module is not working properly, the embodiments of the present disclosure can perform a reset process so as to recalculate the extrinsic parameters.
It should be noted that in the operations of the abovementioned parameter calibration method 500, no particular sequence is required unless otherwise specified. Moreover, the operations may also be performed simultaneously or the execution times thereof may at least partially overlap.
Furthermore, the operations of the parameter calibration method 500 may be added to, replaced, and/or eliminated as appropriate, in accordance with various embodiments of the present disclosure.
Various functional components or blocks have been described herein. As will be appreciated by persons skilled in the art, the functional blocks will preferably be implemented through circuits (either dedicated circuits, or general purpose circuits, which operate under the control of one or more processing circuits and coded instructions), which will typically include transistors or other circuit elements that are configured in such a way as to control the operation of the circuity in accordance with the functions and operations described herein.
Although the present disclosure has been described in considerable detail with reference to certain embodiments thereof, other embodiments are possible. Therefore, the scope of the appended claims should not be limited to the description of the embodiments contained herein.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structured of the present invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the present invention cover modifications and variations of this invention provided they fall within the scope of the following claims.
This application claims priority to U.S. Provisional Application Ser. No. 63/483,760, filed Feb. 8, 2023, which is herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
63483760 | Feb 2023 | US |