The present invention relates to a system and method for delivering interactive laboratory experiments real-time in simulated environments. In particular, the present invention provides a set of purpose-specific apparatuses having multiple motion- and position-sensors for providing spatial and kinetic information for a platform incorporated with a program that can process and manipulate the information in order to provide a realistic tactile and haptic experience for user of the platform in a simulated augmented reality environment remotely.
Online teaching is increasingly becoming an indispensable part of education in past two decades. The advent and the continued popularity of Massive Open Online Courses (MOOC) has transformed education beyond the restraints of traditional classroom settings. Natural disasters, such as the Covid-19 pandemic in early 21st century, can limit the access of students and teachers to schools on a global scale, leaving online teaching the only viable medium of education over a sustained period of time. Online teaching is particularly crucial for education in developing countries where the access to schools, libraries and laboratories is scarce. In many cases, online teaching is beginning to rival traditional classroom teaching in effectiveness. It is foreseeable that certain aspects of online teaching will continue to complement, even pose to replace, traditional learning experiences.
Despite the obvious benefits of online teaching, not all aspects of education can be conducted virtually and remotely. For instance, most science courses contain a substantial component of laboratory classes, in which students learn technical skills for collecting and analyzing data. Educators worldwide have developed creative ideas and technologies for teaching lab works online. This typically includes watching videos that depict experimental procedures, supported by online discussions on experimental design and data analysis. The demand for lab videos is so great that even a commercial publisher (Journal of Visualised Experiments) has recently produced more than 60 videos for chemistry experiments at university level. However, initial surveys have indicated that the feedback from science students towards this format of learning has been negative. Meanwhile, some students are guided remotely by their instructors to conduct unsupervised experiments at or in the neighborhood of their own homes. But these practices are limited to “kitchen science” and cannot be applied to lab works that involve expensive equipment and toxic reagents. To overcome this limitation, some educators have created “gamified” versions of research laboratories to give students a simulated experience in performing experiments. However, concrete proofs of the benefits of lab simulations on learning performance still remains to be seen.
A recent extensive survey on the undergraduate students in a chemistry department of a North American university has indicated that the learning experience most students missed during the Covid19 lockdown in 2020-2021 is “hands-on experience”. Laboratory classes are quintessentially physical experiences, involving the learning of haptic and manual skills that can seldom be reproduced in lab simulations. For example, in acid-base titration, one of the most fundamental volumetric analytical techniques in chemistry, the student needs to set up a burette over a conical flask. A precise volume of the analyte needs to be pipetted into the conical flask, while the volume of the titrant solution in the burette is accurately recorded. By carefully controlling the stopcock of the burette, the student needs to add just enough titrant into the conical flask to change the color of the indicator in the analyte, while constantly swirling the flask for effective mixing. Moreover, the burette has to be set up perfectly vertical to ensure the accurate reading of the initial and final titrant volume. Hence, the successful completion of this experiment requires not only the theoretical understanding of the procedure but also a dexterous skill and eye-hand coordination that cannot be replicated in the absence of the apparatus. In most augmented or virtual reality simulations of titration, the control of the burette stopcock is usually replaced by a click of a button or by an empty hand gesture, thus depriving the learners of the essential haptic experience. Moreover, most videoed demonstrations and lab simulations only show the procedure performed correctly and under perfect conditions. Learners are not able to “learn by mistakes”, a cornerstone of science education. The proficiency of the type of haptic and tactile abilities involved in titration is required in almost all lab works in chemistry, and the acquisition of these skills represents a profound degree of challenge in the age of virtual learning.
A need therefore exists for an interactive learning platform for learners to gain tactile and haptic experience and practice dexterous skills involved in a real laboratory experiment, whilst an instructor can review and evaluate individual performance and skills from real-time data received by the platform, in order to address the inadequacies and drawbacks of the existing remote or virtual learning platform.
Accordingly, a first aspect of the present invention provides a platform integrating both hardware and software for delivering interactive laboratory experiments real-time in simulated environments. In particular, the present invention provides a platform integrating a set of replicas of laboratory apparatuses (or phantoms) configured to have multiple motion- and position-sensors and electronics for detecting and providing spatial and/or kinetic information of the corresponding apparatus(es) (or phantoms) for a data processing unit (or a network) of the platform to process and translate the data derived therefrom into signals to be output by one or more augmented reality devices. Users of the platform, including learners and instructor, are able to obtain the output signals from the corresponding augmented reality device(s), react to a particular output signal, act on a subsequent step following a particular output signal, and repeat, modify, or terminate a session of an interactive laboratory experiment, etc.
The platform in the first aspect includes one or more phantoms incorporated with at least an assembly of motion- and position-sensors, and at least one marker for a corresponding imaging module to capture.
In certain embodiments, the assembly of motion- and position-sensors is selected from an inertia measuring unit (IMU).
In other embodiments, the assembly of motion- and position-sensors may include one or more wireless monitoring sensors including, but not limited to, a dual phase resistive position sensor.
In certain embodiments, the at least one marker is selected from an OpenCV ArUco synthetic marker.
In other embodiments, the at least one marker may include one or more computer-vision markers.
In certain embodiments, the assembly of motion- and position-sensors is embedded into each of the phantoms.
In certain embodiments, the at least one marker is affixed on a surface of each of the phantoms in a way to be visualized and captured by the imaging module.
In certain embodiments, the assembly of motion- and position-sensors is configured to passively track spatial position, acceleration (linear or angular), and velocity (linear or angular) of the corresponding phantom.
In certain embodiments, the at least one marker is configured to be captured and identified by the imaging module in order for the spatial position, acceleration (linear or angular), and velocity (linear or angular) of the corresponding phantom to be actively tracked.
In certain embodiments, the imaging module, the data processing unit and the augmented reality device may be from separate devices or from the same device.
In certain embodiments, the augmented reality device is a portable device with a simulation module for realizing augmented reality and an imaging module for capturing the at least one marker on a surface of the corresponding phantom and determining identity thereof.
In certain embodiments, the augmented reality device is configured to provide timestamp for each of the images captured by the imaging module in order to facilitate sequencing of the captured images and synchronization with the corresponding spatial and kinetic information obtained by the assembly of motion- and position-sensors.
In certain embodiments, the one or more phantoms are fabricated by 3-D printing.
In certain embodiments, the one or more phantoms are configured to have substantially similar shape, size, weight and surface roughness with respect to their corresponding genuine objects.
A second aspect of the present invention provides a method for using the platform described herein to deliver an interactive laboratory experiment lecture real-time in simulated environments. The method includes:
In certain embodiments, the transformation of spatial and kinetic information into the dynamic images to be integrated into the virtual laboratory setting and displayed as an augmented reality images based on a physically-based rendering approach as physically accurate renderings.
In certain embodiments, the physically accurate renderings are incorporated into augmented reality projection to be displayed on the display screen of the corresponding portable device to show position of individual components of each of the phantoms and juxtaposition of the phantoms relative to the user of the portable user in real time.
In certain embodiments, a software component incorporated into any of the portable devices and the network renders images of genuine counterparts of the corresponding phantoms in the augmented reality projection from a database upon identification of a corresponding computer-vision marker affixed on the corresponding phantom by an imaging module of the portable device and synchronization of the renderings with the received spatial and kinetic information of the corresponding phantom.
In certain embodiments, the imaging module includes a camera and an image processing unit.
In certain embodiments, the spatial and kinetic information, and/or other information of one or more of the phantoms is/are detected by an inertia measuring unit (IMU) embedded in each of the phantoms.
In certain embodiments, the portable device may connect to another augmented reality projection device or incorporate an augmented reality projection module.
In certain embodiments, the software component is configured to communicate with other devices in order to exchange data including motion and position information of one or more of the phantoms or genuine counterpart images thereof to be rendered in a corresponding augmented reality projection from time to time being displayed on among different portable devices or augmented reality projection devices.
In certain embodiments, the software component is also configured to provide feedback to the received motion and position information of one or more phantoms in a form of physically accurate renderings in an augmented reality projection or any other format, where the feedback can also be in a form of physically accurate renderings or any other format, depending on the preference of a respondent.
In certain embodiments, the other format of motion and position information or feedback being displayed in the augmented reality projection can be textual, sensory, audio, visual, and/or olfactory.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. Other aspects of the present invention are disclosed as illustrated by the embodiments hereinafter.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The appended drawings, where like reference numerals refer to identical or functionally similar elements, contain figures of certain embodiments to further illustrate and clarify the above and other aspects, advantages and features of the present invention. It will be appreciated that these drawings depict embodiments of the invention and are not intended to limit its scope. The invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been depicted to scale.
It will be apparent to those skilled in the art that modifications, including additions and/or substitutions, may be made without departing from the scope and spirit of the invention. Specific details may be omitted so as not to obscure the invention; however, the disclosure is written to enable one skilled in the art to practice the teachings herein without undue experimentation.
The present disclosure provides a set of replicas of laboratory apparatuses (also referred herein as phantoms) that mimics their corresponding genuine objects in terms of shape, size, weight, and surface features in order to provide a similar haptic and tactile sensation when users of the phantoms hold the same to conduct a virtual laboratory experiment. In certain embodiments, the phantoms are fabricated by methods including 3-D printing via stereolithography and direct light processing, so that surface features of the phantoms fabricated as such can be maximized in terms of their smoothness (or roughness) and other surface details such as some specific curvatures of a particular labware. For example, a customized friction adjusting mechanism can be incorporated into a phantom of stopcock to resemble haptic feeling of using a genuine burette. In addition, the phantoms are incorporated with both wireless monitoring sensors and computer-vision markers to allow real-time detection of their spatial and kinetic information. Again, taking the stopcock phantom as an example, a dual phase resistive position sensor can be incorporated as one of the wireless monitoring sensors for sensing turning of the stopcock to an angular precision of at least 5 arc-seconds.
Turning to
Turning to
Turning to
In certain embodiments of the present invention, the software component in the present invention is used to calculate virtual volume of liquid from the AR outputs of the corresponding phantoms by calculating particle counts within signed distance field (SDF) boundaries, where each particle is a representation of a quantum of the liquid (e.g., 0.1 ml). A visualization of virtual conical flask from a corresponding phantom conical flask by SDF is shown in
At the beginning of a typical laboratory session, the user will be instructed either by another user or by a corresponding virtual instructor with certain level of artificial intelligence to pour an arbitrary amount of titrant from a phantom beaker to the phantom burette. The volume of the titrant delivered to the phantom burette is calculated according to an extent of the tilting of the phantom beaker during the pouring action, by the particle count within the SDF boundaries where each particle is a representation of a quantum of the fluid (e.g., 0.1 ml). This volume (V1) will be indicated by the position of the meniscus of the liquid in the phantom burette. During titration, the titrant will be delivered from the phantom burette to the phantom conical flask by turning the phantom burette's stopcock. The rotation (angular) motion of the stopcock is mapped to the fluid emitter with a linear interpolation (LERP) having customizable values, which measures an effective diameter of a liquid passage channel formed by the opening of the stopcock and the duration in which the liquid passage channel is opened. This information is converted into a liquid volume (V2) using data collected by empirical experiments. During the titration experiment, the value of V2 will increase. The volume of titrant remained in the phantom burette is calculated in real-time by subtracting V1 by V2, until V1-V2=0 when the phantom burette will look empty.
Before the experiment, the user, e.g., a student, will be instructed to deliver a fixed volume (e.g., 25 ml or 50 ml, as V3) of an analyte from a phantom beaker to a phantom conical flask, using a phantom pipette. During the titration experiment, the volume of liquid in the phantom conical flask is calculated, in real-time, by adding V2 to V3. The pH of the mixture in the phantom conical flask is thereby calculated, in real-time, by the concentration of each of reactants in the titrant and the analyte with reference to corresponding published dissociation constants of the reactants. In one example, the titrant is a solution of hydrochloric acid of known concentration and the analyte is a solution of sodium carbonate of a certain concentration that is unknown to the student. When the titrant is added into the analyte, the following reaction will occur:
HCl+Na2CO3→H2O+CO2+NaCl (1)
The pH (i.e., the concentration of hydrogen ions) of the mixture at any point during the titration experiment (at any concentration of the titrant) is calculated from the dissociation constants of the two displacement reactions, the pH of CO3(2−) and HCO3(−), as well as the concentration of excess HCl, if any):
CO2−+H+→HCO3−+H2O (dissociation constant=4.0×10−6 M) (2)
HCO3−+H+→CO2+H2O (dissociation constant=2.5×10−11 M) (3)
The concentration of hydrogen ions at any point during the titration experiment is then converted into color of the mixture in the phantom conical flask based on the published absorption spectrum of the pH indicator at a given pH. In one example, the pH indicator is phenolphthalein. The student will be instructed to use a phantom dropper to deliver a small volume of a phenolphthalein solution of known concentration into the phantom conical flask which contains the analyte. The resulting phenolphthalein concentration is calculated in real-time according to V2+V3. The color of the mixture in the phantom conical flask is generated from an effective concentration of phenolphthalein at any point of the titration experiment and the absorption spectrum of phenolphthalein at the hydrogen ion concentration at this point of the experiment as calculated from equations (2) and (3). A realistic rendering of solution mixing and color change in the phantom conical flask is made possible by calculating the dispersion value of the mixed fluid.
In any of the foregoing described examples, the software component will timestamp each of the output AR images corresponding to the time of detection of the relative motion and/or position data of the phantom by the corresponding motion- and position-sensors.
The software component also allows connection of data from multiple mobile devices into a network, so that users can see one another's AR images, and record data from one another's phantoms. In one example, the AR images of a titration experiment (601a, 601b, 601c . . . , 601z) set up by one individual 601, e.g., an instructor, can be shown in the same virtual space 610 as another individual 602, e.g., a pupil, so that the pupil can follow the step-by-step demonstration from the instructor (601a′, 601b′, 601c′ . . . , 601z′) (
In certain embodiments where multiple participants log in the same session, remote procedure call (RPC) from Netcode for GameObjects, a networking library, is utilized in the network to communicate among multiple users of the present platform. At a high level, when calling an RPC client side, a software development kit (SDK) will take a note of an object, component, method and any parameters for that RPC and send that information over the network. A server will receive that information, find the specified object, find the specified method and call it on the specified object with the received parameters on all available clients. It could either be done in a) client-hosted multi-user session where one of the users, presumably the teacher, can host a session for student to join; or b) a dedicated server hosted on cloud/on-location. Alternatively, other web service approaches can also be used in the context of setting up multi-user communication and synchronization of a relatively large amount of data, by calling to a resource, for example.
Although the invention has been described in terms of certain embodiments, other embodiments apparent to those of ordinary skill in the art are also within the scope of this invention. Accordingly, the scope of the invention is intended to be defined only by the claims which follow.
The present invention is applicable to provide remote online learning of laboratory techniques in different subjects, particularly STEM subjects, by providing learners with physical replicas of lab apparatuses. By handling these replicas, together with AR experience provided by the present platform, the student will learn the dexterous skill and eye-hand coordination that cannot be replicated by other conventional remote learning platforms.
By the present invention, it is possible to collaborate on group experiments, in which students can manipulate the phantoms at their own homes, while visualising the instructor and other students in the same virtual spaces. In particular, instructors can demonstrate to the class how to handle the phantoms, and then remotely check how accurately the students follow the instructions in real-time. In long term, big data on learning performance can be collected, allowing the effectiveness of new pedagogies to be systematically quantified. The present invention also allows students to carry out lab techniques repeatedly without restriction to location and time, and substantially no implications on cost and safety, unlike traditional face-to-face lab classes where the number of times a student can attempt an experiment is limited by the availability of the lab and the student's class schedule. Thus, the present invention enables students to “learn by mistakes”, at wherever and whenever they wish or are available, while still under the guidance of their instructors