None.
This invention relates generally to the laboratory device field, and more specifically to a new and useful system and method for calibrating and controlling the temperature of an apparatus in the laboratory device field.
There continues to be an emphasis in the laboratory device field to calibrate and control the ambient environment of samples. However, current laboratory devices do not allow users to calibrate and control the temperature within said devices. Thus, there is a need in the laboratory device field to create a new and useful system and method for calibrating and controlling the temperature of a laboratory apparatus.
The following description of preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.
As shown in
As shown in
The apparatus 100 is preferably capable of interacting with an environment. The apparatus can have any suitable form factor, but is preferably structured to feature an internal cavity capable of storing a laboratory specimen shown in
The output of the apparatus functions to interact with the physical environment surrounding the apparatus, with one or more users, with other apparatuses, or with any other suitable endpoint. The outputs can include visual outputs 120, audio output 130, or any other suitable output. The outputs are preferably arranged on the apparatus, but can alternatively be remote outputs controlled by the apparatus, or be arranged in any other suitable location.
The visual outputs 120 can include controllable lighting system(s), a graphical display, a tabular display, or any other suitable visual display. The audio output 130 can include speakers, transducers, or any other suitable mechanism capable of generating audio waves. However, the apparatus can include any other suitable output.
The input of the apparatus functions to receive user inputs at the apparatus, receive inputs from other apparatuses, receive inputs from auxiliary sensors remote from the apparatus, measure parameters of the ambient environment, measure apparatus operational parameters, or provide any other suitable information. The apparatus can respond to the inputs according to the programming The apparatus can additionally or alternatively stream the input information to a remote user device, wherein the remote user device can process, store, or otherwise handle the input information.
The inputs can be one or more sensors 140, but can alternatively or additionally be interfaces for communicatively coupling with one or more sensors (e.g., connectors, etc.). Sensor inputs can include light sensors, wind sensors, radiation sensors, pressure sensors, temperature sensors, humidity sensors, touch sensors (e.g., a set of electrodes, etc.), user inputs (e.g., buttons, analog controls, digital controls, etc.), and/or any suitable type of input. The sensors can additionally include system monitoring sensors that function to monitor apparatus operational parameters, ambient environment parameters, or any other suitable parameters. Examples of monitoring sensors include motor monitoring systems (e.g., rotary encoders, mechanical encoders, magnetic encoders, optical encoders, resolvers, Hall effect sensors, back EMF monitoring systems, etc.), light sensors, audio sensors (e.g., microphones), temperature sensors, and pressure sensors, but the apparatus can include any other suitable sensor.
The communication module 150 of the apparatus functions to transfer information between the apparatus and a data endpoint. The data endpoint can be the programming interface application, a user device, server system, or be any other suitable device. The communication module 150 is preferably a transceiver, but can alternatively be a receiver, transmitter, or be any other suitable communication system. The communication module 150 can be wired or wireless. The communication module 150 can an IR system, RF system, beacon system (e.g., ultrasound, RF), light modulation system, NFC system, Wi-Fi system, GSM system, Bluetooth system, mesh system, cellular system, Ethernet system, powerline communication system, or be any other suitable communication system.
The apparatus 100 can additionally include a power storage unit 160 that functions to store energy and supply power to active apparatus components. The power storage unit is preferably arranged on-board the apparatus, but can alternatively be remote. The power storage unit 160 can be a primary battery, secondary batter (rechargeable battery), fuel cell, or be any other suitable power supply.
The apparatus 100 can additionally include a processing unit 170 that functions to control the apparatus output, communication system, or other components. The processing unit 170 can independently and/or automatically control the apparatus based on sensor measurements and stored control instructions. The processing unit 170 can additionally or alternatively operate the apparatus based on control instructions received from the programming interface application 200, user device 210, or other remote control system. The processing unit 170 can additionally or alternatively adjust or otherwise modify the received control instructions (e.g., based on stored user profile, sensor measurements, etc.). The processing unit 170 can be a processor, microprocessor, GPU, CPU, or be any other suitable processing unit. The processing unit can additionally include digital memory (e.g., flash memory, RAM, solid state, etc.) that functions to permanently or temporarily store information. The stored information can be control instructions (e.g., a user profile), sensor measurements or other input, identifier information (e.g., apparatus identifier information, user identifying information, user device identifier information, etc.), or be any other suitable information. The processing unit can include a local control system that functions to control the apparatus independent of the programming interface application, and can additionally include a remote control system that functions to control the apparatus based on control instructions received from the remote control device. The remote control system is preferably accessed through a programming interface application, but can alternatively be accessed through a remote cloud computing system or accessed in any other suitable manner. The local control system can store inputs, process programming configuration, direct output control, and provide any suitable form of control. In some variants, the local control system can be configured with a user profile configuration.
A user profile configuration of an embodiment of the apparatus functions to supply operational pattern directives. The user profile configuration preferably characterizes the type of actions and control instructions that is executed by the apparatus. The user profile configuration can define output responses to inputs. For example, the user profile configuration can specify that the apparatus should initialize lighting when it detects motion, initialize an alarm when a component malfunction is detected, notify the user at the completion of a set of programming instructions, or perform any suitable logic. The user profile configuration is preferably updatable, and preferably evolves or otherwise updates according to interactions and programming received from the programming interface application. The user profile configuration preferably initializes in a new instance of an apparatus as a base user profile. In one preferred implementation, base user profile defines default or minimal response logic, which functions to simulate a new apparatus. The user profile configuration preferably updates through apparatus and/or application interactions. Over time, the user profile configuration updates to provide customized response logic at least partially set through interactions of a user. At least a portion of the user profile configuration is stored and maintained on the apparatus such that the apparatus can conform to user profile-based behaviors independent of the application (e.g., when the apparatus is disconnected from or not controlled by a user device). The user profile configuration can additionally or alternatively be stored and managed remotely (e.g., by the application or in a remote cloud platform).
In a specific variation, the apparatus 100 includes a set of opposing motorized mounting points configured to removably connect to a set of accessories and/or rotate about a shared rotational axis; a set of visual output mechanisms (e.g., individually indexed and controllable light emitting elements); a set of audio output mechanisms (e.g., speakers); a set of light sensors; a set of audio sensors; a motor monitoring system for each or a subset of the apparatus motors; a set of buttons; a wireless communication mechanism (e.g., a Bluetooth communication mechanism). The apparatus can additionally include a processor, non-volatile memory, on-board power storage (e.g., a secondary or rechargeable battery) electrically connected to the active apparatus components, and/or include any other suitable component. However, the apparatus can have any other suitable component or configuration.
The programming interface application 200 functions to provide a programming and interaction control interface to the apparatus. The programming interface application 200 functions to receive programming inputs from a user, and can additionally or alternatively transform the programming input into a second computer language (e.g., target language, such as assembly language or machine code). The programming interface application 200 can additionally or alternatively provide audio and/or visual feedback to the user.
The programming interface application 200 preferably runs on (e.g., is supported by) a user device 210, but can alternatively be run on a remote server or on any other suitable computing system. The user device is preferably remote from the apparatus (e.g., separate and distinct from the apparatus, not physically connected to the apparatus, etc.), but can alternatively be connected to the apparatus, mounted to the apparatus, or otherwise associated with the apparatus. The user device can be any suitable computing device, such as a mobile device (e.g., smartphone, tablet, etc.), wearable computer, a desktop computer, a TV-connected computer, a mobile phone, another electronic apparatus, or any suitable computing device. The system can include one or more programming interface applications that can interact with the apparatus.
The programming interface application 200 preferably includes a user interface configured to promote programming and setting of apparatus logic. Various approaches to programming can be applied as described such as visual programming and direct programming When in communication with the apparatus, the programming interface application preferably provides a substantial portion of control instructions. Input data captured by the apparatus can be communicated to the programming interface application (e.g., in near-real time, at a predetermined frequency, at a variable frequency, at a fixed frequency, etc.), where the input data is processed and transformed into response data, which is then communicated to the apparatus to be executed. Alternatively, the control instructions can have any suitable distribution between the apparatus and the programming interface application. The use of the programming interface application preferably facilitates updating and modification of the user profile instance of an apparatus.
The programming interface application 200 preferably uses an apparatus application programming interface or a software development kit, which functions to facilitate interfacing with the apparatus. Any suitable programmatic interface can be used. The interface is preferably generalized for use with various applications and uses. Preferably, there are multiple programming interface applications that can be selectively (or simultaneously) in control of the apparatus.
The programming interface application 200 can additionally supplement the components of the apparatus. For example, the programming interface application can be used to supply audio output. The programming interface application can similarly use sensors of the computing device to supplement or replace the inputs of the apparatus.
The system can additionally include a remote cloud platform that can facilitate account management, user profile synchronization, and other suitable features.
As shown in
The method can additionally function to enable a user to program the apparatus in real- or near-real time. The method preferably uses an apparatus that obtains environmental information and then responds through actions. Apparatus control can be partially or entirely directed through programming obtained from an application.
In one variation of apparatus operation, the apparatus can stream sensor information (recorded by the apparatus, such as sensor measurements) to the programming interface application (supported by a remote user device), wherein the user device can generate control instructions for the apparatus based on the sensor information. The user device can stream the control instructions to the apparatus, wherein the apparatus operates based on the control instructions, such that the user device can remotely control the apparatus. The apparatus can stream the sensor information in real- or near-real time (e.g., as the measurements are recorded), in batches, at a predetermined frequency, in response to a transmission event (e.g., the full execution of a control instruction), or at any other suitable frequency. In a second variation of apparatus operation, the apparatus can automatically operate based on a user profile configuration or other stored control information. However, the apparatus can operate in any other suitable manner.
Receiving a set of programming inputs at the user device S100 functions to obtain a programming configuration from a user. The programming inputs can be programming components, programming routines, scripts, application logic, compiled application objects, or any suitable configuration that can direct control instructions of an apparatus. The set of programming inputs can include one or more programming inputs, and can define a control path. When the set includes multiple programming inputs, the set can be time-ordered (e.g., be a series or sequence of programming inputs), be unordered, or have any other suitable relationship between programming inputs of the set. The programming inputs are preferably programming statements expressing apparatus actions to be carried out, but can additionally or alternatively be simple statements, compound statements, or be any other suitable programming statements. However, the programming inputs can be expressions or be any other suitable programming input.
The set of programming inputs is preferably received through a programming interface application running on a user device (example shown in
The set of programming inputs can be received in a variety of different ways, through a variety of different programming interface applications, wherein each programming interface applications is capable of interfacing with the apparatus. Alternatively, the programming input can be received through a single programming interface application, received thorough a set of different programming interface modes, or received in any other suitable manner. The apparatus interface, more preferably an apparatus software application interface but alternatively any other suitable apparatus interface, can additionally or alternatively provide a variety of programming input modes, each capable of interfacing with one or more programming interface applications, but the programming input modes can alternatively be natively enabled within an application. The various programming input modes can be used separately or in any suitable combination.
Receiving sensor data from the apparatus at the user device S200 functions to receive feedback of apparatus control instruction performance, receive apparatus inputs for further control instruction generation (e.g., for continued remote apparatus control), receive data for apparatus performance analysis, receive data for control path determination, or receive data for any other suitable functionality. Receiving sensor data can include: sending sensor data from the apparatus to the user device, and receiving the sensor data from the apparatus at the user device. The sensor data can be sent by the apparatus at a predetermined frequency (e.g., a fixed or variable frequency), sent in response to the occurrence of a transmission event (e.g., in response to depression of an apparatus button, in response to receipt of a transmission command from the user device, etc.), sent as the sensor data is generated or recorded (e.g., in real- or near-real time), sent in response to apparatus connection with the user device, or be sent at any other suitable time. The apparatus can additionally compress, encrypt, or otherwise process the sensor data before transmission. The sensor data can be raw sensor data (e.g., raw sensor signals), processed measurements (e.g., wherein the signals are processed into sensor measurements), sensor summaries (e.g., wherein the measurements can be processed into higher-level summaries), or be any other suitable data. The sensor data can be captured and provided by the apparatus, by the user device, a remote server system, by a set of secondary apparatuses, by an auxiliary sensor remote from the apparatus (e.g., external the apparatus), or by any other suitable computing system. The sensor data can be received by the user device at a predetermined frequency, received in response to the transmission event, received in real- or near-real time, received as the data is sent, or received at any other suitable frequency.
Receiving data from the apparatus S200 can additionally include connecting the user device to the apparatus. The user device can be wirelessly connected to the apparatus, connected to the apparatus by a wire, or otherwise connected to the apparatus. The user device is preferably removably or transiently connected to the apparatus. The user device is preferably connected to the apparatus in response to selection of a connection icon (e.g., in response to selection of an icon indicative of the apparatus), or be connected in response to the occurrence of any other suitable connection event. The user device can be simultaneously connected to a single apparatus or multiple apparatuses. In one variation, the user device can be connected to the apparatus via a Bluetooth or other short-range connection, wherein the apparatus can periodically or continuously broadcast a signal upon power-up, the user device can search for and display graphics indicative of apparatus physically proximal the user device (e.g., limited by the range of the short-range communication) upon selection of a search icon, and the user device can establish a transient connection to the apparatus in response to selection of the corresponding apparatus icon. However, the apparatus can connect to the user device in any suitable manner.
Processing the programming inputs into control instructions based on the sensor data S300 functions to remotely generate control instructions for the apparatus, based on the programming inputs received from the user, that respond to the near-instantaneous apparatus operation conditions. Processing the sensor measurements into control instructions based on the programming inputs can additionally enable the apparatus to dynamically respond to unexpected environmental or operational conditions. Processing the sensor measurements into control instructions based on the programming inputs can additionally enable the apparatus to dynamically reflect (e.g., in real- or near-real time) the newly-entered programming input (e.g., perform the operations associated with the programming input).
As shown in
Each programming input of the set can be processed together with the remainder of the set, processed in subsets, or processed individually. The programming inputs are preferably processed automatically by the user device, apparatus, or other computing system, but can alternatively be processed in response to receipt of a user input, processed manually, or processed in any other suitable manner. The programming inputs can be processed into control instructions before the sensor data is received, after the sensor data is received, before the last set of control instructions is determined to have been performed, after the last set of control instructions is determined to have been performed, or be processed into control instructions at any other suitable time.
Processing the programming inputs S300 can include, at the user device: processing a first programming input into a first set of control instructions based on a first set of sensor data in response to receipt of the first set of sensor data; receiving a second set of sensor data; then processing the next programming input in the programming input sequence (e.g., the next unperformed programming input, second programming input, etc.) or subset thereof into a second set of control instructions based on the second set of sensor data (e.g., the subsequently received sensor data). However, the programming inputs can be otherwise processed. The method can additionally include iteratively repeating the method for successive sensor data sets and successive, unperformed programming inputs until the last programming input has been processed and performed. The first and second sets of sensor data can be recorded before the respective programming input is received at the programming interface application, recorded as the respective programming input is received at the programming interface application, or recorded after the respective programming input is received at the programming interface application. The control instructions can additionally or alternatively be generated based on the programming input and secondary data, wherein the secondary data can be user device sensor data, information received from a remote server, or be any other suitable data.
In one variation, as shown in
As shown in
The method can additionally include dynamically modifying the set of programming inputs during programming input execution. This can function to permit the user to change the apparatus programming (e.g., the control path, conditional statements, etc.) on the fly. In this variation, the method can include receiving a programming input set modification as a programming input of the set is being executed, and generating a modified series of programming inputs in response to receipt of the modification. The method can additionally include automatically generating control instructions based on the subsequently received sensor data and the modified series of programming inputs. This can be performed irrespective of which programming input was modified or where the new programming input was inserted (e.g., wherein the apparatus performs the modified programming input after modification receipt) or be performed only if the modified or new programming input is after the instantaneous execution position within the programming input series. However, the programming input set can be otherwise modified, and apparatus control can be otherwise affected.
Controlling the apparatus based on the control instructions S400 functions to remotely control the apparatus based on the programming inputs received at the user device. Controlling the apparatus can include, at the apparatus: receiving the control instructions at the apparatus and controlling apparatus sub-components to execute the control instructions. The control instructions can be received over the same communication channel as that used to send the sensor data, received over a different communication channel or protocol, or received in any other suitable manner. The control instructions can be executed (e.g., the apparatus operated based on the control instructions) in response to control instruction receipt, within a threshold time period of control instruction receipt (e.g., immediately, as soon as possible, within 5 seconds, etc.), in response to determination of a performance event (e.g., when a conditional event is met), or execute the control instructions at any other suitable time.
As shown in
Controlling an apparatus according to a user profile S500 functions to execute application logic for directing apparatus actions. Controlling an apparatus according to a user profile can include controlling the apparatus in an autonomous mode and in a delegated control mode (e.g., the user device programming mode). The autonomous mode preferably engages a local control system. The local control system can be used independently from an application. The control instructions in an autonomous mode can be automatically generated based on at least a portion of a user profile stored on the apparatus. For example, when a user operates the apparatus without an open application, the user profile can be used directly to control apparatus action. A local version of the user profile preferably specifies various behavioral trigger-response patterns. A delegated control mode is preferably substantially similar to that described above, and can be engaged when an application is in communication with apparatus and executing control directives. The delegated mode can additionally or alternatively communicate with a remote control system accessible over the internet or any suitable network infrastructure.
The method can additionally include generating a user profile for the apparatus S520. The user profile can be automatically generated (e.g., based on patterns or associations with stored, historical apparatus actions and/or programming input sets), manually generated, predetermined, or otherwise determined.
In one variation, the history of programming input can impact the user profile configuration. The user profile configuration preferably reflects the combined “learning” of an apparatus, which is achieved through receiving programming input. As a user operates the apparatus, initializes programming instructions, designs customized instructions for the apparatus, the user profile can be updated to reflect the types of programming logic or patterns in programming logic. In this variation, determining the user profile can include storing a history of programming inputs associated with the apparatus, identifying a pattern of control instructions or programming inputs from the history of programming inputs, and generating the user profile based on the identified pattern.
The user profile can be generated by the user device, by a remote computing system, by the apparatus, or by any other suitable system. The user profile can be stored on the apparatus (e.g., wherein the user profile is sent to the apparatus if generated on a separate system), on the user device (e.g., wherein the user device can remotely control the apparatus, independent of programming inputs), on a remote computing system (e.g., wherein the user profile is retrieved by the apparatus or user device), or on any other suitable computing system.
The user profile can additionally be gradually modified (updated) to reflect the way a user typically programs the apparatus to operate. Additionally, user profile configuration can define actions to particular types of event triggers. For example, if a user typically requires a notification signal during a particular event, that notification signal can be used as a default for that event. When a new program feedback is received, the program feedback can process the current program; identify any patterns, and calculate user profile control instructions as a function of the new program feedback and past control instructions.
The user profile configuration update preferably occurs across all applications, but user profile updating can be disabled for select applications. Each user account is preferably associated with a single user profile for a given apparatus identified by an apparatus identifier (globally unique or non-unique identifier), but can alternatively be associated with multiple user profiles for the apparatus. In the latter variation, the user profile instantaneously assigned to the apparatus can be manually selected by the user, automatically selected (e.g., based on time of day, estimated frequency, etc.), or selected in any other suitable manner. Each apparatus can support (e.g., store) one or more user profiles. In one implementation, multiple users can use an apparatus where one user logs in and become the active user of an apparatus at any particular time. User profile can be scoped by apparatus, by user, by application, by set of applications, or by any suitable scope. User profiles can additionally be saved, shared, forked, or modified in any suitable manner. Additionally, a user account or apparatus can have a set of different user profiles that can be selectively activated or used in aggregate. In one implementation, there is a global user profile configuration for an account, but a secondary application specific user profile that augments the global configuration only when using that particular application. Similarly, a new user profile can be formed through combination of other user profiles. For example, a default user profile configuration of an apparatus can be used when there is no specific active user, wherein the default user profile configuration is a combination of user profile configurations from multiple users of the apparatus.
The method can additionally include supplementing or modifying the control instructions based on the user profile S540, wherein the control instructions are generated based on the programming input (example shown in
In one variation, the control instructions can be modified based on the user profile, wherein the output manipulated by the control instructions can be entirely or partially influenced by the user profile instructions. In this variation, the method can include identifying the control instructions as one of a set (e.g., wherein the set is associated with an apparatus sub-component, wherein the set specifies control instructions that can be modified, etc.) and modifying the control instructions in a predetermined manner, based on the user profile. However, the control instructions can be otherwise modified or supplemented.
The method can additionally include updating the programming inputs in real-time. Programming input can be automatically pushed to the apparatus. The apparatus can be impacted as a program is edited and/or created. Alerts and warnings can be expressed in the user interface as well as in the operation of the apparatus. In some cases, for example, the user profile configuration can alter how the apparatus expresses a warning or notification.
Additionally, while the method is described for a one-to-one relationship of applications and apparatuses, the method and system can additionally support many apparatuses to one controller scenarios, one apparatus to multiple controllers scenarios, and many apparatuses controlled by many controllers as shown in
The system and method can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions are preferably executed by computer-executable components preferably integrated with the apparatus system and programming interface application. The computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component is preferably a general or application specific processor, but any suitable dedicated hardware or hardware/firmware combination device can alternatively or additionally execute the instructions.
As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the preferred embodiments of the invention without departing from the scope of this invention defined in the following claims.