INTERACTIVE BUILD PLATE

Information

  • Patent Application
  • 20230415051
  • Publication Number
    20230415051
  • Date Filed
    June 22, 2022
    a year ago
  • Date Published
    December 28, 2023
    5 months ago
Abstract
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for tracking object movement within a three-dimensional workspace. A method includes receiving, by a controller, break beam sensor data indicating object detection by a break beam sensor of a plurality of break beam sensors configured to detect objects within the workspace. The method includes receiving plate sensor data indicating object detection by a plate sensor of a plurality of plate sensors configured to detect objects resting on a surface of a plate defining a floor of the workspace. The method includes determining that an object passed through the workspace to rest at a position on the surface; comparing the position of the object to a target position of the object; and in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing one or more actions.
Description
TECHNICAL FIELD

The present invention relates to the field of building systems. More particularly, the invention relates to releasably interconnecting building elements and workspaces.


BACKGROUND

Certain toy pieces in the form of toy bricks have releasable couplings between bricks, which allow them to be connected to form a larger structure. In their simplest form they build inanimate objects such as castles or houses. In some cases, the toy created using toy bricks can be supported on a baseplate, which can include coupling elements to provide stability or proper positioning, or both.


SUMMARY

In general, this disclosure relates to an interactive build plate system for tracking movement of objects within a three-dimensional workspace. The build plate system can be used to track movement and time of interaction with objects, e.g., modular building blocks. The workspace has a floor defined by a surface of a plate. The plate includes plate sensors for detecting objects resting on the plate. Break beam sensors are arranged to detect objects passing through the workspace. The plate sensors and break beam sensors output sensor data to a controller.


In some examples, the plate surface is configured to be used with releasably coupleable toy pieces, such as toy building blocks. The plate surface can include toy piece coupling elements. A user may reach into the workspace in order to place a toy piece on the plate surface. The break beam sensors can be arranged in an array. A break beam sensor includes an electromagnetic emitter and an electromagnetic receiver. In some examples, emitters, receivers, or both are supported by structures that are attached to the plate and extend non-parallel to the plate surface. The structures can be attached to the edges of the plate. The break beam sensors can detect the presence of the user's hand and/or of the toy piece passing through the workspace. Break beam sensor data can be output to a controller. The break beam sensor data can include data indicating, for a particular break beam sensor that detects an object, an array address of the break beam sensor, a start time of object detection by the break beam sensor, and/or a duration of detection by the particular break beam sensor.


The plate sensors can be integrated with the plate, positioned below the plate, or positioned on the plate surface. The plate sensors can include, for example, pressure sensors, contact sensors, or proximity sensors. Each plate sensor can be configured to detect objects resting on the plate surface within a proper subset of the area of the plate surface. When a toy piece is placed on the plate surface, the plate sensors can detect the presence of the toy piece on the plate surface. Plate sensor data can be output to the controller. The plate sensor data can include data indicating, for a particular plate sensor that detects an object, a location of the plate sensor, a start time of object detection by the plate sensor, a duration of object detection by the plate sensor, a weight of the object, etc.


In some examples, the controller can determine, based on the plate sensor data, that more than one object is located at a particular location, e.g., due to being stacked. In some examples, two or more plate sensors can detect objects resting on the plate surface. The controller can determine, based on the plate sensor data from the two or more plate sensors, a size of the object and/or a number of objects resting on the plate surface.


Using the break beam sensor data and the plate sensor data, the controller can track object paths within the workspace and placement of objects on the plate surface. For example, the controller can track motion of a user's hand placing a first toy piece on a first location of the plate surface, removing a second toy piece from a second location of the plate surface, and placing the second toy piece on a second location of the plate surface. The controller can determine a time duration of each action and a path of movement for each action.


In some examples, the controller can perform actions based on the tracked object motion. For example, the controller may determine that the user requires assistance, and perform a feedback action to assist the user. The controller can determine that the user requires assistance, e.g., based on determining that a placement of an object differs from an expected placement of the object, based on the user's hand moving slowly within the workspace, based on the user's hand being within the workspace for longer than an expected time duration, based on the user moving a toy piece repeatedly between locations of the plate surface, etc. To assist the user, the controller can perform a feedback action such as illuminating a warning light, generating an alert sound, outputting audible instructions, outputting visual instructions, energizing a laser pointer, etc.


In some examples, the controller can generate a visualization of object paths and placement within the workspace. The visualization can include, for example, a heat map showing the path of an object through the workspace and the placement location of the object on the plate surface during a user session. The heat map can be presented on a display in near-real time, can be stored for later viewing by a user, or both. In some examples, the controller can generate an aggregated visualization of objects paths and placement within the workspace. The aggregated visualization can represent multiple user sessions by the same user, or multiple user sessions by multiple users.


Understanding how people interact with toys and other objects can provide information about the person's development, engagement with the toy, and ability to follow instructions. Sets of modular building blocks can be used to track these parameters. An interactive build plate can use break beam sensors, e.g., including LEDs and photodiodes, to detect user motion and time of interaction. The interactive build plate can record how many times, where, and for how long optical sensor sets have their optical paths broken. The interactive build plate can be configured such that, when in use, block sets or models are built on top of the build plate. Each block can pass over the build plate when being put into place on the plate.


Actions performed by a user can be compared with building plans. Based on the actions of the user compared with the building plans, the build plate system can perform actions. Actions can include providing feedback by activating guidance signals to guide the user. Actions can include generating a visualization of object movement. Actions can include comparing average user performance to expected performance specified by the building plans. Based on determining that average user performance does not satisfy performance criteria for a particular building plan, instructions for the building plan can be adjusted. For example, a lower-than-expected performance for a construction project by multiple users can indicate that the instructions for the construction project are not accurate and/or are not adequately specific.


In general, one innovative aspect of the subject matter described in this specification can be embodied in a system including a plate having a surface defining a floor of a three-dimensional workspace. The plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces. The system includes a plurality of plate sensors configured to detect objects resting on the surface of the plate; a plurality of break beam sensors configured to detect objects within the workspace; and a controller. The controller is configured to perform operations including: receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors, receiving plate sensor data indicating object detection by at least one plate sensor of the plurality of plate sensors; and determining, based on the break beam sensor data and the plate sensor data, that a toy piece passed through the workspace to rest on the surface.


These and other embodiments may each optionally include one or more of the following features. In some implementations, the system includes a plurality of support structures, each support structure extending from an edge of the plate in a non-parallel direction to a plane of the surface. Each break beam sensor includes: an emitter configured to emit electromagnetic radiation; and a receiver configured to receive electromagnetic radiation emitted by the emitter. Emitters and receivers of the plurality of break beam sensors are supported by the plurality of support structures.


In some implementations, each break beam sensor of the plurality of break beam sensors includes: an emitter supported by a first support structure coupled to the plate; and a receiver supported by a second support structure coupled to the plate. Electromagnetic energy traveling from the emitter to the receiver passes through the workspace.


In some implementations, the plurality of break beam sensors are arranged in an array, the break beam sensor data including data indicating an array address of the at least one break beam sensor.


In some implementations, receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors includes: receiving break beam sensor data indicating simultaneous object detection by two or more break beam sensors at a first time, the break beam sensor data including data indicating an array address of the two or more break beam sensors; and based on the break beam sensor data, determining a three-dimensional coordinate location of the toy piece within the three-dimensional workspace at the first time.


In some implementations, the break beam sensor data includes data indicating a time of object detection.


In some implementations, the break beam sensor data includes data indicating a sequence of detections, the operations including determining, based on the break beam sensor data, a path traveled through the workspace by the toy piece.


In some implementations, each plate sensor of the plurality of plate sensors is configured to detect objects resting on the surface within a respective proper subset of an area of the surface, the operations including: determining, based on the plate sensor data, a location of the toy piece on the surface, the location including a particular proper subset of the area of the surface.


In some implementations, the operations include comparing a position of the toy piece on the surface to a target position of the toy piece on the surface; and in response to determining that the position of the toy piece does not satisfy similarity criteria for matching the target position, performing one or more actions.


In some implementations, the one or more actions include at least one of: activating a visual alarm; activating an audible alarm; outputting visual instructions; or outputting audible instructions.


In some implementations, the plurality of plate sensors are integrated with the plate and arranged in an array.


In some implementations, the operations include determining, using the break beam sensor data and the plate sensor data, at least one of a size of the toy piece or a shape of the toy piece.


In some implementations, the operations include: generating a visualization showing: a path of the toy piece through the workspace; and a placement of the toy piece on the surface; and providing the visualization for presentation on a display.


In some implementations, the plurality of break beam sensors includes a plurality of infrared break beam sensors.


In some implementations, the plurality of plate sensors includes a plurality of weight sensors, proximity sensors, or contact sensors.


In general, one innovative aspect of the subject matter described in this specification can be embodied in a method for tracking object movement within a three-dimensional workspace. The method includes receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors. The plurality of break beam sensors is configured to detect objects within the workspace. The method includes receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors. The plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace. The method includes determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface; comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; and in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.


These and other embodiments may each optionally include one or more of the following features. In some implementations, the one or more actions include at least one of: activating a visual alarm; activating an audible alarm; outputting visual instructions; or outputting audible instructions.


In some implementations, the method includes obtaining data indicating a plan for a construction to be built on the plate; and determining the target position of the object on the surface using the obtained data.


In some implementations, the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces.


Other embodiments of these aspects include corresponding computer systems, apparatus, computer program products, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.


The details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A and 1B illustrate an example interactive build plate system.



FIG. 2 is a block diagram of the example interactive build plate system.



FIGS. 3A and 3B illustrate tracking of an object by the example interactive build plate system.



FIGS. 4A and 4B illustrate tracking of multiple objects by the example interactive build plate system.



FIG. 5 is a flow diagram of an example process for tracking object movement within a three-dimensional workspace.



FIG. 6 shows an example of a computing device and a mobile computing device that can be used to implement the techniques described herein.





Like reference numbers and designations in the various drawings indicate like elements.


DETAILED DESCRIPTION


FIGS. 1A and 1B illustrate an example interactive build plate system 100 for tracking movement of objects within a three-dimensional workspace 108. The workspace 108 has a floor defined by a surface 112 of a plate 104. The surface 112 extends in a horizontal plane, e.g., in the x-y plane. Sides of the workspace 108 can be defined by side edges 124a, 124b (“edges 124”) of the plate 104. For example, a plane extending vertically in the z-direction from the edges 124 of the plate 104 can define the sides of the workspace 108.


The plate 104 includes plate sensors 106 for detecting objects resting on the plate. In some examples, the surface 112 is configured to be used with releasably coupleable toy pieces, such as block 110. In some examples, the block 110 is a toy building block. The surface 112 can include toy piece coupling elements 114. The coupling elements 114 can provide stability or proper positioning when the block 110 is placed on the surface 112. A user may reach into the workspace in order to place the block 110 on the surface 112 and couple the block 110 to the coupling elements 114.


Break beam sensors are arranged to detect objects passing through the workspace 108. A break beam sensor includes an electromagnetic emitter 120 and an electromagnetic receiver 122. The plate sensors 106 and break beam sensors can output sensor data to a controller. The plate sensors 106 can be integrated with the plate 104, positioned below the plate 104, or positioned on the surface 112.


The build plate system 100 can have any appropriate size. In some examples, the plate 104 can have an area of thirty square centimeters or greater (e.g., forty square centimeters or greater, fifty square centimeters or greater, sixty square centimeters or greater). In some examples, the plate 104 can have an area of five hundred square centimeters or less (e.g., four hundred square centimeters or less, three hundred square centimeters or less, two hundred centimeters or less).


The plate 104 can have any appropriate shape. In some examples, the plate 104 has a polygonal shape in the x-y plane. For example, the plate 104 can have a triangular, rectangular, square, pentagonal, or hexagonal shape in the x-y plane. In some examples, the plate 104 has a non-polygonal shape in the x-y plane. For example, the plate 104 can have an elliptical, oval, or circular shape in the x-y plane.


The build plate system 100 includes structures 102a, 102b (“structures 102”). In some examples, emitters, receivers, or both are supported by the structures 102. In the example of FIG. 1A, the emitter 120 is supported by the structure 102a and the receiver 122 is supported by the structure 102b. In some examples, electromagnetic energy travels from the emitter 120 to the receiver 122 in a diagonal direction, e.g., a direction non-parallel to the x-y plane of the surface 112. In some examples, electromagnetic energy travels from an emitter to a receiver in a direction parallel to the x-y plane of the surface 112.


The structures 102 are coupled to the plate 104. The structures 102 extend non-parallel to the surface 112. For example, an inner surface 105a, 105b of each structure 102 can be perpendicular to the surface 112. In some examples, the structures 102 are attached to the edges 124 of the plate 104. In some examples, each structure 102 is attached to a respective edge 124 of the plate 104. In some examples, the structures 102 are attached to opposing edges of the plate 104, e.g., such that the inner surface 105a faces the inner surface 105b across the plate 104.


Although shown as having two structures, the build plate system 100 can include any number of structures. For example, the build plate system 100 can include three structures, four structures, or five or more structures. The structures 102 can be positioned in an arrangement that does not impede access to the workspace 108. For example, the number, size, shape, and positioning of the structures 102 can be such that a user can reach into the workspace 108 between the structures 102. A ceiling of the workspace 108 can be defined by top edges 103a, 103b of the structures 102a, 102b. For example, a plane extending from the top edge 103a of the structure 102a to the top edge 103b of the structure 102b can define the ceiling of the workspace 108.


In some examples, the structures 102 can be coupled to the plate 104 at corners of the plate. For example, the plate 104 can have a rectangular shape in the x-y plane. A structure 102 can be coupled to the plate 104 at each of the four corners of the plate 104.



FIG. 1B illustrates a perspective view of the example build plate system 100. The build plate system 100 can track the block 110 as a user's hand 140 enters the workspace 108 while holding the block. The build plate system 100 can track the block 110 as the user's hand 140 places the block 110 on the surface 112. The build plate system 100 can determine a location and positioning of the block 110 on the surface 112.


The break beam sensors of the build plate system 100 can be arranged in an array and supported by structure 102a, 102b. Each break beam sensor includes an emitter, e.g., emitter 130, and a receiver, e.g., receiver 132. The break beam sensors can detect the presence and movement of objects within the workspace 108, e.g., block 110 being carried by a user's hand 140. The break beam sensors can detect and track movement of the user's hand 140 passing through the workspace 108.


In the example of FIG. 1B, emitters are represented by white circles, and receivers are represented by black circles. In some examples, an emitter and corresponding receiver of a break beam sensor can be positioned across from each other and supported by opposing structures 102. In some examples, the direction of travel between an emitter and a receiver is parallel to the x-y plane and parallel to the x-z plane, e.g., path 152. In some examples, the direction of travel between an emitter and a receiver is parallel to the x-y plane and non-parallel to the x-z plane, e.g., path 154. In some examples, the direction of travel between an emitter and a receiver is non-parallel to the x-y plane and parallel to the x-z plane, e.g., path 156. In some examples, the direction of travel between an emitter and a receiver is non-parallel to the x-y plane and non-parallel to the x-z plane, e.g., path 158.


Each structure 102 supports an array of emitters, receivers, or both. In the example of FIG. 1B, the structure 102a supports an array 160a including both emitters and receivers. The structure 102b also supports an array 160b including both emitters and receivers. In some examples, each structure 102 supports an array of only emitters or only receivers.



FIG. 2 shows a block diagram of the example build plate system 100. The build plate system 100 includes break beam sensors 204 and plate sensors 106. The build plate system 100 optionally includes a camera 206. The build plate system 100 includes a controller 210. The controller 210 includes a movement tracker 212 and a placement tracker 214. The build plate system 100 includes a memory 220. The build plate system 100 optionally includes a visualization generator 222 and a signal device 224.


The break beam sensors 204 output break beam sensor data 234 to the controller 210. The break beam sensor data 234 can include data indicating, for a particular break beam sensor that detects the block 110, an array address of the break beam sensor, a start time of object detection by the break beam sensor, a duration of detection by the particular break beam sensor, or any combination of these. The array address of a break beam sensor can be, for example, a coordinate position of the emitter in the respective array, a coordinate position of the receiver in the respective array, or both.


In some examples, the break beam sensors 204 include infrared emitters and receivers. An example infrared emitter includes an infrared LED. The infrared LED can have a diameter of approximately 3.0 millimeters (mm) (e.g., 2.0 mm or greater, 2.5 mm or greater, 3.5 mm or greater). Infrared break-beam sensors can be used to detect object presence and object motion. An infrared emitter sends out a beam of human-invisible infrared light. A receiver, such as a photodiode, that is sensitive to the infrared light is positioned across the workspace from the emitter. When the block 110 passes between the emitter and the receiver, and the object is not transparent to infrared, the beam is broken and the receiver detects the interruption. An array of break beam sensors can be used to detect and localize objects. The array of break beam sensors can be used to detect motion of objects, to determine speed of object motion, and to determine two-dimensional and three-dimensional direction of object motion.


The build plate system 100 includes one or more plate sensors 106. A plate sensor 106 is a sensor that detects the presence of the block 110 on the plate 104. The plate sensor 106 can output data indicating a size, shape, location, weight, or any combination of these to the controller 210. The plate sensors 106 can include, for example, pressure sensors, weight sensors, proximity sensors, load cells, contact sensors, capacitive sensors, or any combination of these.


Each plate sensor can be configured to detect objects resting on the plate surface within a proper subset of the area of the plate surface. When a toy piece, e.g., block 110, is placed on the surface 112, the plate sensors 106 can detect the presence of the toy piece on the surface 112. Plate sensor data 236 can be output to the controller 210. Plate sensor data 236 output by a particular plate sensor can include data indicating a location of the plate sensor that detected the object, a size of the object, a shape of the object, an orientation of the object, a start time of object detection by the plate sensor, a duration of object detection by the plate sensor, a weight of the object, or any combination of these.


The plate sensors 106 can be arranged in an array. The build plate system 100 can include any appropriate number of plate sensors 106. The resolution of the build plate system 100 can depend on the number of plate sensors. For example, a greater number of plate sensors 106 results in a higher resolution, while a lesser number of plate sensors 106 results in a lower resolution. A build plate system having a higher resolution can determine more precise locations and sizes of objects resting on the plate 104, compared to a build plate system having a lower resolution.


The controller 210 includes a movement tracker 212 and a placement tracker 214. Using the break beam sensor data 234 from the break beam sensors 204, the movement tracker 212 can track object paths within the workspace 108. For break beam sensor data 234 generated at a particular time, the movement tracker 212 can determine a coordinate location of the block 110 within the workspace 108 at the particular time. The coordinate location can be a three-dimensional coordinate location within the workspace 108 where the block 110 is detected. In some examples, the coordinate location can be a location of an estimated center of the block 110 at the particular time.


In some examples, the movement tracker 212 can determine a size of the block 110 based on a number of break beam sensors 204 that are simultaneously interrupted at a given time. In some examples, the movement tracker 212 can determine a shape of the block 110, a location of an estimated center of the block 110, or both, using the break beam sensor data 234. The movement tracker 212 can determine the shape of the block 110 and/or the location of the center of the block 110 based on the number of break beam sensors 204 that are simultaneously interrupted at a given time, the array addresses of the break beam sensors 204 that are simultaneously interrupted at the given time, or both.


The movement tracker 212 can determine a path of the block 110 moving through the workspace 108. The path can include a trajectory of the block 110, a speed of the block 110, a time of travel of the block 110, or any combination of these. In some examples, the trajectory of the block 110 includes a series of coordinate locations of the block 110 in the workspace 108, with each coordinate location being associated with a time of detection.


Using the plate sensor data 236 from the plate sensors 106, the placement tracker 214 can track placement of objects on the surface 112. For plate sensor data 236 generated at a particular time, the placement tracker 214 can determine a coordinate location of the block on the plate 104. The coordinate location can be a two-dimensional coordinate location on the plate 104 where the block 110 is detected. In some examples, the coordinate location can be a location of an estimated center of the block 110 at the particular time.


In some examples, the placement tracker 214 can determine a size of the block 110 based on a number of plate sensors 106 that detect the block 110. In some examples, the placement tracker 214 can determine a shape of the 110, a location of an estimated center of the block 110, an orientation of the block 110, or any combination of these, using the plate sensor data 236. The placement tracker 214 can determine the shape, location, and orientation of the block 110 based on the number of plate sensors 106 that detect the block, the array addresses of the plate sensors 106 that detect the block, or both.


In an example scenario, using the break beam sensor data 234 and the plate sensor data 236, the controller 210 can track motion of a user's hand 140 moving through the workspace 108 to place the block 110 at a particular location on the plate 104. The controller 210 can determine a time duration of movement of the hand 140 within the workspace 108, a speed of the hand 140 moving through the workspace 108, a path that the hand 140 travels through the workspace 108, and a coordinate address of the particular location on the plate 104.


In some examples, the build plate system 100 includes a camera 206. The camera 206 can capture images of the workspace 108. The camera 206 can be activated when the user (or user's parents/guardians) consent to having the camera capture user interaction with the build plate system 100 and toy pieces.


The controller 210 can use camera image data 238 captured by the camera 206 to track movement and placement of objects within the workspace 108. In some examples, the controller 210 can overlay camera image data 238 captured by the camera 206 with break beam sensor data 234 from the break beam sensors 204, with plate sensor data 236 from the plate sensors 106, or both. Camera image data 238 captured by the camera 206 can be used to verify and/or validate trajectories and placement of objects determined by the controller 210.


The build plate system 100 includes a memory 220. In some examples, the memory 220 can store calibration data. Calibration data can include data associating break beam sensor data with location and movement patterns of objects within the workspace 108. Calibration data can include data associating plate sensor data 236 with locations, shapes, and sizes of objects placed on the plate 104.


In some examples, the memory 220 can store a building plan 202. In some examples, the building plan 202 can be loaded into the memory 220. The building plan 202 can include a plan for a construction to be built on the plate 104. The building plan 202 can include an arrangement of blocks. The arrangement can include a number of blocks to be placed on the plate 104, a location for each block on the plate 104, a type of block to be placed at each location, an orientation for each block on the plate 104, a number of blocks to be placed at each location of the plate 104, or any combination of these.


In some examples, the building plan 202 includes a sequence of actions to be performed by the user within the workspace 108. In some examples, the building plan 202 includes a sequence of block movement within the workspace 108. In some examples, the building plan 202 includes a sequence of block placement on the plate 104. The building plan 202 can include an expected time duration for building the construction, an expected time duration for placing each block on the plate 104, an expected speed of the user's hand 140 through the workspace 108, or any of these.


The controller 210 can access the building plan 202 and can compare object movement within the workspace 108, and object placement on the plate 104, to the building plan 202. For example, the controller 210 can compare a sequence of object movement within the workspace 108 to a sequence of object movement specified by the building plan 202. The controller 210 can compare a placement of a block on the plate 104 to a placement specified by the building plan 202. The controller 210 can determine whether detected movement and placement of objects satisfies similarity criteria for matching the movement and placement of objects specified by the building plan 202.


The controller 210 can determine target motion patterns of the block 110 using the building plan 202. The controller 210 can then determine whether motion of an object, e.g., a trajectory of the block 110, satisfies similarity criteria for matching the target motion of the block 110 specified by the building plan 202. In response to determining that the motion of the block 110 satisfies the similarity criteria, the controller 210 can determine that motion of the block 110 matches the building plan 202. In response to determining that the motion of the block 110 does not satisfy the similarity criteria, the controller 210 can determine that the motion of the block 110 does not match the building plan 202.


The controller 210 can determine a target placement of the block 110 using the building plan 202. The controller 210 can determine whether placement of the block 110 on the plate 104 satisfies similarity criteria for matching the target placement of the block 110 specified by the building plan 202. Similarity criteria can include, for example, a threshold distance between the placement of the block 110 on the plate 104 and the target placement specified by the building plan 202. In response to determining that the placement of the block 110 satisfies the similarity criteria, the controller 210 can determine that the block 110 placement matches the building plan 202. In response to determining that the placement of the block 110 does not satisfy the similarity criteria, the controller 210 can determine that the placement of the block 110 does not match the building plan 202.


In some examples, the controller 210 can perform actions based on the tracked object motion. For example, the controller 210 may determine that the motion and/or placement of the block 110 is inaccurate based on determining that the motion and/or placement does not satisfy criteria for matching the building plan 202. In response to determining that the motion and/or placement of the block 110 does not match the building plan 202, the controller 210 can determine that the user requires assistance and can determine to perform an action to provide feedback and/or assistance to the user. In some examples, the controller 210 can determine that the user requires assistance based on determining that a placement of the block 110 differs from a target placement of the block 110, based on the user's hand 140 moving at a speed that is slow than a target speed within the workspace, based on the user's hand being within the workspace for longer than an expected time duration, based on the user moving a block repeatedly between multiple locations of the surface 112, and/or based on other detected object or user movements.


To guide and assist the user, the controller 210 can perform one or more actions. An example action includes providing feedback to the user by activating a signal device 224. The signal device 224 can include, for example, a visual alarm, a light, an audible alarm, a speaker, a laser pointer, or any combination of these. In some examples, the controller 210 can activate the signal device 224 by activating a visible signal such as a light or laser pointer that illuminates a location of the plate 104 where the block 110 should be placed, e.g., in accordance with the building plan 202. In some examples, the controller 210 can activate the signal device 224 by illuminating a light of a particular color. For example, a red light can indicate that the block 110 has been placed incorrectly or is on the wrong path, and a green light can indicate that the block 110 has been placed correctly or is on the correct path.


In some examples, the controller 210 can activate the signal device 224 by broadcasting audible sound through a speaker. The sound can include, for example, an alert sound indicating that the block 110 has been placed in correctly. In some examples, the sound can include verbal instructions. In some examples, the controller 210 can activate the signal device 224 by displaying visual instructions on a display coupled to the build plate system 100. The visual instructions can include, for example, textual or graphical instructions. The instructions can specify one or more actions to be performed by the user in order to place the block 110 correctly per the building plan 202.


The controller 210 can operate the build plate system 100 in different operating modes, e.g. an easy mode, a medium mode, a hard mode. The controller 210 can be configured to provide different levels of feedback and/or assistance in the different operating modes. For example, in the easy mode, the controller 210 can provide more assistance to the user than in the medium and hard modes. In the medium mode, the controller 210 can provide more assistance to the user than in the hard mode, but less assistance than in the easy mode. For example, in the easy mode, the controller 210 can provide more assistance by providing assistance more quickly after determining that the user needs assistance, by providing more specific assistance, or both.


In an example scenario, the controller 210 may determine that the user placed the block 110 at an incorrect or inaccurate location of the surface 112. In the easy mode, the controller 210 can perform an action to assist the user, e.g., by illuminating a light under the correct location of the surface 112 after a one second delay. In the medium mode, the controller 210 can perform an action by illuminating a light supported by one of the structures 102 after a three second delay. The light can indicate the incorrect location of the block 110 without revealing the correct location. In the hard mode, the controller 210 can perform an action by illuminating the light after a ten second delay.


The build plate system 100 can operate in the different operating modes, e.g., based on the building plan 202, based on user input, or both. For example, the build plate system 100 can provide a user interface for receiving user input indicating the operating mode. In some examples, the build plate system 100 can receive user input, through a user interface, specifying various settings of operation. For example, the build plate system can receive user input specifying a setting for a preferred type of signal device 224 to be used for providing user feedback and assistance. The setting for the preferred type of signal device 224 can indicate, for example, a user preference for visual guidance over audible guidance.


The build plate system 100 can include a visualization generator 222. In some examples, the visualization generator 222 can generate a visualization of object paths and placement within the workspace. The visualization can include, for example, a heat map showing the path of the block 110 through the workspace 108 and the placement location of the block 110 on the surface 112 during a user session. The visualization can be presented on a display device. In some examples, the visualization can be presented in near-real time, can be stored for later viewing by a user, or both. In some examples, the visualization generator 222 can generate an aggregated visualization of objects paths and placement within the workspace. The aggregated visualization can represent multiple user sessions by a same user, or multiple user sessions by multiple users.


The memory 220 can store data generated from user sessions. For example, the memory 220 can store, for a user session, break beam sensor data 234, plate sensor data 236, camera image data 238, or any of these. In some examples, the memory 220 can store data indicating object movement paths determined by the movement tracker 212, and object placement determined by the placement tracker 214.



FIGS. 3A and 3B illustrate tracking of an object by the example interactive build plate system 100. Referring to FIG. 3A, the user's hand 140 holds the block 110 and moves the block 110 through the workspace 108. The build plate system 100 can track movement of the hand 140 and the block 110 using the break beam sensors. For example, the build plate system 100 can determine a time of interruption of the beam 302 between the emitter 120 and the receiver 122. In some examples, the build plate system 100 can determine a three-dimensional coordinate location of the block 110 within the workspace 108 at a particular time based on the array addresses of break beam sensors that detected the block at the particular time. In some examples, the build plate system 100 can determine a size of the block 110, an orientation of the block 110, a shape of the block 110, a speed of movement of the block 110, a direction of movement of the block 110, or any combination of these based on break beam sensor data generated by the break beam sensors while the block 110 is in the workspace 108.


Referring to FIG. 3B, the block 110 is placed on the surface 112 of the plate 104. The build plate system 100 can determine the location and orientation of the block 110 based on plate sensor data 236 generated by the plate sensors 106. In some examples, the build plate system 100 can determine a size, shape, and weight of the block 110 based on the plate sensor data 236.



FIGS. 4A and 4B illustrate tracking of multiple objects by the example interactive build plate system 100. Referring to FIG. 4A, blocks 110, 410 are placed side-by-side on the plate 104. Plate sensor data 236 generated can indicate array addresses of the plate sensors 106 that detect the presence of the blocks 110, 410.


The build plate system 100 can determine, based on the plate sensor data 236, a number of objects resting on the plate 104, a size of each object, a weight of each object, an orientation of each object, a shape of each object, or any combination of these. The build plate system 100 can determine, based on the plate sensor data 236, that two blocks are resting on the plate 104.


The build plate system 100 can determine an accuracy of block placement by comparing the placement of the blocks 110, 410 to target placement of the blocks 110, 410, e.g., as specified by the building plan 202. Based on determining that the placement of the blocks 110, 410, does not satisfy similarity criteria for matching the target placement of the blocks 110, 410, the build plate system 100 can perform an action to provide feedback and assist the user in correct placement of the blocks 110, 410.


The build plate system 100 can track individual objects moving between the workspace 108 and the surface 112. For example, a user may place the block 110 in a first location on the surface 112, place the block 410 in a second location on the surface 112, remove the block 110 from the first location, and place the block 110 in a third location. When a particular block transitions between the workspace 108 to the surface 112, the build plate system 100 can compare the break beam sensor data 234 with the plate sensor data 236 to determine which block, e.g., block 110 or 410, is being moved.


In some examples, the build plate system 100 can generate an identifier for each object that enters the workspace 108. For example, when each block 110, 410 enters the workspace 108, the build plate system 100 can generate an identifier for each block. The identifier can be stored in the memory 220 with data indicating the location of the respective block and associated characteristics of the respective block. The associated characteristics can be determined using the break beam sensor data 234, the plate sensor data 136, the camera image data 138, or any of these. Characteristics can include, for example, a size, shape, weight, and/or color of the block. When the build plate system 100 detects movement of one of the blocks, the build plate system 100 can determine the identifier of the block that is moving. The build plate system 100 can determine the identifier of the block that is moving based on characteristics of the moving block, based on the starting location of the moving block, or both. The build plate system 100 can then track movement of the identified block from its starting location to an ending location. The build plate system 100 can thus track location and movement of individual objects by storing identifiers, locations, and/or characteristics of each object within the memory 220.


Referring to FIG. 4B, block 420 is stacked on top of block 110 on the plate 104. The build plate system 100 can determine, based on the plate sensor data 236, a number of objects placed in a same location, e.g., due to being stacked. For example, the plate sensor data 236 can indicate a first increase in weight detected by a first plate sensor at a first time. The plate sensor data 236 can indicate a second increase in weight detected by the first plate sensor at a second time after the first time. Based on the first increase in weight, followed by the second increase in weight, the build plate system 100 can determine that two objects are stacked at a location of the plate 104 corresponding to the first plate sensor.


In some examples, objects can be detected by break beam sensors while resting on the plate 104. For example, the block 420, stacked on top of the block 110, breaks the beam 302. The block 420 is therefore detected by both the break beam sensors 204 and the plate sensors 106 while stacked on the block 110. The build plate system 100 can overlay the break beam sensor data 234 and the plate sensor data 236 to determine a precise location and other attributes of the block 420. For example, using the break beam sensor data 234 and the plate sensor data 236, the build plate system 100 can determine a three-dimensional coordinate location of the block 420, a height of the block 420 when stacked on the block 110, a shape of the block 420, a size of the block 420, a weight of the block 420, a three-dimensional orientation of the block 420, or any of these. Information about the block 420 determined from both the break beam sensor data 234 and the plate sensor data 236 can be more precise and accurate than information about the block 420 determined from only the break beam sensor data 234 or only the plate sensor data 236.



FIG. 5 is a flow diagram of an example process 500 for tracking object movement within a three-dimensional workspace.


The process 500 includes receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors (502). The plurality of break beam sensors is configured to detect objects within the workspace. For example, the array of break beam sensors 205 can detect and track objects moving within the workspace 108.


The process 500 includes receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors (504). The plurality of plate sensors is configured to detect objects resting on a surface that defines a floor of the workspace. For example, the plate sensors 106 can detect the block 110 resting on the surface 112 that defines the floor of the workspace 108.


The process 500 includes determining, based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest on the surface (506). For example, the controller 210 can determine, based on the break beam sensor data 234 and the plate sensor data 236, that the block 110 passed through the workspace 108 to rest on the surface 112. The controller 210 can determine a trajectory of the block 110 through the workspace 108, and a location of placement of the block 110 on the surface 112.



FIG. 6 shows an example of a computing device 600 and a mobile computing device 650 that can be used to implement the techniques described here. The computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The mobile computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart-phones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to be limiting.


The computing device 600 includes a processor 602, a memory 604, a storage device 606, a high-speed interface 608 connecting to the memory 604 and multiple high-speed expansion ports 610, and a low-speed interface 612 connecting to a low-speed expansion port 614 and the storage device 606. Each of the processor 602, the memory 604, the storage device 606, the high-speed interface 608, the high-speed expansion ports 610, and the low-speed interface 612, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 602 can process instructions for execution within the computing device 600, including instructions stored in the memory 604 or on the storage device 606 to display graphical information for a GUI on an external input/output device, such as a display 616 coupled to the high-speed interface 608. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).


The memory 604 stores information within the computing device 600. In some implementations, the memory 604 is a volatile memory unit or units. In some implementations, the memory 604 is a non-volatile memory unit or units. The memory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk.


The storage device 606 is capable of providing mass storage for the computing device 600. In some implementations, the storage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. Instructions can be stored in an information carrier. The instructions, when executed by one or more processing devices (for example, processor 602), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices such as computer- or machine-readable mediums (for example, the memory 604, the storage device 606, or memory on the processor 602).


The high-speed interface 608 manages bandwidth-intensive operations for the computing device 600, while the low-speed interface 612 manages lower bandwidth-intensive operations. Such allocation of functions is an example only. In some implementations, the high-speed interface 608 is coupled to the memory 604, the display 616 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 610, which may accept various expansion cards (not shown). In the implementation, the low-speed interface 612 is coupled to the storage device 606 and the low-speed expansion port 614. The low-speed expansion port 614, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.


The computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 620, or multiple times in a group of such servers. In addition, it may be implemented in a personal computer such as a laptop computer 622. It may also be implemented as part of a rack server system 624. Alternatively, components from the computing device 600 may be combined with other components in a mobile device (not shown), such as a mobile computing device 650. Each of such devices may contain one or more of the computing device 600 and the mobile computing device 650, and an entire system may be made up of multiple computing devices communicating with each other.


The mobile computing device 650 includes a processor 652, a memory 664, an input/output device such as a display 654, a communication interface 666, and a transceiver 668, among other components. The mobile computing device 650 may also be provided with a storage device, such as a micro-drive or other device, to provide additional storage. Each of the processor 652, the memory 664, the display 654, the communication interface 666, and the transceiver 668, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.


The processor 652 can execute instructions within the mobile computing device 650, including instructions stored in the memory 664. The processor 652 may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor 652 may provide, for example, for coordination of the other components of the mobile computing device 650, such as control of user interfaces, applications run by the mobile computing device 650, and wireless communication by the mobile computing device 650.


The processor 652 may communicate with a user through a control interface 658 and a display interface 656 coupled to the display 654. The display 654 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. The display interface 656 may comprise appropriate circuitry for driving the display 654 to present graphical and other information to a user. The control interface 658 may receive commands from a user and convert them for submission to the processor 652. In addition, an external interface 662 may provide communication with the processor 652, so as to enable near area communication of the mobile computing device 650 with other devices. The external interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.


The memory 664 stores information within the mobile computing device 650. The memory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. An expansion memory 674 may also be provided and connected to the mobile computing device 650 through an expansion interface 672, which may include, for example, a SIMM (Single In Line Memory Module) card interface. The expansion memory 674 may provide extra storage space for the mobile computing device 650, or may also store applications or other information for the mobile computing device 650. Specifically, the expansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, the expansion memory 674 may be provided as a security module for the mobile computing device 650, and may be programmed with instructions that permit secure use of the mobile computing device 650. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.


The memory may include, for example, flash memory and/or NVRAM memory (non-volatile random access memory), as discussed below. In some implementations, instructions are stored in an information carrier. The instructions, when executed by one or more processing devices (for example, processor 652), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices, such as one or more computer- or machine-readable mediums (for example, the memory 664, the expansion memory 674, or memory on the processor 652). In some implementations, the instructions can be received in a propagated signal, for example, over the transceiver 668 or the external interface 662.


The mobile computing device 650 may communicate wirelessly through the communication interface 666, which may include digital signal processing circuitry where necessary. The communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls (Global System for Mobile communications), SMS (Short Message Service), EMS (Enhanced Messaging Service), or MMS messaging (Multimedia Messaging Service), CDMA (code division multiple access), TDMA (time division multiple access), PDC (Personal Digital Cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000, or GPRS (General Packet Radio Service), among others. Such communication may occur, for example, through the transceiver 668 using a radio-frequency. In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, a GPS (Global Positioning System) receiver module 670 may provide additional navigation- and location-related wireless data to the mobile computing device 650, which may be used as appropriate by applications running on the mobile computing device 650.


The mobile computing device 650 may also communicate audibly using an audio codec 660, which may receive spoken information from a user and convert it to usable digital information. The audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of the mobile computing device 650. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on the mobile computing device 650.


The mobile computing device 650 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 680. It may also be implemented as part of a smart-phone 682, personal digital assistant, or other similar mobile device.


Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.


These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms machine-readable medium and computer-readable medium refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.


To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.


The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), and the Internet.


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


Although a few implementations have been described in detail above, other modifications are possible. For example, while a client application is described as accessing the delegate(s), in other implementations the delegate(s) may be employed by other applications implemented by one or more processors, such as an application executing on one or more servers. In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other actions may be provided, or actions may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims. What is claimed is:

Claims
  • 1. A system comprising: a plate having a surface defining a floor of a three-dimensional workspace, wherein the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces;a plurality of plate sensors configured to detect objects resting on the surface of the plate;a plurality of break beam sensors configured to detect objects within the workspace; anda controller configured to perform operations comprising: receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors,receiving plate sensor data indicating object detection by at least one plate sensor of the plurality of plate sensors; anddetermining, based on the break beam sensor data and the plate sensor data, that a toy piece passed through the workspace to rest on the surface.
  • 2. The system of claim 1, comprising a plurality of support structures, each support structure extending from an edge of the plate in a non-parallel direction to a plane of the surface, wherein: each break beam sensor comprises: an emitter configured to emit electromagnetic radiation; anda receiver configured to receive electromagnetic radiation emitted by the emitter, andemitters and receivers of the plurality of break beam sensors are supported by the plurality of support structures.
  • 3. The system of claim 1, wherein each break beam sensor of the plurality of break beam sensors comprises: an emitter supported by a first support structure coupled to the plate; anda receiver supported by a second support structure coupled to the plate, wherein electromagnetic energy traveling from the emitter to the receiver passes through the workspace.
  • 4. The system of claim 1, wherein the plurality of break beam sensors are arranged in an array, the break beam sensor data comprising data indicating an array address of the at least one break beam sensor.
  • 5. The system of claim 4, wherein receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors comprises: receiving break beam sensor data indicating simultaneous object detection by two or more break beam sensors at a first time, the break beam sensor data comprising data indicating an array address of the two or more break beam sensors; andbased on the break beam sensor data, determining a three-dimensional coordinate location of the toy piece within the three-dimensional workspace at the first time.
  • 6. The system of claim 1, wherein the break beam sensor data comprises data indicating a time of object detection.
  • 7. The system of claim 1, wherein the break beam sensor data includes data indicating a sequence of detections, the operations comprising determining, based on the break beam sensor data, a path traveled through the workspace by the toy piece.
  • 8. The system of claim 1, wherein each plate sensor of the plurality of plate sensors is configured to detect objects resting on the surface within a respective proper subset of an area of the surface, the operations comprising: determining, based on the plate sensor data, a location of the toy piece on the surface, the location comprising a particular proper subset of the area of the surface.
  • 9. The system of claim 1, the operations comprising: comparing a position of the toy piece on the surface to a target position of the toy piece on the surface; andin response to determining that the position of the toy piece does not satisfy similarity criteria for matching the target position, performing one or more actions.
  • 10. The system of claim 9, wherein the one or more actions comprise at least one of: activating a visual alarm;activating an audible alarm;outputting visual instructions; oroutputting audible instructions.
  • 11. The system of claim 1, wherein the plurality of plate sensors are integrated with the plate and arranged in an array.
  • 12. The system of claim 1, the operations comprising determining, using the break beam sensor data and the plate sensor data, at least one of a size of the toy piece or a shape of the toy piece.
  • 13. The system of claim 1, the operations comprising: generating a visualization showing: a path of the toy piece through the workspace; anda placement of the toy piece on the surface; andproviding the visualization for presentation on a display.
  • 14. The system of claim 1, wherein the plurality of break beam sensors comprise a plurality of infrared break beam sensors.
  • 15. The system of claim 1, wherein the plurality of plate sensors comprises a plurality of weight sensors, proximity sensors, or contact sensors.
  • 16. A computer-implemented method for tracking object movement within a three-dimensional workspace, the method comprising: receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors, wherein the plurality of break beam sensors is configured to detect objects within the workspace;receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors, wherein the plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace;determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface;comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; andin response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.
  • 17. The method of claim 16, wherein the one or more actions comprise at least one of: activating a visual alarm;activating an audible alarm;outputting visual instructions; oroutputting audible instructions.
  • 18. The method of claim 16, comprising: obtaining data indicating a plan for a construction to be built on the plate; anddetermining the target position of the object on the surface using the obtained data.
  • 19. The method of claim 16, wherein the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces.
  • 20. A non-transitory computer storage medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations for tracking object movement within a three-dimensional workspace, the operations comprising: receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors, wherein the plurality of break beam sensors is configured to detect objects within the workspace;receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors, wherein the plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace;determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface;comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; andin response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.