The present disclosure relates to systems and methods for collecting trash outdoors, such as at beaches and parks.
Collecting trash outdoors, such as at beaches and parks, can be challenging. Often, trash is collected manually, but typically includes smaller pieces that can be difficult to spot and/or collect by personnel assigned to such tasks. In some cases, larger machinery may be pulled behind a vehicle (e.g., a tractor) or pushed manually to collect trash, but still requires personnel to continuously operate the machinery. Such machinery also tends to be large and can require significant energy to operate. Accordingly, there is a need in the art for trash collection systems and methods that address one or more of these challenges.
A robotic system is provided for collecting trash. The robotic system comprises a vehicle, a camera, a trash collection unit, an actuator, and a controller. The vehicle includes a frame and a drive. The camera is coupled to the vehicle to move with the vehicle to capture images of the trash while the vehicle moves along ground. The trash collection unit is carried by the vehicle to collect the trash. The trash collection unit includes a vacuum pump and a collection conduit. The actuator is operatively coupled to the collection conduit to adjust a position of the collection conduit relative to the vehicle to position the collection conduit adjacent to the trash to collect the trash. The controller is coupled to the drive, the camera, the trash collection unit, and the actuator to coordinate movement of the vehicle, positioning of the collection conduit, and operation of the vacuum pump to collect the trash.
A method is provided for robotically collecting trash with a robotic system including a vehicle, a camera, a trash collection unit, an actuator, and a controller. The method comprises: moving the vehicle along ground; capturing images of the trash while the vehicle moves along the ground; adjusting, with the actuator, a position of a collection conduit of the trash collection unit relative to the vehicle to position the collection conduit adjacent to the trash to collect the trash; and coordinating, with the controller, movement of the vehicle, positioning of the collection conduit, and operation of the vacuum pump to collect the trash.
Advantages of the present disclosure will be readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
Referring to
Referring to
Reflective material RM may be attached to the vehicle 12 to assist with spotting the vehicle 12 during low light conditions. The reflective material RM may include one or more reflective elements, such as reflective tapes, reflective coatings, or reflective paint.
As shown in
Axles 22a, 22b are rotatably supported by the supports 20 via bearings, bushings, combinations thereof, and the like. In some versions, pillow blocks B having internal bearings may be mounted to one or more of the supports 20 to rotatably support the axles 22a, 22b so that the axles 22a, 22b are able to rotate about a plurality of drive axes R1, R2, R3 relative to the frame 14. As shown, there is a first set of axes 22a to which the wheels 18a on the first side of the vehicle 12 are mounted and a second set of axes 22b to which the wheels 18b on the second side of the vehicle 12 are mounted. The wheels 18a on the first side are independently drivable from the wheels 18b on the second side as described further below. In some versions, axles connect the wheels on both sides of the vehicle.
The drive 16 includes a first drive motor 24a operatively coupled to the first plurality of wheels 18a to rotate the first plurality of wheels 18a about the respective axes R1, R2, R3 via the first set of axles 22a. The drive 16 also includes a second drive motor 24b operatively coupled to the second plurality of wheels 18b to rotate the second plurality of wheels 18b about the respective axes R1, R2, R3 via the second set of axles 22B. The drive motors 24a, 24b are independently operable so that the first plurality of wheels 18a can be rotated independently of the second plurality of wheels 18b. This can be useful for turning maneuvers in which the first drive motor 24a drives the first plurality of wheels 18a in a forward direction, while the second drive motor 24b drives the second plurality of wheels 18b in a reverse direction, causing tight rotation of the vehicle 12 (e.g. zero turn).
The drive 16 further includes a first transmission 26a operatively interconnecting the first drive motor 24a to the first set of axles 22a and a second transmission 26b operatively interconnecting the second drive motor 24b to the second set of axles 22b. The transmissions 26a, 26b may include any suitable gear train to convert rotations of the drive motors 24a, 24b into rotations of the drive axles 22a, 22b. In the version shown, the first drive motor 24a drives a single one of the axles 22a through transmission 26a, and the remaining drive axles 22a of the first set of drive axles 22a are driven through drive chains 28a connecting to the remaining drive axles 22a via sprockets 30a fixed to the drive axles 22a. Likewise, the second drive motor 24b drives a single one of the axles 22b through transmission 26b, and the remaining drive axles 22b of the second set of drive axles 22b are driven through drive chains 28b connecting to the remaining drive axles 22b via sprockets 30b fixed to the drive axles 22b. Ultimately, the drive 16 is operable to maneuver the vehicle 12 along the ground G in response to command signals so that the robotic system 10 moves into a position to collect the trash T found along the ground G.
Referring back to
Referring to
As shown in
The trash canister 46 is shown mounted to a bracket of the frame 14 to be thereby fixed relative to the frame 14. The trash canister 46 may be mounted via fasteners, welding, clamps, straps, or the like. In some versions the trash canister 46 is disposed on an upper surface of the frame 14, such as on a bed of the frame 14 near the battery BATT or located more forward on the frame 14, e.g., near a front of the vehicle 12. The trash canister 46 may be positioned at any suitable location to collect the trash T. In some versions, the trash canister 46 may be attached to a separate cart pulled by the vehicle 12.
Periodically, or once a certain volume or weight of such particles are collected in the trash canister 46 on the movable bottom 54, the movable bottom 54 is moved (e.g., pivoted) and a bottom of the trash canister 46 is opened to allow the collected particles (e.g., sand and dirt) to be deposited back on the ground G (see the pile P illustrated in
Referring to
A bracket, strap, clamp, fastener, or other suitable mounting device 68 mounts the vacuum head 48 of the collection conduit 44 to the carriage 60 to move in the lateral direction with the carriage 60 upon actuation of the drive screw 62. The mounting device 68 effectively secures the vacuum head 48 to the carriage 60 so that the actuator 58 is capable of accurately adjusting a position of the vacuum head 48 to ultimately position the vacuum head 48 adjacent each piece of trash T identified in the images of the camera 38 so that the trash T can thereafter be collected by the trash collection unit 40 and captured in the trash canister 46 via activation of the vacuum pump 42.
Operation of the actuator 58 and operation of the drive 16 are coordinated to cause suitable movement of the vacuum head 48 in longitudinal and lateral directions so that the vacuum head 48 is positioned adjacent to each piece of trash T. In the version shown, the drive 16 is largely responsible for driving the vehicle 12 in a longitudinal direction perpendicular to the lateral direction (e.g., perpendicular to axis A2) to move the vacuum head 48 in the longitudinal direction and the actuator 58 is responsible for moving the vacuum head 48 in the lateral direction (e.g., parallel to axis A2) so that the vacuum head 48 is moved in two degrees of freedom (x, y) to reach the trash T.
The vacuum head 48 may be mounted to the carriage 60 so that the vacuum head 48 is spaced a suitable fixed distance from the ground G to be able to collect the trash T. In some cases, the vacuum head 48 may be spaced 1, 2, or 3 inches from the ground G. Other spacings for the vacuum head 48 are also contemplated. In some versions, the vacuum head 48 may also be adjustable in its spacing from the ground G, i.e., in a third degree of freedom (z) via a separate motor (not shown). In some versions, the vacuum head 48 may be adjustable in more than three degrees of freedom.
Referring to
In some versions, the controller 72 is mounted to the frame 14 of the vehicle 12 but can be mounted at any suitable location of the robotic system 10. Memory 74 may be any memory suitable for storage of data and computer-readable instructions, such as the instructions provided by programs used to carry out the algorithms/functions described herein. For example, the memory 74 may be a local memory, an external memory, or a cloud-based memory embodied as random-access memory (RAM), non-volatile RAM (NVRAM), flash memory, or any other suitable form of memory. Power to the various components of the robotic system 10 may be provided by a battery power supply and/or an external power source, such as by one or more batteries BATT located on the vehicle 12, solar power, wind power, and the like. Additionally, or alternatively, a gas/diesel-powered generator may be used to generate power for the various components.
In some versions, the controller 72 includes an internal clock to keep track of time. In some versions, the internal clock is a microcontroller clock. The microcontroller clock may include a crystal resonator, a ceramic resonator, a resistor capacitor (RC), oscillator, or a silicon oscillator. Examples of other internal clocks other than those disclosed herein are fully contemplated. The internal clock may be implemented in hardware, software, or both. In some embodiments, the memory 74, microprocessors, and microcontroller clock cooperate to send signals to and operate the various components shown in
A user interface UI with display is coupled to the controller 72. The user interface UI has one or more user input devices 76 (also referred to as controls), which transmit corresponding input signals to the controller 72, and the controller 72 may control certain functions of the robotic system 10 based on the input signals. The user input devices 76 may include any device capable of being actuated by the user and may be provided on a control panel, touchscreen, or the like. The user input devices 76 may be configured to be actuated in a variety of different ways, including but not limited to, mechanical actuation (hand, foot, finger, etc.), hands-free actuation (voice, foot, etc.), and the like. The user input devices 76 may include buttons, a gesture sensing device for monitoring motion of hands, feet, or other body parts of the user (such as through the camera 38), a microphone for receiving voice activation commands, a foot pedal, and sensors (e.g., infrared sensor such as a light bar or light beam to sense a user's body part, ultrasonic sensors, capacitive sensors, etc.). Additionally, the buttons/pedals can be physical buttons/pedals, such as pushbuttons, or virtually implemented buttons/pedals such as through optical projection or on a touchscreen. The buttons/pedals may also be mechanically connected or drive-by-wire type buttons/pedals where a user applied force actuates a sensor, such as a switch or potentiometer. It should be appreciated that any combination of user input devices may also be utilized. The user interface UI can also be implemented on a remote-control pendant with associated controls to control operation of the robotic system 10, and may be implemented, for example, on a portable electronic device, such as an iPhone®, iPad®, or the like.
The controller 72 is also coupled to one or more sensors S associated with the components shown in
In some versions, the drive motors 24a, 24b include position sensors S in the form of hall-effect sensors or encoders for determining a rotational position and angular velocity of their associated drive shafts so that the controller 72 is able to determine position, velocity, and acceleration of the vehicle 12 by correlating the drive shaft positions, velocities, and/or accelerations to vehicle position, velocity, and/or acceleration in one or more degrees of freedom. The hall-effect sensors/encoders may be integrated into the drive motors 24a, 24b, or may be separate. Additionally, or alternatively, position sensors may be responsive to rotations of the axles 22a, 22b. The dump motor 56 may have a position sensor to determine whether the movable bottom 54 is in the open or closed state. In some cases, the position sensor for the dump motor 56 is in the form of one or more limit switches attached to the trash canister 46 or elsewhere that are associated with the open and/or closed states and that change signal states when the movable bottom 54 moves to/from the open/closed states. The vacuum pump/motor 42 may also include a pressure sensor that indicates whether a vacuum of suitable pressure is being generated in the trash canister 46 to draw in the trash T. The rail motor 66 may also have a position sensor to determine a position of the carriage 60 as described further below and/or may use one or more limit switches LS to calibrate the position of the carriage 60.
The control system 70 processes instructions from one or more computer programs, such as programs created in one or more programming languages (e.g., Python, C++, etc.) with third-party libraries including, for example, opencv, scikit-learn, scikit-image, and numpy. Such programs, or subroutines of such programs may form part of one or more program modules executed by the controller 72. Such modules may include, for example, an identification module M1 to identify trash T in the images captured by the camera 38, a behavior module M2 to determine how the vacuum head 48 needs to move to reach the trash T, and a control module M3 to execute necessary movements of the vacuum head 48 via the drive motors 24a, 24b and via the rail motor 66 and to execute operation of the vacuum pump/motor 42 to collect the trash T. The behavior module M2 receives, as input, data from the identification module M2 regarding locations of trash T. Based on this input data, the behavior module M2 determines desired movement of the vacuum head 48 and timing for activating suction, and outputs data regarding desired movements and timing to the control module M3. The control module M3 can then command appropriate movements from the drive motors 24a, 24b and rail motor 66 and command appropriate operation of the vacuum pump/motor 42.
In the example illustrated, the control system 70 initially operates the drive 16 so that the vehicle 12 moves at a constant velocity (vrobot) in a straight direction with respect to a travel path PATH. The control system 70 may utilize the one or more position sensors previously described as feedback so that the controller 72 is able to use closed-loop speed control to maintain the constant velocity of the vehicle 12 to adjust for varying terrain encountered by the vehicle 12. For example, a velocity control loop may be implemented by the control system 70 with a preset velocity and the error between the preset velocity and the measured velocity being used to adjust power output to the drive motors 24a, 24b, e.g., using a PID control loop. Separate control loops may be used for each drive motor 24a, 24b and associated sets of wheels 18a, 18b. In some versions, the feedback may be provided by sensors S that measure rotation of the wheels 18a, 18b so that the rotations of the wheels 18a, 18b are controlled to maintain a travel direction/speed of the vehicle 12 along the travel path PATH. In some versions, a global positioning system (GPS) could be used to determine a current location, velocity, etc. of the vehicle 12 for purposes of controlling the vehicle 12 to keep along the travel path PATH at the constant velocity (vrobot).
Still referring to
The controller 72 locates the trash T within a camera coordinate system CCS of the camera 38, which may be a two-dimensional or three-dimensional coordinate system, by assigning the trash T coordinates in the camera coordinate system CCS. In the version shown in
Any suitable form of image processing algorithms may be used to identify the trash T in the images. In some versions, the images captured by the camera 38 are processed using dynamic background subtraction along with basic color detection to locate the trash T within the images. Techniques such as R-CNN, Fast R-CNN, and Faster-RCNN techniques, Scale-Invariant Feature Transforms (SIFT), Speeded Up Robust Features (SURF), You Only Look Once (YOLO), and/or other techniques may be employed for object localization and/or recognition. The identification module M1 first receives the images from the camera 38 as input into the identification module M1. The identification module M1 then processes the images using one or more of the techniques referenced above, and outputs bounding boxes defined by a point (coordinates), width, and/or height around the trash T in the camera coordinate system CCS. The identification module M1 may additionally classify the trash T found in the images (e.g., with one or more integers that are mapped to class labels).
Once the trash T has been identified and located via the pair of coordinates (ximage, yimage), the controller 72 is configured to coordinate operation of the drive 16, the trash collection unit 40, and the actuator 58 based on the location of the trash T to move the vehicle 12 along the ground G and adjust the position of the vacuum head 48 so that the vacuum head 48 is positioned adjacent to the trash T to collect the trash when the vacuum pump/motor 42 is activated. More specifically, the behavior module M2 translates the coordinates (ximage, yimage) of the trash T into a series of commands for the drive motors 24a, 24b, vacuum pump/motor 42, and rail motor 66 to move the vacuum head 48 over the trash T and collect it. This may include the controller 72 computing, based on the location of the trash T, values of one or more operational parameters for the drive motors 24a, 24b, vacuum pump/motor 42, and/or rail motor 66. Such operational parameters may include one or more of position, velocity, acceleration, current, voltage, and torque.
Still referring to
Incremental sensing methods can be combined with a calibration procedure to provide absolute coordinates of the vacuum head 48 in the camera coordinate system CCS. The calibration procedure may include the carriage 60 and vacuum head 48 being adjusted to extreme ends of the drive screw 62 until the limit switches LS, which are placed at the ends on the supports 34 (see
Once the current location (x1, y1) of the vacuum head 48 is determined, then target coordinates (xtarget, ytarget) for the vacuum head 48 can be determined in the camera coordinate system CCS. The target coordinates (xtarget, ytarget) are based on the relationships: (i) xtarget=ximage; and (ii) ytarget=y1. The head-to-trash relationships (Δx, Δy) between the current location of the vacuum head 48 and the trash T can then be determined by the calculations Δx=x1−xtarget and Δy=ytarget−yimage. Once these relationships are established, then the time to activate the vacuum pump/motor 42 (tsuction) and a velocity (vhead) of the vacuum head 48 needed to reach the target coordinates in time (tsuction) to collect the trash T is calculated by the calculations: (i) tsuction=(Δy/vrobot)+t1; and (ii) vhead=Δx/(tsuction−t1). These relationships and calculations are also shown in
Once the velocity (vhead) and time (tsuction) are calculated, then these values can be input into the control module M3 and the controller 72 can then command the rail motor 66 to operate at the commanded velocity until the time (tsuction) is reached and the vacuum pump/motor 42 can also be commanded to operate at that time (tsuction). In some versions, the velocity (vhead) required for moving the vacuum head 48 may simply be a check to confirm that the vacuum head 48 can be moved quickly enough to align the vacuum head 48 with the trash T in order to collect the trash T, but the rail motor 66 may actually be operated at a faster speed such that the position of the vacuum head 48 is changed until it reaches the target position (xtarget, ytarget), but before the time (tsuction). In some cases, if the calculated velocity (vhead) to move the vacuum head 48 to reach the trash T exceeds a maximum velocity of the vacuum head 48 (vmax), the controller 72 can reduce the velocity of the vehicle 12 (vrobot) as needed. For example, when two pieces of trash T are at/near the same y coordinate, but substantially separated from one another in the x-direction, the rail motor 66 may not be able to move quickly enough to align the vacuum head 48 with both pieces of trash T at the current velocity of the vehicle 12. In this case, the controller 72 may calculate tsuction first using vmax, compute vrobot, and then control the drive motors 24a, 24b accordingly. In some versions, movement of the vacuum head 48 can be constant and the velocity of the vehicle 12 (vrobot) varied. The vehicle 12 may also be stopped by the controller 72 as needed to provide enough time for the vacuum head 48 to reach the trash T. In some versions, when multiple pieces of trash T are identified in the images, the controller 72 selects the trash T that has the y-coordinate with the largest value (i.e., closest to the axis A2) to process first, and then proceeds in succession with the next closest, etc. In other words, the controller 72 processes the trash T in an order to collect all trash that is identified in the images while still moving the vehicle 12, if possible.
Referring to
The controller 72 is configured to operate the drive 16 to move the vehicle 12 autonomously along the ground G while collecting the trash T and to cover the predefined area 78 for trash T removal. A user may initiate and/or cease such autonomous movement via the user interface UI, or it may be initiated/stopped at preset times each day, or according to a predefined schedule. In some versions, instead of being a fully autonomous vehicle 12, the user interface UI may include remote control operation of the vehicle 12, including remote control of the speed and steering of the vehicle 12.
Referring to
In step 102, the vehicle 12 is moved along the ground G, e.g., autonomously, via remote control, etc. and may be moved initially at a constant speed. In step 104, one or more images of the trash T are captured while the vehicle 12 moves along the ground G. The trash T is then identified and the coordinates (ximage, yimage) of the trash T in the camera coordinate system are determined in steps 106 and 108 using one or more of the object localization and recognition algorithms previously mentioned. The current coordinates (x1, y1) of the vacuum head 48 are retrieved from memory in step 110. The controller 72 computes required travel (Δx, Δy) of the vacuum head 48 in step 112 to reach the target position (xtaget, ytarget). The time (tsuction) to activate the vacuum pump/motor 42 is calculated in step 114 and the velocity (vhead) of the vacuum head 48 needed to reach the target coordinates in time (tsuction) to collect the trash T is calculated in step 116.
In step 118, the controller 72 determines if the calculated velocity (vhead) exceeds a maximum velocity of the vacuum head 48 (vmax). If not, then the controller 72 commands the rail motor 66 to operate at the target velocity (vhead) until the time (tsuction) is reached. This results in the rail motor 66 moving the vacuum head 48 to the target position (xtarget, ytaget) at the target velocity (vhead) in step 120. The vacuum pump/motor 42 is commanded in step 122 to activate at the time (tsuction) when the target position (xtaget, ytarget) is reached. If the controller 72 determines that the calculated velocity (vhead) does exceed the maximum velocity (vmax) of the vacuum head 48, the controller 72 may move the vacuum head 48 at the maximum velocity (vmax) in step 124 and then, in step 126, recompute the time (tsuction) using vmax. In step 128, the controller 72 can then compute a new target robot velocity (vrobot) and control the drive motors 24a, 24b accordingly to move the vehicle 12 at the new target velocity (vrobot) in step 130. The method then continues back to step 122 to activate the vacuum pump/motor 42. In some versions, the vacuum pump/motor 42 can be activated/operational a predetermined amount of time (e.g., 1, 2, 3 seconds or more) before and/or after the time is reached to ensure trash collection. When not active/operational, the vacuum pump/motor 42 may be turned off to avoid additional sand, dirt, or other material from being collected from the ground G.
In some versions, during each frame of operation of the identification module M1 and the behavior module M2, a single piece of trash T is identified and computations made to determine the required travel of the vacuum head 48 and/or vehicle 12 to reach the target position (xtarget, ytarget) and the time (tsuction) to activate the vacuum pump/motor 42 before processing a subsequent piece of trash T present in the same image. In some versions, in one frame of operation of the identification module M1, target positions are determined for all of the pieces of trash T captured in the one or more images taken at the first time (t1). The behavior module M2 then determines the appropriate sequence of movements of the vacuum head 48 and/or the vehicle 12 to reach all of the pieces of trash T, and the appropriate sequence of times (tsuction) to activate the vacuum pump/motor 42, which are then executed in the manner described herein. In some versions, the machine vision unit 32 may operate in a batch manner to capture images periodically based on the movement of the vehicle 12, e.g., to avoid significant overlap in the one or more images that are captured. In other words, once one or more images are captured of an area on the ground G, then the machine vision unit 32 waits until the vehicle 12 traverses a distance substantially equal to a dimension of the field of view FOV and then activates the machine vision unit 32 to capture another set of one or more images to determine the trash T to be collected. In some versions, images are constantly being captured at a predetermined capture rate.
In step 132, the controller 72 evaluates the last time that the trash canister 46 was dumped, i.e., the last time the movable bottom 54 was opened. If the elapsed time is greater than a predefined threshold time, the controller 72 operates the dump motor 56 in step 134 to empty the contents of the trash canister 46 that have passed through the screens 50, 52. In some versions, the controller 72 is configured to operate the dump motor 56 if a weight/pressure sensor detects a load over a predefined threshold load.
In step 136, the controller 72 determines if the vehicle 12 has completed traversing a leg of the travel path PATH and requires turning. If so, in step 138, the controller 72 operates the drive motors 24a, 24b as needed to turn the vehicle 12. In some versions, the vehicle 12 may have a steering system to enable such maneuvers, in which case the steering system would be controlled in step 138 to make the turn.
In step 140, the controller 72 determines how much time has elapsed since the last time that the controller 72 performed calibration of the actuator 58. If the elapsed time is greater than a predetermined threshold time, then the method continues with a new calibration at step 100, otherwise the method continues moving the vehicle 12 at step 102.
Several embodiments have been discussed in the foregoing description. However, the embodiments discussed herein are not intended to be exhaustive or limit the invention to any particular form. The terminology which has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations are possible in light of the above teachings and the invention may be practiced otherwise than as specifically described.
Number | Name | Date | Kind |
---|---|---|---|
3621919 | Olson | Nov 1971 | A |
5199996 | Jonas | Apr 1993 | A |
5309592 | Hiratsuka | May 1994 | A |
5317783 | Williamson | Jun 1994 | A |
20110004342 | Knopow | Jan 2011 | A1 |
20120260944 | Martins, Jr. | Oct 2012 | A1 |
Number | Date | Country |
---|---|---|
205421127 | Aug 2016 | CN |
115323973 | Nov 2022 | CN |
WO-2014111898 | Jul 2014 | WO |
Entry |
---|
Website Printout: https://yellrobot.com/beach-cleaning-robots-thailand/, published Sep. 27, 2018; pages of website downloaded on Jan. 6, 2021; 5 pages. |
Website Printout: http://www.en.psu.ac.th/international/international-news/701-first-thai-beach-cleaning-robot-developed-by-pttep-and-psu/; pages of website downloaded on Jan. 6, 2021; 8 pages. |
Website Printout: https://en.beach-trotters.com/products/kangur-12; pages of website downloaded on Jan. 6, 2021; 5 pages. |
YouTube Video entitled “Maquina limpieza playas”, https://www.youtube.com/watch?v=0slvB5hZCJw, published Nov. 4, 2010. |
Website Printout: https://makezine.com/2016/05/20/learnings-robot-shoreline/, published May 20, 2016; pages of website downloaded on Jan. 6, 2021; 18 pages. |
Website Printout: https://www.beach-tech.com/ita/en/models/hotel-and-lake-beaches/beachtech-sweepy.html?keyword=&device=c&network=g&gclid=EAlalQobChMl4vPioYeF7glVj4bACh08EwowEAAYASAAEgL8byD BwE; pages of website downloaded on Jan. 6, 2021; 11 pages. |
Website Printout: https://www.dronyx.com/solarino-beach-cleaner/; pages of website downloaded on Jan. 6, 2021; 13 pages. |
YouTube Video entitled “Technical Video for Solarino Beach Cleaner Robot”, https://www.youtube.com/watch?v=17FHbc67iQY, published Apr. 30, 2015. |
Website Printout: https://ottawa.citynews.ca/local-news/high-hopes-for-ottawa-beach-cleaning-robot-965558, published Jun. 25, 2018; pages of website downloaded on Jan. 6, 2021; 3 pages. |
YouTube Video entitled “Beach Cleaning—Beach Cleaner—Compact Tractor Beach Cleaning Machine”, https://www.youtube.com/watch?v=MA-g5ezAcRk, published May 30, 2020. |
YouTube Video entitled “RF Controlled Beach Cleaner Robotic Vehicle”, https://www.youtube.com/watch?v=tiD65VBrrRM, published Jun. 29, 2018. |
YouTube Video entitled “Sand Cleaning Robot”, https://www.youtube.com/watch?v=9sbQN2VB9DI, published Jun. 6, 2013. |
Schmoeller da Roza, Felippe; Ghizoni da Silva, Vinicius; Pereira, Patrick Jose: and Bertol, Douglas Wildgrube. Modular Robot Used as a Beach Cleaner, I Will Engineer, Chilean Engineering Journal, Mar. 7, 2016, pp. 643-653, vol. 24. No. 4. Department of Automation and Systems, Federal University of Santa Catarina, Brazil. |
Al Enezi Nouf F.: Al Ajmi, Omar K.; Al Sharhan, Sarah A.: and Khudada, Sarah T., Autonomous Beach Cleaner, Department of Electrical and Computer Engineering, ELEG/CPEG 480-Capstone Design Project II, May 15, 2019, 166 pages, American University of Kuwait. |
Dromedar Beach Cleaner. Unicorn Beach Cleaners, 6 pages. Beach Trotters S.L., C. Joan Güell 19, P.O. Box: 207, 43830 Torredembarra, Tarragona, Spain. |
Number | Date | Country | |
---|---|---|---|
20220097236 A1 | Mar 2022 | US |