This disclosure relates to systems and methods for operating a refuse collection vehicle to engage a refuse container.
Refuse collection vehicles have been used for generations for the collection and transfer of waste. Traditionally, collection of refuse with a refuse collection vehicle required two people: (1) a first person to drive the vehicle and (2) a second person to pick up containers containing waste and dump the waste from the containers into the refuse collection vehicle. Technological advances have recently been made to reduce the amount of human involvement required to collect refuse. For example, some refuse collection vehicles include features that allow for collection of refuse with a single operator, such as mechanical or robotic lift arms.
Many aspects of the disclosure feature operating a mechanical lift arm and fork assembly to perform refuse collection.
In an example implementation, a refuse collection vehicle includes a fork assembly that is operable to engage one or more fork pockets of a refuse container, a lift arm that is operable to lift a refuse container, and at least one sensor that is configured to collect data indicating a position of the one or more fork pockets of the refuse container. A position of at least one of the fork assembly or the lift arm is adjusted in response to the data collected by the at least one sensor.
In an aspect combinable with the example implementation, adjusting the position of at least one of the fork assembly or the lift arm in response to the data collected by the at least one sensor includes adjusting a relative positioning of the lift arm.
In another aspect combinable with any of the previous aspects, adjusting the position of at least one of the fork assembly or the lift arm in response to the data collected by the at least one sensor includes adjusting an angular position of one or more forks of the fork assembly.
In another aspect combinable with any of the previous aspects, adjusting the position of at least one of the fork assembly or the lift arm in response to the data collected by the at least one sensor includes aligning one or more ends of one or more forks of the fork assembly with the position of the one or more fork pockets.
In another aspect combinable with any of the previous aspects, adjusting the position of at least one of the fork assembly or the lift arm in response to the data collected by the at least one sensor includes aligning the center of one or more ends of one or more forks of the fork assembly with the center of the one or more fork pockets.
In another aspect combinable with any of the previous aspects, the at least one sensor is a camera.
In another aspect combinable with any of the previous aspects, the at least one sensor is an analog ultrasonic sensor.
Another aspect combinable with any of the previous aspects further includes at least one sensor that is arranged to collect data indicating an angular position of the fork assembly, at least one sensor that is arranged to collect data indicating a relative positioning of the lift arm, and an onboard computing device coupled to the at least one sensor arranged to collect data indicating an angular position of the fork assembly and the at least one sensor arranged to collect data indicating a relative positioning of the lift arm.
In another aspect combinable with any of the previous aspects, adjusting the position of at least one of the fork assembly or the lift arm in response to the data collected by the at least one sensor includes determining, by the onboard computing device, a relative positioning of the lift arm based on data provided by the at least one sensor arranged to collect data indicating a relative positioning of the lift arm, determining, by the onboard computing device, a height of one or more ends of one or more forks of the fork assembly based on the relative positioning of the lift arm, determining, by the onboard computing device, an amount and a direction of travel of the lift arm required to align the one or more ends of the one or more forks with the one or more fork pockets based on the height of the one or more ends of the one or more forks and the position of the one or more fork pockets, and moving the lift arm in the determined amount and the determined direction of travel.
In another aspect combinable with any of the previous aspects, adjusting the position of at least one of the fork assembly or the lift arm in response to the data collected by the at least one sensor includes determining, by the onboard computing device, an angular position of one or more forks of the fork assembly based on data provided by the at least one sensor arranged to collect data indicating angular position of the fork assembly, determining, by the onboard computing device, an amount and a direction of rotation of the fork assembly required to align the one or more forks of the fork assembly with the fork pockets based on the angular position of one or more forks of the fork assembly and the position of the one or more fork pockets, and rotating the fork assembly in the determined amount and the determined direction of rotation.
Potential benefits of the one or more implementations described in the present specification may include increased waste collection efficiency and reduced operator error in refuse collection. The one or more implementations may also reduce the likelihood of damaging refuse containers and refuse collection vehicles during the refuse collection process.
It is appreciated that methods in accordance with the present specification may include any combination of the aspects and features described herein. That is, methods in accordance with the present specification are not limited to the combinations of aspects and features specifically described herein, but also include any combination of the aspects and features provided.
The details of one or more implementations of the subject matter described in this disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the subject matter will be apparent from the description and drawings, and from the claims.
The body components 104 of the vehicle 102 can include various components that are appropriate for the particular type of vehicle 102. For example, a garbage collection vehicle may be a truck with an automated side loader (ASL). Alternatively, the vehicle may be a front-loading truck, a rear loading truck, a roll off truck, or some other type of garbage collection vehicle. A vehicle with an ASL may include body components 104 involved in the operation of the ASL, such as an arm and/or grabbers, as well as other body components such as a pump, a tailgate, a packer, and so forth. A front-loading vehicle, such as the example shown in
The vehicle 102 can include any number of body sensor devices 106 that sense body component(s) 104 and generate sensor data 110 describing the operation(s) and/or the operational state of various body components. The body sensor devices 106 are also referred to as sensor devices, or sensors. Sensors may be arranged in the body components, or in proximity to the body components, to monitor the operations of the body components. The sensors 106 emit signals that include the sensor data 110 describing the body component operations, and the signals may vary appropriately based on the particular body component being monitored. Sensors may also be arranged to provide sensor data 110 describing the position of external objects, such as a refuse container.
Sensors 106 can be provided on the vehicle body to evaluate cycles and/or other parameters of various body components. For example, as described in further detail herein, the sensors 106 can detect and/or measure the particular position and/or operational state of body components such a lift arm, a fork assembly, and so forth.
Sensors 106 can include, but are not limited to, an analog sensor, a digital sensor, a CAN bus sensor, a magnetostrictive sensor, a radio detection and ranging (RADAR) sensor, a light detection and ranging (LIDAR) sensor, a laser sensor, an ultrasonic sensor, an infrared (IR) sensor, a stereo camera sensor, a three-dimensional (3D) camera, in-cylinder sensors, or a combination thereof.
In some implementations, the sensor data 110 may be communicated from the sensors to an onboard computing device 112 in the vehicle 102. In some instances, the onboard computing device is an under-dash device (UDU), and may also be referred to as the Gateway. Alternatively, the device 112 may be placed in some other suitable location in or on the vehicle. The sensor data may be communicated from the sensors to the onboard computing device 112 over a wired connection (e.g., an internal bus) and/or over a wireless connection. In some implementations, a Society of Automotive Engineers standard J1939 bus in conformance with International Organization of Standardization (ISO) standard 11898 connects the various sensors with the onboard computing device. In some implementations, a Controller Area Network (CAN) bus connects the various sensors with the onboard computing device. For example, a CAN bus in conformance with ISO standard 11898 can connect the various sensors with the onboard computing device. In some implementations, the sensors may be incorporated into the various body components. Alternatively, the sensors may be separate from the body components. In some implementations, the sensors digitize the signals that communicate the sensor data before sending the signals to the onboard computing device, if the signals are not already in a digital format.
The analysis of the sensor data 110 is performed at least partly by the onboard computing device 112, e.g., by processes that execute on the processor(s) 114. For example, the onboard computing device 112 may execute processes that perform an analysis of the sensor data 110 to determine the current position of the body components, such as the lift arm position or the fork assembly position. In some implementations, an onboard program logic controller or an onboard mobile controller perform analysis of the sensor data 110 to determine the current position of the body components 104.
The onboard computing device 112 can include one or more processors 114 that provide computing capacity, data storage 166 of any suitable size and format, and network interface controller(s) 118 that facilitate communication of the device 112 with other device(s) over one or more wired or wireless networks.
In some implementations, a vehicle includes a body controller that manages and/or monitors various body components of the vehicle. The body controller of a vehicle can be connected to multiple sensors in the body of the vehicle. The body controller can transmit one or more signals over the J1939 network, or other wiring on the vehicle, when the body controller senses a state change from any of the sensors. These signals from the body controller can be received by the onboard computing device 112 that is monitoring the J1939 network.
In some implementations, the onboard computing device 112 is a multi-purpose hardware platform. The device can include a under dash unit (UDU) and/or a window unit (WU) (e.g., camera) to record video and/or audio operational activities of the vehicle. The onboard computing device hardware subcomponents can include, but are not limited to, one or more of the following: a CPU, a memory or data storage unit, a CAN interface, a CAN chipset, NIC(s) such as an Ethernet port, USB port, serial port, I2c lines(s), and so forth, I/O ports, a wireless chipset, a global positioning system (GPS) chipset, a real-time clock, a micro SD card, an audio-video encoder and decoder chipset, and/or external wiring for CAN and for I/O. The device can also include temperature sensors, battery and ignition voltage sensors, motion sensors, CAN bus sensors, an accelerometer, a gyroscope, an altimeter, a GPS chipset with or without dead reckoning, and/or a digital can interface (DCI). The DCI cam hardware subcomponent can include the following: CPU, memory, can interface, can chipset, Ethernet port, USB port, serial port, I2c lines, I/O ports, a wireless chipset, a GPS chipset, a real-time clock, and external wiring for CAN and/or for I/O. In some implementations, the onboard computing device is a smartphone, tablet computer, and/or other portable computing device that includes components for recording video and/or audio data, processing capacity, transceiver(s) for network communications, and/or sensors for collecting environmental data, telematics data, and so forth.
In some implementations, one or more cameras 134 can be mounted on the vehicle 102 or otherwise present on or in the vehicle 102. The camera(s) 134 each generate image data 128 that includes one or more images of a scene external to and in proximity to the vehicle 102. In some implementations, one or more cameras 134 are arranged to capture image(s) and/or video of a container 130 before, after, and/or during the operations of body components 104 to engage and empty a container 130. For example, for a front-loading vehicle, the camera(s) 134 can be arranged to image objects in front of the vehicle 102. As another example, for a side loading vehicle, the camera(s) 134 can be arranged to image objects to the side of the vehicle, such as a side that mounts the ASL to lift containers. In some implementations, camera(s) 134 can capture video of a scene external to, internal to, and in proximity to the vehicle 102.
In some implementations, the camera(s) 134 are communicably coupled to a graphical display 120 to communicate images and/or video captured by the camera(s) 134 to the graphical display 120. In some implementations, the graphical display 120 is placed within the interior of the vehicle. For example, the graphical display 120 can be placed within the cab of vehicle 102 such that the images and/or video can be viewed by an operator of the vehicle 102 on a screen 122 of the graphical display 120. In some implementations, the graphical display 120 is a heads-up display that projects the images and/or video captured by the camera(s) 134 onto the windshield of the vehicle 102 for viewing by an operator of the vehicle 102. In some implementations, the images and/or video captured by the camera(s) 134 can be communicated to a graphical display 120 of the onboard computing device 112 in the vehicle 102. Images and/or video captured by the camera(s) 134 can be communicated from the sensors to the onboard computing device 112 over a wired connection (e.g., an internal bus) and/or over a wireless connection. In some implementations, a network bus (e.g., a J1939 network bus, a CAN network bus, etc.) connects the camera(s) with the onboard computing device 112. In some implementations, the camera(s) are incorporated into the various body components. Alternatively, the camera(s) may be separate from the body components.
One or more position sensors 106 can be situated to determine the state and/or detect the operations of the body components 104.
In the example shown, the vehicle 102 includes position sensors 106a, 106b that are arranged to detect the position of the lift arm 111 and/or the forks 113. For example, the position sensors 106a, 106b can provide data about the current position of the lift arm 111 and the fork 113, respectively, relative to the surface 190 on which the vehicle 102 is positioned, which, as described in further detail herein, can be used to determine any adjustments to the lift arm 111 position necessary to engage a refuse container 130.
Position sensors 106a, 106b can include, but are not limited to, an analog sensor, a digital sensor, a CAN bus sensor, a magnetostrictive sensor, a RADAR sensor, a LIDAR sensor, a laser sensor, an ultrasonic sensor, an infrared (IR) sensor, a stereo camera sensor, a three-dimensional (3D) camera, in-cylinder sensors, or a combination thereof.
In some implementations, the position sensors are located in one or more cylinders of the refuse collection vehicle 102. In some examples, a first position sensor 106a is located inside a cylinder 150 used for raising the lift arm 111 and a second position sensor (not shown) is located inside a cylinder used for moving the fork assembly 113 (not shown). In some implementations, position sensor 106a is located on the outside of a housings containing the cylinder 150 coupled to the lift arm 111. In some examples, the position sensors, such as sensor 106a, are in-cylinder, magnetostrictive sensors.
In some implementations, the position sensors (e.g., sensor 106a) include one or more radar sensors inside one or more cylinders of the lift arm 111 and/or fork assembly 113. In some examples, the position sensors coupled to a cylinder of the vehicle 102 (e.g., sensor 106a coupled to cylinder 150) include one or more proximity sensors coupled to a cross shaft of the lift arm 111.
The vehicle 102 also includes a fork assembly position sensor 106b arranged to detect the position of the fork assembly 113. For example, the fork assembly position sensor 106b can be used to detect the angle of the fork assembly 113 relative to the surface 190 on which the vehicle 102 is positioned. As described in further detail herein, the fork assembly position sensor 106b can be used to detect the angle of the fork assembly 113 as the vehicle 102 approaches a refuse container 130 to be emptied. Fork assembly position sensor 106b can include, but is not limited to, an analog sensor, a digital sensor, a CAN bus sensor, a magnetostrictive sensor, a RADAR sensor, a LIDAR sensor, a laser sensor, an ultrasonic sensor, an infrared (IR) sensor, a stereo camera sensor, a three-dimensional (3D) camera, in-cylinder sensors, or a combination thereof.
In some implementations, the distance 270 between the center of an end 126 of one or more forks 116 of the fork assembly 113 and the surface on which the vehicle 102 is located is determined by the one or more body sensors 106. For example, by determining the position of the lift arm 111 and the angle of the fork assembly 113 relative to the surface 190 on which the vehicle 102 is positioned, the distance 270 the center of an end 126 of one or more forks 116 of the fork assembly 113 and the surface 190 on which the vehicle 102 is positioned can be determined.
As depicted in
In some examples, as depicted in
A computing device can receive one or more images from the camera container detection sensor 160 and process the one or more images using machine learning based image processing techniques to detect the presence of a refuse container 130 in the one or more images. For example, sensor 160 can be a camera, and images and/or video captured by the sensor 160 can be provided to a computing device, such as onboard computing device 112, for image processing. In some implementations, a computing device can receive an image from container detection sensor 160 and determine, based on machine learning image processing techniques, that the vehicle 102 is positioned within a sufficient distance to engage a refuse container 130. In some implementations, a video feed of the refuse container 130 is provided by the sensor 160 and transmitted to a computing device for machine learning based image processing techniques to detect the presence of a refuse container 130.
The data captured by sensor 160 can be further processed by the onboard computing device 112 to determine the location of various components of the detected refuse container 130. In some implementations, a computing device 112 receives images or video captured by the sensor 160 and uses machine learning based image processing techniques to determine the position of one or more fork pockets 180 on a refuse container 130. In some implementations, images captured by the sensor 160 are processed by a computing device 112 to detect the sides of one or more fork pockets 180 to determine one or more dimensions of each of the fork pockets 180, such as the height and width of each of the fork pockets 180. In some examples, a computing device can process images provided by sensor 160 to determine a location of one or more corners of the one or more fork pockets 180 of a detected refuse container 130. The detected corners of the fork pockets 180 can be provided as GPS coordinates, and based on these coordinates, the height and angular position of the fork pockets 180 relative to the surface 190 on which the vehicle 102 is positioned can be determined.
Once the position of the fork pockets 180 of a refuse container 130 are determined based on the image data captured by sensor 160, a signal conveying the position of the fork pockets 180 is transmitted to an onboard computing device 112 of the vehicle 102. In some implementations, the position of the fork pockets 180 is provided as GPS coordinates identifying the coordinates of the corners of each of the fork pockets 180. In some examples, the position of the fork pockets is provided as a height of the fork pockets relative to the surface 190 on which the vehicle 102 is positioned. In some implementations, the position of the fork pockets is provided as a height of the center of the fork pockets relative to the surface 190 on which the vehicle 102 is positioned.
In some implementations, the container detection sensor 160 includes one or more optical sensors. For example, container detection sensor 160 can include one or more analog ultrasonic sensors. In some implementations, container detection sensor 160 is an ultrasonic sensor and is configured to detect the presence of one or more fork pockets 180 of a refuse container 130. In some examples, container detection sensor 160 is an ultrasonic sensor and is configured to detect the height of the center of one or more fork pockets 180 relative to the surface 190 on which the vehicle is positioned. In some examples, container detection sensor 160 is an ultrasonic sensor and is configured to detect the angular position of one or more fork pockets 180 relative to the surface 190 on which the vehicle is positioned.
In some implementations, container detection sensor 160 transmits a signal conveying data indicating the position of the fork pockets 180 to an onboard computing device 112 of the vehicle 102. In some examples, container detection sensor 160 transmits a signal conveying data indicating the height of the center of one or more fork pockets 180 relative to the surface 190 on which vehicle 102 is positioned. In some implementations, onboard computing device 112 receives the data from an ultrasonic container sensor 160 and determines the position of the fork pockets 180 based on the data received from the sensor 160.
Upon receiving data describing the position of one or more fork pockets 180 of a refuse container 130 proximate the vehicle 102 collected by one or more container detection sensors 160, the position of the lift arm 111 and the fork assembly 113 of the vehicle 102 can be automatically adjusted to engage the detected refuse container 130. For example, the position of the lift arm 111 and the fork assembly 113 of the vehicle 102 can be automatically adjusted to align one or more ends 126 of the forks 116 of the fork assembly 113 with the detected fork pockets 180 of the detected refuse container 130. For example, the position of the lift arm 111 and the fork assembly 113 of the vehicle 102 can be automatically adjusted to align the height of the center of one or more ends 126 of the forks 116 of the fork assembly 113 with the height of the center of the detected fork pockets 180 of the detected refuse container 130. As previously discussed, the current position of the lift arm 111 and the angle of the fork assembly 113 relative to the surface 190 on which the vehicle 102 is positioned are determined based on data received from the body sensors 106. Based on this determination, the distance 270 between a center of the ends 126 of the forks 116 of fork assembly 113 and the surface 190 on which the vehicle 102 is located can be determined. In some examples, the computing device determines the GPS coordinates of the one or more ends 126 of the forks 116 of the fork assembly 113 based on data provided by the body sensors 106.
The computing device 112 can compare the position of the one or more ends of the forks 116 of the fork assembly 113 with the position of the one or fork pockets 180 of the refuse container 130 to determine adjustments to the lift arm 111 position and the fork assembly 113 angle necessary to align the forks 116 of the fork assembly 113 with the fork pockets 180. For example, the onboard computing device determines the adjustments to the lift arm 111 position and fork assembly 113 angle necessary to align the height of the center of the ends 126 of the forks 116 with the height of the center of the fork pockets 180. In some implementations, the onboard computing device determines the adjustments to the lift arm 111 position and fork assembly 113 angle necessary to align the center of the ends 126 of the forks 116 with the center of the fork pockets 180.
In
As depicted in
As depicted in
As depicted in
In some examples, both the position of the lift arm 111 and the angle of the fork assembly 113 are adjusted in response to the onboard computing device 112 receiving data indicating a position of one or more fork pockets 180. For example, the position of the lift arm 111 and the angle of the fork assembly 113 can both be adjusted to accommodate for differences in both the height and the angle between the position of the forks 116 and the position of the fork pockets 180.
The automatic positioning of the body components 104 based on fork pocket 180 position data can be conducted automatically with minimal or no operator involvement. For example, the position of the lift arm 111 and the position of the fork assembly 113 can be automatically adjusted in response to the onboard computing device 112 receiving data indicating the position of one or more fork pockets 180. In some examples, the position of the lift arm 111 and the position of the fork assembly 113 are automatically adjusted based on receiving data indicating the position of one or more fork pockets 180 and in response to an operator of the vehicle manually engaging a switch to initiate a dump cycle. In some implementations, the switch to initiate the dump cycle is provided as one or more foot pedals positioned on the floorboard of the vehicle 102. U.S. patent application Ser. No. 16/781,857 filed Feb. 4, 2020 discloses foot pedals for initiating and controlling a dump cycle. The entire content of U.S. patent application Ser. No. 16/781,857 is incorporated by reference herein.
In some implementations, the position of the container 130, as determined based on the position of the fork pockets 180, at the time the dump cycle is initiated (“pick position”) is recorded by the onboard computing device 112. At the end of the dump cycle, the container 130 can be automatically returned to a position that is within 1 inch of the recorded pick position. U.S. patent application Ser. No. 16/781,857 filed Feb. 4, 2020 discloses systems and methods for recording and returning refuse containers to pre-recorded pick positions. The entire content of U.S. patent application Ser. No. 16/781,857 is incorporated by reference herein.
While this specification contains many specifics, these should not be construed as limitations on the scope of the disclosure or of what may be claimed, but rather as descriptions of features specific to particular implementations. Certain features that are described in this specification in the context of separate implementations may also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation may also be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some examples be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, various forms of the flows shown above may be used, with steps re-ordered, added, or removed. Accordingly, other implementations are within the scope of the following claim(s).
This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Patent Application No. 62/837,595, entitled “Refuse Collection Vehicle Positioning,” filed Apr. 23, 2019, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62837595 | Apr 2019 | US |