Refuse vehicles collect a wide variety of waste, trash, and other material from residences and businesses. Operators of the refuse vehicles transport the material from various waste receptacles within a municipality to a storage or processing facility (e.g., a landfill, an incineration facility, a recycling facility, etc.).
One embodiment of the present disclosure relates to refuse vehicle, comprising a chassis, a body assembly coupled to the chassis, the body assembly defining a refuse compartment, one or more sensors coupled to the body and configured to provide data relating to the presence of an obstacle within an area near the refuse vehicle, and a controller configured to receive the data from the one or more sensors, determine, using an obstacle detector and the data, the presence of an obstacle within the area, and initiate a control action, wherein the control action includes at least one of controlling the movement of the refuse vehicle, controlling the movement of a lift assembly attached to the body assembly, or generating an alert.
Another implementation of the present disclosure relates to a refuse vehicle comprising a chassis, a body assembly coupled to the chassis, the body assembly defining a refuse compartment, one or more sensors coupled to the body assembly and configured to provide data relating to the presence of an obstacle within a defined proximity of the refuse vehicle, wherein the defined proximity of the refuse vehicle is a portion of area around the refuse vehicle that cannot be seen by an operator of the refuse vehicle, and a controller configured to receive the data from the one or more sensors, determine, using an obstacle detector and the data, the presence and at least one of a position, a speed, or a direction of travel of an obstacle within the blind spot, and initiate a control action based on at least one of the presence, position, speed, or direction of travel of the obstacle, wherein the control action includes at least one of controlling the movement of the refuse vehicle, controlling the movement of a lift assembly attached to the body assembly, or generating an alert.
Yet another implementation of the present disclosure relates to a refuse vehicle comprising a chassis, a body assembly coupled to the chassis, the body assembly defining a refuse compartment, one or more sensors coupled to the body and configured to provide data relating to the presence of an obstacle within a defined proximity of the refuse vehicle, wherein the defined proximity of the refuse vehicle is a portion of area around the refuse vehicle that cannot be seen by an operator of the refuse vehicle and a controller configured to receive the data from the one or more sensors, determine, using an obstacle detector and the data, the presence and at least one of a position, a speed, or a direction of travel of an obstacle within the defined proximity of the refuse vehicle, classify, based on an output of the obstacle detector, the obstacle based on a determination regarding at least one of a position, speed, or direction of travel of the obstacle, associate a risk with the obstacle, the risk based on a determination regarding at least one of the position, speed, or direction of travel of the obstacle, generate, based on at least one of the presence, position, class, or the risk associated with the obstacle an alert, and initiate a control action based on at least one of the presence, position, speed, or direction of travel of the obstacle, wherein the control action includes at least one of controlling the movement of the refuse vehicle or controlling the movement of the lift assembly.
This summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices or processes described herein will become apparent in the detailed description set forth herein, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements.
Before turning to the figures, which illustrate certain exemplary embodiments in detail, it should be understood that the present disclosure is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology used herein is for the purpose of description only and should not be regarded as limiting.
According to an exemplary embodiment, a refuse vehicle includes a spatial awareness system configured to detect obstacles around the vehicle. The system includes various sensors and cameras positioned on the vehicle to provide the system with data necessary to determine the presence and/or the motion of an obstacle. The sensors detect obstacles around the vehicle and within operator blind spots. The system provides alerts based on the detected obstacles. The alerts may notify the operator of the detected obstacle and/or the obstacle of the vehicle.
Overall Vehicle
As shown in
In one embodiment, the refuse vehicle 10 is a completely electric refuse vehicle. For example, motor 18 includes one or more electric motors coupled to frame 12 (e.g., a hybrid refuse vehicle, an electric refuse vehicle, etc.). In other embodiments, the refuse vehicle 10 includes an internal combustion generator that utilizes one or more fuels (e.g., gasoline, diesel, propane, natural gas, hydrogen, etc.) to generate electricity to power motor 18, power actuators, and/or power the other accessories (e.g., a hybrid refuse vehicle, etc.). For example, the refuse vehicle 10 may have an electric motor augmented by motor 18 (e.g., a combustion engine) to cooperatively provide power to wheels 19 and/or other systems of the refuse vehicle 10. In other embodiments, the refuse vehicle 10 may consume electrical power from an external power source (e.g., overhead power lines, etc.) and provide power to the systems of the refuse vehicle 10.
As shown in
According to the exemplary embodiments shown in
As shown in
As shown in
The attachment assembly 58 may be coupled to the lift arms 52 of the front-lift assembly 40. The attachment assembly 58 is configured to engage with a first attachment, shown as refuse container 60, to selectively and releasably secure refuse container 60 to the front-lift assembly 40. As denoted herein, refuse container 60 may include any type of residential, commercial, or industrial refuse can. Refuse container 60 may also be a first lift container attachment 60. In some embodiments, the attachment assembly 58 is configured to engage with a second attachment, such as a fork attachment (not shown), to selectively and releasably secure second attachment to the front-lift assembly 40. In some embodiments, the attachment assembly 58 is configured to engage with another type of attachment (e.g., a street sweeper attachment, a snowplow attachment, a snow blower attachment, a towing attachment, a wood chipper attachment, a bucket attachment, a cart tipper attachment, a grabber attachment, etc.).
According to an exemplary embodiment shown in
The grabber assembly 42 is shown to include a pair of actuators, shown as actuators 44. The actuators 44 are configured to releasably secure a refuse can to the grabber assembly 42, according to an exemplary embodiment. The actuators 44 are selectively repositionable (e.g., individually, simultaneously, etc.) between an engaged position or state and a disengaged position or state. In the engaged position, the actuators 44 are rotated towards one other such that the refuse can may be grasped there between. In the disengaged position, the actuators 44 rotate outwards (e.g., as shown in
In operation, the refuse vehicle 10 may pull up alongside the refuse can, such that the refuse can is positioned to be grasped by the grabber assembly 42 therein. The grabber assembly 42 may then transition into an engaged state to grasp the refuse can. After the refuse can has been securely grasped, the grabber assembly 42 may be transported along the track 20 (e.g., by an actuator) with the refuse can. When the grabber assembly 42 reaches the end of track 20, grabber assembly 42 may tilt and empty the contents of the refuse can into the refuse compartment 30. The tilting is facilitated by the path of track 20. When the contents of the refuse can have been emptied into refuse compartment 30, grabber assembly 42 may descend along track 20 and return the refuse can to the ground. Once the refuse can has been placed on the ground, the grabber assembly 42 may transition into the disengaged state, releasing the refuse can.
According to an exemplary embodiment as shown in
The carriage 26 is slidably coupled to the track 20. In operation, the carriage 26 may translate along a portion or all of the length of the track 20. The carriage 26 is removably coupled (e.g., by removable fasteners) to a body or frame of the grabber assembly 42, shown as grabber frame 46. Alternatively, the grabber frame 46 may be fixedly coupled to (e.g., welded to, integrally formed with, etc.) the carriage 26. The actuators 44 are each pivotally coupled to the grabber frame 46 such that they rotate about a pair of axes 45. The axes 45 extend substantially parallel to one another and are longitudinally offset from one another. In some embodiments, one or more actuators configured to rotate the actuators 44 between the engaged state and the disengaged state are coupled to the grabber frame 46 and/or the carriage 26.
According to an exemplary embodiment shown in
The second sidewall 240 of the refuse can 202 defines a cavity, shown as recess 242. The collection arm assembly 270 is coupled to the refuse can 202 and may be positioned within the recess 242. In other embodiments, the collection arm assembly 270 is otherwise positioned (e.g., coupled to the rear wall 214, coupled to the first sidewall 230, coupled to the front wall 210, etc.). According to an exemplary embodiment, the collection arm assembly 270 includes an arm, shown as arm 272; a grabber assembly, shown as grabber 276, coupled to an end of the arm 272; and an actuator, shown as actuator 274. The actuator 274 may be positioned to selectively reorient the arm 272 such that the grabber 276 is extended laterally outward from and retracted laterally inward toward the refuse can 202 to engage (e.g., pick up, etc.) a refuse can (e.g., a garbage can, a reclining bin, etc.) for emptying refuse into the container refuse compartment 260.
As shown in
Spatial Awareness System
According to an exemplary embodiment shown in
The memory 414 may be any volatile or non-volatile computer-readable storage medium capable of storing data or computer code relating to the activities described herein. The memory 414 may include one or more devices (e.g., memory units, memory devices, storage devices, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. Memory 414 may include random access memory (RAM), read-only memory (ROM), hard drive storage, temporary storage, non-volatile memory, flash memory, optical memory, or any other suitable memory for storing software objects and/or computer instructions. Memory 414 may include computer code modules (e.g., executable code, object code, source code, script code, machine code, etc.) configured for supporting the various activities and information structures described herein. The memory 414 may be communicably connected to processor 412 via processing circuit 410 and may include computer code for executing (e.g., by processor 412) one or more of the processes described herein.
According to the exemplary embodiment shown in
The sensor(s) 422 may be disposed at any number of locations throughout and/or around the refuse vehicle 10 for capturing image and/or object data from any direction with respect to the refuse vehicle 10. For example, sensor(s) 422 may include a plurality of visible light cameras, radar sensors, and LIDAR cameras/sensors mounted on the forward and lateral sides of the refuse vehicle 10 for capturing data as the refuse vehicle 10 moves down a path (e.g., a roadway). In some embodiments, one or more of sensor(s) 422 may be located on an attachment utilized by the refuse vehicle 10, such as container attachment 60 described above. It should be understood that sensor(s) 422 may be positioned anywhere on the refuse vehicle 10.
According to the exemplary embodiment shown in
In some embodiments, the obstacle detector 416 classifies detected obstacles based at least in part on the data received from the sensor(s) 422. For example, obstacle detector 416 may classify obstacles as static obstacles or dynamic obstacles depending on their motion. For example, the obstacle detector 416 may classify a moving vehicle as a dynamic obstacle and a parked vehicle as a static obstacle. In some embodiments, the obstacle detector 416 determines a subclass of an obstacle. For example, the obstacle detector 416 may determine that a dynamic obstacle is a person, and that a static obstacle is a refuse container. In some embodiments, the obstacle detector 416 determines a risk associated with the obstacle. For example, the obstacle detector 416 may classify a high-speed obstacle as high risk and a low-speed obstacle as low risk.
In some embodiments, the obstacle detector 416 is configured to generate a safety zone around a refuse vehicle. For example, the obstacle detector 416 may establish a safety zone of two feet around the perimeter of the refuse vehicle 10. In some embodiments, the safety zone may extend to the outer range limit of the sensor(s) 422. In some other embodiments, the safety zone may only include the refuse vehicle 10 and its immediate area. In some embodiments, the safety zone may be set by an operator of the refuse vehicle 10. In some embodiments, the safety zone may extend only partially around the refuse vehicle 10. For example, referring now to
In some embodiments, the safety zone dynamically changes based on aspects of the refuse vehicle 10 and/or its surroundings. For example, the safety zone may extend 60 ft. in front of the refuse vehicle 10 when it is traveling at highway speeds, and adjust to just 20 feet in front of the refuse vehicle 10 when traveling at low speeds. In some embodiments, the obstacle detector 416 is configured to only detect obstacles within the safety zone. In some embodiments, the obstacle detector 416 detects obstacles both within and without of the safety zone.
In some embodiments, the safety zone changes based on detected obstacles. For example, the safety zone may extend to cover a refuse container when a refuse container is detected by the obstacle detector 416. In some embodiments, the safety zone may change based on inputs from an operator of the refuse vehicle 10. For example, referring now to
In some embodiments, the obstacle detector 416 is configured to generator a trajectory for the refuse vehicle 10 or its systems. For example, the obstacle detector 416 may determine the path of the front-lift assembly 40 and use the sensor(s) 422 to detect obstacles within said path. The obstacle detector 417 may generate the trajectory based on preinstalled information regarding the refuse vehicle 10 and its systems. In some embodiments, obstacle detector 416 generates the trajectory based on data collected by the sensor(s) 422. In some embodiments, the trajectory falls within the safety zone. In some embodiments, the trajectory covers only the safety zone. In some embodiments, the safety zone and trajectory both comprise the path of refuse container 60 and front-lift assembly 40. For example, the obstacle detector 416 may detect obstacles within the trajectory and/or safety zone and provide an indication of the presence of the obstacle.
According to the exemplary embodiment shown in
In some embodiments, the alert module 418 alerts the obstacle of the refuse vehicle 10. For example, the alert module 418 may generate an audio warning for an obstacle determined to be a pedestrian detected in a blind spot of the refuse vehicle 10. For further example, in addition and/or alternatively to the audio warning the alert module 418 may generate a visual warning (e.g., flashing lights) to alert a pedestrian of the refuse vehicle 10. In some embodiments, the alert module 418 generates an alert for an operator of the refuse vehicle 10 that is outside of cab 16. For example, the alert module 418 may generate an audio alert for an approaching high-risk obstacle to warn an operator of its approach.
In some embodiments, the alert module 418 initiates, additionally or alternatively to generating an alert, a control action which controls the movement of the refuse vehicle 10 and its various systems in order to avoid the obstacle. For example, referring now to
According to the exemplary embodiment shown in
According to the exemplary embodiment shown in
The processing circuit 510 can be communicably connected to a network interface 526 and an input/output (I/O) interface 524, such that the processing circuit 510 and the various components thereof can send and receive data via the interfaces 524 and 526. In some embodiments, the controller 500 is communicably coupled with a network 528 via the network interface 526, for transmitting and/or receiving data from/to network-connected devices. The network 528 may be any type of network (e.g., intranet, Internet, VPN, a cellular network, a satellite network, etc.) that allows the controller 500 to communicate with other remote systems. For example, the controller 500 may communicate with a server (i.e., a computer, a cloud server, etc.) to send and receive information regarding operations of controller 500 and/or the refuse vehicle 10.
The network interface 526 may include any type of wireless interface (e.g., antennas, transmitters, transceivers, etc.) for conducting data communications with the network 528. In some embodiments, the network interface 526 includes a cellular device configured to provide the controller 500 with Internet access by connecting the controller 500 to a cellular tower via a 2G network, a 3G network, an LTE network, a 5G network, etc. In some embodiments, the network interface 526 includes other types of wireless interfaces such as Bluetooth, Wi Fi, ZigBee, etc.
In some embodiments, the controller 500 receives over-the-air (OTA) updates or other data from a remote system (e.g., a server, a computer, etc.) via the network 528. The OTA updates may include software and firmware updates for the controller 500 for example. Such OTA updates may improve the robustness and performance on the controller 500. In some embodiments, the OTA updates may be receive periodically to keep the controller 500 up-to-date.
In some embodiments, the controller 500 is communicably coupled to any number of subsystems and devices of the refuse vehicle 10 via I/O interface 524. The I/O interface 524 may include wired or wireless interfaces (e.g., antennas, transmitters, transceivers, wire terminals, etc.) for conducting data communications with subsystems and/or devices of the refuse vehicle 10. In some embodiments, the I/O interface 524 includes a Controller Area Network (CAN) bus, a Local Interconnect Network (LIN) bus, a Media Oriented Systems Transport (MOST) bus, an SAE J1850 bus, an Inter-Integrated Circuit (I2C) bus, etc., or any other bus commonly used in the automotive industry. As shown in
The vehicle systems 534 shown in
The lift assembly 536 show in
According to the exemplary embodiments shown in
According to the exemplary embodiment shown in
According to the exemplary embodiment shown in
As shown in
According to the exemplary embodiment shown in
In some embodiments, the sensor(s) 612 are configured to detect obstacles such as pedestrian 908. The sensor(s) 612 are positioned on a rearward portion of the refuse vehicle 10. For example, the sensor(s) 612 may be positioned on the sides of tailgate 34. Additionally or alternatively, the sensor(s) 612 may be positioned elsewhere. For example, the sensor(s) 612 may be positioned on a top of the refuse vehicle 10 It should be understood that the sensor(s) 612 may be positioned anywhere on the refuse vehicle 10. In some embodiments, the sensor(s) 612 are integrated with controller described above with reference to
In brief summary, a refuse vehicle 10 with spatial awareness may operate according to the following example illustrated in scenario 900. An operator of the refuse vehicle 10 puts the refuse vehicle 10 in a reverse gear, and in response, the controller (e.g., controller 500 not shown) and sensor(s) (e.g., sensors 422 or sensor(s) 520) shown as sensor(s) 612, activate. The sensor(s) 612 collect data that may indicate the presence of obstacles around the refuse vehicle 10 and send the data to the controller. In some embodiments, the controller is configured to classify the obstacles. For example, the obstacle detector 516 of the controller 500 may classify an obstacle as a static obstacle or a dynamic obstacle. In some embodiments, the controller determines a sub-classification for an obstacle. For example, the obstacle detector 516 may determine obstacle 908 is moving and therefore a dynamic obstacle, and further that its subclass is a pedestrian. In some embodiments, the spatial awareness system reclassifies an obstacle after a change in an aspect of the obstacle. For example, a dynamic obstacle that comes to a stop may be reclassified as a static obstacle. In some embodiments, the spatial awareness system determines a risk associated with the obstacle. For example, the spatial awareness system may highlight a medium risk obstacle in a yellow box on a user display and highlight a high-risk obstacle in a red box on the user display. In some embodiments, sensor(s) 612 determine other characteristics associated with an obstacle. For example, sensor(s) 612 may determine a speed and direction of travel of an obstacle. In some embodiments, the controller of the refuse vehicle with spatial awareness predicts a path of an obstacle based on the speed and direction of travel of the obstacle. In some embodiments, the controller uses machine-learning techniques to classify obstacles and/or predict their location. For example, the spatial awareness system may label a high-speed obstacle as high risk and a low-speed obstacle as low risk.
Still referring to the operation of refuse vehicle 10 with spatial awareness in scenario 900, the operator may reverse the refuse vehicle 10 in direction 910. The sensor(s) 612 may determine the presence of pedestrian 908 and alert the operator. For example, the controller may display a graphic on a user interface in refuse vehicle 10 (not shown) for the operator. In some embodiments, the alert is an auditory alert (e.g., a beep, etc.). In some embodiments, in a semi-autonomous or autonomous mode, the spatial awareness system automatically limits the movement of the refuse vehicle 10 to avoid contact with pedestrian 908. For example, the spatial awareness system may, upon detection of pedestrian 908 operate various vehicle systems 534 (e.g., brakes, not shown). In some embodiments, the spatial awareness system first displays an alert, but unless the alert is addressed by an operator of the refuse vehicle 10, the spatial awareness then initiates a follow up or successive control action.
According to the exemplary embodiment shown in
In some embodiments, the controller does not initiate a control action until an object is a minimum distance from the refuse vehicle 10. For example, the controller may generate an alert for an operator based on the distance between the side-lift assembly 100 and the barrier 1030. The controller may generate a low volume alert when the side-lift assembly is four feet from the barrier 1030, and a high volume alert when the side-lift assembly 100 is two feet from the barrier 1030. In some embodiments, the controller generates an alert and controls an aspect of the refuse vehicle 10 and/or the side-lift assembly 100. For example, the controller may generate an audible alert but not limit control of side-lift assembly 100 when it is four feet from the barrier 1030. The controller may however generate an audible alert and limit control of the side-lift assembly 100 when it is two feet from the barrier 1030. In some embodiments, the controller does not initiate a control action until an object is a minimum distance from the refuse vehicle 10. For example, the controller may allow the side-lift assembly 100 to operate until 6 inches of distance is between the side-lift assembly 100 and the barrier 1030, at which point the controller stops the movement of side-lift assembly 100. It should be appreciated that the minimum distance may be any desired distance between the refuse vehicle 10 and the detected obstacle and the examples given are not intended to be limiting.
According to the exemplary embodiment shown in
In some embodiments, the controller initiates a control action upon detection of power lines 1120. For example, the controller may generate an alert for an operator of the refuse vehicle 10 indicating the presence and/or location of the power lines 1120. The controller may display a graphic on a user interface for the operator indicating the presence and/or location of power lines 1120. In some embodiments, the user interface displays a distance between the refuse vehicle 10 and power lines 1120. The distance may be displayed numerically. In some embodiments, the user interface displays the distance graphically with a digital representation of the refuse vehicle 10 and power lines 1120.
In some embodiments, the controller determines the trajectory 1430 of refuse container 60 based on information regarding the range of motion and/or path of front-lift assembly 40, the trajectory 1430 described above with reference to
Still in reference to
According to the exemplary embodiments shown in
In some embodiments, the controller generates alerts based on the position of refuse container 1230 and the refuse vehicle 10. For example, referring now specifically to
In the exemplary embodiment shown in
Still referring to
Referring now to the exemplary embodiment shown in
According to the exemplary embodiment shown in
In some embodiments, scenario 1300 illustrates the refuse vehicle 10 underneath a barrier, shown as barrier 1330. Barrier 1330 may be a parking structure, overhang, bridge, bypass, or any other obstacle that may be above the refuse vehicle 10. In scenario 1300 the refuse vehicle 10 is traveling along direction 1340 towards and under barrier 1330. In some embodiments, barrier 1330 is located in a blind spot that is an area that cannot be seen by an operator of the refuse vehicle 10. In some embodiments, the sensor(s) 1210 are positioned on the top of the refuse vehicle 10. For example, the sensor(s) 1210 may be placed on top of the refuse vehicle 10 at the front and rear of the vehicle and detect obstacles.
In some embodiments, the sensor(s) 1210 detect barrier 1330 and the controller initiates a control action when barrier 1330 enters safety zone 1320. In some embodiments, the control action includes generating an alert to the operator of the refuse vehicle 10 indicating the presence of obstacles 908 above the refuse vehicle 10. In some embodiments, the control action additionally and/or alternatively includes controlling an aspect of the refuse vehicle 10. For example, the control action may include limiting the movement of the refuse vehicle 10 so as to prevent it from coming into contact with barrier 1330. For example, as the refuse vehicle 10 approaches barrier 1330 the controller may automatically stop the movement of the refuse vehicle 10 as barrier 1330 enters safety zone 1320. The controller may detect barrier 1330 and initiate a control action that includes generating an alert including an alarm indicating the presence of barrier 1330 to the operator of the refuse vehicle 10. As a further example, the refuse vehicle 10 may not be operable until an operator clears the alert indicating the presence of barrier 1330.
According to the exemplary embodiment shown in
According to the exemplary embodiment shown in
Interface 1500 includes a top-down view of the refuse vehicle 10 and various detected obstacles. As shown in
Another example interface, interface 1500 is shown in
As shown, interface 1500 includes a top-down view of a path being traversed by the refuse vehicle 10. In this example, interface 1500 presents a graphical representation of a roadway. In some embodiments, interface 1500 does not include an illustration of the path and only indicates a position of a refuse container 1590 with respect to the refuse vehicle 10. Also shown in
In some embodiments, interface 1500 is generated from aerial or satellite images of a location of the refuse vehicle 10. For example, satellite imagery may be retrieved via a network based on a determined location of the refuse vehicle 10. In this example, the location of the refuse vehicle 10 may be determined based on GPS coordinates, triangulation (e.g., via a cellular network), or by any other methods for determining a location. In other embodiments, interface 1500 is generated from images captured by sensor(s) 1510 located at various points around the refuse vehicle 10. In some embodiments, multiple images or data are combined from sensor(s) 1510 to form a panoramic or top-down view of the area around the refuse vehicle 10. In yet other embodiments, the background (e.g., the roadway) of interface 1500 is a generated graphical element.
As illustrated in
According to the exemplary embodiment shown in
At step 1602, data is received from one or more sensors (e.g., sensor(s) 422) positioned at various locations of a refuse vehicle. In some embodiments, data is received from at least a radar and a camera sensor. Received data may include raw data from one or more cameras (e.g., visible light cameras) and/or data from one or more sensors (e.g., LIDAR, radar, etc.), as described above. In some embodiments, the data includes still images, video, or other data that can be used to detect an object or objects. In some embodiments, the received data includes at least raw image data and LIDAR data. As described above with respect to
At step 1604, the data is inputted into a controller, such as the controller described above with reference to
At step 1606, a determination is made if an obstacle is detected. In some embodiments, the controller processes the data to detect one or more obstacles in an area surrounding the entire refuse vehicle. In some embodiments, the controller only detects obstacles within a safety zone (as shown in
At step 1608, the controller classifies an obstacle. In some embodiments, the controller classifies an obstacle as static or dynamic. For example, the controller may classify a moving obstacle as dynamic and a stationary obstacle as static. In some embodiments, the controller applies sub-classifications to an obstacle (e.g., pedestrian, refuse container, car, etc.).
At step 1610, the controller determines the position of an obstacle. In some embodiments, the controller determines a speed and direction of travel for an obstacle in addition to determining the position of an obstacle. In some embodiments, the controller determines the position and/or speed and direction of an obstacle using secondary information (e.g., satellite or GPS location information provided over network 528) in addition to data from the one or more sensors. In some embodiments, the controller determines a risk associated with an obstacle. In some embodiments, the risk is associated with an obstacles position and/or speed. For example, a controller may classify a nearby slow-moving obstacle as a high-risk, and a distant slow-moving obstacle as a low risk. It should be appreciated by those skilled in the art who read the present application that the risk may be determined by considering at least one of the position, speed, and direction of travel or any combination thereof, and that the combinations listed are merely exemplary and are not intended to be limiting. The risk may also be determined with reference to the refuse vehicle and its position, speed, and direction of travel. The output of the controller may be an indication of an obstacle, its classification, its sub-classification, and/or the risk associated with it (e.g., a red bounding box for a high-risk obstacle).
At step 1612, a response is initiated based on the detection and/or classification of an obstacle. The response may include any number of automated control actions. For example, the response may include presenting a notification or alert of a detected pedestrian in a blind spot to an operator via a user interface (e.g., user interface 420). As another example, the control action(s) may include automatically moving the refuse vehicle and/or systems of the refuse vehicle to avoid the obstacle. The control actions initiated by step 1612 are described in detail above.
According to the exemplary embodiment shown in
At step 1702, a refuse vehicle including a lift assembly is provided with a spatial awareness system, including a and a controller (e.g., controller 400, controller 500, etc.) and with one or more sensors (e.g., sensor(s) 422 etc.). As described above, the refuse vehicle may be a front-lifting, side-lifting, or rear-loading refuse vehicle. The one or more sensors may be coupled to the refuse vehicle at any point to facilitate detection of obstacles. In some embodiments, the sensors are facilitated to detect obstacles in an operator's blind spot.
At step 1704, the sensors are employed to collect data about the area near the refuse vehicle. The area may be limited to blind spots of the refuse vehicle. In some embodiments, the area includes the entire sensing arc of the sensors. In some embodiments, the area may be represented by a safety zone that extends around the perimeter of the refuse vehicle. In some embodiments, the area may only cover a portion of the refuse vehicle. For example, the sensors may be positioned so as to sense behind a refuse vehicle.
At step 1706, an obstacle is detected and classified based on data provided by the one or more sensors. As described above, the data may be any type of data than can be collected from the sensors provided. For example, the data may be proximity data from a radar sensor as shown in
At step 1708, process 1700 is shown to include generating an alert based on at least one of the presence, classification, or location of a detected obstacle. In some embodiments, the alert informs an operator of the presence of a detected obstacle. In some embodiments, the alert includes information regarding the location of the obstacle. For example, referring now to
At step 1710, the controller may operate a display of the refuse vehicle to provide data from the one or more sensors to an operator. As explained above with reference to
At step 1712, the controller initiates a control action apart from the alert of step 1708. As described above, the control action may itself be an alert. In some embodiments, the control action is an alert and an action controlling an aspect of the refuse vehicle and its systems. The control action may be based on at least one of the status of the vehicle, the presence of the obstacle, the class of the obstacle, and the location of the obstacle. As described above in the various embodiments the control action may including controlling the movement of the refuse vehicle and the systems of the refuse vehicle such as an attached lift. For example, the control action may include preventing the movement of the lift assembly when an obstacle is detected within its path (e.g., trajectory 1430). In some embodiments, the control action prevents movement of the refuse vehicle itself. In some embodiments, the control action is based on the risk associated with an obstacle. For example, a controller may provide a low volume alert for low risk obstacle and a high-volume alert for a high-risk obstacle.
As utilized herein, the terms “approximately,” “about,” “substantially”, and similar terms are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. It should be understood by those of skill in the art who review this disclosure that these terms are intended to allow a description of certain features described and claimed without restricting the scope of these features to the precise numerical ranges provided. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.
It should be noted that the term “exemplary” and variations thereof, as used herein to describe various embodiments, are intended to indicate that such embodiments are possible examples, representations, or illustrations of possible embodiments (and such terms are not intended to connote that such embodiments are necessarily extraordinary or superlative examples).
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
The hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some embodiments, particular processes and methods may be performed by circuitry that is specific to a given function. The memory (e.g., memory, memory unit, storage device) may include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present disclosure. The memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. According to an exemplary embodiment, the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.
The present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data, which cause a general-purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. In addition, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
It is important to note that the construction and arrangement of the refuse vehicle 10 and the systems and components thereof as shown in the various exemplary embodiments is illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein. Although only one example of an element from one embodiment that can be incorporated or utilized in another embodiment has been described above, it should be appreciated that other elements of the various embodiments may be incorporated or utilized with any of the other embodiments disclosed herein.
This application claims the benefit of U.S. Provisional Patent Application No. 63/011,619, filed Apr. 17, 2020, which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2473379 | Lindgren | Jun 1949 | A |
5378010 | Marino et al. | Jan 1995 | A |
5754099 | Nishimura | May 1998 | A |
5919027 | Christenson | Jul 1999 | A |
5934858 | Christenson | Aug 1999 | A |
5934867 | Christenson | Aug 1999 | A |
5938394 | Christenson | Aug 1999 | A |
5951235 | Young et al. | Sep 1999 | A |
5967731 | Brandt | Oct 1999 | A |
5984609 | Bartlett | Nov 1999 | A |
6033176 | Bartlett | Mar 2000 | A |
6062803 | Christenson | May 2000 | A |
6089813 | McNeilus et al. | Jul 2000 | A |
6105984 | Schmitz et al. | Aug 2000 | A |
6120235 | Humphries et al. | Sep 2000 | A |
6123500 | McNeilus et al. | Sep 2000 | A |
6210094 | McNeilus et al. | Apr 2001 | B1 |
6213706 | Christenson | Apr 2001 | B1 |
6224318 | McNeilus et al. | May 2001 | B1 |
6315515 | Young et al. | Nov 2001 | B1 |
6336783 | Young et al. | Jan 2002 | B1 |
6350098 | Christenson et al. | Feb 2002 | B1 |
6390758 | McNeilus et al. | May 2002 | B1 |
6447239 | Young et al. | Sep 2002 | B2 |
6474928 | Christenson | Nov 2002 | B1 |
6516914 | Andersen et al. | Feb 2003 | B1 |
6565305 | Schrafel | May 2003 | B2 |
6757597 | Yakes et al. | Jun 2004 | B2 |
6885920 | Yakes et al. | Apr 2005 | B2 |
7070382 | Pruteanu et al. | Jul 2006 | B2 |
7164977 | Yakes et al. | Jan 2007 | B2 |
7277782 | Yakes et al. | Oct 2007 | B2 |
7284943 | Pruteanu et al. | Oct 2007 | B2 |
7302320 | Nasr et al. | Nov 2007 | B2 |
7357203 | Morrow et al. | Apr 2008 | B2 |
7379797 | Nasr et al. | May 2008 | B2 |
7448460 | Morrow et al. | Nov 2008 | B2 |
7520354 | Morrow et al. | Apr 2009 | B2 |
7556468 | Grata | Jul 2009 | B2 |
7559735 | Pruteanu et al. | Jul 2009 | B2 |
7689332 | Yakes et al. | Mar 2010 | B2 |
7711460 | Yakes et al. | May 2010 | B2 |
7848857 | Nasr et al. | Dec 2010 | B2 |
7878750 | Zhou et al. | Feb 2011 | B2 |
7931103 | Morrow et al. | Apr 2011 | B2 |
8000850 | Nasr et al. | Aug 2011 | B2 |
8139109 | Schmiedel et al. | Mar 2012 | B2 |
8182194 | Pruteanu et al. | May 2012 | B2 |
8215892 | Calliari | Jul 2012 | B2 |
8360706 | Addleman et al. | Jan 2013 | B2 |
8540475 | Kuriakose et al. | Sep 2013 | B2 |
8561735 | Morrow et al. | Oct 2013 | B2 |
8807613 | Howell et al. | Aug 2014 | B2 |
8947531 | Fischer et al. | Feb 2015 | B2 |
9045014 | Verhoff et al. | Jun 2015 | B1 |
9174686 | Messina et al. | Nov 2015 | B1 |
9216856 | Howell et al. | Dec 2015 | B2 |
9387985 | Gillmore et al. | Jul 2016 | B2 |
9403278 | Van Kampen | Aug 2016 | B1 |
9420203 | Broggi et al. | Aug 2016 | B2 |
9656640 | Verhoff et al. | May 2017 | B1 |
9707869 | Messina et al. | Jul 2017 | B1 |
9880581 | Kuriakose et al. | Jan 2018 | B2 |
9981803 | Davis et al. | May 2018 | B2 |
10196205 | Betz et al. | Feb 2019 | B2 |
D843281 | Gander et al. | Mar 2019 | S |
10414067 | Datema et al. | Sep 2019 | B2 |
10434995 | Verhoff et al. | Oct 2019 | B2 |
10457533 | Puszkiewicz et al. | Oct 2019 | B2 |
D871283 | Gander et al. | Dec 2019 | S |
10558234 | Kuriakose et al. | Feb 2020 | B2 |
10594991 | Skolnick | Mar 2020 | B1 |
10633180 | Salinas | Apr 2020 | B2 |
10633181 | Butcher | Apr 2020 | B2 |
D888629 | Gander et al. | Jun 2020 | S |
10781090 | Puszkiewicz et al. | Sep 2020 | B2 |
10800605 | Rocholl et al. | Oct 2020 | B2 |
10843379 | Rocholl et al. | Nov 2020 | B2 |
10858184 | Betz et al. | Dec 2020 | B2 |
10859167 | Jax et al. | Dec 2020 | B2 |
D907544 | Wall et al. | Jan 2021 | S |
10899538 | Nelson et al. | Jan 2021 | B2 |
D909934 | Gander et al. | Feb 2021 | S |
10987829 | Datema et al. | Apr 2021 | B2 |
11001135 | Yakes et al. | May 2021 | B2 |
11001440 | Rocholl et al. | May 2021 | B2 |
11007863 | Yakes et al. | May 2021 | B2 |
11021078 | Rocholl et al. | Jun 2021 | B2 |
11042750 | Wildgrube | Jun 2021 | B2 |
11059436 | Wildgrube et al. | Jul 2021 | B2 |
20020017412 | Pietsch | Feb 2002 | A1 |
20030098786 | Bishop | May 2003 | A1 |
20030169213 | Spero | Sep 2003 | A1 |
20060152351 | Daura Luna | Jul 2006 | A1 |
20060215020 | Mori | Sep 2006 | A1 |
20080122597 | Englander | May 2008 | A1 |
20090108065 | King | Apr 2009 | A1 |
20120245798 | Coats | Sep 2012 | A1 |
20130245822 | Kawanami | Sep 2013 | A1 |
20130293712 | Turner | Nov 2013 | A1 |
20130332062 | Kreitmair-Steck | Dec 2013 | A1 |
20170043717 | Heiman | Feb 2017 | A1 |
20170313262 | Wisnia | Nov 2017 | A1 |
20190185077 | Smith et al. | Jun 2019 | A1 |
20190193934 | Rocholl et al. | Jun 2019 | A1 |
20190241124 | Izumikawa | Aug 2019 | A1 |
20190265703 | Hicok | Aug 2019 | A1 |
20190340909 | Nguyen | Nov 2019 | A1 |
20190344475 | Datema et al. | Nov 2019 | A1 |
20190351883 | Verhoff et al. | Nov 2019 | A1 |
20200262328 | Nelson et al. | Aug 2020 | A1 |
20200262366 | Wildgrube et al. | Aug 2020 | A1 |
20200316816 | Messina et al. | Oct 2020 | A1 |
20200317083 | Messina et al. | Oct 2020 | A1 |
20200346547 | Rocholl et al. | Nov 2020 | A1 |
20200346556 | Rocholl et al. | Nov 2020 | A1 |
20200346557 | Rocholl et al. | Nov 2020 | A1 |
20200346657 | Clifton et al. | Nov 2020 | A1 |
20200346854 | Rocholl et al. | Nov 2020 | A1 |
20200346855 | Rocholl et al. | Nov 2020 | A1 |
20200346856 | Rocholl et al. | Nov 2020 | A1 |
20200346857 | Rocholl et al. | Nov 2020 | A1 |
20200346858 | Buege et al. | Nov 2020 | A1 |
20200346859 | Buege et al. | Nov 2020 | A1 |
20200346860 | Buege et al. | Nov 2020 | A1 |
20200346861 | Rocholl et al. | Nov 2020 | A1 |
20200346862 | Rocholl et al. | Nov 2020 | A1 |
20200347659 | Rocholl et al. | Nov 2020 | A1 |
20200347661 | Rocholl et al. | Nov 2020 | A1 |
20200348681 | Clifton et al. | Nov 2020 | A1 |
20200348764 | Clifton et al. | Nov 2020 | A1 |
20200369468 | Searle | Nov 2020 | A1 |
20200398670 | Rocholl et al. | Dec 2020 | A1 |
20200398695 | Rocholl et al. | Dec 2020 | A1 |
20200398697 | Rocholl et al. | Dec 2020 | A1 |
20200398772 | Wildgrube et al. | Dec 2020 | A1 |
20200398857 | Clifton et al. | Dec 2020 | A1 |
20200399057 | Rocholl et al. | Dec 2020 | A1 |
20200399058 | Rocholl et al. | Dec 2020 | A1 |
20210002112 | Puszkiewicz et al. | Jan 2021 | A1 |
20210031611 | Yakes et al. | Feb 2021 | A1 |
20210031612 | Yakes | Feb 2021 | A1 |
20210031649 | Messina et al. | Feb 2021 | A1 |
20210054942 | Jax et al. | Feb 2021 | A1 |
20210069934 | Rocholl et al. | Mar 2021 | A1 |
20210086991 | Betz et al. | Mar 2021 | A1 |
20210192234 | Chen | Jun 2021 | A1 |
20210221216 | Yakes et al. | Jul 2021 | A1 |
20210229908 | Rocholl et al. | Jul 2021 | A1 |
20220001800 | Singh | Jan 2022 | A1 |
Number | Date | Country |
---|---|---|
2011005942 | Jul 2011 | MX |
WO-2018235274 | Dec 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20210325529 A1 | Oct 2021 | US |
Number | Date | Country | |
---|---|---|---|
63011619 | Apr 2020 | US |