Refuse vehicles collect a wide variety of waste, trash, and other material from residences and businesses. Operators of the refuse vehicles transport the material from various waste receptacles within a municipality to a storage or processing facility (e.g., a landfill, an incineration facility, a recycling facility, etc.).
At least one embodiment relates to a refuse vehicle. The refuse vehicle includes a chassis coupled a wheel, a motor configured to drive the wheel, a body assembly coupled to the chassis and defining a refuse compartment, a lift assembly, and a vehicle control system having a sensor integrated into the body assembly and a controller in communication with the lift assembly and the sensor. The controller includes a processor and at least one memory and is configured to receive sensor data from the sensor, the sensor data indicating a potential event, receive control data indicating a state the lift assembly, and compare the sensor data to the control data to determine if the potential event is a false event associated with the state of the lift assembly or an actual event.
Another embodiment relates to a refuse vehicle that includes a chassis, a body assembly coupled to the chassis and defining a refuse compartment, a lift assembly, a refuse container configured to selectively couple to the lift assembly and having a carry can sensor, and a vehicle control system having a vehicle sensor integrated into the body assembly and a controller in communication with the lift assembly, the carry can sensor, and the vehicle sensor. The controller includes a processor and at least one memory and is configured to determine that the lift assembly is in a collection mode and in response to determining that the lift assembly is in the collection mode, deactivate the vehicle sensor and activate the carry can sensor.
Another embodiment relates to a refuse vehicle that includes a chassis, a cab supported by the chassis, a body assembly coupled to the chassis and defining a refuse compartment, a lift assembly, and a vehicle control system having a vehicle sensor integrated into the body assembly and a front camera mounted to the cab. The front camera defines a field of view that intersects with a ground plane. A predefined vision distance defined between the front camera and a point where the field of view intersects with the ground plane is less than or equal to about 7 meters.
This summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices or processes described herein will become apparent in the detailed description set forth herein, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements.
The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements, in which:
Before turning to the figures, which illustrate certain exemplary embodiments in detail, it should be understood that the present disclosure is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology used herein is for the purpose of description only and should not be regarded as limiting.
According to an exemplary embodiment, a vocational vehicle (e.g., refuse truck, refuse vehicle, mixer vehicle, fire fighting vehicle, etc.) includes a vehicle control system configured to operate as an advanced driver-assistance system (ADAS). The ADAS system includes one or more sensors positioned in and around the vocational vehicle. In some embodiments, the sensors include a three hundred sixty degree camera system, a three hundred sixty degree radar system, and a forward collision detection system. Further, the ADAS system can include sensors for monitoring an activity of the vocational vehicle. For example, the sensors may monitor a refuse collection mode of a refuse vehicle including monitoring the position of a lift assembly and the contents of a refuse compartment. In some embodiments, the ADAS system can filter the sensor data through the vehicle controls to identify false events. The false events can be due to the position of components of the vocation vehicle itself that are detected by the sensors. For example, a forward facing radar sensor may detect an object in front of the refuse vehicle, but the ADAS system can compare the sensor data to vehicle control data to determine the object that is detected is a lift assembly of the vocational vehicle, and the ADAS system can disregard the data. In one embodiment, the sensors are integrated into a body of the vocational vehicle. In some embodiments, the exterior of the vocational vehicle can be modified to include radar-friendly materials at select locations in front of the sensors to facilitate operation of the radar sensors. In some embodiments, the ADAS system can detect operation modes, allowing for (a) automatic adjustment to a user interface of the ADAS system, and (b) automatic driver-assistance, based on the detected mode. For example, the ADAS system can provide a view on a console display of a hopper camera and curbside camera when a collection mode is detected, and a view of a different combination of video feeds in a forward travel mode. In some embodiments, the ADAS system combines vocational activity awareness and ADAS operations into a single user interface system that can include an instrument cluster display and a main console display. The ADAS controls the displays in tandem providing each display with context-specific information based on the detected mode.
As shown in
As shown in
According to an exemplary embodiment, the energy storage and/or generation system 20 is configured to (a) receive, generate, and/or store power and (b) provide electric power to (i) the electric motor 18 to drive the wheels 22, (ii) electric actuators of the refuse vehicle 10 to facilitate operation thereof (e.g., lift actuators, tailgate actuators, packer actuators, grabber actuators, etc.), and/or (iii) other electrically operated accessories of the refuse vehicle 10 (e.g., displays, lights, etc.). The energy storage and/or generation system 20 may include one or more rechargeable batteries (e.g., lithium-ion batteries, nickel-metal hydride batteries, lithium-ion polymer batteries, lead-acid batteries, nickel-cadmium batteries, etc.), capacitors, solar cells, generators, power buses, etc. In one embodiment, the refuse vehicle 10 is a completely electric refuse vehicle. In other embodiments, the refuse vehicle 10 includes an internal combustion generator that utilizes one or more fuels (e.g., gasoline, diesel, propane, natural gas, hydrogen, etc.) to generate electricity to charge the energy storage and/or generation system 20, power the electric motor 18, power the electric actuators, and/or power the other electrically operated accessories (e.g., a hybrid refuse vehicle, etc.). For example, the refuse vehicle 10 may have an internal combustion engine augmented by the electric motor 18 to cooperatively provide power to the wheels 22. The energy storage and/or generation system 20 may thereby be charged via an on-board generator (e.g., an internal combustion generator, a solar panel system, etc.), from an external power source (e.g., overhead power lines, mains power source through a charging input, etc.), and/or via a power regenerative braking system, and provide power to the electrically operated systems of the refuse vehicle 10. In some embodiments, the energy storage and/or generation system 20 includes a heat management system (e.g., liquid cooling, heat exchanger, air cooling, etc.).
According to an exemplary embodiment, the refuse vehicle 10 is configured to transport refuse from various waste receptacles within a municipality to a storage and/or processing facility (e.g., a landfill, an incineration facility, a recycling facility, etc.). As shown in
As shown in
According to the exemplary embodiment shown in
As shown in
As shown in
The extension mechanism 320 includes an extendable/telescoping arm, shown as can arm 322, and a first actuator, shown as extension actuator 324, positioned to facilitate selectively extending and retracting the can arm 322 and, thereby, the lift mechanism 340 and the grabber mechanism 360 between a nominal, non-extended position (see, e.g.,
Referring generally to
According to an exemplary embodiment shown in
According to an exemplary embodiment, the controller 402 includes a processing circuit 404, a processor 406, and memory 408. The processing circuit 404 can be communicably connected to a communications interface such that the processing circuit 404 and the various components thereof can send and receive data via the communications interface. The processor 406 can be implemented as a general purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable electronic processing components.
According to an exemplary embodiment, the memory 408 (e.g., memory, memory unit, storage device, etc.) can include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage, etc.) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present application. The memory 408 can be or include volatile memory or non-volatile memory. The memory 408 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present application. According to some embodiments, the memory 408 is communicably connected to the processor 406 via the processing circuit 404 and includes computer code for executing (e.g., by the processing circuit 404 and/or the processor 406) one or more processes described herein.
According to an exemplary embodiment, the controllable elements 410 include steering components, suspension components, power transmission or drive components, braking components, actuators, assemblies, systems, subsystems, and/or accessories of the refuse vehicle 10 that can be controlled by an operator. For example, the controllable elements 410 can include the lift assembly 40. Control data may include the position, speed, status, etc. of the controllable elements 410. For example, the control data may indicate that the lift assembly 40 is positioned in a collection position in front of the cab 16 of the refuse vehicle 10. In some embodiments, the controller 402 facilitates control of the controllable elements 410 by providing control signals based on sensor data, control data, and/or user inputs. In some embodiments, the controllable elements 410 provide control data to the controller 402.
According to an exemplary embodiment, the ADAS 400 can include the remote network 412 with which the controller 402 is configured to communicate. In some embodiments, the controller 402 is configured to wirelessly communicate with the remote network 412. In some embodiments, any user inputs, sensor data, display data, control signals, control data, etc., as obtained, determined, generated, output, etc., by the controller 402 are provided to the remote network 412. In some embodiments, the remote network 412 includes a processing circuit or processing circuitry similar to the processing circuit 404 of the controller 402 so that the remote network 412 can be configured to perform any of the functionality (e.g., the driver-assistance functions) of the controller 402. In this way, the functionality of the controller 402 as described herein may be performed locally at the controller 402 of the refuse vehicle 10, remotely by the remote network 412, or distributed across the controller 402 and the remote network 412 so that some of the functionality as described herein is performed locally at the refuse vehicle 10 while other of the functionality as described herein is performed remotely at the remote network 412.
According to an exemplary embodiment, the ADAS 400 includes one or more sensors, shown as sensors 414. The sensors 414 may be disposed at various locations around the refuse vehicle 10 to identify obstacles and/or obtain other contextual information useful to the controller 402. The sensors 414 include any one and/or a combination of proximity sensors, infrared sensors, electromagnetic sensors, capacitive sensors, photoelectric sensors, inductive sensors, radar sensors, ultrasonic sensors, Hall Effect sensors, fiber optic sensors, Doppler Effect sensors, magnetic sensors, laser sensors (e.g., LIDAR sensors), sonar, and/or the like. In some embodiments, the sensors 414 include an image capture device such as visible light cameras, full-spectrum cameras, image sensors (e.g., charged-coupled device (CCD), complementary metal oxide semiconductor (CMOS) sensors, etc.), or any other type of suitable object sensor or imaging device. Data captured by the sensors 414 may include, for example, raw image data from one or more cameras (e.g., visible light cameras) and/or proximity data from one or more sensors (e.g., LIDAR, radar, etc.) that may be used to detect objects. In some embodiments, the sensors 414 are active during operation of the refuse vehicle 10. Additionally or alternatively, the sensors 414 may become active in response to a detected operation mode of the refuse vehicle 10. For example, a hopper camera may activate in response to the refuse vehicle 10 being put into a collection mode.
In some embodiments, the sensor data is video feed data obtained from the sensors 414 regarding one or more areas in and/or surrounding refuse vehicle 10. For example, the sensor data may be or include video feed data (e.g., live or real-time video feed data) of the front, sides, rear, and/or interior of the refuse compartment 30 of the refuse vehicle 10. In some embodiments, the sensors 414 provide the controller 402 video feed data for generating a 360-degree composite view of the refuse vehicle 10 and/or its surroundings. The 360 composite video feed can be an image of the refuse vehicle 10 from above with the video feeds from one or more cameras, such as cameras 510 and the controller 402 can be configured to stitch together the video feed data from one or more cameras to create the 360-degree composite video feed. In some embodiments, the sensors 414 are radar sensors and the sensor data is proximity data. For example, the sensor data may include proximity data indicating the position, speed, direction of travel, and/or acceleration of one or more objects surrounding the refuse vehicle 10. In some embodiments, the sensors 414 include both cameras and radar sensors and provide both video feed data and proximity data. In some embodiments, the sensor data also includes thermal imaging data from one or more sensors 414. For example, sensor data from a visible light sensor and sensor data from a thermal imaging sensor in hopper camera 522 can be sent to the controller 402 as part of the sensor data.
Referring still to
In some embodiments, the alert devices 424 can provide auditory alerts to an operator of the refuse vehicle 10. The alert devices 424 may include speakers, sound output devices, alarms, buzzers, etc. based on the display/alert data provided by the controller 402. In some embodiments, the alert devices 424 are associated with a corresponding automatic action undertaken by the ADAS 400. For example, audible natural language based alerts indicating a lane change can accompany a corresponding automatic lane change initiated the ADAS 400. The audible natural language based alerts can accord to one or more languages.
According to an exemplary embodiment shown in
Refuse vehicle 10 is shown on a vehicle axis system with an x-axis 1002 and y-axis 1004 that follow the International Organization for Standardization (ISO) Road Vehicles—Vehicle Dynamics and road-holding ability—Vocabulary (ISO Standard No. 8855:2011) Vehicle Axis System 2.10 convention, published December 2012, the entirety of which is herein incorporated by reference. The x-axis 1002 is a horizontal axis parallel to the vehicle's heading and in the forward direction of the vehicle such that it is also parallel to refuse vehicle 10′s longitudinal plane of symmetry. The y-axis 1004 is perpendicular to the x-axis 1002 and the refuse vehicle 10′s longitudinal plane of symmetry and is in the left direction of the vehicle of refuse vehicle 10. The z-axis 1006 (shown in
Referring now to
According to an exemplary embodiment shown in
Still referring to
In some embodiments, to aid in the screening process, the hopper camera 522 can include a thermal imaging sensor (e.g., FLIR) in the place of or in addition to a visible light sensor. The hopper camera 522 with thermal imaging sensors can detect creatures or other objects based on differences in the temperature of the object and its surroundings to improve an operator's ability to screen the contents of the refuse compartment 30. For example, thermal imaging data from the hopper camera 522 can detect that a creature such as a squirrel is within the refuse compartment 30. In other embodiments, the thermal imaging capabilities of the hopper camera 522 are used for confirmation purposes, with vision techniques for processing the image data serving as the initial mode of detection of unwanted objects and/or creatures. In some embodiments, the ADAS 400 is configured to automatically monitor the refuse stream in the refuse compartment 30 using the hopper camera 522. The ADAS 400 can use the normal visible light sensor and/or an additional thermal imaging sensor to automatically detect contaminants in the refuse compartment 30. In some embodiments, upon detecting a contaminant in the refuse compartment 30 based on the data provided by the hopper camera 522, the ADAS 400 can generate one or more control signals and/or alerts. For example, the ADAS 400 can generate a control signal to stop the dumping operation when a contaminant is detected and provide an alert to an operator. The ADAS 400 may be configured to display the image data from either the visible light sensor and/or the thermal imaging sensor in the hopper camera 522 to an operator.
According to an exemplary embodiment shown in
In some embodiments, the alley system 2600 is a component of the camera system 500, and the alley cameras 2610 are part of the cameras 510. For example, the alley cameras 2610 may form the rear cameras 514 of the camera system 500. In some embodiments, the alley system 2600 is a separate system and the data from the alley system 2600 is provided to the ADAS 400 in the same manner and for the same purpose as data from the camera system 500. In some embodiments, the data from the alley system 2600 may be separately addressable. For example, the ADAS 400 can be configured to automatically display the combined feed from one or more of the alley cameras 2610 when the ADAS 400 determines the refuse vehicle 10 is in a reverse mode. In some embodiments, the data from the alley system 2600 can be combined with the data from one or more other systems of the ADAS 400 (i.e., the camera system 500, radar system 600, and/or the collision detection system 1600 described herein).
Turing to
According to an exemplary embodiment, the radar system 600 includes two radar sensors 610 positioned on the front of the cab 16 and with the radar FOV 620 directed in a generally forward direction (e.g. a centerline of the radar FOV 620 is generally parallel to the x-axis 1002 or a forward direction of travel of the refuse vehicle 10). In some embodiments, two radar sensors 610 are positioned on the front corners of the cab 16 and positioned so that the radar FOV 620 is directed more toward the rear of refuse vehicle 10. In some embodiments, two radar sensors 610 are integrated into the rear of body 14 and positioned to face a generally rearward direction (e.g., a centerline of the radar FOV 620 is generally parallel to the x-axis and faces a reverse direction of travel of the refuse vehicle 10). In some embodiments, two radar sensors 610 can be integrated into the rear corners of body 14 and positioned at an angle relative to the two radar sensors arranged in the rear of the body 14. For example, the radar sensors 610 arranged in the rear corners of the body 14 may be orientated at an approximately 45 degree angle relative to the radar sensors 610 positioned to in the rear of the body 14. While the radar sensors 610 are shown in the configuration described above, it should be understood that the number and position of the radar sensors 610 in the radar system 600 may vary without department from the scope of the invention. For example, radar system 600 may only include front-facing and rear-facing radar sensors 610, rather than additional sensors in the corners.
As shown in
In some embodiments, each of the radar sensors 610 is positioned behind a respective one of the covers 704 and attached to a firewall of the cab 16. The radar sensors 610 can be positioned with a gap between the radar sensors 610 and the cover 704. In some embodiments, the radar sensors 610 must be placed within a maximum distance of any protruding metal feature (e.g., bumper) of cab 16. For example, the radar sensors 610 may be at most one inch behind a protruding metal bumper to minimize interference to the radar sensors 610 due to the metal bumper. As shown in
In some embodiments, the radar sensors 610 may include an external case. The thickness of the external case may be limited to minimize the interference with the radar sensors 610 from external case. For example, the external case may have a maximum thickness of 1.8 mm. In some embodiments, the radar sensors 610 are mounted with a gap between external case and an outer face of radar sensors 610. For example, the radar sensors 610 can be mounted with a 0.5 mm gap between the outer face of the radar sensors 610 and the external case. In some embodiments, the external case is made of plastic e.g., polycarbonate.
As shown in
As shown in
With specific reference to
Referring now to
In some embodiments, a position of the forward camera 1602 depends on the wiper path of wipers on the windshield 516. In general, the forward camera 1602 can be positioned to not be obstructed by a parked wiper blade. In some embodiments, the clearance between the forward camera FOV 1606 and the edge of a wiper blade path may be at least about 20 millimeters, or at least 40 millimeters.
As shown in
In general, the controller 402 of the ADAS 400 is configured to operate the refuse vehicle 10 and/or its subsystems, attachments, assemblies, etc., according to various operation modes. As an ADAS, the controller 402 can a provide route information, monitor a human driver (i.e., for weariness, concentration, etc.), provide alerts to a human driver, control the movement of the refuse vehicle 10 (i.e., lane assist, cruise control, emergency braking, autonomous route tracking, parking, etc.), and/or communicate with a remote system or other vehicles to act in concert with one or more other vehicles. The controller 402 may also be configured to assist an operator of the refuse vehicle 10 with vocational activities (i.e., refuse collection). In some embodiments, the controller 402 generates control signals for one or more controllable elements 410 to assist the vocational activity. The control signals may include controlling the lift assembly 40, compaction assembly, articulating collection arm, etc. of the refuse vehicle 10. In some embodiments, the controller 402 can use sensor data and/or control data in both ADAS actions and vocational activity assistance.
In some embodiments, the controller 402 is configured to determine an operation mode based on the sensor data and/or control data. In some embodiments, the controller 402 can filter sensor data through the vehicle control data to identify false events. The false events can be sensor-detected events that are due to one or more controllable elements 410 of the refuse vehicle 10 (i.e., lift assembly 40). In some embodiments, the controller 402 is configured to determine the one or more sensors 414 from which sensor data should be obtained from. Depending on the configuration of the refuse vehicle 10, a subset of sensors can be deactivated and a subset of sensors in a more preferable location can be activated. The controller 402 can be configured to determine the appropriate sensors based on sensor data and/or control data. In some embodiments, the controller 402 is configured to generate control signals for controllable elements 410 to control the movement of refuse vehicle 10 and/or its subsystems in response to receiving a user input, command, a request, etc. In some embodiments, the controller 402 is configured to generate control signals for one or more controllable elements 410 based on the sensor data and/or control data. In some embodiments, the controller 402 is configured to display different views via the user interface 416 based on the determined operation mode. The views can incorporate the sensor data and/or the control data relevant to the determined operation mode. The operation modes may include, for example, a collection mode, a forward mode, a reverse mode, a compaction mode, a dumping mode, and/or still other modes. In some embodiments, two or more modes may be active simultaneously. In some embodiments, once in a first operation mode, the controller 402 will not transition to a second operation mode until a set of conditions has first been met.
Referring now to
In some embodiments, the method 2200 includes providing a refuse vehicle (e.g., the refuse vehicle 10) including an ADAS system (e.g., the ADAS 400) having one or more sensors and one or more controllable elements at step 2202. Controllable elements may be the same or similar to controllable elements 410. In some embodiments, controllable elements include a prime mover, steering components, power transmission or driver components, braking components, lift assemblies, electric actuators, hydraulic actuators, electric motors, systems, subsystems, assemblies, and/or any other components of the refuse vehicle 10 controllable by an operator or by the controller 402. In some embodiments, provided sensor can be the same or similar to sensors 414. In some embodiments, the sensors can include the camera system 500, the radar system 600, and the collision detection system 1600.
In some embodiments, the method 2200 includes obtaining data from the one or more sensors and the one or more controllable elements relating to a detected event at step 2204. In some embodiments, the data obtained includes sensor data and/or control data. The sensor data can include image data, proximity data, and or other types of data. The control data can include the position, direction of movement, speed, and/or acceleration of controllable elements 410. The control data may also include a list of past control signals provided to controllable elements 410. In some embodiments, the data is obtained from the one or more sensors via an Ethernet bus. In some embodiments, the sensors 414, including the cameras 510, the hopper camera 522 (both visible light and thermal imaging sensors), the radar sensors 610, the forward camera 1602, and the forward radar 1604 can all be connected to the Ethernet bus for transmitting information to the controller 402. The Ethernet bus may be composed of copper using a coax line or differential twisted pairs. In some embodiments, the Ethernet bus is a fiber-optic line.
In some embodiments, the detected event is the presence of an obstacle. For example, when lift assembly 40 interfaces with carry can 200, sensor data from the forward radar 1604 may indicate the presence of an obstacle. The sensor data relating to this event may be provided to the controller 402. In some embodiments, the control data is the data received from one or more controllable elements at the time the sensor data indicated the presence of the obstacle. In some embodiments, the detected event may be based on a user input. For example, the detected event can be a movement of a joystick of the user interface 416. In some embodiments, all sensor data is filtered by the corresponding control data. For example, the controller 402 may constantly be comparing sensor data to control data to identify, based on the control data, instances where the sensor data includes false events.
In some embodiments, the method 2200 includes filtering the sensor data through the control data at step 2206. The controller 402 may be configured to filter the sensor data through the vehicle control data to identify, remove, and/or tag false events from the sensor data. False events may be instances of sensor data that appear to indicate one or more objects are present around refuse vehicle 10, but actually are due to refuse vehicle 10 itself and/or one or more of its components. The filter process can include using the control data to identify the position of one or more components of refuse vehicle 10 and comparing that position to the detected object in the sensor data. The sensor data observations that align with control data can be filtered out as false events. For example, the lift assembly 40 and/or the robotic arm 300 may interface with the carry can 60, 200 in a front, rear, or side of refuse vehicle 10 (e.g., in front of the cab 16 or at a side or rear of the body 14). The forward radar 1604 (or another radar sensor 610 on the rear or side of the body 14) can detect the carry can 60, 200 and/or the robotic arm 300 as an object and provide sensor data to the controller 402 indicating the positon, direction of movement, speed, and/or acceleration of the carry can 60, 200 and/or the robotic arm 300. The controller 402 can also receive control data indicating that the arms of lift assembly 40 are lowered and/or that the robotic arm 300 is extended or retracted. If the sensor data is not compared to the control data, the controller 402 may analyze the sensor data and determine an object is present. If the sensor data is filtered through the control data, the controller 402 can compare the filter sensor data and the control data and determine, at step 2208, that the sensor data is a false event due to the carry can 60, 200 and/or the robotic arm 300 being intentionally manipulated and that no external object is present.
In some embodiments, if a false event is detected, the sensor data that the alert is based on may be removed, tagged, ignored, and/or adjusted by the controller 402. For example, during the filtering process, the controller 402 can tag all sensor data that is determined to be due to one or more components of refuse vehicle 10, based on the control data, as false event data, and not generate one or more control signals based on the false event data. In some embodiments, if the event is determined to be a false event, method 2200 proceeds directly to step 2216 and ends. In some embodiments, if the event is determined not to be a false event, method 2200 includes proceeding to step 2210.
In some embodiments, method 2200 includes generating, via a user interface, an alert based on the sensor data at step 2210. The alert may be a visual alert via a display (i.e., the instrument display 418, the console display 420, etc.), and/or an auditory alert via the alert devices 424. In some embodiments, the alert includes a recommended control action for a user to perform. For example, the ADAS 400 via the radar system 600 and radar sensors 610 may detect a vehicle in a blind spot of the refuse vehicle 10, and the controller 402 can generate an alert to a driver indicating the presence of the vehicle. In another example, the refuse vehicle 10 may be stopped, and the ADAS 400 senses fast approaching objects from the rear of the refuse vehicle 10. The controller 402 can generate visual, audible, and/or haptic alerts that are apparent from outside of the refuse vehicle 10 and/or the alerts themselves are external to the refuse vehicle 10 to alert those around refuse vehicle 10 of the approaching objects. In some embodiments, the alerts are audible natural-language based alerts. Audible natural-language based alerts can explain with language (according to a user preference for example) the content of the alert. For example, the alert may include an audible natural-language based alert saying “Vehicle in blind spot.” Natural language based alerts allow a user to understand what an alert is for without any other supplemental information. In some embodiments, the alerts can indicate information about the refuse vehicle 10. For example, alerts may include a tire pressure of the refuse vehicle 10 while operating.
In some embodiments, the method 2200 includes determining if the alert is cleared at step 2212. For example, a user may clear an alert via a user input to the user interface 416. Alerts may also be cleared automatically by the controller 402. In some embodiments, the controller 402 automatically clears alerts if the underlying event that triggered the alert is no longer detected. For example, an alert of a car in a blind spot of the refuse vehicle 10 may persist so long as the car is in the blind spot. Once the car leaves the blind spot, the controller 402 may automatically clear the alert. In some embodiments, if the alert is cleared, method 2200 proceeds to step 2216 and ends. In some embodiments, if the alert is not cleared, method 2200 proceeds to step 2214.
In some embodiments, the method 2200 includes generating one or more control signals based on the sensor data and control data at step 2214. The controller 402 can be configured to generate control signals based on the sensor data and control data in response to an actual event (i.e., not a false event). Control signals may be commands to operate the refuse vehicle 10 and/or one or more components of the refuse vehicle 10. For example, sensor data indicate an object (e.g., a vehicle) ahead of the refuse vehicle 10 and the controller 402 may generate an alert indicating this event. The sensor data may indicate that the refuse vehicle 10 is traveling at a sufficient speed that it will collide with the object if the speed is not diminished. If the alert is not cleared (i.e., before a time threshold, where the time threshold is the point in time determined by the controller 402 where action must be taken to avoid a collision), the controller 402 can generate control signals to activate the brakes of the refuse vehicle 10 and prevent the collision. In some embodiments, the control signals may also control components of the refuse vehicle 10 including actuators, motors, lift assemblies, etc. In some embodiments, the method 2200 skips steps 2210 and 2212 and proceeds directly to generating one or more control signals via at step 2218. The controller 402 can be configured to automatically generate one or more control signals in emergencies where there is not enough time to generate an alert and wait for it to be cleared. For example, the controller 402 may determine that a control action such as emergency braking should be taken immediately in order to avoid an accident.
Referring now to
In some embodiments, the method 2300 includes providing a refuse vehicle one or more controllable elements, a user interface, and an ADAS having one or more sensors at step 2302. In some embodiments, the controllable elements can be the controllable elements 410, a user interface can be the user interface 416, and an ADAS can be the ADAS 400 with the one or more sensors 414. In some embodiments, the method 2300 includes obtaining sensor data from the sensors, control data from the controllable elements, and a user input from the user interface at step 2304. The user input may be the movement of an input device 422 of user interface 416. For example, the user input can be a user pressing a joystick. In some embodiments, the control data includes position, speed, direction of travel, and/or acceleration of the refuse vehicle 10. In some embodiments, the sensor data includes image data and/or proximity data from one or more sensors 414 including the cameras 510 of the camera system 500, the radar sensors 610 of the radar system 600, and/or the forward camera 1602 and the forward radar 1604 of the collision detection system 1600.
In some embodiments, the method 2300 includes determining, at step 2306, an operation mode based on the obtained data at step 2304. The controller 402 can be configured to analyze obtained data (i.e., sensor data, control data, user input(s), etc.), and determine from the obtained data an operation mode of the refuse vehicle 10. The operation modes may include, for example, a collection mode, a forward mode, a reverse mode, a compaction mode, a dumping mode, and/or still other modes. In some embodiments, the operation mode is based on a user input. For example, the controller 402 may automatically transition the refuse vehicle 10 into a collection mode when a driver presses a joystick of the user interface 416 that controls the lift assembly 40 or the robotic arm 300. In some embodiments, the operation mode is determined based on a combination of one or more of sensor data, control data, and user inputs. For example, a user input may indicate a transition to a collection mode, but later control data indicating a speed of 25 miles per hour (MPH) may indicate a forward mode.
In some embodiments, the controller 402 is provided a pre-defined list (e.g., lookup table) of operation modes and various requirements for activation. The list can include multiple ways of activating or indicating an operating mode to the controller 402. For example, a driver may input via the user interface 416 a command to enter a collection mode, a driver may audibly command the refuse vehicle 10 enter collection mode, or the controller 402 may automatically enter the collection mode when a refuse container is identified in a collection zone of the refuse vehicle 10 (e.g., by one of the cameras 510 or one of the radar sensors 610). In some embodiments, the controller 402 is configured to monitor the obtained data and learn when data and/or inputs can be reasonably associated with a collection mode. For example, the controller 402 can be configured to monitor a joystick of the user interface 416 that controls the lift assembly 40 or the robotic arm 300, and determine that each time the joystick is activated in a leftward motion; it is then followed by a command to activate the lift assembly 40 or the robotic arm 300. The controller 402 can determine that the refuse vehicle 10 is in a collection mode based on the activation of the lift assembly 40 or the robotic arm 300, and in some embodiments, the controller 402 can thereby associate leftward movement of the joystick with indicating the collection mode. In some embodiments, once in a first operation mode, the controller 402 will not transition to a second operation mode until a set of conditions has first been met. In some embodiments, two or more modes may be active simultaneously.
In some embodiments, the method 2300 includes generating a view including sensor data and/or control data, at step 2308, based on the operation mode determined at step 2306. The views can be provided on one or more of the displays of the user interface 416. The controller 402 can be configured to provide different information to different displays of user interface 416 based on the operation mode. The different information can include information from one or more separate systems of the refuse vehicle 10. The controller 402 can be configured to integrate the various systems into a single system, such as the ADAS 400, to provide centralized access and control with improved contextual awareness, to an operator. The systems can include the camera system 500, the radar system 600, the collision detection system 1600, and the controllable elements 410 (i.e., lift assembly 40) of the refuse vehicle 10. In some embodiments, the view(s) generated by the controller 402 include information from one or more of the above systems. For example, the views can incorporate the sensor data and/or the control data relevant to the determined operation mode. A feed from the hopper camera 522 and a feed from the camera system 500 can be combined and provided to an operator via the same display. By integrating the various systems into a single control system, the controller 402 can provide enhanced control and monitoring ability to an operator. In some embodiments, data and/or feeds that would otherwise be permanently display on independent monitors are integrated into the ADAS 400 by the controller 402 allowing the controller 402 to control the various feeds and display in a unified interface that displays the information necessary based on the determined operational mode.
In some embodiments, the controller 402 is configured with a predefined list of views corresponding to the operation mode. The list may also correlate views to vehicle type (i.e., a front-end loader may have different views available compared than a rear-end loader). The displays of the user interface 416 (i.e., instrument display 418 and console display 420) may be controlled independently, and may display different information based on the same operation mode. For example, when a joystick of the user interface 416 is pressed, the controller 402 can determine that the refuse vehicle 10 is in a collection mode and automatically update console display 420 of user interface 416 to display image data from the hopper camera 522 and a curbside camera (e.g., one of the cameras 510 arranged on a side or a front of the refuse vehicle 10). This can aid the operator during a collection mode by providing a view of the lift arm and the surrounding area. The controller 402 can also automatically update the instrument display 418 of the user interface 416 based on the collection mode with the reverse image feed (i.e., from a rearward facing cameras 510 of the camera system 500) and/or the 360 composite view. This can be beneficial when a refuse vehicle goes from house to house and needs to pull back onto a road. Each display can be provided a unique view based on the determined operation mode. In some embodiments, the views are only generated for the console display 420 of user interface 416, and the instrument display 418 can permanently display material information such as a reverse camera video feed, speed, fuel level, battery charge, etc.
In some embodiments, the views are bound by a set of criteria, and the controller 402 can be configured to always display specific data based on the criteria no matter the operation mode and corresponding view. For example, rules/criteria can be applied requiring that the image data from the reverse camera be always displayed on the instrument display 418 regardless of the operation mode. Contextual information dependent on the operation mode can still be displayed around the reverse video feed on the instrument display 418.
In some embodiments, the views persist until a new mode is determined. For example, once in collection mode, the refuse vehicle 10 can stay in collection mode until the vehicle reaches 25 MPH. The controller 402 can be configured to determine, based on the control data or the sensor data, the speed of refuse vehicle 10 and transition from the collection mode to another mode, for example, a forward mode.
Referring now to
In some embodiments, method 2400 includes providing a refuse vehicle including an ADAS with a first set of sensors at step 2402. The sensors can be the sensors 414 of the ADAS 400. In some embodiments, the first set of sensors is a subset of the sensors 414. In some embodiments, the first set of sensors is comprised of sensors 414 all meeting a certain criteria. For example, the first set of sensors may be all forward looking sensors, all sensors that would be interfered with by a carry can 200, all cameras 510, all radar sensors 610, etc. Still in other embodiments, the first set of sensors can be all sensors installed on the refuse vehicle 10.
In some embodiments, method 2400 includes providing a carry can including a second set of sensors and configured to interface with the refuse vehicle at step 2404. Referring now to
Referring back to
In some embodiments, the method 2400 includes activating the first set of sensors at step 2408. In the forward mode, in some embodiments, the carry can 200 is raised in a transportation position above the body 14 of the refuse vehicle 10. As explained above, in some embodiments, the first set of sensors can be all forward-facing radar sensors 610 of the ADAS 400. In this embodiment, the carry can 200 may not be interfering with the first set of sensors installed on the refuse vehicle 10 when in forward mode and the controller 402 can be configured to accordingly activate the first set of sensors for use in the ADAS 400. In some embodiments, the sensors are always active, and the controller 402 can be configured to analyze only the data obtained from the first set of sensors. In some embodiments, all sensors are activate and the controller 402 is configured to weight the data obtained from the first set of sensors greater than the data from the second set of sensors.
In some embodiments, the method 2400 includes deactivating the second set of sensors at step 2410. In some embodiments, in forward mode with the carry can 200 in a transportation position above the refuse vehicle 10, the data from the second set of sensors on the carry can 200 may be not useful for control operations. For example, as described above with reference to
In some embodiments, the method 2400 includes obtaining sensor data from the activated sensors at step 2412. The obtained data may be provided to the ADAS 400. In some embodiments, the sensors are not physically deactivated, and data is obtained from both sets of sensors. The controller 402 may instead be configured to only analyze the data from one set based on the operation mode.
Referring back to step 2406, in some embodiments, if the controller 402 determines the operation mode is a collection mode, the method 2400 may proceed to step 2414. In some embodiments, step 2414 includes deactivating the first set of sensors. In a collection mode, carry can 200 may be positioned in front of the refuse vehicle 10. The carry can 200 can interfere with the forward radar sensors 610 and provide the ADAS 400 with data indicating false events based on the detection of the carry can 200. In some embodiments, the first set of sensors may include all sensors whose operation is interfered with by the position of the carry can 200 (e.g., the forward-facing cameras 510, the forward-facing radar sensors 610, the collision detection system 1600, etc.). To maintain the operation of the ADAS 400, in some embodiments, the controller 402 can be configured to deactivate the first set of sensors, and activate the second set of sensors on the carry can 200 at step 2416. By deactivating the first set of sensors and activating the second set of sensors, the ADAS 400 can maintain a 360-degree view around the refuse vehicle 10 and avoid instances of interference. In some embodiments, the method 2400 finally includes obtaining sensor data the activated sensors at step 2412.
While the method 2400 is shown illustrating the process for selecting sensors based on a forward mode and a collection mode, the method 2400 can also be applied to the selection of sensors for other operation modes of the refuse vehicle 10. The composition of the first set of sensors and the second set of sensors can also vary. It should be understood by a person of ordinary skill in the art in view of the present application that the sets of sensors can be determined by controller 402 based on the operation mode and vary accordingly.
As utilized herein with respect to numerical ranges, the terms “approximately,” “about,” “substantially,” and similar terms generally mean +/−10% of the disclosed values. When the terms “approximately,” “about,” “substantially,” and similar terms are applied to a structural feature (e.g., to describe its shape, size, orientation, direction, etc.), these terms are meant to cover minor variations in structure that may result from, for example, the manufacturing or assembly process and are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.
It should be noted that the term “exemplary” and variations thereof, as used herein to describe various embodiments, are intended to indicate that such embodiments are possible examples, representations, or illustrations of possible embodiments (and such terms are not intended to connote that such embodiments are necessarily extraordinary or superlative examples).
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
The hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some embodiments, particular processes and methods may be performed by circuitry that is specific to a given function. The memory (e.g., memory, memory unit, storage device) may include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present disclosure. The memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. According to an exemplary embodiment, the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
It is important to note that the construction and arrangement of the refuse vehicle 10 and the systems and components thereof as shown in the various exemplary embodiments is illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein. It should be appreciated that other elements of the various embodiments may be incorporated or utilized with any of the other embodiments disclosed herein.
This application claims the benefit of and priority to U. S. Provisional Application No. 63/280,899, filed Nov. 18, 2021, the entire disclosure of which is hereby incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63280899 | Nov 2021 | US |