The present disclosure relates generally to vehicles. More specifically, the present disclosure relates to an alert system for a commercial vehicle.
One implementation of the present disclosure relates to a system for alerting a user of a vehicle. The system includes a wearable alert device and processing circuitry. The wearable alert device is configured to be worn by the user of the vehicle and includes at least one of a visual alert device configured to provide visual feedback to a user, an aural alert device configured to provide aural feedback to the user, and a haptic alert device configured to provide haptic feedback to the user. The processing circuitry is configured to obtain a current position of the user of the vehicle and determine an alert condition and a severity of the alert condition based on sensor data obtained from one or more sensors of the vehicle and the current position of the user. The processing circuitry is also configured to operate at least one of the visual alert device, the aural alert device, or the haptic alert device to provide visual feedback, aural feedback, or haptic feedback to the user to inform the user regarding the alert condition and the severity of the alert condition.
In one embodiment the wearable alert device is a vest or an article of clothing. In other embodiments, the wearable alert device also includes a positioning device configured to report a position of the user to the processing circuitry.
In some embodiments the processing circuitry is further configured to determine, based on the current position of the user of the vehicle, whether the user is currently within a zone of an implement of the vehicle or in a path of the implement of the vehicle, and determine the alert condition and the severity of the alert condition at least in part based on the determination of whether the user is currently within the zone of the implement of the vehicle or in the path of the implement of the vehicle.
In some embodiments, the implement of the vehicle comprises an automated side-loading arm, a front-end loader, a tailgate, a mixer chute, a ladder assembly, or a boom assembly.
In some embodiments, the alert condition comprises at least one of the following conditions: the user is located within a zone of an implement of the vehicle, the user is located in a path of the implement of the vehicle, the user is within a cab of the vehicle and traffic is oncoming, a rear-end collision is predicted to occur, or a side collision is predicted to occur.
In some embodiments, the processing circuitry is configured further to, upon determining the alert condition is of high severity, operate the visual alert device to provide visual feedback, operate the aural alert device to provide aural feedback, and operate the haptic alert device to provide haptic feedback to inform the user of the high severity of the alert condition. In some embodiments, the processing circuitry is configured further to, upon determining the alert condition is of medium severity, operate the visual alert device to provide visual feedback and operate the aural alert device to provide aural feedback to inform the user of the medium severity of the alert condition. In some embodiments, the processing circuitry is configured further to, upon determining the alert condition is of low severity, operate the visual alert device to provide visual feedback to inform the user of the low severity of the alert condition.
Another implementation of the present disclosure relates to a method for alerting a user of a vehicle regarding an alert condition, the method comprising obtaining a current position of the user of the vehicle, determining an alert condition and a severity of the alert condition based on sensor data obtained from one or more sensors of the vehicle and the current position of the user. In some embodiments, the method includes operating at least one of a visual alert device to provide visual feedback, an aural alert device to provide aural feedback, or a haptic alert device to provide haptic feedback to the user to inform the user regarding the alert condition and the severity of the alert condition.
In some embodiments, the visual alert device, the aural alert device, or the haptic alert device is provided on a vest or article of clothing that is worn by the user. In some embodiments, the method further comprises reporting, by a positioning device, the current position of the user to a processing circuitry.
In some embodiments, the method includes determining, based on the current position of the user of the vehicle, whether the user is currently within a zone of an implement of the vehicle or in a path of the implement of the vehicle, and determining the alert condition and the severity of the alert condition at least in part based on the determination of whether the user is currently within the zone of the implement of the vehicle or in the path of the implement of the vehicle.
In some embodiments, the implement of the vehicle comprises an automated side-loading arm, a front-end loader, a tailgate, a mixer chute, a ladder assembly, or a boom assembly.
In other embodiments, the alert condition comprises at least one of the following conditions: the user is located within a zone of an implement of the vehicle, the user is located in a path of the implement of the vehicle, the user is within a cab of the vehicle and traffic is oncoming, a rear-end collision is predicted to occur, or a side collision is predicted to occur.
In some embodiments, the method includes, upon determining the alert condition is of high severity, operating the visual alert device to provide visual feedback, operating the aural alert device to provide aural feedback, and operating the haptic alert device to provide haptic feedback to inform the user of the high severity of the alert condition. In some embodiments, the method includes, upon determining the alert condition is of medium severity, operating the visual alert device to provide visual feedback and operating the aural alert device to provide aural feedback to inform the user of the medium severity of the alert condition. In some embodiments, the method includes, upon determining the alert condition is of low severity, operating the visual alert device to provide visual feedback to inform the user of the low severity of the alert condition.
Another implementation of the present disclosure relates to a wearable alert device. In some embodiments, the wearable alert device comprises an article of clothing with at least one of a visual alert device provided on the article of clothing and configured to provide visual feedback to a user, an aural alert device provided on the article of clothing and configured to provide aural feedback to the user, or a haptic alert device provided on the article of clothing and configured to provide haptic feedback to the user. In some embodiments, the wearable alert device also includes processing circuitry configured to obtain a current position of the user, determine an alert condition and a severity of the alert condition based on sensor data obtained from one or more sensors of a vehicle and the current position of the user, and operate at least one of the visual alert device, the aural alert device, or the haptic alert device to provide visual feedback, aural feedback, or haptic feedback to the user to inform the user regarding the alert condition and the severity of the alert condition.
In some embodiments, the article of clothing is a vest. In some embodiments, the wearable alert device also comprises a positioning device configured to report a position of the user to the processing circuitry. In some embodiments, the processing circuitry is further configured to determine, based on the current position of the user of the vehicle, whether the user is currently within a zone of an implement of the vehicle or in a path of the implement of the vehicle. In other embodiments, the processing circuitry is configured to determine the alert condition and the severity of the alert condition at least in part based on the determination of whether the user is currently within the zone of the implement of the vehicle or in the path of the implement of the vehicle.
In some embodiments, the alert condition comprises at least one of the following conditions: the user is located within a zone of an implement of the vehicle, the user is located in a path of the implement of the vehicle, the user is within a cab of the vehicle and traffic is oncoming, a rear-end collision is predicted to occur, or a side collision is predicted to occur.
In some embodiments, the processing circuitry is further configured to, upon determining the alert condition is of high severity, operate the visual alert device to provide visual feedback, operate the aural alert device to provide aural feedback, and operate the haptic alert device to provide haptic feedback to inform the user of the high severity of the alert condition. In some embodiments, the processing circuitry is configured to, upon determining the alert condition is of medium severity, operate the visual alert device to provide visual feedback and operate the aural alert device to provide aural feedback to inform the user of the medium severity of the alert condition. In some embodiments, the processing circuitry is configured to, upon determining the alert condition is of low severity, operate the visual alert device to provide visual feedback to inform the user of the low severity of the alert condition.
This summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices or processes described herein will become apparent in the detailed description set forth herein, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements.
Before turning to the figures, which illustrate certain exemplary embodiments in detail, it should be understood that the present disclosure is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology used herein is for the purpose of description only and should not be regarded as limiting.
According to an exemplary embodiment, a vehicle includes a system for notifying an operator regarding an alert condition. The system includes a wearable alert device that can have the form of a vest and is worn by the user. The wearable alert device includes visual alert devices, aural alert devices, and haptic feedback devices. The wearable alert device also includes a positioning device configured to report the user's position as the user or operator moves about the vehicle or is inside of the vehicle. An alert condition can be identified based on sensor data from sensors of the vehicle and based on the user's position. The wearable alert device provides any combination of visual, aural, or haptic feedback to notify the user regarding the alert condition and/or a severity of the alert condition.
Referring to
As shown in
In some embodiments, the front section 22 and the rear section 26 are configured as separate, discrete subframes (e.g., a front subframe and a rear subframe). In such embodiments, the front rail portion 30, the front rail portion 32, the rear rail portion 34, and the rear rail portion 36 are separate, discrete frame rails that are spaced apart from one another. In some embodiments, the front section 22 and the rear section 26 are each directly coupled to the middle section 24 such that the middle section 24 couples the front section 22 to the rear section 26. Accordingly, the middle section 24 may include a structural housing or frame. In other embodiments, the front section 22, the middle section 24, and the rear section 26 are coupled to one another by another component, such as a body of the vehicle 10.
In other embodiments, the front section 22, the middle section 24, and the rear section 26 are defined by a pair of frame rails that extend continuously along the entire length of the vehicle 10. In such an embodiment, the front rail portion 30 and the rear rail portion 34 would be front and rear portions of a first frame rail, and the front rail portion 32 and the rear rail portion 36 would be front and rear portions of a second frame rail. In such embodiments, the middle section 24 would include a center portion of each frame rail.
In some embodiments, the middle section 24 acts as a storage portion that includes one or more vehicle components. The middle section 24 may include an enclosure that contains one or more vehicle components and/or a frame that supports one or more vehicle components. By way of example, the middle section 24 may contain or include one or more electrical energy storage devices (e.g., batteries, capacitors, etc.). By way of another example, the middle section 24 may include fuel tanks. By way of yet another example, the middle section 24 may define a void space or storage volume that can be filled by a user.
A cabin, operator compartment, or body component, shown as cab 40, is coupled to a front end portion of the chassis 20 (e.g., the front section 22 of the chassis 20). Together, the chassis 20 and the cab 40 define a front end of the vehicle 10. The cab 40 extends above the chassis 20. The cab 40 includes an enclosure or main body that defines an interior volume, shown as cab interior 42, that is sized to contain one or more operators. The cab 40 also includes one or more doors 44 that facilitate selective access to the cab interior 42 from outside of the vehicle 10. The cab interior 42 contains one or more components that facilitate operation of the vehicle 10 by the operator. By way of example, the cab interior 42 may contain components that facilitate operator comfort (e.g., seats, seatbelts, etc.), user interface components that receive inputs from the operators (e.g., steering wheels, pedals, touch screens, switches, buttons, levers, etc.), and/or user interface components that provide information to the operators (e.g., lights, gauges, speakers, etc.). The user interface components within the cab 40 may facilitate operator control over the drive components of the vehicle 10 and/or over any implements of the vehicle 10.
The vehicle 10 further includes a series of axle assemblies, shown as front axle 50 and rear axles 52. As shown, the vehicle 10 includes one front axle 50 coupled to the front section 22 of the chassis 20 and two rear axles 52 each coupled to the rear section 26 of the chassis 20. In other embodiments, the vehicle 10 includes more or fewer axles. By way of example, the vehicle 10 may include a tag axle that may be raised or lowered to accommodate variations in weight being carried by the vehicle 10. The front axle 50 and the rear axles 52 each include a series of tractive elements (e.g., wheels, treads, etc.), shown as wheel and tire assemblies 54. The wheel and tire assemblies 54 are configured to engage a support surface (e.g., roads, the ground, etc.) to support and propel the vehicle 10. The front axle 50 and the rear axles 52 may include steering components (e.g., steering arms, steering actuators, etc.), suspension components (e.g., gas springs, dampeners, air springs, etc.), power transmission or drive components (e.g., differentials, drive shafts, etc.), braking components (e.g., brake actuators, brake pads, brake discs, brake drums, etc.), and/or other components that facilitate propulsion or support of the vehicle.
In some embodiments, the vehicle 10 is configured as an electric vehicle that is propelled by an electric powertrain system. Referring to
The batteries 60 may include one or more rechargeable batteries (e.g., lithium-ion batteries, nickel-metal hydride batteries, lithium-ion polymer batteries, lead-acid batteries, nickel-cadmium batteries, etc.). The batteries 60 may be charged by one or more sources of electrical energy onboard the vehicle 10 (e.g., solar panels, etc.) or separate from the vehicle 10 (e.g., connections to an electrical power grid, a wireless charging system, etc.). As shown, the drive motors 62 are positioned within the rear axles 52 (e.g., as part of a combined axle and motor assembly). In other embodiments, the drive motors 62 are otherwise positioned within the vehicle 10.
In other embodiments, the vehicle 10 is configured as a hybrid vehicle that is propelled by a hybrid powertrain system (e.g., a diesel/electric hybrid, gasoline/electric hybrid, natural gas/electric hybrid, etc.). According to an exemplary embodiment, the hybrid powertrain system may include a primary driver (e.g., an engine, a motor, etc.), an energy generation device (e.g., a generator, etc.), and/or an energy storage device (e.g., a battery, capacitors, ultra-capacitors, etc.) electrically coupled to the energy generation device. The primary driver may combust fuel (e.g., gasoline, diesel, etc.) to provide mechanical energy, which a transmission may receive and provide to the front axle 50 and/or the rear axles 52 to propel the vehicle 10. Additionally or alternatively, the primary driver may provide mechanical energy to the generator, which converts the mechanical energy into electrical energy. The electrical energy may be stored in the energy storage device (e.g., the batteries 60) in order to later be provided to a motive driver.
In yet other embodiments, the chassis 20 may further be configured to support non-hybrid powertrains. For example, the powertrain system may include a primary driver that is a compression-ignition internal combustion engine that utilizes diesel fuel.
Referring to
The application kit 80 may include various actuators to facilitate certain functions of the vehicle 10. By way of example, the application kit 80 may include hydraulic actuators (e.g., hydraulic cylinders, hydraulic motors, etc.), pneumatic actuators (e.g., pneumatic cylinders, pneumatic motors, etc.), and/or electrical actuators (e.g., electric motors, electric linear actuators, etc.). The application kit 80 may include components that facilitate operation of and/or control of these actuators. By way of example, the application kit 80 may include hydraulic or pneumatic components that form a hydraulic or pneumatic circuit (e.g., conduits, valves, pumps, compressors, gauges, reservoirs, accumulators, etc.). By way of another example, the application kit 80 may include electrical components (e.g., batteries, capacitors, voltage regulators, motor controllers, etc.). The actuators may be powered by components of the vehicle 10. By way of example, the actuators may be powered by the batteries 60, the drive motors 62, or the primary driver (e.g., through a power take oft).
The vehicle 10 generally extends longitudinally from a front side 86 to a rear side 88. The front side 86 is defined by the cab 40 and/or the chassis. The rear side 88 is defined by the application kit 80 and/or the chassis 20. The primary, forward direction of travel of the vehicle 10 is longitudinal, with the front side 86 being arranged forward of the rear side 88.
Referring now to
As shown in
As shown in
Referring now to
Referring still to
The grabber assembly 162 is movably coupled to a guide, shown as track 170, that extends vertically along a side of the refuse vehicle 100. Specifically, the main body 164 is slidably coupled to the track 170 such that the main body 164 is repositionable along a length of the track 170. An actuator (e.g., a hydraulic motor, an electric motor, etc.), shown as lift actuator 172, is configured to control movement of the grabber assembly 162 along the length of the track 170. In some embodiments, a bottom end portion of the track 170 is straight and substantially vertical such that the grabber assembly 162 raises or lowers a refuse container when moving along the bottom end portion of the track 170. In some embodiments, a top end portion of the track 170 is curved such that the grabber assembly 162 inverts a refuse container to dump refuse into the hopper volume 132 when moving along the top end portion of the track 170.
The lift assembly 160 further includes an actuator (e.g., a hydraulic cylinder, an electric linear actuator, etc.), shown as track actuator 174, that is configured to control lateral movement of the grabber assembly 162. By way of example, the track actuator 174 may be coupled to the chassis 20 and the track 170 such that the track actuator 174 moves the track 170 and the grabber assembly 162 laterally relative to the chassis 20. The track actuator 174 may facilitate repositioning the grabber assembly 162 to pick up and replace refuse containers that are spaced laterally outward from the refuse vehicle 100.
Referring now to
As shown in
The mixing drum 232 may be configured to receive a mixture, such as a concrete mixture (e.g., cementitious material, aggregate, sand, etc.), through the hopper 236. In some embodiments, the mixer truck 200 includes an injection system (e.g., a series of nozzles, hoses, and/or valves) including an injection valve that selectively fluidly couples a supply of fluid to the inner volume of the mixing drum 232. By way of example, the injection system may be used to inject water and/or chemicals (e.g., air entrainers, water reducers, set retarders, set accelerators, superplasticizers, corrosion inhibitors, coloring, calcium chloride, minerals, and/or other concrete additives, etc.) into the mixing drum 232. The injection valve may facilitate injecting water and/or chemicals from a fluid reservoir (e.g., a water tank, etc.) into the mixing drum 232, while preventing the mixture in the mixing drum 232 from exiting the mixing drum 232 through the injection system. In some embodiments, one or more mixing elements (e.g., fins, etc.) may be positioned in the interior of the mixing drum 232, and may be configured to agitate the contents of the mixture when the mixing drum 232 is rotated in a first direction (e.g., counterclockwise, clockwise, etc.), and drive the mixture out through the chute 238 when the mixing drum 232 is rotated in a second direction (e.g., clockwise, counterclockwise, etc.). In some embodiments, the chute 238 may also include an actuator positioned such that the chute 238 may be selectively pivotable to position the chute 238 (e.g., vertically, laterally, etc.), for example at an angle at which the mixture is expelled from the mixing drum 232.
Referring now to
As shown in
As shown in
Referring now to
The application kit 80 includes a pump system 304 (e.g., an ultra-high-pressure pump system, etc.) positioned within one of the compartments 302 near the center of the ARFF truck 300. The application kit 80 further includes a water tank 310, an agent tank 312, and an implement or water turret, shown as monitor 314. The pump system 304 may include a high pressure pump and/or a low pressure pump, which may be fluidly coupled to the water tank 310 and/or the agent tank 312. The pump system 304 may to pump water and/or fire suppressing agent from the water tank 310 and the agent tank 312, respectively, to the monitor 314. The monitor 314 may be selectively reoriented by an operator to adjust a direction of a stream of water and/or agent. As shown in
Referring now to
As shown in
As shown in
The boom assembly 354 further includes a second actuator, shown as upper lift cylinder 366. The upper boom 362 is pivotally coupled (e.g., pinned) to the upper end of the lower boom 360 at a joint or upper boom pivot point. The upper lift cylinder 366 (e.g., a pneumatic cylinder, an electric linear actuator, a hydraulic cylinder, etc.) is coupled to the upper boom 362. The upper lift cylinder 366 may be configured to extend and retract to actuate (e.g., lift, rotate, elevate, etc.) the upper boom 362, thereby raising and lowering a distal end of the upper boom 362.
Referring still to
The platform assembly 370 provides a platform configured to support one or more operators or users. In some embodiments, the platform assembly 370 may include accessories or tools configured for use by the operators. For example, the platform assembly 370 may include pneumatic tools (e.g., an impact wrench, airbrush, nail gun, ratchet, etc.), plasma cutters, welders, spotlights, etc. In some embodiments, the platform assembly 370 includes a control panel (e.g., a user interface, a removable or detachable control panel, etc.) configured to control operation of the boom lift 350 (e.g., the turntable 352, the boom assembly 354, etc.) from the platform assembly 370 or remotely. In other embodiments, the platform assembly 370 is omitted, and the boom lit 350 includes an accessory and/or tool (e.g., forklift forks, etc.) coupled to the distal end of the boom assembly 354.
Referring now to
As shown in
The lit assembly 404 may include a series of subassemblies, shown as scissor layers 420, each including a pair of inner members and a pair of outer members pivotally coupled to one another. The scissor layers 420 may be stacked atop one another in order to form the lift assembly 404, such that movement of one scissor layer 420 causes a similar movement in all of the other scissor layers 420. The scissor layers 420 extend between and couple the lift base 402 and an operator platform (e.g., the platform assembly 430). In some embodiments, scissor layers 420 may be added to, or removed from, the lift assembly 404 in order to increase, or decrease, the fully extended height of the lift assembly 404.
Referring still to
A distal or upper end of the lift assembly 404 is coupled to an operator platform, shown as platform assembly 430. The platform assembly 430 may perform similar functions to the platform assembly 370, such as supporting one or more operators, accessories, and/or tools. The platform assembly 430 may include a control panel to control operation of the scissor lift 400. The lift actuators 424 may be configured to actuate the lift assembly 404 to selectively reposition the platform assembly 430 between a lowered position (e.g., where the platform assembly 430 is proximate to the lift base 402) and a raised position (e.g., where the platform assembly 430 is at an elevated height relative to the lift base 402). Specifically, in some embodiments, extension of the lift actuators 424 moves the platform assembly 430 upward (e.g., extending the lift assembly 404), and retraction of the lift actuators 424 moves the platform assembly 430 downward (e.g., retracting the lift assembly 404). In other embodiments, extension of the lift actuators 424 retracts the lift assembly 404, and retraction of the lift actuators 424 extends the lift assembly 404.
Referring to
The wearable alert device 504 includes one or more light emitting diodes (LEDs), condensed fluorescent (CFL) bulbs, glow strips, display screens, etc., shown as visual alert devices 510, according to some embodiments. The visual alert devices 510 can be configured to provide a visual or lighting alert to the operator 502 by varying intensity, color, pattern, etc., of the visual alert devices 510. For example, in response to different conditions or to indicate different alert severities, the visual alert devices 510 can provide green, yellow, or red colors to visually notify the operator 502 regarding an alert and a severity of the alert. The visual alert devices 510 may also flash or blink intermittently to indicate an alert, strobe, actuate brightness or light emittance in a pattern, etc. In some embodiments, the visual alert devices 510 are configured to provide diffused light to provide a glow (e.g., a colored glow) to the operator 502 to notify the operator 502 regarding an alert. The visual alert devices 510 can be arranged in an array, along a specific portion of the wearable alert device 504, etc. In some embodiments, the visual alert devices 510 are arranged along multiple 1-dimensional or 2-dimensional arrays on the wearable alert device 504.
The wearable alert device 504 also includes one or more speakers, sound emitters, electroacoustic transducers, tweeters, beepers, loudspeaker, woofers, sub-woofers, etc., shown as aural alert devices 506, according to some embodiments. The aural alert devices 506 are configured to provide an aural alert to the operator or wearer 502 to notify the operator 502 regarding an alert condition (e.g., a warning event), according to some embodiments. The aural alert devices 506 can provide aural alerts such as tones, siren sounds, spoken words or phrases, a horn sound, etc. The aural alert devices 506 can function in combination with the visual alert devices 510 to notify the operator 502 regarding the alert condition and/or the severity of the alert condition. In some embodiments, a decibel level or loudness of the sounds output by the aural alert devices 506 indicates the severity of the alert condition.
The wearable alert device 504 also includes haptic alert devices 514 that are configured to provide haptic feedback to the operator 502 to notify the operator 502 regarding the alert condition and/or the severity of the alert condition. In some embodiments, the haptic alert devices 514 are configured to vibrate, move, provide a force, accelerate, etc., to provide tactile or haptic feedback to the operator 502 regarding the alert condition or the severity of the alert condition. The haptic alert devices 514 can provide continuous haptic feedback, discrete haptic feedback, etc., to notify the operator 502. The haptic alert devices 514 can be disposed in different locations about the wearable alert device 504 (e.g., proximate the operator's 502 sternum, at the operator's 502 mid-section, at the operator's 502 shoulders, etc.
Referring still to
Referring still to
Referring to
Referring particularly to
The awareness sensors 568 along the street side 82 of the vehicle 10 may be configured to monitor or detect the presence of objects, proximity of objects, motion of objects, etc., that are along the street side 82 of the vehicle 10. In some embodiments, the awareness sensors 568 along the street side 82 of the vehicle 10 are configured to monitor or detect objects, motion, proximity, etc., of objects that are within a street side zone 90. Similarly, the awareness sensors 568 along the rear end 88 of the vehicle 10 can be configured to detect or monitor objects, motion of objects, proximity of objects, etc., that are within a rear zone 94. The awareness sensors 568 along the curb side 84 of the vehicle 10 can be configured to monitor or detect presence, motion, proximity, etc., of objects along the curb side 84 of the vehicle 10 (e.g., within zones 99, 98, and/or 96). The awareness sensors 568 along the front end 86 of the vehicle 10 can be configured to monitor or detect presence, motion, proximity, etc., of objects that are in front of the vehicle 10 (e.g., within a front zone 92). It should be understood that the areas of the front zone 92, the rear zone 94, the street side zone 90, and/or the zones 99, 98, and 96 are illustrative only, and the awareness sensors 568 should not be understood as only being capable of detecting presence, motion, or proximity of objects within these zones.
The position of the operator 502 can be determined or transmitted to the controller 512 or the controller 560 of the vehicle 10 based on data from the positioning device 522, and/or based on data obtained from the awareness sensors 568. In some embodiments, the controller 560 of the vehicle 10 is configured to use a triangulation technique to determine the location or current position of the operator 502 or the wearable alert device 504 relative to the vehicle 10.
When the operator 502 is detected as being within the zone 98, the controller 512 and/or the controller 560 of the vehicle 10 may determine that an alert condition is present. The controller 512 or the controller 560 may determine that the alert condition is present if the operator 502 is within the zone 98, or if the operator 502 is within the zone 98 while the ASL 570 is being requested to operate. In response to the alert condition being present (e.g., the operator 502 being within the zone 98), the controller 512 may operate the visual alert devices 510, the aural alert devices 506, or the haptic alert devices 514 to notify the operator 502 that the alert condition is present, and/or to provide a severity of the alert condition to the operator 502.
In some embodiments, the controller 512 operates the visual alert devices 510 to notify the operator 502 that the operator is currently within the zone 98. The ASL may be communicatively coupled to and controlled by the controller 560 of the vehicle 10. If a command is sent to the controller 560 to operate the ASL 570, both the visual alert devices 510 and the haptic alert devices 514 may be operated to indicate that the severity of the alert condition has increased. Similarly, if the ASL 570 begins to operate, the visual alert devices 510, the haptic alert devices 514, and the aural alert devices 506 may be operated to indicate that severity of the alert condition has increased. In some embodiments, operation of the ASL 570 is limited (e.g., by the controller 560) if the operator 502 is currently within the zone 98.
The alerts provided to the operator 502 may prompt the operator 502 to move out of the zone 98 (e.g., out of the way of the ASL 570) and into zone 99 or into zone 96. Once the operator 502 moves out of the zone 98, the ASL 570 may be operated to perform its intended function. In this way, the wearable alert device 504 can be used to prompt the operator 502 regarding a current alert condition and to prompt the operator 502 to move in order to terminate the alert condition.
Referring to
Referring to
If the vehicle 580 is predicted to be within range of the door of the cab 40 at a future point in time (e.g., predicted based on observed motion, speed, etc., as provided by the awareness sensors 568), the controller 512 may operate any of, or a combination of, the visual alert devices 510, the aural alert devices 506, or the haptic alert devices 514. In some embodiments, when the vehicle 580 is approaching but is at a first distance, the controller 512 operates the visual alert devices 510 only in order to notify the operator 502. If the vehicle 580 approaches and is closer and still predicted to be within range of the door of the cab 40 (if opened), the controller 512 may operate the visual alert devices 510 and the haptic alert devices 514. If the vehicle 580 is proximate and approaching the vehicle 10 (e.g., within the zone 90), and the operator 502 reaches to open the door of the cab 40 on the street side 82 of the cab 40, the controller 512 may operate the visual alert devices 510, the haptic alert devices 514, and the aural alert devices 506 to warn the operator 502 to not open the door of the cab 40. In some embodiments, the controller 560 of the vehicle 10 is configured to obtain image data from cameras within the cab 40, to monitor a door sensor, etc., to determine if the operator 502 is about to open the door of the cab 40. In some embodiments, the controller 560 is configured to limit opening of the door of the cab 40 (e.g., lock the doors) so that the operator 502 does not open the door into the oncoming vehicle 10. Once the alert condition (e.g., the vehicle 580) has passed, the controller 512 may operate the visual alert devices 510, the aural alert devices 506, and/or the haptic alert devices 514 to notify the operator 502 that the alert condition has passed and that it is safe to open the door of the cab 40. In this way, the wearable alert device 504 can facilitate providing alerts to the operator 502 for predicted alert conditions.
Referring to
It should be understood that while
Referring to
Referring to
Memory 520 (e.g., memory, memory unit, storage device, etc.) can include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage, etc.) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present application. Memory 520 can be or include volatile memory or non-volatile memory. Memory 520 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present application. According to some embodiments, memory 520 is communicably connected to processor 518 via processing circuitry 516 and includes computer code for executing (e.g., by processing circuitry 516 and/or processor 518) one or more processes described herein.
In some embodiments, controller 512 is implemented within a single computer (e.g., one server, one housing, etc.). In various other embodiments, the functionality of the controller 512 can be distributed across multiple servers or computers (e.g., that can exist in distributed locations).
Similarly, the controller 560 of the vehicle 10 includes processing circuitry 562, a processor 564, and memory 566. Processing circuitry 562 can be communicably connected to the communications interface such that processing circuitry 562 and the various components thereof can send and receive data via the communications interface. Processor 564 can be implemented as a general purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable electronic processing components.
Memory 566 (e.g., memory, memory unit, storage device, etc.) can include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage, etc.) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present application. Memory 566 can be or include volatile memory or non-volatile memory. Memory 566 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present application. According to some embodiments, memory 566 is communicably connected to processor 564 via processing circuitry 562 and includes computer code for executing (e.g., by processing circuitry 562 and/or processor 564) one or more processes described herein.
In some embodiments, controller 560 is implemented within a single computer (e.g., one server, one housing, etc.). In various other embodiments, the functionality of the controller 560 can be distributed across multiple servers or computers (e.g., that can exist in distributed locations).
The positioning device 522 may be configured to communicate with the awareness sensors 568 so that the awareness sensors 568 can provide the controller 560 with detection data indicating a current position of the operator 502 relative to the vehicle 10. The positioning device 522 may be configured to additionally or alternatively report its current position to the controller 512, which can transmit the position to the personal computer device 550 or the controller 560 via a wireless transceiver 578 (e.g., a wireless radio, a cellular dongle, an ultra-wide band transceiver, etc.) to a wireless transceiver 576 (e.g., a wireless radio, a cellular dongle, an ultra-wide band transceiver, etc.) of the controller 560 of the vehicle 10. If the wireless transceiver 578 is an ultra-wide band transceiver, the wireless transceiver 578 may not need to pair with the controller 512 of the wearable alert device 504, the controller 560, and/or the awareness sensors 568 of the vehicle 10. In some embodiments, the positioning device 522 includes its own wireless transceiver that is configured to communicate with the wireless transceiver 576 of the controller 560 of the vehicle 10 to report its position.
The vehicle 10 also includes one or more body/chassis sensors 572 configured to provide sensor data to the controller 560. The body/chassis sensors 572 can include any speed sensors of the vehicle 10, sensors that measure a current degree of deployment of any of the lift arm actuators 144, the ASL 570, the FEL 140, the tailgate 136, a compaction apparatus of the vehicle 10, a current position of the chute 238, a current position of the turntable 262, a current position or deployment of the ladder assembly 254, etc. In some embodiments, the controller 560 is configured to determine the current positions of any implements of the vehicle 10 (e.g., a current position of the ASL 570, a current position of the FE L 140, a current position of the tailgate 136, etc.). In some embodiments, the controller 560 is also configured to determine or predict a future position of any implement of the vehicle 10 (e.g., a future or predicted position of the ASL 570, a future or predicted position of the FEL 140, a future or predicted position of the tailgate 136, etc.) based on control signals that are provided to any controllable systems 575 of the vehicle 10. The controllable systems 575 can include any of the controllable elements (e.g., linear actuators, electric actuators, electric motors, hydraulics, pneumatics, etc.) of the ASL 570, the FEL 140, the tailgate 136, a compaction apparatus of the vehicle 10, the boom assembly 354, the lift assembly 404, the ladder assembly 254, the turntable 262, etc.
The controller 560, or more particularly, the processing circuitry 562, can use the position of the positioning device 522 (e.g., the position of the wearable alert device 504, the position of the operator 502, etc.), the detection data obtained from the awareness sensors 568, the sensor data obtained from the body/chassis sensors 572, and the control signals provided to the controllable systems 575 that indicate a predicted or future position of any implements of the vehicle 10 to determine if an alert condition is present and to determine a severity or magnitude of the alert condition.
In some embodiments, the controller 560 is configured to provide the determined or identified alert condition and/or the alert severity to the controller 512 of the wearable alert device 504. The controller 512 of the wearable alert device 504 is configured to use the alert condition and the alert severity of the alert condition to determine controls for the haptic alert devices 514, the aural alert devices 506, and the visual alert devices 510 (e.g., which of the haptic alert devices 514, the aural alert devices 506, or the visual alert devices 510 to operate, how to operate the haptic alert devices 514, the aural alert devices 506, or the visual alert devices 510, etc.).
In some embodiments, the alert conditions that are detected by the controller 560 include determining if the operator 502 is about the exit the cab 40 as a vehicle or oncoming traffic is detected by the awareness sensors 568 (e.g., as illustrated in
In some embodiments, the alert severities include high, medium, and low severities. In some embodiments, certain alert conditions such as expected collisions with the vehicle 10 have a high alert severity. In some embodiments, alert conditions such as when the operator 502 is about to exit the cab 40 and there is incoming traffic, may have a medium severity. In some embodiments, alert conditions such as when the operator 502 is within a zone or within a path of an implement of the vehicle 10 (e.g., within a path or a zone of the ASL 570, within a path or zone of the FEL 140, within a path or zone of the tailgate 136, etc.) may have low or medium alert severity. In some embodiments, the alert severity is a quantified value such as a value between 1 and 10 with 1 being the lowest alert severity and 10 being the highest severity.
In some embodiments, the controller 512 is configured to use the alert condition and the alert severity corresponding to the alert condition to determine how to control the haptic alert devices 514, the aural alert devices 506, the visual alert devices 510. In some embodiments, which of the haptic alert devices 514, the aural alert devices 506, or the visual alert devices 510 are operated by the controller 512 is determined based on the alert severity of the alert condition. For example, high or medium alert severities may include operating the haptic alert devices 514 in combination with the aural alert devices 506 and the visual alert devices 510. In some embodiments, a low alert severity includes operating the visual alert devices 510 and/or the aural alert devices 506 without operating the haptic alert devices 514. In some embodiments, certain low severity alert conditions (such as when the operator 502 is within a zone or in a path of an implement of the vehicle 10 but the implement is not yet being requested to be operated) may be associated with only operating the visual alert devices 510.
In some embodiments, the controller 560 and/or the controller 512 are configured to adjust the alert severity as the alert condition develops in real-time. For example, if there is a low probability of a collision (e.g., the approaching vehicle 10 is far away and may still stop in time), the alert severity may be low, but if the probability of the collision increases (e.g., the approaching vehicle approaches the vehicle 10 and does not slow down, increases in speed, etc.), the controller 560 may increase the alert severity (e.g., to high or medium), and the controller 512 can accordingly operate the haptic alert devices 514, the aural alert devices 506, and/or the visual alert devices 510 to notify the operator 502 that the alert severity has increased. In some embodiments, alert severity may increase based on operation of the vehicle 10, or more specifically, the controllable systems 575 of the vehicle 10 (e.g., the ASL 570, the FEL 140, the tailgate 136, etc.). For example, if the operator 502 is within the zone 98 but the ASL 570 is not yet being operated, the alert severity may be low. However, if the ASL 570 is about to begin to operate or is commanded to operate and the operator 502 is within the zone 98 or in a path of the ASL 570, the alert severity may be increased, updated, modified, etc. (e.g., by the controller 560) to medium or high. The controller 560 provides the updated alert severity of the alert condition to the controller 512, which adjusts operation of the haptic alert devices 514, the aural alert device 506, and/or the visual alert devices 510 to inform the operator 502 regarding the updated alert severity. In some embodiments, the aural alerts provided by the aural alert devices 506 include spoken words or phrases to instruct the operator 502 to move (e.g., a spoken phrase of “Move out of the path of the ASL,” “Warning, please move,” “Please step away from my arm,” “Please move out of the way,” “Collision expected,” “Do not open the door,” “Oncoming traffic!,” etc.).
In some embodiments, the controller 560 is also configured to adjust operation of any of the controllable system 575 based on the alert condition and/or the alert severity. For example, if the operator 502 is standing in the path of one of the implements (e.g., the application kit 80) of the vehicle 10 such as the ASL 570, the FEL 140, the tailgate 136, the chute 238, etc., and a command is sent to the controller 560 to operate the implement such as the ASL 570, the FEL 140, the tailgate 136, the chute 238, etc., the controller 560 may limit operation of the implement until the operator 502 has moved out of the zone or path of the implement. The operator 502 can be prompted to move out of the zone or path of the implement by operation of the haptic alert devices 514, the aural alert devices 506, and/or the visual alert devices 510 by the controller 512. In this way, the controller 512 and the controller 560 can operate cooperatively to both prompt the operator 502 regarding the alert condition, and limiting operation of the vehicle 10 or implements thereof until the alert condition is no longer present (e.g., until the operator 502 moves out of the way of the implement). In some embodiments, the controller 560 is also configured to actively operate the controllable systems 575 to mitigate a hazard. For example, if traffic is oncoming and the operator 502 is about to open the door of the cab 40 into the oncoming traffic, the controller 560 may operate a door lock of the controllable systems 575 to restrict or otherwise limit the operator 502 from opening the door into oncoming traffic, and maintain the door lock of the controllable systems 575 locked until the oncoming traffic passes. In some embodiments, the controller 560 is configured to determine an alert condition if traffic is oncoming and the operator 502 is travelling towards the oncoming traffic. For example, if the operator 502 is in front of the vehicle 10, and a car is oncoming along the street side 82 of the vehicle 10, the controller 560 can determine an alert condition is present if the operator 502 walks towards the street side 82 of the vehicle 10, based on the detection data provided by the awareness sensors 568. In some embodiments, the controller 512 operates the haptic alert devices 514, the aural alert devices 506, and/or the visual alert devices 510 based on the alert condition to notify the operator 502 to be careful due to the oncoming traffic along the street side 82 of the vehicle 10.
In some embodiments, the controller 512 is configured to store (e.g., in the memory 520) a table or database of all different possible alert conditions, and appropriate alert responses for each of the different alert conditions. In some embodiments, the table or database includes corresponding alert severities for each of the different alert conditions and appropriate alert responses for each of the alert conditions and the alert severities. For example, certain conditions, when the alert condition increases from low to medium to high, may have different alert responses, and the controller 512 can adjust operation of the haptic alert devices 514, the aural alert devices 506, and/or the visual alert devices 510 in real-time. In some embodiments, the controller 512 is configured to store the table or database as settings in the memory 520. In some embodiments, the controller 512 stores default or factory settings for operating the haptic alert devices 514, the aural alert devices 506, and/or the visual alert devices 510. In some embodiments, the settings for the operation of the wearable alert device 504 is customizable between different pre-determined modes of operation. For example, the wearable alert device 504 may be transitioned into a silent mode so that the aural alert devices 506 are not operated, and alerts are only provided to the operator 502 via the haptic alert devices 514 and the visual alert devices 510.
Referring still to
Referring still to
Referring to
The alert settings 1108 are shown displayed as a table that includes the different alert conditions and corresponding severity, and the corresponding visual alert, aural alert, and haptic alert, according to some embodiments. It should be understood that the alert conditions and corresponding alert actions shown in
The alert conditions are shown to include a predicted rear collision, a predicted side collision, oncoming traffic, several alert conditions for when the user or operator 502 is in a particular zone, and when no alerts are present. It should be understood that these alert conditions are not limiting. The predicted rear collision is shown having a high severity, and the corresponding alert actions include flashing red lights for the visual alert devices 510, providing a loud siren with the aural alert devices 506, and providing rapid discrete haptic feedback using the haptic alert devices 514. The predicted side collision is shown having a high severity, and the corresponding alert actions include flashing red lights for the visual alert devices 510, providing a loud siren with the aural alert devices 506, and providing rapid discrete haptic feedback using the haptic alert devices 514. The oncoming traffic alert condition is shown having a medium severity, and the corresponding alert actions include constant orange lights for the visual alert devices 510, and providing a beep tone with the aural alert devices 506. The first user in the zone alert condition is shown having a low severity, and the corresponding actions include a constant yellow lighting provided via the visual alert devices 510. In some embodiments, the first user in the zone alert condition applies when the operator 502 is determined to be in a particular zone (e.g., the zone 98 as shown in
The third user in zone alert condition is shown having a high severity, and the corresponding alerts include providing a flashing red light visual alert using the visual alert devices 510, providing a loud siren using the aural alert devices 506, and providing rapid discrete haptic feedback using the haptic alert devices 514. The third user in zone alert, condition may apply when the operator 502 is in the zone of the implement (e.g., in the zone 98 or path of the ASL 570, in the zone 92 or path of the FEL 140, in the zone 94 or path of the tailgate 136, etc.) and the implement is beginning to move.
When no alert condition is present, the controller 512 may operate the visual alert devices 510 to provide green lighting. In some embodiments, the controller 512 only provides the green lighting using the visual alert devices 510 for a certain amount of time after the alert condition has transitioned from a low, medium, or high severity alert condition to no alert condition.
In some embodiments, the operator 502 can select various alert responses (e.g., the responsive visual alerts, the responsive aural alerts, the responsive haptic alerts) and edit, modify, or update the alert responses. For example, the operator 502 may switch the visual alerts, the aural alerts, or the haptic alerts between different predetermined alerts. In some embodiments, one or more of the alert conditions (e.g., the rear collision, side collision, etc., alert conditions) are locked such that the operator 502 is limited from modifying the responsive alert actions. In some embodiments, the personal computer device 550 is configured to provide updated alert settings to the controller 512 when the operator 502 updates the alert settings 1108. In some embodiments, the alert settings 1108 are configured to display updated or different alert settings when the operator 502 enables or disables the silent mode 1104. In some embodiments, enabling the silent mode 1104 causes the personal computer device 550 to display the alert settings 1108 with the aural alerts grayed out.
Referring to
Process 1300 includes providing a wearable alert device including one or more visual alert devices, one or more aural alert devices, and/or one or more haptic alert devices (step 1302), according to some embodiments. In some embodiments, the visual alert devices include LEDs that are configured to emit one or more colors of light (e.g., red, green, blue, or any combination thereof) according to varying intensities, varying patterns, etc. In some embodiments, the aural alert devices include speakers, sirens, beepers, etc. In some embodiments, the haptic alert devices are vibrators or configured to provide tactile or vibrational feedback to the user. The wearable alert device may be the wearable alert device 504. The wearable alert device can also include a positioning device or transmitter that is usable to determine a position of the user. The wearable device may have the form of a vest that is wearable by the user.
Process 1300 includes determining a position of an operator wearing the wearable alert device relative to a vehicle, the vehicle including an implement (step 1304), according to some embodiments. In some embodiments, the positioning device is configured to wirelessly communicate with a controller or different sensors of the vehicle to identify the position of the operator relative to the vehicle. The position of the operator wearing the wearable alert device relative to the vehicle can be provided to a controller of the vehicle and/or a controller of the wearable alert device. In some embodiments, step 1304 includes determining which side of the vehicle the user is on, how close the user is to a side of the vehicle, which of multiple zones surrounding the vehicle that the user is in, etc. In some embodiments, step 1304 is performed by the controller 560 or the controller 512.
Process 1300 includes obtaining sensor or operational data from one or more systems of the vehicle, and detection data from awareness sensors of the vehicle (step 1306), according to some embodiments. In some embodiments, the sensor data includes detection data obtained from one or more cameras, radar cameras, etc., such as the awareness sensors 568, which may detect different objects. In some embodiments, the sensor data includes operational data of any implement, side loading arm, boom arm, telehandler section, etc., indicating a current position, degree of extension, degree of deployment, etc., of the implement, side loading arm, boom arm, telehandler section, etc. In some embodiments, the sensor data includes a current position of the vehicle along a route. In some embodiments, the sensor data indicates environmental objects (e.g., stationary or moving) that surround the vehicle. In some embodiments, the detection data includes camera or image information provided by cameras that are mounted about the vehicle. In some embodiments, step 1306 is performed by the controller 560 of the vehicle 10, or by the controller 512 of the wearable alert device 504.
Process 1300 includes identifying an alert condition and a severity of the alert condition based on the position of the operator, the sensor data, and the detection data (step 1308), according to some embodiments. In some embodiments, the alert condition includes any of detecting if the operator is in a path of an implement of the vehicle, detecting if traffic is oncoming while the operator is in a cab of the vehicle, detecting if traffic is oncoming while the operator is moving to a street side of the vehicle, detecting if a collision is imminent, etc. In some embodiments, the severity of the alert condition is determined based on the type of alert condition that is present. In some embodiments, the detection data is used to determine or identify if a collision is imminent or if traffic is oncoming. In some embodiments, the position of the operator is used to determine the alert condition (e.g., to determine if the operator is standing in a path of the implement) and/or to determine the severity of the alert condition. In some embodiments, step 1308 is performed by the controller 560 or the controller 512.
Process 1300 includes operating any of the visual alert devices, the aural alert devices, and/or the haptic alert devices according to the alert condition and the severity to notify the operator regarding the alert condition and the severity (step 1310), according to some embodiments. In some embodiments, step 1310 includes operating the visual alert devices to provide a lighting alert according to a pattern, a certain color, a certain intensity, etc., to indicate the alert condition and the severity of the alert condition. In some embodiments, step 1310 includes operating the aural alert devices to provide an aural alert such as a beep, a siren, a spoken word or phrase, etc., to notify the operator regarding the alert condition and/or the severity of the alert condition. In some embodiments, step 1310 includes operating the haptic alert devices to provide haptic feedback to the operator to notify the operator regarding the alert condition and/or the severity of the alert condition. In some embodiments, step 1310 is performed by the controller 512 operating the visual alert device 510, the aural alert devices 506, and/or the haptic alert devices 514 based on the alert condition that is detected and the severity of the alert condition.
Process 1300 includes operating an alert system of the vehicle according to the alert condition, the severity, and settings of the wearable alert device (step 1312), according to some embodiments. In some embodiments, step 1312 includes operating a visual alert device or an aural alert device of the vehicle in combination with the wearable alert device. In some embodiments, step 1312 includes operating the alert system of the vehicle to compensate for a silent mode of the wearable alert device. In some embodiments, step 1312 is performed by the controller 560 of the vehicle 10 and the alert device 582 of the vehicle 10.
Referring to
As utilized herein, the terms “approximately,” “about,” “substantially”, and similar terms are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. It should be understood by those of skill in the art who review this disclosure that these terms are intended to allow a description of certain features described and claimed without restricting the scope of these features to the precise numerical ranges provided. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.
It should be noted that the term “exemplary” and variations thereof, as used herein to describe various embodiments, are intended to indicate that such embodiments are possible examples, representations, or illustrations of possible embodiments (and such terms are not intended to connote that such embodiments are necessarily extraordinary or superlative examples).
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
The hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some embodiments, particular processes and methods may be performed by circuitry that is specific to a given function. The memory (e.g., memory, memory unit, storage device) may include one or more devices (e.g., RAM. ROM. Flash memory, hard disk storage) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present disclosure. The memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. According to an exemplary embodiment, the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.
The present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
It is important to note that the construction and arrangement of the vehicle 10 and the systems and components thereof as shown in the various exemplary embodiments is illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein. Although only one example of an element from one embodiment that can be incorporated or utilized in another embodiment has been described above, it should be appreciated that other elements of the various embodiments may be incorporated or utilized with any of the other embodiments disclosed herein.
This application claims the benefit of and priority to U.S. Provisional Application No. 63/325,667, filed on Mar. 31, 2022, the entire disclosure of which is hereby incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63325667 | Mar 2022 | US |