The present embodiments relate to a materials handling vehicle having a positioning assistance system that provides assistance to an operator that is driving the vehicle.
Known materials handling vehicles include a power unit, a mast assembly, and a platform assembly that includes a fork carriage assembly coupled to the mast assembly for vertical movement relative to the power unit.
In accordance with a first aspect of the disclosure, a materials handling vehicle is provided comprising: a power unit comprising a steered wheel and a steering device for generating a steer control signal; a load handling assembly coupled to the power unit; a controller located on the power unit for receiving the steer control signal; and a sensing device on the power unit coupled to the controller. The sensing device may monitor areas in front of and next to the vehicle. Data from the sensing device may be used by the controller to identify at least one of position information of the vehicle relative to a wall or rack or object information indicating that one or more objects are in front of or to the side of the vehicle. Based on the sensing device data, the controller may modify at least one of the following vehicle parameters: a maximum allowable turning angle or a steered-wheel-to-steering-device ratio.
The controller may modify the at least one of the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack.
The controller may modify the steered-wheel-to-steering-device ratio from a larger ratio to a smaller ratio when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack.
The controller may modify the at least one of the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack and the object information indicates that an object is in front of or to the side of the vehicle.
The controller may reduce the maximum turning angle from a first maximum allowable turning angle to a second maximum allowable turning angle when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack and the object information indicates that an object is in front of or to the side of the vehicle, wherein the second maximum turning angle is less than the first maximum turning angle.
The materials handling vehicle may further comprise a light source device coupled to the controller. The light source device may be controlled by the controller to designate an area between the vehicle and the wall or rack as a limited operation area when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack located adjacent to the side of the vehicle. The light source device may designate the area as a limited operation area in a manner that can be observed by a person in the vicinity of the vehicle.
In accordance with a second aspect of the present invention, a method is provided for controlling a materials handling vehicle. The materials handling vehicle may comprise: a power unit comprising: a steered wheel, and a steering device for generating a steer control signal; a load handling assembly coupled to the power unit; a controller located on the power unit for receiving the steer control signal; and a sensing device on the power unit and coupled to the controller. The method may comprise: monitoring, via the sensing device, areas in front of and next to the vehicle; identifying, by the controller, using data from the sensing device, at least one of position information of the vehicle relative to a wall or rack or object information indicating that one or more objects are in front of or to the side of the vehicle; and modifying, by the controller, based on sensing device data, at least one of the following vehicle parameters: a maximum allowable turning angle or a steered-wheel-to-steering-device ratio.
The controller may modify the at least one of the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack.
The controller may modify the steered-wheel-to-steering-device ratio from a larger ratio to a smaller ratio when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack.
The controller may modify the at least one of the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack and the object information indicates that an object is in front of or to the side of the vehicle.
In accordance with a third aspect of the present disclosure, a materials handling vehicle is provided comprising: a power unit comprising: a steered wheel, and a steering device for generating a steer control signal; a load handling assembly coupled to the power unit; a controller located on the power unit for receiving the steer control signal; and a sensing device on the power unit and coupled to the controller. The sensing device may monitor areas in front of and next to the vehicle. Data from the sensing device may be used by the controller to identify at least one of position information of the vehicle relative to at least one wall or rack near which the vehicle is located or object information indicating that one or more objects are in front of or to the side of the vehicle. Based on sensing device data, the controller may modify at least one of the following vehicle parameters: a load handling assembly lift height, a maximum turning angle or a steered-wheel-to-steering-device ratio.
The controller may modify the at least one of the load handling assembly lift height, the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within an aisle between a first wall or rack and a second wall or rack.
The controller may modify the at least one of the load handling assembly lift height, the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within an aisle between a first wall or rack and a second wall or rack and the object information indicates that an object is in front of or to the side of the power unit.
The controller may reduce the maximum turning angle from a first maximum allowable turning angle to a second maximum allowable turning angle when the position information indicates that the vehicle is positioned within an aisle between the first wall or rack and the second wall or rack and the object information indicates that the object is in front of or to the side of the power unit, wherein the second maximum turning angle is less than the first maximum turning angle.
The load handling assembly may comprise a lift carriage. A lift height of the lift carriage may define the load handling assembly lift height. The controller may modify a maximum lift height of the lift carriage when the position information indicates that the vehicle is positioned within an aisle having a first aisle width.
The load handling assembly may comprise a lift carriage. A lift height of the lift carriage may define the load handling assembly lift height. The controller may modify the lift height of the lift carriage when the position information indicates that the vehicle is positioned within an aisle between a first wall or rack and a second wall or rack such that the lift carriage is moved to an intermediate height location.
The materials handling vehicle may further comprise a light source device coupled to the controller. The light source device may be controlled by the controller to designate an area between the vehicle and the wall or rack as a limited operation area when the position information indicates that the vehicle is positioned within a predefined distance from the wall or rack located adjacent to the side of the vehicle. The light source device may designate the area as a limited operation area in a manner that can be observed by a person in the vicinity of the vehicle.
In accordance with a fourth aspect of the present disclosure, a method is provided for controlling a materials handling vehicle. The materials handling vehicle may comprise: a power unit comprising: a steered wheel, and a steering device for generating a steer control signal; a load handling assembly coupled to the power unit; a controller located on the power unit for receiving the steer control signal; and a sensing device on the power unit and coupled to the controller. The method may comprise: monitoring, by the sensing device, areas in front of and next to the vehicle; identifying, by the controller, using data from the sensing device, at least one of position information of the vehicle relative to at least one wall or rack near which the vehicle is located or object information indicating that one or more objects are in front of or to the side of the vehicle; and modifying, by the controller, based on sensing device data, at least one of the following vehicle parameters: a load handling assembly lift height, a maximum turning angle or a steered-wheel-to-steering-device ratio.
The controller may modify the at least one of the load handling assembly lift height, the maximum allowable turning angle or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle is positioned within an aisle between a first wall or rack and a second wall or rack.
The load handling assembly may comprise a lift carriage. A lift height of the lift carriage may define the load handling assembly lift height. The controller may modify a maximum lift height of the lift carriage when the position information indicates that the vehicle is positioned within an aisle having a first aisle width.
The load handling assembly may comprise a lift carriage. A lift height of the lift carriage may define the load handling assembly lift height. The controller may modify the lift height of the lift carriage when the position information indicates that the vehicle is positioned within an aisle between a first wall or rack and a second wall or rack such that the lift carriage is moved to an intermediate height location.
In accordance with a fifth aspect of the present disclosure, a materials handling vehicle is provided comprising: a power unit comprising a left side, a right side and an operator station; a load handling assembly coupled to the power unit and comprising a lift carriage; a controller located on the power unit; and a sensing device on the power unit and coupled to the controller. The sensing device may monitor areas in front of and next to the power unit, wherein data from the sensing device may be used by the controller to identify position information of the power unit relative to a wall or rack near which the power unit is located. A sensing system may be provided that detects that an operator has exited the operator station of the vehicle and whether the operator exited the operator station from a first exit on the left side of the power unit or from a second exit on the right side of the power unit. When the position information indicates that one of the first or the second side of the power unit is positioned within a predefined distance from a wall or rack and the operator has exited the operator station from one of the first or the second exit on the one side, the controller may modify at least one of the following vehicle parameters: vehicle traction control, operation of the lift carriage or remote control operation of the vehicle.
In accordance with a sixth aspect of the present disclosure, a method is provided for controlling a materials handling vehicle. The materials handling vehicle may comprise: a power unit comprising a left side, a right side and an operator station; a load handling assembly coupled to the power unit and comprising a lift carriage; a controller located on the power unit; and a sensing device on the power unit and coupled to the controller. The method may comprise: monitoring, by the sensing device, areas in front of and next to the power unit; identifying, by the controller, using data from the sensing device, position information of the power unit relative to a wall or rack near which the power unit is located; detecting, by a sensing system, that an operator has exited the operator station of the vehicle and whether the operator exited the operator station from a first exit on the left side of the power unit or from a second exit on the right side of the power unit; and modifying, by the controller, when (i) the position information indicates that one of the first or the second side of the power unit is positioned within a predefined distance from a wall or rack and (ii) the sensing system has detected that the operator has exited the operator station from one of the first or the second exit on the one side, at least one of the following vehicle parameters: vehicle traction control, operation of the lift carriage or remote control operation of the vehicle.
The following text sets forth a broad description of numerous different embodiments of the present disclosure. The description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical, if not impossible, and it will be understood that any feature, characteristic, component, composition, ingredient, product, step or methodology described herein can be deleted, combined with or substituted for, in whole or part, any other feature, characteristic, component, composition, ingredient, product, step or methodology described herein. It should be understood that multiple combinations of the embodiments described and shown are contemplated and that a particular focus on one embodiment does not preclude its inclusion in a combination of other described embodiments. Numerous alternative embodiments could also be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims. All publications and patents cited herein are incorporated herein by reference.
Referring now to the drawings, and particularly to
The illustrated power unit 14 comprises a step-through operator station 20 dividing a first end section of the power unit 14 (opposite the forks 16) from a second end section (proximate the forks 16). The operator station 20 includes a platform 21 upon which an operator may stand to drive the vehicle 10 and/or to provide a position from which the operator may operate various included features of the vehicle 10. The operator manually controls traveling functions of the vehicle 10 using operator controls 24 provided in the operator station 20.
The power unit 14 further comprises at least one steered wheel 108. The truck 10 comprises a steer-by-wire system for effecting angular movement of the steered wheel 108. The steer-by-wire system 80 comprises a control handle 90 forming part of the operator controls 24, a steer motor 114 and the steered wheel 108, see
Presence sensors 22 (see
The vehicle 10 illustrated in
According to one embodiment shown in
The remote control device 32 is manually operable by an operator, e.g., by pressing a button or other control, to cause the remote control device 32 to wirelessly transmit at least a first type signal designating a travel request to a vehicle 10 that is paired to the remote control device 32. The travel request is a command that requests the vehicle 10 to travel, as will be described in greater detail herein. Although the remote control device 32 is illustrated in
The vehicle 10 also comprises one or more contactless obstacle sensors 40, which are provided about the vehicle 10, e.g., towards the first end section of the power unit 14 as shown in
The obstacle sensors 40 may comprise any suitable proximity detection technology, such as ultrasonic sensors, image capture devices, infrared sensors, laser scanner sensors, etc., which are capable of detecting the presence of objects/obstacles or are capable of generating signals that can be analyzed to detect the presence of objects/obstacles within the predefined detection zone(s). In the exemplary embodiment illustrated in
The first obstacle detector 42 may comprise a sweeping or scanning laser sensor capable of detecting objects, for example, in first, second, and third zones Z1, Z2, Z3 (also referred to herein as scan zones or detection zones), which first, second, and third zones Z1, Z2, Z3 may comprise planar zones, see
The second obstacle detectors 44A and 44B may comprise point laser sensors that are capable of detecting objects between one or more of the zones Z1, Z2, Z3 of the first obstacle detector 42 and the vehicle 10, i.e., underneath one or more of the zones Z1, Z2, Z3, as illustrated in
Additional sensor configurations and/or detection zones may be used.
Referring to
Thus, the controller 103 may define, at least in part, a data processing system suitable for storing and/or executing program code and may include at least one processor coupled directly or indirectly to memory elements, e.g., through a system bus or other suitable connection. The memory elements can include local memory employed during actual execution of the program code, memory that is integrated into a microcontroller or application specific integrated circuit (ASIC), a programmable gate array or other reconfigurable processing device, etc.
The response implemented by the controller 103 in response to wirelessly received commands, e.g., via a wireless transmitter 178 of the remote control device 32 (to be discussed below) and sent to the receiver 102 on the vehicle 10, may comprise one or more actions, or inaction, depending upon the logic that is being implemented. Positive actions may comprise controlling, adjusting or otherwise affecting one or more components of the vehicle 10. The controller 103 may also receive information from other inputs 104, e.g., from sources such as the presence sensors 22, the obstacle sensors 40, switches, load sensors, encoders and other devices/features available to the vehicle 10 to determine appropriate action in response to the received commands from the remote control device 32. The sensors 22, 40, etc. may be coupled to the controller 103 via the inputs 104 or via a suitable truck network, such as a control area network (CAN) bus 110.
In an exemplary arrangement, the remote control device 32 is operative to wirelessly transmit a control signal that represents a first type signal such as a travel command to the receiver 102 on the vehicle 10. The travel command is also referred to herein as a “travel signal”, “travel request” or “go signal”. The travel request is used to initiate a request to the vehicle 10 to travel, e.g., for as long as the travel signal is received by the receiver 102 and/or sent by the remote control device 32, by a predetermined amount, e.g., to cause the vehicle 10 to advance or jog in a first direction by a limited travel distance, or for a limited time. The first direction may be defined, for example, by movement of the vehicle 10 in a power unit 14 first, i.e., forks 16 to the back, direction. However, other directions of travel may alternatively be defined. Moreover, the vehicle 10 may be controlled to travel in a generally straight direction or along a previously determined heading. Correspondingly, the limited travel distance may be specified by an approximate travel distance, travel time or other measure.
Thus, a first type signal received by the receiver 102 is communicated to the controller 103. If the controller 103 determines that the travel signal is a valid travel signal and that the current vehicle conditions are appropriate, the controller 103 sends a signal to the appropriate control configuration of the vehicle 10 to advance and then stop the vehicle 10. Stopping the vehicle 10 may be implemented, for example, by either allowing the vehicle 10 to coast to a stop or by initiating a brake operation to cause the vehicle 10 to brake to a stop.
As an example, the controller 103 may be communicably coupled to a traction control system, illustrated as a traction motor controller 106 of the vehicle 10. The traction motor controller 106 is coupled to a traction motor 107 that drives the at least one steered wheel 108 of the vehicle 10. The controller 103 may communicate with the traction motor controller 106 so as to accelerate, decelerate, adjust and/or otherwise limit the speed of the vehicle 10 in response to receiving a travel request from the remote control device 32. As noted above, the controller 103 may also be communicably coupled to the steer controller 112, which is coupled to the steer motor 114 that steers at least one steered wheel 108 of the vehicle 10. In this regard, the vehicle 10 may be controlled by the controller 103 to travel an intended path or maintain an intended heading in response to receiving a travel request from the remote control device 32.
As yet another illustrative example, the controller 103 may be communicably coupled to a brake controller 116 that controls vehicle brakes 117 to decelerate, stop or otherwise control the speed of the vehicle 10 in response to receiving a travel request from the remote control device 32. Still further, the controller 103 may be communicably coupled to other vehicle features, such as main contactors 118, and/or other outputs 119 associated with the vehicle 10, where applicable, to implement desired actions in response to implementing remote travel functionality.
According to various embodiments, the controller 103 may communicate with the receiver 102 and with the traction motor controller 106 to operate the vehicle 10 under remote control in response to receiving travel commands from the associated remote control device 32. Moreover, the controller 103 may be configured to perform various actions if the vehicle 10 is traveling under remote control in response to a travel request and an obstacle is detected in one or more of the detection zone(s) Z1, Z2, Z3. In this regard, when a travel signal is received by the controller 103 from the remote control device 32, any number of factors may be considered by the controller 103 to determine whether the received travel signal should be acted upon to initiate and/or sustain movement of the vehicle 10.
Correspondingly, if the vehicle 10 is moving in response to a command received by the remote control device 32, the controller 103 may dynamically alter, control, adjust or otherwise affect the remote control operation, e.g., by stopping the vehicle 10, changing the steer angle of the vehicle 10, or taking other actions. Thus, the particular vehicle features, the state/condition of one or more vehicle features, vehicle environment, etc., may influence the manner in which the controller 103 responds to travel requests from the remote control device 32.
The controller 103 may refuse to acknowledge a received travel request depending upon predetermined condition(s), e.g., that relate to environmental or operational factor(s). For example, the controller 103 may disregard an otherwise valid travel request based upon information obtained from one or more of the sensors 22, 40. As an illustration, according to various embodiments, the controller 103 may optionally consider factors such as whether an operator is on the vehicle 10 when determining whether to respond to a travel command from the remote control device 32. As noted above, the vehicle 10 may comprise at least one presence sensor 22 for detecting whether an operator is positioned on the vehicle 10. In this regard, the controller 103 may be further configured to respond to a travel request to operate the vehicle 10 under remote control when the presence sensor(s) 22 designate that no operator is on the vehicle 10. Thus, in this implementation, the vehicle 10 cannot be operated in response to wireless commands from the remote control device 32 unless the operator is physically off of the vehicle 10. Similarly, if the obstacle sensors 40 detect that an object, including the operator, is adjacent and/or proximate to the vehicle 10, the controller 103 may refuse to acknowledge a travel request from the remote control device 32. Thus, in an exemplary implementation, an operator must be located within a limited range of the vehicle 10, e.g., close enough to the vehicle 10 to be in wireless communication range (which may be limited to set a maximum distance of the operator from the vehicle 10). Other arrangements may alternatively be implemented.
Any other number of reasonable conditions, factors, parameters or other considerations may also/alternatively be implemented by the controller 103 to interpret and take action in response to received signals from the transmitter 178.
Upon acknowledgement of a travel request, the controller 103 interacts with the traction motor controller 106, e.g., directly or indirectly, e.g., via a bus such as the CAN bus 110 if utilized, to advance the vehicle 10. Depending upon the particular implementation, the controller 103 may interact with the traction motor controller 106 and optionally, the steer controller 112, to advance the vehicle 10 for as long as a travel control signal is received. Alternatively, the controller 103 may interact with the traction motor controller 106 and optionally, the steer controller 112, to advance the vehicle 10 for a period of time or for a predetermined distance in response to the detection and maintained actuation of a travel control on the remote control device 32. Still further, the controller 103 may be configured to “time out” and stop the travel of the vehicle 10 based upon a predetermined event, such as exceeding a predetermined time period or travel distance regardless of the detection of maintained actuation of a corresponding control on the remote control device 32.
The remote control device 32 may also be operative to transmit a second type signal, such as a “stop signal”, designating that the vehicle 10 should brake and/or otherwise come to rest. The second type signal may also be implied, e.g., after implementing a “travel” command, e.g., after the vehicle 10 has traveled a predetermined distance, traveled for a predetermined time, etc., under remote control in response to the travel command. If the controller 103 determines that a wirelessly received signal is a stop signal, the controller 103 sends a signal to the traction motor controller 106, the brake controller 116 and/or other truck component to bring the vehicle 10 to a rest. As an alternative to a stop signal, the second type signal may comprise a “coast signal” or a “controlled deceleration signal” designating that the vehicle 10 should coast, eventually slowing to rest.
The time that it takes to bring the vehicle 10 to a complete rest may vary, depending for example, upon the intended application, the environmental conditions, the capabilities of the particular vehicle 10, the load on the vehicle 10 and other similar factors. For example, after completing an appropriate jog movement, it may be desirable to allow the vehicle 10 to “coast” some distance before coming to rest so that the vehicle 10 stops slowly. This may be achieved by utilizing regenerative braking to slow the vehicle 10 to a stop. Alternatively, a braking operation may be applied after a predetermined delay time to allow a predetermined range of additional travel to the vehicle 10 after the initiation of the stop operation. It may also be desirable to bring the vehicle 10 to a relatively quicker stop, e.g., if an object is detected in the travel path of the vehicle 10 or if an immediate stop is desired after a successful jog operation. For example, the controller 103 may apply predetermined torque to the braking operation. Under such conditions, the controller 103 may instruct the brake controller 116 to apply the brakes 117 to stop the vehicle 10.
A pairing system 34 can utilize, for example, a close range system to wirelessly communicate with a compatible close range system on the wireless remote control device 32. Using the pairing system 34, a vehicle 10 and wireless remote control device 32 can be “paired” such that a vehicle 10 will transmit and receive messages from only its paired wireless remote control device 32. The pairing system 34 includes components that physically implement the communication method (e.g., Bluetooth, NFC, BLE, Wi-Fi, etc.) used to send messages and includes components that programmatically exchange information in an agreed upon protocol to establish and maintain a pairing. Thus, the pairing system 34 includes a device that can execute programmable instructions to implement a predetermined algorithm and protocol to accomplish pairing operations.
With reference now to
The light source device 200 may comprise a light controller 1202 and one or more light sources 204 coupled to the light controller 1202, wherein the one or more light sources 204 may be located on the vehicle 10 and may comprise visible lasers, light bars, projectors, etc., which light sources 204 may project visible indicia on the floor adjacent to the vehicle 10 on the left side LS and/or right side RS, and also optionally in front of and/or behind the vehicle 10. See
When both of the distances D1 and D2 are concurrently greater than or equal to the predetermined distance, the controller 103 may activate the one or more light sources 204 such that they concurrently designate first and second areas on opposed sides of the vehicle 10 as non-limited operation areas. Also, when both of the distances D1 and D2 are concurrently less than the predetermined distance, the controller 103 may activate the one or more light sources 204 such that they concurrently designate the first and second areas on opposed sides of the vehicle 10 as limited operation areas. Additionally, when one of the distances D1 or D2 is greater than or equal to the predetermined distance, and the other of the distances D1 or D2 is concurrently less than the predetermined distance, the controller 103 may activate the one or more light sources 204 via the light controller 1202 such that the one or more light sources 204 concurrently designate one of the first and second areas as a limited operation area and the other of the first and second areas as a non-limited operation area.
According to an embodiment, the one or more light sources 204 may designate a limited operation area using a first indicia 206 (see
The one or more light sources 204 and separate light sources 205 may be located anywhere on the vehicle 10, such as on the power unit 14, for example, and are preferably located where they can illuminate at least a portion of the floor between the load handling assembly 12 and the object 202 and between the power unit/operator station 14/20 and the object 202.
In embodiments, the controller 103 will only actuate the one or more light sources 204 and separate light sources 205 to illuminate the applicable indicia if the vehicle 10 is determined to be in an aisle 210. In such an embodiment, the light sources 204 and separate light sources 205 will not be activated while the vehicle 10 is in a location other than in an aisle 210. The vehicle 10 may be determined to be in an aisle 210, for example, by the controller 103 using sensor data from the obstacle sensor(s) 40, by a warehouse management system (WMS) that communicates with the vehicle 10, and/or using positional data of the vehicle 10, etc.
This embodiment provides an operator or other person in the vicinity of the vehicle 10 with a suggestion as to where they might not want to walk (limited operation area), in addition to a suggestion where they may want to walk (non-limited operation area). When the device 200 is located on the vehicle 10, the device 200 moves with the vehicle 10, which is beneficial in that there will be no limited operation area when the vehicle 10 is not in the vicinity. In other words, an area may only become a limited operation area when a vehicle 10 is present and is located close to the object 202, e.g., the wall or rack.
Referring now to
Referring again to
According to embodiments, if the vehicle 10 is positioned within a predefined distance from an object 202, e.g., a wall or rack, that is located adjacent to the side of the vehicle 10 from which an operator exited the vehicle 10, as determined by the sensing system 250, at least one function of the vehicle 10 may be modified by the controller 103, e.g., disabled, limited, or activated. The predefined distance is measured in the lateral direction LD between the vehicle 10 and the object 202. The predefined distance may be the same as, similar to, or different than the predetermined distance discussed above. This embodiment could be used along with the light source(s) 204, such that when stepping out of the vehicle, the operator will know whether they are stepping into a limited operation zone or a non-limited operation zone, i.e., based on the first or second indicia 206 or 208 illuminated on the floor adjacent to the vehicle 10. Hence, the light source(s) 204 could designate an area to the left side LS or right side RS of the vehicle 10 as a limited operation area when the vehicle is positioned within a predefined distance from an object 202, wherein the predefined distance may be the same as the predetermined distance discussed above.
The function(s) of the vehicle that are modified by the controller 103 may be, for example, traction control/traveling movement of the vehicle, e.g., the maximum allowable speed of the vehicle 10 may be limited or the traction control of the vehicle 10 may be disabled, functions of the load handling assembly, e.g., lift and/or lower may be limited or disabled, remote control functionality of the vehicle 10 via the remote control device 32 may be disabled, a vehicle alert system may be activated, e.g., to initiate an alarm, etc.
As noted above, the data from the operator presence sensors 22 may additionally be used to determine that an operator has exited the vehicle 10. In this regard, the system 250 is additionally capable of detecting a situation wherein, for example, the operator has moved one foot out of the vehicle 10, but the other foot is still inside the vehicle 10, i.e., one of the light curtain sensors 252A or 252B detected a pass through (e.g., the operator's foot/leg passing through), but the operator presence sensors 22 still detect the presence of the operator on the platform 21. In this situation, the aforementioned function(s) of the vehicle may or may not be disabled by the controller 103, and/or the vehicle 10 may issue an alarm or other warning for the operator to move their foot/leg back into the operator station 20. Alternative measures may also be taken, such as, for example stopping the vehicle 10 until the operator returns their foot/leg into the operator station 20.
With reference now to
This embodiment could also be used with a vehicle that includes only a single exit. That is, if a single-exit vehicle is positioned within the predefined distance from a boundary object (e.g., a wall or rack) that is located adjacent to the side of the vehicle having the exit, at least one function of the vehicle 10 may be disabled as described herein.
This embodiment could also be used with a vehicle that includes two exits, but where only one of the exits would include a light curtain sensor. This configuration could be used, for example, where, while driving in an aisle, the vehicle will always be located closer to one side of the aisle than the other, e.g., a situation where the vehicle always drives along the left or right side of the aisle. In this case, only the exit corresponding to the side of the aisle that the vehicle drives along may include a light curtain sensor.
Turning now to
The position information may be used by the controller 103 to determine if the vehicle 10 is located in an aisle 210. For example, the vehicle 10 may be determined to be located in an aisle 210 if the distance D1 between the vehicle 10 and the first rack 202A, plus the distance Da between the vehicle 10 and the second rack 202B, plus the width of the vehicle 10 are equal to or within a predefined range to a known width of the aisle 210 (if the distances D1 and D2 were to be measured from the longitudinal axis LA of the vehicle 10 to the respective racks 202A, 202B, as opposed to being measured from the left and right sides LS, RS of the vehicle 10, the width of the vehicle 10 would be taken out of this equation).
The position information may also be used by the controller 103 to determine if the vehicle 10 is located in a desired position within an aisle 210. For example, if the distances from the vehicle 10 to the first and second racks 202A, 202B are equal or within a predetermined tolerance, the vehicle 10 may be determined to be located in the center of the aisle 210. Or, if the distance from the vehicle 10 to one of the first rack 202A or the second rack 202B is equal to or within a predetermined tolerance to a predefined hugging distance (to be discussed below), and, optionally, if the operator is determined by the controller 103 not to be present on the vehicle 10 (e.g., via information from the sensing system 250), it may be determined that the vehicle 10 is in hugging mode (to be described below), or is in the proper position to begin hugging mode.
The position information of the vehicle 10 relative to the boundary object(s) can be used by the controller 103 to modify at least one vehicle parameter. Exemplary vehicle parameters that can be modified in this way include: a maximum allowable travel speed (e.g., based on the position information, the maximum allowable travel speed can be reduced from a normal maximum allowable travel speed to a reduced maximum allowable travel speed or increased from the reduced maximum allowable travel speed to the normal maximum allowable travel speed); a maximum allowable turning angle (e.g., based on the position information, the maximum allowable turning angle can be reduced from a normal maximum allowable turning angle to a reduced maximum allowable turning angle or increased from reduced maximum allowable turning angle to the normal maximum allowable turning angle); a steered-wheel-to-steering-device ratio; one or more vehicle lights (e.g., based on the position information, one or more lights on the vehicle 10 can be switched on or off); a lifting function of the load handling assembly (e.g., based on the position information, lifting/lowering function(s) of the load handling assembly 12 can be adjusted, such as lift/lower speed or a maximum lift height, and/or the load handling assembly 12 may be automatically raised or lowered to a desired height); indicia used to indicate that the vehicle is located in a particular area (e.g., based on the position information, the first, second, or third indicia 206, 208, 209 may be switched on or off); and/or, based on the position information, an alert may be given to indicate the presence of the vehicle 10 in an aisle 210, such as an audible alert, visual alert, alert on a display screen (e.g., the touchscreen TS), etc.
As noted above, the controller 103 receives the steer control signal from the control handle position sensor 100A, which senses the angular position of the control handle 90 within the angular range of approximately +/−60 degrees in the illustrated embodiment. Since a current steer control signal corresponds to a current position of the control handle 90 falling within the range of from about +/−60 degrees and the steered wheel 108 is capable of rotating through an angular range of +/−90 degrees, the controller 103 converts the current control handle position, as indicated by the steer control signal, to a corresponding desired angular position of the steered wheel 108 by multiplying the current control handle position by a steered-wheel-to-steering-device ratio, such as 90/60 or 1.5/1.0, e.g., an angular position of the control handle 90 of +60 degrees equals a desired angular position of the steered wheel 108 of +90 degrees. For example, if the angular position of the control handle 90 is +60 degrees, the controller 103 multiplies+60 degrees by the ratio of 1.5/1.0 to determine a desired angular position of the steered wheel 108 equal to +90 degrees and generates a corresponding steer actuation signal to the steer controller 112.
The steered-wheel-to-steering-device ratio may equal 60/60 or 1.0/1.0. For example, if the angular position of the control handle 90 is +60 degrees, the controller 103 may multiply +60 degrees by the ratio of 1.0/1.0 to determine a desired angular position of the steered wheel 108 equal to +60 degrees.
The controller 103 may modify at least one of a maximum allowable turning angle of the steered wheel 108 or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle 10 is positioned within a predefined distance from an object 202, such as a wall or a rack that is located adjacent to the side of the vehicle 10. The controller 103 may modify at least one of the maximum allowable turning angle of the steered wheel 108 or the steered-wheel-to-steering-device ratio independent of whether the vehicle is being manually or remotely controlled by an operator. It is also contemplated that the controller 103 may only modify at least one of the maximum allowable turning angle of the steered wheel 108 or the steered-wheel-to-steering-device ratio when an operator is determined to be not present in the operator station 20, e.g., as determined by the sensing system 250, or when an operator is remotely controlling the vehicle 10 with a remote control device 32 that is paired to the vehicle 10.
The predefined distance, as noted above, is measured in the lateral direction LD between the vehicle 10 and the object 202. The predefined distance may be the same as, similar to, or different than the predetermined distance (defined such that when the distance D1 or D2 is less than the predetermined distance, the area corresponding to that distance D1 or D2 may be an area not sufficiently large enough to receive an operator or person while also maintaining a minimum clearance distance between the operator or person and the boundary object 202) discussed above. For example, the controller 103 may reduce the maximum allowable turning angle for the steered wheel 108 from a first maximum allowable turning angle to a second maximum allowable turning angle when the position information indicates that the vehicle 10 is positioned within the predefined distance from the wall or rack, wherein the second maximum allowable turning angle is less than the first maximum allowable turning angle. In
In a further example, the controller 103 may modify at least one of a maximum allowable turning angle of the steered wheel 108 or the steered-wheel-to-steering-device ratio when the position information, sensed by the sensing device 300, indicates that the vehicle 10 is positioned within a predefined distance from an object 202, such as a wall or a rack that is located adjacent to the side of the vehicle 10, and object information, also sensed by the sensing device 300, indicates that a further object is in front of or to the side of the vehicle 10, i.e., within a sensing range of the sensing device 300. For example, the controller 103 may reduce the maximum allowable turning angle from a first maximum allowable turning angle to a second maximum allowable turning angle when the position information indicates that the vehicle 10 is positioned within the predefined distance from the wall or rack and the object information indicates that a further object is in front of or to the side of the vehicle, wherein the second maximum allowable turning angle is less than the first maximum allowable turning angle. In
In
In yet another example, the controller 103 may modify at least one of a load handling assembly lift height, a maximum allowable turning angle of the steered wheel 108 or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle 10 is positioned within an aisle. The lift height of the lift carriage may define the load handling assembly lift height. For example, the controller 103 may reduce a maximum lift height to which the fork carriage 226 and forks 216 may be raised, i.e., a maximum lift height of the lift carriage, once the sensing device 300 senses and the controller 103 determines that the vehicle 10 is located within an aisle. In a further embodiment, the controller 103 may reduce the maximum lift height to which the fork carriage 226 and forks 216 may be raised only when the sensing device 300 senses and the controller 103 determines that the vehicle 10 is located within an aisle having a designated or predefined aisle width (also referred to herein as “a first aisle width”) or a width equal to or less than the predefined or first aisle width. It is noted that some freezers have a low ceiling and also have very narrow aisles. Hence, when the controller 103 determines that the vehicle 200A is moving through a narrow aisle having the designated or first aisle width, the controller 103 will limit the height to which the fork carriage 226 and forks 216 can be elevated to a lower maximum lift height to avoid contacting the ceiling. In a further example, the controller 103 may reduce the maximum allowable turning angle for the steered wheel 108 from a first maximum allowable turning angle to a second maximum allowable turning angle when the position information indicates that the vehicle 10 is located within an aisle, wherein the second maximum allowable turning angle is less than the first maximum allowable turning angle. It is still further contemplated that the controller 103 may modify the maximum allowable turning angle of the steered wheel 108 to a reduced value concurrently with the sensing device 300 first sensing that the vehicle 10 is located within an aisle and may also modify, i.e., return, the maximum allowable turning angle of the steered wheel 108 to its higher value as soon as the sensing device 300 senses that the vehicle 10 is no longer located within an aisle. It is also contemplated that when the controller 103 determines that the vehicle 10 is located within an aisle the steered-wheel-to-steering-device ratio may be changed from a larger ratio (1.5/1.0) to a smaller ratio (1.0/1.0) to make the steering of the steered wheel 108 less sensitive.
In a still further example, the controller 103 may modify at least one of a load handling assembly lift height, a maximum allowable turning angle of the steered wheel 108 or the steered-wheel-to-steering-device ratio when the position information indicates that the vehicle 10 is positioned within an aisle, and object information, sensed by the sensing device 300, indicates that a further object is in front of or to the side of the vehicle 10. For example, the controller 103 may reduce a maximum height to which the fork carriage 226 and forks 216 may be raised, once the sensing device 300 senses and the controller 103 determines that the vehicle 10 is located within an aisle and a further object is in front of or to the side of the vehicle 200A. Further, the controller 103 may reduce the maximum allowable turning angle from a first maximum allowable turning angle to a second maximum allowable turning angle when the position information indicates that the vehicle 10 is positioned within an aisle and the object information indicates that a further object is in front of or to the side of the vehicle, wherein the second maximum allowable turning angle is less than the first maximum allowable turning angle. It is also contemplated that the steered-wheel-to-steering-device ratio may be changed from a larger ratio to a small ratio when the vehicle enters an aisle and an object is detected in front of or to the side of the vehicle 10 to make the steering of the steered wheel 108 less sensitive.
In another example, when the position information indicates that the vehicle 10 is located within an aisle, the controller 103 may modify the load handling assembly lift height by moving the lift carriage 226 to an intermediate height. Thus, when the controller 103 determines that the vehicle has entered an aisle, the controller 103 will automatically raise the lift carriage 226 to an intermediate height such that an operator, when picking items, does not have to bend over to place the items on the forks 216 located in a lower position, i.e., near a surface on which the vehicle is traveling. The intermediate height may be dependent on circumstances in which the vehicle 10 is being operated. For example, the intermediate height may be dependent on the aisle in which the vehicle 10 is currently being operated and/or the operator that is currently operating the vehicle 10. When the vehicle 10 is being operated in an aisle wherein the items to be placed on the forks 216 are large, it may be advantageous for the intermediate height to be preconfigured to a lower position than when the vehicle 10 is being operated in an aisle wherein the items to be placed on the forks 216 are small. Similarly, when the vehicle 10 is being operated by a short operator, it may be advantageous for the intermediate height to be preconfigured to a lower position than when the vehicle 10 is being operated by a tall operator.
Moreover, in the case where the position information is used by the controller 103 to modify multiple vehicle parameters, select one(s) of the vehicle parameters may be modified only situationally. For example, one or more of the vehicle parameters may be modified by the controller 103 only when an operator is determined to be not present in the operator station 20, e.g., as determined by the sensing system 250. As another example, one or more of the vehicle parameters may be modified by the controller 103 only when an operator is remotely controlling the vehicle 10 with a remote control device 32 that is paired to the vehicle 10.
With reference now to
At step 334, the vehicle 10 is determined to be located in the center of the aisle 210 by the controller 103 using the position information from the sensing device 300, when the distance D1 from the vehicle 10 to the first rack 202A is equal to or within a predetermined tolerance to the distance D2 from the vehicle 10 to the second rack 202B. The vehicle 10 is then moved within the aisle 210 at step 336, e.g. by the operator using the operator controls 24 or the remote control device 32. In its new location in the aisle 210, at step 338 the vehicle 10 is determined by the controller 103 to be located at or within a predetermined tolerance to a predefined hugging distance (the hugging distance is explained in more detail below) from one of the first or the second rack 202A or 202B. At step 340 the operator is determined to have exited the operator station 20, e.g., by the controller 103 using information from the sensing system 250. Based on the vehicle 10 being at or within a predetermined tolerance to the predefined hugging distance from one of the first or the second rack 202A or 202B, and based on the operator having exited the vehicle 10, the vehicle 10 is determined by the controller 103 to be in or ready to enter hugging mode at step 342. It is noted that, while the predefined hugging distance may be set such that the vehicle 10 is maintained in the center of the aisle 210 while the vehicle 10 is hugging an object, this exemplary embodiment assumes that the predefined hugging distance is set such that the vehicle 10 will be located closer to one of the first or the second rack 202A or 202B than the other.
In accordance with another embodiment, the system 8 may further include a positioning assistance system 350, as shown in
The assistance provided by the positioning assistance system 350 may comprise at least one of audible, tactile, or visual cue(s) to indicate at least one of a spacing from the vehicle 10 to at least one boundary object, e.g., a distance, such as a lateral distance, from the vehicle 10 to a boundary object, and/or a heading of the vehicle 10 with respect to the boundary object. In this regard, the positioning assistance system 350 comprises a cue device 352 for implementing the audible, tactile, and/or visual cues. For example, the audible, tactile, and/or visual cues may be actuated to indicate that: the vehicle is located at a distance that is equal to or greater than a desired distance from the boundary object; the vehicle is located at a proper heading with respect to the boundary object; the vehicle is not located at a distance that is equal to or greater than the desired distance from the boundary object; and/or the vehicle is not located at the proper heading with respect to the boundary object. Cues for indicating different information may be distinguishable from one another so as to relay to the operator the meaning of the cue.
In
Moving on to
Turning now to
With reference now to
Referring finally to
One or more of the cues 360, 362, 364, 366, 368 may be distinguishable from one or more of the others. For example, if the cues 360, 362, 364, 366, 368 are visual cues, they may be different colors or shapes, have different illumination patterns (blinking, changing intensity, size), etc.
Only once the vehicle 10 is located at a position where the distances D1 and D2 are equal to or greater than the desired distance from both racks 202A, 202B, and the vehicle 10 is oriented at the proper heading, the vehicle 10 is able to be remotely controlled by an operator using the remote control device 32, i.e., the vehicle 10 is not able to be remotely controlled by an operator using the remote control device 32 unless the vehicle 10 is located at a position where the distances D1 and D2 are equal to or greater than the desired distance from both racks 202A, 202B, and the vehicle 10 is oriented at the proper heading. As noted above, the positioning assistance system 350 may issue an audible, tactile, and/or visual fifth cue 368 to indicate this positioning of the vehicle 10. While being remotely controlled, the vehicle may be capable of being operated in hugging mode, where the vehicle 10 moves down the aisle 210 and hugs one of the first or second racks 202A or 202B, wherein while hugging the rack 202A or 202B, the vehicle 10 maintains the predefined hugging distance from the rack 202A or 202B being hugged.
With reference to
At step 372, the operator OP is driving the vehicle 10 into the aisle 210, e.g., using the operator controls 24 on the vehicle 10. At step 374, it is determined that the vehicle 10 is located inside of the desired distance from the first rack 202A, and a first cue 360 is issued by the positioning assistance system 350. At step 376, the operator OP responds to the first cue 360 by steering the vehicle 10 away from the first rack 202A, and once the vehicle 10 is located outside of the desired distance from the first rack 202A, the first cue 360 is turned off.
At step 378, it is determined that the vehicle 10 is not located at a proper heading with respect to the second rack 202B, and a second cue 362 is issued by the positioning assistance system 350. At step 380, the operator OP responds to the second cue 362 by steering the vehicle 10 relative to the boundary object, e.g., away from the second rack 202B in the embodiment shown, and once the vehicle 10 is within the range that defines the proper heading of the vehicle 10 with respect to the second rack 202B, the second cue 362 is turned off.
At step 382, it is determined that the vehicle 10 is located inside of the desired distance from the second rack 202B, and a third cue 364 is issued by the positioning assistance system 350. At step 384, the operator OP responds to the third cue 364 by steering the vehicle 10 away from the second rack 202B, and once the vehicle 10 is located outside of the desired distance from the second rack 202B, the third cue 364 is turned off.
After turning away from the second rack 202B, at step 386, it is determined that the vehicle 10 is not located at a proper heading with respect to the first rack 202A, and a fourth cue 366 is issued by the positioning assistance system 350. At step 388, the operator OP responds to the fourth cue 366 by steering the vehicle 10 away from the first rack 202A, and once the vehicle 10 is within the range that defines the proper heading of the vehicle 10 with respect to the first rack 202A, the fourth cue 366 is turned off.
At step 390, it is determined that the vehicle 10 is located at a position equal to or greater than the desired distance from both racks 202A, 202B, and the vehicle 10 is oriented at the proper heading with respect to the first and second racks 202A, 202B. With both of these criteria being met, a fifth cue 368 is issued by the positioning assistance system 350, indicating that the vehicle 10 is in a position where it is able to be operated remotely by the operator OP using the remote control device 32.
With reference now to
A cart 500 for use with a materials handling vehicle according to another embodiment is shown in
The cart 500 further includes rollers 508, which enable picked items to be rolled toward the backwall 506 of the cart 500 as the items are picked by the operator and placed on the cart 500.
Each of the features of the cart 500, i.e., the sidewalls 504 and backwall 506, the width Wc, and the rollers 508, in addition to the shorter forks 16 described in
With reference now to
The detection system 600 could be mounted in the vicinity of the designated area, such as at the top of a rack, or on a wall or ceiling of the facility. Alternatively, the detection system 600 could be incorporated into the vehicles 10A, 10B themselves. For example, the vehicles 10A and 10B could know the location of each other, either by direct communication between the vehicles 10A, 10B, or through communication with the warehouse manage system WMS.
This embodiment may be particularly beneficial in a facility where space is limited, such as a facility having narrow aisles (e.g., where two vehicles would not fit side by side in the aisle), and also where vehicles are being controlled remotely, e.g., via wireless remote control devices 32 such as those disclosed herein.
The various features, aspects, and embodiments described herein can be used in any combination(s) with one another, or on their own.
Having thus described embodiments in detail, it will be apparent that modifications and variations are possible without departing from the scope of the appended claims.
This application is a divisional of U.S. patent application Ser. No. 17/249,000, filed Feb. 17, 2021 and entitled “MODIFY VEHICLE PARAMETER BASED ON VEHICLE POSITION INFORMATION”, which claims the benefit of U.S. Provisional Patent Application Ser. No. 62/979,916, filed Feb. 21, 2020, entitled “REMOTELY CONTROLLED MATERIALS HANDLING VEHICLE,” the entire disclosures of both of which are hereby incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4083422 | Blakeslee et al. | Apr 1978 | A |
4266106 | Fraser et al. | May 1981 | A |
4347794 | Nordstrom | Sep 1982 | A |
4849735 | Kirtley et al. | Jul 1989 | A |
5011358 | Andersen et al. | Apr 1991 | A |
5258911 | Wellman et al. | Nov 1993 | A |
5260694 | Remahl | Nov 1993 | A |
5652486 | Mueller | Jul 1997 | A |
6208916 | Hori | Mar 2001 | B1 |
RE37215 | Dammeyer et al. | Jun 2001 | E |
6713750 | Goddard | Mar 2004 | B2 |
6784800 | Orzechowski | Aug 2004 | B2 |
7047132 | Jacobs | May 2006 | B2 |
7148794 | Stigall | Dec 2006 | B2 |
7219769 | Yamanouchi et al. | May 2007 | B2 |
7300092 | Dunn | Nov 2007 | B2 |
7408314 | Hayashi | Aug 2008 | B2 |
7849955 | Crabill et al. | Dec 2010 | B2 |
7899597 | Vitale et al. | Mar 2011 | B2 |
7987050 | Tanaka et al. | Jul 2011 | B2 |
8036771 | Hayashi | Oct 2011 | B2 |
8145390 | Dix et al. | Mar 2012 | B2 |
8220169 | Goddard | Jul 2012 | B2 |
8477025 | Campmans et al. | Jul 2013 | B2 |
8515629 | Medwin et al. | Aug 2013 | B2 |
8698612 | Toll | Apr 2014 | B2 |
8924039 | Miller et al. | Dec 2014 | B2 |
8963704 | Adami | Feb 2015 | B2 |
9075412 | Dixon et al. | Jul 2015 | B2 |
9139408 | Alveteg | Sep 2015 | B2 |
9230419 | Beggs et al. | Jan 2016 | B2 |
9317037 | Byford et al. | Apr 2016 | B2 |
9493184 | Castaneda et al. | Nov 2016 | B2 |
9541922 | Tsujimoto et al. | Jan 2017 | B2 |
9542824 | Beggs et al. | Jan 2017 | B2 |
9547969 | Beggs et al. | Jan 2017 | B2 |
9567102 | Ross | Feb 2017 | B1 |
9604539 | Blignaut | Mar 2017 | B1 |
9607496 | Beggs et al. | Mar 2017 | B2 |
9633537 | Beggs et al. | Apr 2017 | B2 |
9664789 | Rosenblum et al. | May 2017 | B2 |
9718661 | Hoffman | Aug 2017 | B1 |
9725038 | Wu | Aug 2017 | B2 |
9745180 | Meijer et al. | Aug 2017 | B2 |
9802528 | Adami | Oct 2017 | B2 |
9857472 | Mheen et al. | Jan 2018 | B2 |
9864371 | Douglas et al. | Jan 2018 | B2 |
9946265 | Sullivan et al. | Apr 2018 | B2 |
10023142 | McGoldrick et al. | Jul 2018 | B2 |
10053286 | Sullivan et al. | Aug 2018 | B2 |
10059261 | Wu | Aug 2018 | B2 |
10086756 | Manci et al. | Oct 2018 | B2 |
10118537 | Kunii et al. | Nov 2018 | B2 |
10216193 | Gupta et al. | Feb 2019 | B2 |
10220768 | Damon | Mar 2019 | B2 |
10220844 | Ko et al. | Mar 2019 | B2 |
10221014 | Sullivan et al. | Mar 2019 | B2 |
10222215 | Holz | Mar 2019 | B2 |
10242514 | Harish et al. | Mar 2019 | B2 |
10248123 | Ichinose et al. | Apr 2019 | B2 |
10261511 | Masaki et al. | Apr 2019 | B2 |
10280054 | High et al. | May 2019 | B2 |
10289118 | Göpner | May 2019 | B2 |
10317448 | Streett et al. | Jun 2019 | B2 |
10343601 | Wu | Jul 2019 | B2 |
10346797 | Jacobus et al. | Jul 2019 | B2 |
10365117 | Harish et al. | Jul 2019 | B2 |
10388085 | Harish et al. | Aug 2019 | B2 |
10402772 | Terwilliger et al. | Sep 2019 | B2 |
10429833 | Schulz et al. | Oct 2019 | B2 |
10466692 | Douglas et al. | Nov 2019 | B2 |
10528828 | Eckman | Jan 2020 | B2 |
10538421 | Blevins et al. | Jan 2020 | B2 |
10556785 | Göpner et al. | Feb 2020 | B2 |
10570001 | Yahner | Feb 2020 | B2 |
10591627 | Frederick et al. | Mar 2020 | B2 |
10611615 | Valfridsson et al. | Apr 2020 | B2 |
10663314 | Harish et al. | May 2020 | B2 |
10740627 | Corcoran | Aug 2020 | B2 |
10875448 | Zevenbergen et al. | Dec 2020 | B2 |
20080051952 | Tushaus | Feb 2008 | A1 |
20080306691 | Louis et al. | Dec 2008 | A1 |
20090189132 | Meijer et al. | Jul 2009 | A1 |
20100114405 | Elston et al. | May 2010 | A1 |
20110153139 | Erb et al. | Jun 2011 | A1 |
20120078471 | Siefring et al. | Mar 2012 | A1 |
20130096735 | Byford et al. | Apr 2013 | A1 |
20150138002 | Beggs et al. | May 2015 | A1 |
20150158428 | Beggs et al. | Jun 2015 | A1 |
20150170493 | Beggs et al. | Jun 2015 | A1 |
20160090038 | Briggs et al. | Mar 2016 | A1 |
20160090283 | Svensson et al. | Mar 2016 | A1 |
20160090284 | Svensson et al. | Mar 2016 | A1 |
20160138248 | Conway et al. | May 2016 | A1 |
20160264387 | Yoon | Sep 2016 | A1 |
20170001554 | Sørensen et al. | Jan 2017 | A1 |
20170057798 | Dues et al. | Mar 2017 | A1 |
20180016126 | Valfridsson et al. | Jan 2018 | A1 |
20180057049 | Stewart et al. | Mar 2018 | A1 |
20180118095 | Kunii et al. | May 2018 | A1 |
20180128932 | Frederick et al. | May 2018 | A1 |
20180151070 | Katou et al. | May 2018 | A1 |
20180155167 | G?pner | Jun 2018 | A1 |
20180170250 | Hänninen | Jun 2018 | A1 |
20180178342 | Russell | Jun 2018 | A1 |
20180209793 | Sakaguchi et al. | Jul 2018 | A1 |
20180265340 | Luminet et al. | Sep 2018 | A1 |
20180330175 | Corcoran | Nov 2018 | A1 |
20180350098 | Siessegger et al. | Dec 2018 | A1 |
20190017828 | Harish et al. | Jan 2019 | A1 |
20190017837 | Harish et al. | Jan 2019 | A1 |
20190018037 | Harish et al. | Jan 2019 | A1 |
20190018425 | Harish et al. | Jan 2019 | A1 |
20190019254 | Harish et al. | Jan 2019 | A1 |
20190019255 | Harish et al. | Jan 2019 | A1 |
20190019351 | Harish et al. | Jan 2019 | A1 |
20190119084 | Pautz et al. | Apr 2019 | A1 |
20190138875 | Simon et al. | May 2019 | A1 |
20190161943 | Frank | May 2019 | A1 |
20190163188 | Walton et al. | May 2019 | A1 |
20190193629 | Zevenbergen et al. | Jun 2019 | A1 |
20190193937 | Sullivan et al. | Jun 2019 | A1 |
20190256083 | Kang et al. | Aug 2019 | A1 |
20190258266 | Kirk et al. | Aug 2019 | A1 |
20190270449 | Grabbe et al. | Sep 2019 | A1 |
20190315270 | Ly | Oct 2019 | A1 |
20190331761 | Wynn et al. | Oct 2019 | A1 |
20190361464 | Ahnell | Nov 2019 | A1 |
20190382252 | Meijer et al. | Dec 2019 | A1 |
20200018079 | Richards et al. | Jan 2020 | A1 |
20200024114 | Uchimura et al. | Jan 2020 | A1 |
20200031645 | Uchimura et al. | Jan 2020 | A1 |
20200073399 | Tateno et al. | Mar 2020 | A1 |
20210261192 | Theos et al. | Aug 2021 | A1 |
20210261391 | Theos et al. | Aug 2021 | A1 |
20210261392 | Theos et al. | Aug 2021 | A1 |
Number | Date | Country |
---|---|---|
101140470 | Mar 2008 | CN |
101939248 | Jan 2011 | CN |
102549514 | Jul 2012 | CN |
102834345 | Dec 2012 | CN |
203402897 | Jan 2014 | CN |
106864359 | Jun 2017 | CN |
107161915 | Sep 2017 | CN |
107298409 | Oct 2017 | CN |
108751052 | Nov 2018 | CN |
109311650 | Feb 2019 | CN |
109533686 | Mar 2019 | CN |
3737660 | May 1989 | DE |
202008006639 | Aug 2008 | DE |
102007063226 | Jul 2009 | DE |
102009006175 | Jul 2010 | DE |
102010048144 | Jul 2011 | DE |
102010053793 | Jun 2012 | DE |
102012010876 | Nov 2012 | DE |
102012006536 | Oct 2013 | DE |
102013100191 | Jul 2014 | DE |
102016101404 | Feb 2017 | DE |
2261129 | Dec 2010 | EP |
2266885 | Dec 2010 | EP |
2123596 | Oct 2012 | EP |
2711880 | Mar 2014 | EP |
1834922 | May 2014 | EP |
2754584 | Jul 2014 | EP |
2468678 | Mar 2015 | EP |
2678748 | Apr 2015 | EP |
2708490 | Apr 2015 | EP |
2722687 | Apr 2015 | EP |
2910497 | Aug 2015 | EP |
2840556 | Aug 2016 | EP |
2840564 | Aug 2016 | EP |
3173294 | May 2017 | EP |
3118706 | Mar 2018 | EP |
3118707 | Mar 2018 | EP |
3118708 | Mar 2018 | EP |
3118153 | Apr 2018 | EP |
3312131 | Apr 2018 | EP |
3118152 | Dec 2018 | EP |
3434522 | Jan 2019 | EP |
3434640 | Jan 2019 | EP |
3454158 | Mar 2019 | EP |
3456681 | Mar 2019 | EP |
3469305 | Apr 2019 | EP |
3511287 | Jul 2019 | EP |
3514103 | Jul 2019 | EP |
3587339 | Jan 2020 | EP |
3620423 | Mar 2020 | EP |
2897017 | Jul 2020 | EP |
1421722 | Jan 1976 | GB |
2568752 | May 2019 | GB |
2017193928 | Oct 2017 | JP |
100907490 | Jul 2009 | KR |
101547564 | Aug 2015 | KR |
20181035502 | Dec 2018 | KR |
199640533 | Dec 1996 | WO |
2004090830 | Oct 2004 | WO |
2010021515 | Feb 2010 | WO |
2015121818 | Aug 2015 | WO |
2016029246 | Mar 2016 | WO |
2017036750 | Mar 2017 | WO |
2018127681 | Jul 2018 | WO |
2018156652 | Aug 2018 | WO |
2018172165 | Sep 2018 | WO |
2018187341 | Oct 2018 | WO |
2019112427 | Jun 2019 | WO |
2019120856 | Jun 2019 | WO |
2019179768 | Sep 2019 | WO |
Entry |
---|
Noel, Jempson; Non-Final Office Action dated Apr. 1, 2024; U.S. Appl. No. 17/248,997; United States Patent and Trademark Office; Alexandria, Virginia. |
Tian, Shaoxu; Second Chinese Office Action dated Dec. 28, 2023; Chinese Patent Application No. 202180008054.0; CNIPA; Beijing, China. |
Tian, Shaoxu; Second Chinese Office Action dated Dec. 27, 2023; Chinese Patent Application No. 202180007939.9; CNIPA; Beijing, China. |
Tian, Shaoxiao; Notice of Grant dated Apr. 29, 2024; Chinese Application No. 202180007939.9; CNIPA; Beijing, China. |
“Red Zone Forklift Danger Zone Warning Light”, ForkLift Training Sys; YouTube Video; published Mar. 16, 2016; https://www.youtube.com/watch?v+c%tMCJ2YOYw. |
Cisco-Eagle; “Pedestrian Safety in Forklift Operations: An Introduction”; 2016; downloaded from https://ppsa.memberclicks.net/assets/ConferencePresentations/2016/Monday/1-%20alex%20gandall_ciscoeagle.pdf. |
Theos, Sebastian; “Lighting Floor on Sides of Material Handling Vehicle to Indicate Limited or Non-Limited Area”; Related U.S. Appl. No. 17/248,997, filed Feb. 17, 2021. |
Theos, Sebasitan; “Position Assistance System for a Materials Handling Vehicle”; Related U.S. Appl. No. 17/248,998, filed Feb. 17, 2021. |
Elokon; “ELOshield driver-assistance system”; retrieved on Apr. 22, 2021 from https://www.elokon.com/en-US/material-handling/eloshield-vehicle-pedestrian-proximity-detection.html. |
Mallard Manufacturing; “Full & Split Roller Pallet Flow Rack”; retrieved on Apr. 22, 2021 from https://www.mallardmfg.com/full-split-roller-pallet-flow-rack/. |
Stamatia Epifanis; International Search Report and Written Opinion; International Application No. PCT/US2021/018284; Jun. 7, 2021; Rijswijk, Netherlands. |
Stamatia Epifanis; International Search Report and Written Opinion; International Application No. PCT/US2021/018291; Jun. 10, 2021; Rijswijk, Netherlands. |
Ana Rodriguez; International Search Report and Written Opinion; International Application No. PCT/US2021/018285; Jun. 29, 2021; Rijswijk, Netherlands. |
International Preliminary Report on Patentability dated Aug. 23, 2022; International Application No. PCT/US2021/018284; The International Bureau of WIPO; Geneva, Switzerland. |
International Preliminary Report on Patentability dated Aug. 23, 2022; International Application No. PCT/US2021/018285; The International Bureau of WIPO; Geneva, Switzerland. |
International Preliminary Report on Patentability dated Aug. 23, 2022; International Application No. PCT/US2021/018291; The International Bureau of WIPO; Geneva, Switzerland. |
Office Action dated Dec. 20, 2022; U.S. Appl. No. 17/248,998 United States Patent and Trademark Office; Alexandria, Virginia. |
Zhang, Yufeng; Final Office Action Dated May 9, 2023; U.S. Appl. No. 17/248,998, United States Patent and Trademark Office; Alexandria, Virginia. |
Tian, Shaoxu; First Office Action dated May 29, 2023; Chinese Application No. 202180008054.0; China National Intellectual Property Administration; Beijing, China. |
Tian, Shaoxu; First Office Action dated May 30, 2023; Chinese Application No. 202180007939.9; China National Intellectual Property Administration; Beijing, China. |
Lau, Albert; Official Action dated Sep. 18, 2023; Canadian Application No. 3,163,133; CIPO; Quebec, Canada. |
Noel, Jempson; Non-Final Office Action dated Sep. 29, 2023; U.S. Appl. No. 17/248,997; United States Patent and Trademark Office; Alexandria, Virginia. |
Restriction Requirement dated Dec. 8, 2022, U.S. Appl. No. 17/249,000; United States Patent and Trademark Office; Alexandria, Virgina. |
Nguyen, Tan Quang; Office Action dated Jan. 31, 2023; U.S. Appl. No. 17/249,000; United States Patent and Trademark Office; Alexandria, Virginia. |
Notice of Allowance dated May 30, 2023; U.S. Appl. No. 17/249,000; United States Patent and Trademark Office; Alexandria, Virginia. |
Zhang, Yufeng; Notice of Allowance; U.S. Appl. No. 17/248,998; Aug. 16, 2023; United States Patent and Trademark Office; Alexandria, Virginia. |
Bandaranayake, Sajith; Office Action dated Oct. 6, 2023; Canadian Application No. 3,163,140; CIPO; Quebec, Canada. |
Breton, Eric E.; Office Action dated Oct. 16, 2023; Canadian Application No. 3,163,146; CIPO; Quebec, Canada. |
Tian Shaoxiao; Notice of Grant of Invention Patent Right dated Jul. 24, 2024; Chinese Patent Application No. 202180008054.0; China National Intellectual Property Administration; Beijing, China. |
Theos, Sebastian; Related Continuation U.S. Appl. No. 18/820,593 entitled “Modify Vehicle Parameter Based On Vehicle Position Information”; filed on Aug. 30, 2024; United States Patent and Trademark Office; Alexandria, Virginia. |
Breton, Eric E.; Canadian Office Action dated Aug. 19, 2024; Canadian Application No. 3,163,146; Canadian Intellectual Property Office; Quebec, Canada. |
Number | Date | Country | |
---|---|---|---|
20230391593 A1 | Dec 2023 | US |
Number | Date | Country | |
---|---|---|---|
62979916 | Feb 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17249000 | Feb 2021 | US |
Child | 18450657 | US |