The present invention relates to electronic systems for use in an industrial vehicle that interacts with and presents information to a vehicle operator via a graphical user interface.
Industrial vehicles, such as forklift trucks and other materials handling trucks, are often equipped with a user interface that allows a vehicle operator to perform a variety of functions, such as accessing and viewing information programmed into the truck, entering new information, and viewing images from onboard cameras. When entering or accessing information, the operator may be required to scroll or click through large amounts of information across multiple screens or scroll through numerous options within a menu. In addition, operators working in cold environments, such as freezers, typically must wear gloves, which increases the difficulty of navigating through multiple screens and menus.
Various aspects and embodiments of the present disclosure address various technical problems associated with the need for an operator of a materials handling vehicle to spend excess time scrolling, clicking or reviewing a large amount of information to locate needed information for viewing on a vehicle user interface screen during operation of the vehicle. The present disclosure provides a first technical solution which involves detecting activation of an icon corresponding to a widget and, in response to detecting activation of the one icon, automatically moving the corresponding widget to a designated widget space for operator use. Hence, an operator need not manually search through multiple widgets, find and move the desired widget to a screen display as the desired widget is automatically moved to the screen upon activation of the corresponding icon. Another technical solution involves detecting activation of an icon corresponding to a widget and, in response to detecting the activation of the one icon, allowing a first menu portion of the one widget to be displayed. Hence, an operator may access a menu portion of the one widget when needed and desired upon activation of the corresponding icon and inadvertent access to or appearance of the menu portion is prevented when the corresponding icon is not activated. A further technical solution involves changing a state of a portion of a widget, such as an outline of a widget, upon a vehicle function being completed, e.g., a carriage assembly reaching a desired height, which is advantageous as this provides an operator with quick and clear confirmation that the vehicle function has been successfully executed. Yet another technical solution involves detecting activation of an icon corresponding to a widget and, in response, moving the widget to a predefined widget space, moving the widget from the predefined widget space in response to an operator command to move the widget away from the widget space and automatically moving the widget back to the predefined widget space in response to a command related to a vehicle operation. Such a solution provides a user interface that is flexible so as to allow an operator to move the widget corresponding to an activated icon away from the predefined widget space when the operator wishes to view another widget for additional information yet automatically returns the widget corresponding to the activated icon to the predefined widget space in response to a command related to a vehicle operation, thereby saving the operator time as the operator need not manually look and move the widget corresponding to the activated icon back to the predefined widget space. Other technical problems and corresponding solutions are set out herein.
In accordance with a first aspect of the present disclosure, a processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a screen display, such as a touch screen display that receives gesture commands from a vehicle operator, memory storing executable instructions, and a processor in communication with the memory. The processor when executing the executable instructions defines a plurality of widgets, in which each widget comprises a visual representation of a current state of an associated function of the industrial vehicle, controls the display of or causes to be displayed a subset of the plurality of widgets on a portion of the screen display defining a plurality of widget spaces, and controls the display of or causes to be displayed an icon tray or icon row on the screen display comprising one or more icons, in which at least one of the one or more icons corresponds to a respective one of the plurality of widgets.
The processor when executing the executable instructions in an example embodiment defines the icon tray as a separate portion of the screen display from the plurality of widget spaces, the icon tray being spaced apart from the plurality of widget spaces. The processor when executing the executable instructions may lock one of the plurality of widgets in position in a locked widget space upon activation of an icon corresponding to the one widget. The widget may be spaced away from its corresponding icon. The processor when executing the executable instructions may detect the activation of the icon corresponding to the one widget, and in response to detecting the activation, automatically move the one widget to the locked widget space and shift the remaining one or more widgets in the subset to the one or more remaining widget spaces. The processor when executing the executable instructions may shift a position of one or more of the widgets of the subset on the touch screen display following detection of a gesture command on the touch screen display.
The processor when executing the executable instructions may control or cause display of a first menu associated with one of the plurality of widgets when the one widget is displayed in one of the plurality of widget spaces on the screen display and a first menu portion of the one widget is activated by the vehicle operator. In some particular embodiments, the first menu may comprise a list, a sidebar, or a scroll wheel, in which a display of options in the first menu may be altered by one of a tap gesture, swipe gesture, a slide gesture, or a rotating gesture on the touch screen display and in which the options within the first menu may be color-coded with a different color. In other particular embodiments, the first menu portion of the one widget may be activated by the vehicle operator touching or selecting the first menu portion. In further particular embodiments, the processor when executing the executable instructions may define a plurality of sub-menus, each sub-menu corresponding to a particular option within the first menu, in which one sub-menu may be displayed on the screen display after the corresponding option within the first menu has been selected and a sub-menu portion of the one widget is activated.
The processor when executing the executable instructions may further color code at least a portion of the one sub-menu using a same color associated with the corresponding option within the first menu. In some embodiments, one or more of the first menu or the sub-menus may be displayed within the one widget. In other embodiments, one or more of the first menu or the sub-menus may be displayed in a separate window that is temporarily superimposed over one or more of the widget spaces. In further embodiments, the processor when executing the executable instructions may define the one widget as a rack height select (RHS) widget, the RHS widget comprising a workspace zone menu defining the first menu, in which the workspace zone menu comprises a plurality of workspace zones, each workspace zone having a corresponding sub-menu comprising a plurality of stored rack heights associated with the workspace zone. It is also contemplated that the first menu may comprise parameters or categories other than the zone. For example, the first menu may comprise a listing of racks designated by type, name and/or number. In some particular embodiments, at least a portion of a visual depiction of each workspace zone comprises a different color, and at least a portion of a visual depiction of each corresponding sub-menu comprises a same color as the associated workspace zone.
The processor when executing the executable instructions may define one of the plurality of widgets as a rack height select (RHS) widget comprising a workspace zone selection portion defining a first menu portion, in which a rack height selection portion defines a sub-menu portion, and a load presence indicator. In some particular embodiments, the processor when executing the executable instructions may control or cause display of the RHS widget in one of the widget spaces, detect a selection of a particular workspace zone and a particular stored rack height related to the particular workspace zone, in which after the selection of the particular workspace zone and the particular stored rack height, the workspace zone selection portion comprises an identifier of the particular workspace zone selected, the rack height selection portion comprises an identifier of the particular stored rack height selected, and the load presence indicator comprises a visual indication of a presence or an absence of a detected load. In other particular embodiments, the processor when executing the executable instructions may override the indication of the absence of a detected load upon activation of the load presence indicator by the vehicle operator.
In some embodiments, the processing device may further comprise a vehicle network system connecting the processor to at least one vehicle network bus, in which the processor extracts a current position of a carriage assembly and a current sensed load weight. The processor when executing the executable instructions may define one of the plurality of widgets as a capacity data monitoring (CDM) widget comprising a visual representation of the current position of the carriage assembly and the current sensed load weight.
The processing device may further comprise a vehicle operator control section comprising one or more physical input control elements, in which the one or more physical input control elements are used to make selections on the screen display. In some particular embodiments, the one or more physical input control elements may comprise at least one of a five-button control, a rotary control knob, a trigger switch on a multifunction control handle, or a trigger switch on an armrest.
The processor when executing the executable instructions may determine if a speed of the vehicle is below a threshold speed, and change one or more of the widgets of the subset on the touch screen display following detection of a gesture command on the touch screen display and if the speed of the vehicle is below the threshold speed.
The processor when executing the executable instructions may move one of the plurality of widgets to a predefined widget space upon activation of an icon corresponding to the one widget.
In accordance with a second aspect of the present disclosure, a processing device comprising a graphical user interface is provided. The processing device comprises a screen display, memory storing executable instructions, and a processor in communication with the memory. The processor when executing the executable instructions defines a plurality of widgets, in which each widget comprises a visual representation of a current state of an associated function, controls or causes display of a subset of the plurality of widgets on a portion of the screen display defining a plurality of widget spaces, controls or causes display of an icon tray on the screen display comprising one or more icons, in which at least one of the one or more icons corresponds to a respective one of the plurality of widgets, detects activation of the one of the one or more icons corresponding to the one widget, and in response to detecting the activation of the one icon, locks the respective one widget in position in one of the widget spaces.
The processor when executing the executable instructions may, in response to detecting the activation of the one icon, automatically move the one widget to the locked widget space and shift the remaining one or more widgets in the subset to the one or more remaining widget spaces.
In accordance with a third aspect of the present disclosure, a processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a screen display, memory storing executable instructions, and a processor in communication with the memory. The processor when executing the executable instructions defines one or more widgets each comprising a visual representation of a current state of an associated function of the industrial vehicle, controls or causes display of at least one of the one or more widgets on a portion of the screen display defining one or more widget spaces, controls or causes display of an icon tray on the screen display comprising one or more icons, in which at least one of the one or more icons corresponds to a respective one of the one or more widgets, detects activation of the one icon corresponding to the one widget, in response to detecting the activation of the one icon, allows a first menu portion of the one widget to be displayed, controls or causes display of a first menu associated with the one widget.
In an embodiment, the processor when executing the executable instructions may, in response to detecting the activation of the one icon, allow a first menu portion of the one widget to be activated, detect activation of the first menu portion, and, in response to detecting the activation of the first menu portion, control or cause display of the first menu associated with the one widget.
The processor when executing the executable instructions may, further in response to detecting the activation of the one icon, lock the one widget in position in a first widget space on the screen display.
In accordance with a fourth aspect of the present invention, a processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a screen display, memory storing executable instructions, and a processor in communication with the memory. The processor when executing the executable instructions defines one or more widgets, each widget comprising a visual representation of a current state of an associated function of the industrial vehicle, and controls or causes display of a rack height select (RHS) widget on a portion of the screen display defining one or more widget spaces, in which the RHS widget comprises a portion that changes state upon a related vehicle function being completed, e.g., a carriage assembly reaching a desired height. The outline of the RHS widget, defining the portion, may become one of darker, wider or both darker and wider upon a related vehicle function being completed, e.g., a carriage assembly reaching a desired height.
In accordance with a fifth aspect of the present invention, a processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a screen display, memory storing executable instructions, and a processor in communication with the memory. The processor when executing the executable instructions defines a plurality of widgets, in which each widget comprises a visual representation of a current state of an associated function of the industrial vehicle, controls or causes display of a subset of the plurality of widgets on a portion of the screen display defining a plurality of widget spaces, controls or causes display of an icon tray on the screen display comprising one or more icons, in which at least one of the one or more icons corresponds to a respective one of the plurality of widgets, detects activation of the one of the one or more icons corresponding to the one widget. The processor when executing the executable instructions, in response to detecting the activation of the one icon, moves the respective one widget to a predefined widget space, moves the respective one widget from the predefined widget space in response to an operator command, and moves the one widget back to the predefined widget space in response to a command related to a vehicle operation.
The command related to a vehicle operation may comprise one of a command to activate a traction motor to effect vehicle movement or a command to lift or lower a carriage assembly.
While the specification concludes with claims particularly pointing out and distinctly claiming the present invention, it is believed that the present invention will be better understood from the following description in conjunction with the accompanying Drawing Figures, in which like reference numerals identify like elements, and wherein:
In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration, and not by way of limitation, specific preferred embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and that changes may be made without departing from the spirit and scope of the present invention.
With reference to
The vehicle 100 further comprises a load handling assembly 140, which generally comprises a mast assembly 142 and a carriage assembly 144. The mast assembly 142 is positioned between the outriggers 180A, 180B and may comprise, for example, a fixed mast member 146 affixed to the frame 114 and nested first and second movable mast members 148, 150. It is noted that the vehicle 100 may comprise additional or fewer movable mast members than the two members 148, 150 shown in
A battery (not shown), which is housed in a compartment within the frame 114, supplies power to a traction motor (not shown) that is connected to the second wheel 120 and to one or more hydraulic motors (not shown). The hydraulic motor(s) supply power to several different systems, such as one or more hydraulic cylinders (not shown) for effecting generally vertical movement of the movable mast members 148, 150 relative to the fixed mast member 146 and generally vertical movement of the carriage assembly 144 relative to the second movable mast member 150 of the mast assembly 142, as shown by arrow A in
An operator's compartment 122 is located within the main body 112 for receiving an operator driving or operating the vehicle 100. The operator's compartment 122 comprises a variety of control elements including one or more handles, knobs, levers, switches, buttons, sliders, encoders, and combinations thereof, along with one or more devices that display information to the operator and/or receive operator input. For example, a tiller knob 124 is provided within the operator's compartment 122 for controlling steering of the vehicle 100. An armrest 170 located adjacent to an operator seat 128 comprises a control panel 126 for receiving input from the operator. In the embodiment shown in
In the embodiment shown in
In
In some embodiments, the display unit 151 may be mounted, for example, on one of the support structures 132A, 132B. Some vehicles 100, such as those designed to operate in cold storage, may include an enclosed cabin (not shown) comprising the operator's compartment 122, and the display unit 151 may be mounted elsewhere in the operator's compartment 122, such as on one or more additional support structures (not shown). In other embodiments, the display unit 151 may comprise a separate or standalone device, such as a tablet or laptop computer. In addition, although the rotary control knob 162 is depicted in
Turning now to
The processing devices 202 may comprise any device capable of communicating over the respective networks 204. In certain contexts and roles, the processing device 202 is intended to be mobile (e.g., a hardware-based processing device 202 provided on the vehicles 100). In this regard, the vehicles 100 include a processing device 202 that may communicate wirelessly to the network 204 to carry out the features described herein. Under such circumstances, the vehicles 100 may wirelessly communicate through one or more access points 210 to a corresponding networking component 206. The vehicles 100 may also be equipped with WiFi, cellular, or other suitable technology that allows the processing device 202 on the vehicles 100 to communicate directly with a remote device (e.g., over the network(s) 204).
The illustrative computer system 200 also comprises a hardware server 212 (e.g., a web server, a file server, and/or other processing device) that supports an analysis engine 214 and one or more corresponding data sources (designated generally by reference numeral 216). The analysis engine 214 and data sources 216 may provide resources to one or more of the processing devices 202, including the processing devices 202 installed on the vehicles 100.
With reference to
The processing device 202 illustrated in
In some embodiments, the processing device 202 is connected to a transceiver 222 for wireless communication. Although a single transceiver 222 is illustrated in
The processing device 202 also comprises data processing circuitry (illustrated generally as the control module 226) having a processor (μP) coupled to a memory for implementing executable instructions, including the relevant processes, or aspects thereof, as set out and described more fully herein. The control module 226 may also comprise other necessary processing circuitry and software, such as for implementing a display engine, camera processing engine, data processing engine(s), etc. In this regard, the control module 226 may comprise additional support circuitry, e.g., video ports, camera ports, input/output ports, etc. Moreover, the memory may comprise memory that stores processing instructions, as well as memory for data storage, e.g., to implement one or more databases, data stores, registers, arrays, etc. Additionally, the control module 226 implements processes such as operator login, pre-use inspection checklists, data monitoring, and other features, examples of which are described more fully in U.S. Pat. No. 8,060,400, the entirety of which is hereby incorporated by reference herein.
The processing device 202 may also optionally comprise vehicle power enabling circuitry 228 to selectively enable or disable the vehicle 100 and/or to selectively enable or disable select components or functions of the vehicle 100. In some embodiments, the vehicle power enabling circuitry 228 may partially or fully enable the vehicle 100 for operation, e.g., depending upon a proper operator login, a particular vehicle condition, etc. For example, the vehicle power enabling circuitry 228 may selectively provide power to components via a suitable power connection (not shown) or otherwise command certain vehicle components not to respond to vehicle operator control via vehicle messaging, e.g., across one or more vehicle communication busses.
Still further, the processing device 202 comprises a monitoring input/output (I/O) module 230 to communicate via wired or wireless connection between the control module 226 and one or more peripheral devices mounted to or otherwise associated with the vehicle 100, such as one or more cameras, sensors, meters, encoders, switches, etc. (not separately labeled; collectively represented by reference numeral 232). The monitoring I/O module 230 may optionally be connected to other devices, e.g., third party devices 234, such as one or more RFID scanners, displays, meters, bar code scanners, cameras, or other devices to convey information to the control module 226.
The processing device 202 is coupled to and/or communicates with other vehicle system components via a suitable vehicle network system 236. The vehicle network system 236 may comprise at least one wired or wireless network, bus, or other communications capability or combination thereof that allows electronic components of the vehicle 100 to communicate with each other. As an example, the vehicle network system 236 may comprise a controller area network (CAN) bus, ZigBee, Bluetooth®, Local Interconnect Network (LIN), time-triggered data-bus protocol (TTP), RS422 bus, Ethernet, universal serial bus (USB), other suitable communications technology, or combinations thereof.
As will be described more fully herein, utilization of the vehicle network system 236 enables seamless integration of the components of the vehicle 100 with the processing device 202, and in particular, the control module 226. By way of example, the vehicle network system 236 enables communication between the control module 226 and a fob (via a fob reader 240), a keypad, a card reader, or any other suitable device for receiving operator login identification, as well as one or more native vehicle components, such as a vehicle control module, controllers (e.g., traction controller, hydraulics controller, etc.), modules, devices, bus-enabled sensors, displays, lights, light bars, sound generating devices, headsets, microphones, haptic devices, etc. (designated generally by reference numeral 238). The control module 226 may also facilitate the communication of information from any electronic peripheral devices 232 or third party devices 234 associated with the vehicle 100 (e.g., via the monitoring I/O module 230) that integrate with and communicate over the vehicle network system 236.
Referring now to
The display unit 151 comprises a housing 304 having a front face 306 defining a display section 308 comprising the screen display 152 and a vehicle operator control section 310. The screen display 152 within the display section 308 may comprise, for example, an LCD screen, a light emitting diode (LED) screen, a plasma screen, etc. The screen display 152 may comprise any known technology, e.g., a touch screen display, so as to receive and respond to gesture commands, e.g., implemented by the operator directly touching or tapping the touch screen display 152, pressing against or releasing from the touch screen display 152, swiping, sliding, or rotating a finger along or across the touch screen display 152, and performing other touch gesture functions or combinations thereof. The terms “gesture command” and “touch gesture command” also include gesture commands that do not require direct physical contact with the screen display 152 such as when an operator moves a finger adjacent to but spaced a small distance from the touch screen display 152 in a swiping, sliding, rotating or other motion.
The vehicle operator control section 310 may comprise one or more physical input control elements, such as buttons, switches, sliders, encoders, knobs, etc., that are used to receive operator input, e.g., making selections on the touch screen display 152. One or more multifunction control handles, keypads, keyboards (not shown), or combinations thereof may be provided in place of the vehicle operator control section 310. As shown in
Referring generally to
With reference to
In embodiments in which the screen display 152 comprises a touch screen, the GUI controller module 402 receives and processes touch gesture commands when the operator touches the touch screen display 152, such as touch, tap, press, release, swipe, scroll, etc. Received touch gesture commands may comprise, for example, a first touch gesture command implemented as an up swipe gesture command, a second touch gesture command implemented as a right swipe gesture command, a third touch gesture command implemented as a left swipe gesture command, a fourth touch gesture command implemented as a down swipe gesture command, and a fifth touch gesture command implemented as a select gesture command (e.g., pressing and releasing, tapping, etc.).
In other embodiments, the GUI controller module 402 receives and processes operator input from one or more of the control elements in the vehicle operator control section 310 of the display unit 151 (
In this regard, the control module 226 (
The control module 226 may similarly map operator commands associated with the rotary control knob 162, 164F. For example, the control module 226 maps rotation of the rotary control knob 162, 164F to the left and operation of the left control to a same (second) graphical user interface command. The control module 226 maps rotation of the rotary control knob 162, 164F to the right and operation of the right control to a same (third) graphical user interface command. The control module may map depression of the rotary control knob 162, 164F and operation of the select control to a same (fifth) graphical user interface command.
The up and down commands or controls may be used to navigate vertically, e.g., up and down within various menus provided in the screen display 152 of the display unit 151 (
The redundancy of the commands and controls generated by touching the touch screen display 152, and using the corresponding control elements (e.g., buttons 164A-164E in
The GUI controller module 402 also facilitates customization of the user interaction experience. For example, the GUI controller module 402 communicates with a user management module 404 and a system management module 406. A user management module 404 may store personalized settings that are passed from the control module 226 (
The GUI controller module 402 further communicates with a vehicle management module 408. The vehicle management module 408 stores and controls information about the specific vehicle 100 on which the processing device 202 (
The GUI controller module 402 still further communicates with a language format module 410, which may be used to set a preferred language for the display of text on the screen display 152 (
The GUI controller module 402 further communicates with a message system module 414. The message system module 414 may control the messaging that is presented to the operator, as well as the manner in which the messaging is presented to the operator. For example, a message may be displayed across a portion of the screen display 152, e.g., across a bottom third, across one widget space (606, 608 in
In accordance with aspects of the present disclosure, the screen display 152 may be utilized to display one or more widgets, each of which is defined by an application program forming part of the dashboard module 416 that provides a visual representation on the screen display 152. In an embodiment, computer instructions are provided in the form of an application program stored in memory that instructs the processor of the control module 226 what a particular widget looks like, how it behaves and how it responds to operator actions and/or vehicle-related information. The visual representation provides information to the operator and allows the operator to interface with the control module 226. For example, widgets may provide visual representations of a current state of one or more associated vehicle features, functions, or operations (e.g., a battery charge, a current vehicle speed, etc.) and/or one or more ancillary conditions (e.g., environmental condition such as the current time). In an exemplary embodiment, widgets may be used to represent the current state of the vehicle speed, fork height, load weight, battery charge, clock, stop watch, odometer, trip meter, hour meter, time, and date.
In this regard, the widgets represent “live” or real-time data. With reference to
By way of example, by continually data logging operator-based performance and/or vehicle operation data, one or more of the widgets may provide a dashboard view of key vehicle and/or operator performance measures. In this regard, the overall data provided in a widget need not be limited to data collected by or stored in a specific vehicle. In some embodiments, one or more of the widgets may reflect all of the relevant vehicle data associated with the logged in operator, regardless of which vehicle the operator is currently operating. In other embodiments, one or more of the widgets may tie into third party databases to display other information, such as operational information, messages, information from a warehouse management system, feeds (such as from news, sports, and weather), etc. Thus, the processing device 202 is communicably connected to a communications device (e.g., the transceiver 222) such that the processing device 202 receives from a remote server (e.g., the server 212), information that is not extracted from the vehicle 100.
With reference to
Referring now to
As shown in
One status tray, e.g., the first status tray 604A, or a portion thereof may be used to display information such as one or more identifiers related to the operator, the vehicle, the vehicle owner, etc. One status tray, e.g., the second status tray 604B, or a portion thereof may comprise an icon row or an icon tray that is used to dock a predetermined number of system status icons (730 in
An optional widget position indicator 610 may be utilized to illustrate the number and position of the displayed widgets within the array 500. In the embodiment shown, the widget position indicator 610 comprises circles, but in other embodiments (not shown) the widget position indicator 610 may comprise another shape, e.g., squares, triangles, etc. A number of circles 610(1) . . . 610(N) may correspond to a number of widgets available within the array 500, see
With reference to
As shown in
Each icon 730 corresponds to a current state of an associated vehicle feature, function, or operation or an ancillary condition. For example, the icons 730 depicted in
In some embodiments, at least one of the icons 730 corresponds to a respective one of the widgets. The corresponding widget may be displayed in one of the widget spaces 606, 608, or the corresponding widget may be available in the array 500 (
In further embodiments, one or more of the icons 730 may appear only when a particular condition is satisfied or occurs. For example, the messaging icon 730D may appear in the second status tray 604B only upon receipt of a new message, and a maintenance icon (not shown) may appear only upon receipt of an indication of a problem with a vehicle component or system. In yet further embodiments, one or more of the icons 730 may be removed from the second status tray 604B when a particular condition is satisfied or occurs.
The performance icon 730C may be used to set a vehicle mode (e.g., training, economy, or full performance mode).
In some embodiments, selection or activation of one of the icons locks the corresponding widget into place on the display screen 600 in a designated or “locked” widget space. As used herein, “activation” is intended to comprise touching, tapping, clicking, or otherwise selecting a portion of the display screen where the icon is located using one or more touch gestures and/or one or more physical control elements, such as the physical control elements found in the vehicle operator control section 310 (
However, the widget corresponding to the activated icon may be located in one of the other widget spaces or may be off the display screen 600. In some embodiments, the widget corresponding to the activated icon may not be in the array 500 (
For example, with reference to
One or more characteristics of a visual appearance of the activated icon may be altered upon activation. For example, as shown in
In addition, one or more characteristics of the widget position indicator 610 may be altered to indicate that a widget has been locked into place in the locked widget space. For example, as shown in
Prior to activation of an icon and locking of the corresponding widget into the locked widget space, the operator may scroll through the widgets using one or more touch gestures and/or one or more physical control elements, as described herein, and the widgets in both widget spaces will change as the operator cycles through the array 500 (
In some embodiments, activation of an icon may move the corresponding widget to a predefined widget space but does not lock the widget in place. For example, activation of the RHS icon 730A may cause the RHS widget 760 to move into a predefined widget space, e.g., the first widget space 606 as shown in
In all embodiments, movement of the corresponding widget to a locked or a predefined widget space on the display screen 600 in response to a particular operator command may save time for the operator and help to increase productivity, as there is no need for the operator to manually search for the appropriate widget and/or move the widget back onto the display screen 600 if the operator has navigated away from the widget. Thus, the processing device 202 disclosed herein, as implemented, for example, in the display unit 151, provides a smart and flexible user interface that ensures that the operator receives the most relevant information at the correct time with the least operator input.
In additional embodiments, upon movement of a widget into a predetermined widget space (by scrolling, by activation of the corresponding icon, etc.), a message (not shown) related to the widget may optionally be displayed. If the predetermined widget space is, for example, the first widget space 606, the message may be temporarily superimposed over the second widget space 608 and may appear only when a predefined condition is met. For example, if a battery condition widget (not shown) is moved into the first widget space 606 and the battery charge is below a certain level, a message, e.g., “Low Battery,” may appear to alert the operator that the battery may need to be changed soon. In addition, if the operator moves the speedometer widget 750 into the first widget space 606, a message, e.g., “Speed Too High,” may appear if the operator is exceeding a speed limit.
In further embodiments, the control module 226, which is communicably coupled to one or more vehicle system modules via the vehicle network system 236 (
In some particular embodiments, the control module 226 extracts from a traction control module (not shown), directly or via a memory or current vehicle state lookup table, an indication as to whether the traction control is engaged. If the current operating state of the traction control module indicates that the traction controls are engaged, the control module 226 causes the display screen to “snap” back to a designated “home” position, such as the first two widgets in the array 500 (
In other particular embodiments, the control module 226 extracts from a hydraulic valve control module (not shown) an indication as to whether the forks 156A, 156B (
In yet further embodiments, the control module 226 may use the extracted data related to the current vehicle state to selectively disable operation of one or more portions of the display unit 151. The display screen 600 may continue to display the current state of one or more vehicle features, functions, or operations, but the touch layer may be fully or partially disabled such that the display screen 600 is unresponsive to touch gesture commands. The control module 226 may also optionally disable one or more of the control elements in the vehicle operator control section 310 (
In some particular embodiments, if the current operating state of the traction control module indicates that the traction controls are engaged, as described herein, the control module 226 may lock the display screen 600 so that the operator cannot scroll through other widgets or otherwise leave the home position.
In other particular embodiments, the control module 226 extracts a speed of the vehicle 100 based upon information received from the vehicle network bus, e.g., a vehicle network system 236 (
In yet further particular embodiments, the display of the icons and/or widgets on the display screen 600 may be customized based on static vehicle information, such as a vehicle type (e.g., forklift vs. stock picker), a vehicle model, etc., and/or one or more operator-based metrics, such as a current level of completion of a task (e.g., percentage of picks per shift), an operator skill or performance level, a level of correct vehicle operation or environmental behaviors, etc. For example, less skilled operators may benefit from the constant display of the icons and/or widgets corresponding to a steer wheel/travel direction 730B and a vehicle speed 750, while more skilled operators may wish to monitor different vehicle operations and systems. These features help to ensure that the display screen 600 presents each individual vehicle operator with the relevant and useful information.
With reference to
As shown in
Data related to the detected load weight and the current fork height, tilt, and/or centering may be obtained as described herein and provided to the CDM widget 740 for display. For example, the processor of the control module 226 is in communication with one or more vehicle control modules, sensors, etc. (e.g., 232), across the vehicle network system 236, via the monitoring I/O module 230, or a combination thereof (
As shown in
The RHS widget 760 may comprise a first menu portion 761, a sub-menu portion 762, and a pallet presence indicator 763, as shown in
The operator may access the first menu 764 by activating the first menu portion 761 using one or more touch gestures and/or the one or more control elements in the vehicle operator control section 310 (
As shown in
In some embodiments, the options contained in the first menu 764 (also referred to herein as a workspace zone menu) comprise a list of available workspace zones. As described herein, one or more workspace zones may be stored in a memory of the vehicle 100. Each zone may correspond to, for example, a particular work site, warehouse, room, or other workspace, or area or portion thereof. The zones may be customized by a vehicle owner or other end user based on the various zone(s) in which the vehicle 100 will be used. For example, the number of available zones may be customized, and each zone may be assigned a zone identifier, e.g., a name (e.g., “Stacker Pallets” in
In other embodiments (not shown), the options listed in the first menu 764 may comprise parameters or categories other than the zone. In one particular embodiment, the options may comprise a listing of racks designated by type, name, and/or number. For example, a first menu may comprise a listing of racks such as: Fixed Rack #1; Portable Rack #1; Fixed Rack #2; Portable Rack #2. Each rack will have corresponding programmed rack heights and may be independent of a zone or location of the rack. In another particular embodiment, the options may comprise a job type, e.g., pickup or put away.
With reference to the embodiment shown in
Following selection of the desired option in the workspace zone menu 764, the display screen 600 reverts back to a display of the RHS widget 760 with the new selected workspace zone. For example, if the operator selects “Freezer” in the workspace zone menu 764 shown in
In addition, as shown in
With reference to
The rack height identifier 762a may comprise information related to a currently displayed rack height, such as a name (“Height 3”), a number, a color, or other identifying feature or combination thereof. As shown in
When the first menu provides a listing of rack designations, the additional options available for selection in the sub-menu portion may comprise a plurality of programmed rack heights. Each rack designation in the first menu may have a corresponding set of one or more programmed rack heights in the sub-menu portion. For example, Fixed Rack #1 will have a first set of programmed rack heights and Fixed Rack #2 will have a second set of programmed rack heights, wherein the first and second sets may be different.
In some embodiments, the rack height selection portion 762 displays information related to the last rack height selected by the operator. In other embodiments, the rack height select portion 762 displays information related to a default rack height, e.g., a next higher or lower available rack height based on a current position of the fork carriage assembly 144 (
In the illustrated embodiment, the operator may select a programmed rack height via the sidebar 762b using one or more touch gestures and/or one or more physical control elements in the vehicle operator control section 310 (
In one embodiment, the operator may select the desired rack height using one or more touch gestures. For example, the operator may scroll through the tabs in the sidebar 762b, such that when each tab is touched, information regarding that tab's corresponding rack height is displayed in the rack height identifier 762a. Thus, an operator may touch a tab in the sidebar 762b corresponding to the desired rack height or swipe a finger along the tabs and select the tab corresponding to the desired rack height. Releasing touch of a selected tab in the sidebar 762B causes the corresponding programmed rack height to be selected. As shown in
In other embodiments, the rack height identifier 762a may comprise a scroll wheel that allows the operator to scroll through the available programmed rack heights by swiping or sliding his finger up or down along the text displayed in the rack height identifier 762a. The scroll wheel may wrap around and repeat when the operator reaches the last option at the top or bottom of the list. The scroll wheel defines a sub-menu providing a listing of programmed rack heights corresponding to the workspace zone designated in the first menu portion 761, which, in
In further embodiments, the operator may use one or more physical control elements located in the vehicle operator control section 310 (
In yet further embodiments, the operator may use one or more physical control elements located in the control panel 126 (
In yet further embodiments, a trigger switch is provided on a multifunction control handle and when the RHS icon 730A is activated but no programmed height is selected, the display screen 600 may display the RHS widget 760. During lifting or lowering of the carriage assembly 144 via the multifunction control handle, the height shown on the display screen will automatically change to a next available programmed rack height. As the carriage assembly 144 is moving, the operator may select the next available programmed rack height, and the carriage assembly 144 will stop at the selected rack height. For example, following activation of the RHS icon 730A′ and selection of the “Stacker Pallets” zone, the operator begins a lifting operation without first choosing a programmed rack height. During the continuous lifting operation and while the carriage assembly 144 is between racks, the operator actuates the trigger switch (not shown) when the operator wishes for the carriage assembly 144 to stop at the next available programmed rack height, and the carriage assembly 144 will stop at that next available programmed rack height, e.g., the fifth programmed height in
In all embodiments, a visual appearance of one or more portions of the visual depiction of the first menu 764, the first menu portion 761 and the options contained therein, and/or the sub-menu portion 762 (including one or more of the rack height identifier 762a and the sidebar 762b) may be altered to indicate selection of a particular option, e.g., a workspace zone, and/or a particular additional option, e.g., a rack height. In some embodiments, each option within the first menu 764 may be color-coded with a different color, and one or more of the items displayed in the first menu portion 761 and/or the sub-menu menu portion 762 may comprise a same color associated with the corresponding option in the first menu 764.
For example, as shown in
In some embodiments, the visual appearance of one or more portions of the CDM widget 750 and/or the RHS widget 760 may also change to indicate that the carriage assembly 144 (
In addition, in some embodiments, a display of a portion of the CDM widget 740 and/or the RHS widget 760 may change in real time as the carriage assembly 144 raises or lowers. Movement of the carriage assembly 144 may be indicated by a corresponding upward or downward movement of the forks 744 and the pointer 749 along the scale 742 and by a corresponding increase or decrease in the numerical indication 747 of the rack height in the CDM widget 740. In addition, if a programmed rack height has not been selected by a user prior to movement of the carriage assembly 144, the information displayed in the rack height selection portion 762 may change as the forks approach each programmed rack height. With reference to
The real-time display feature may be particularly helpful in embodiments in which the operator selects a programmed rack height during a lifting or lowering operation. For example, during lifting and lowering operations, the information displayed in the rack height selection portion 762 of the RHS widget 760 indicates the next available programmed rack height so that the operator may, for example, actuate the trigger switch (not shown) to select the upcoming programmed rack height. The operator may also use the location of the forks 744 along the scale 742 and the numerical indication 747 shown in the CDM widget 740 to gauge the current position of the carriage assembly 144 and the proximity to the next programmed rack height.
As illustrated herein, the rack height selection feature may be used in conjunction with the zone selection feature, but those of skill in the art will appreciate that the two features may be used independently. Combined use of the two features helps to eliminate confusion between similar, but slightly different, programmed rack heights that may exist in different workspace zones. For example, different zones in a large warehouse may comprise rack heights that are only inches apart. In the absence of zones, it may be difficult for the operator to easily determine whether the forks have been raised to the correct height. Combined use of the two features also reduces the number of programmed rack heights through which the operator must search. For example, a vehicle 100 that is used in several locations may store a large number of programmed rack heights. Without zones, the operator must search through all of the available rack heights, which adds time and difficulty to the selection process and decreases operator productivity, particularly in environments required gloved operation. For embodiments where a trigger switch is provided and used, having corresponding programmed heights defined for separate workspace zones makes use of the trigger switch during a lifting operation more usable as the operator is presented only with corresponding programmed heights in the selected workspace zone in which the operator is working.
The pallet presence indicator 763 will now be described in more detail. As shown in
As shown in
In
However, some loads (typically <500 pounds) may be too light for automatic detection by the one or more pressure sensors, causing the control module 226 (
As shown in
With reference to
When no load is detected or the detected load requires no lift height restrictions, both areas 742a, 742b of the scale 742 may comprise a uniform, default color, e.g., green (not shown), to provide a highly visible indication to the operator that all lift heights are within the lift capacity of the vehicle 100. In some embodiments, the CDM widget 740 may comprise an indicator (not shown) representing a percentage of capacity, e.g., an indication that the carriage assembly 144 has been raised to 80% of the determined maximum lift height.
Upon detection of a load requiring a lift height restriction, the control module 226 (
When the detected load weight exceeds a maximum lift capacity of the vehicle 100, the entire scale 742, including areas 742a and 742b, may comprise a different uniform color, e.g., red (not shown), to provide a highly visible indication to the operator that the current load should not be lifted to any height. In this situation, the control module 226 may allow very limited movement of the vehicle 100, e.g., operation at a speed below a certain threshold or over a predetermined distance, and may optionally completely disable operation of the vehicle 100. In some embodiments, a color-coded message (not shown) may be displayed on the display screen 600 to notify or warn the operator that, for example, a determined maximum lift height for the detected load has been exceeded, the detected load exceeds a determined maximum lift capacity of the vehicle, and/or that the forks 156A (
In addition to, or in place of, the use of one or more touch gestures or physical control elements in the vehicle operator control section 310 (
For example, receipt of the verbal command ACTIVATE RHS ICON or ACTIVATE RHS WIDGET may activate the RHS icon 730A and move the RHS widget 760 (
Referring now to
This list of peripherals is presented by way of illustration, and is not intended to be limiting. Other peripheral devices may be suitably integrated into the computer system 800. The memory 820, storage 860, removable media insertable into the removable media storage 870, or combinations thereof may be used to implement the methods, configurations, interfaces and other aspects set out and described herein.
The microprocessor(s) 810 control operation of the exemplary computer system 800. Moreover, one or more of the microprocessor(s) 810 execute computer readable code that instructs the microprocessor(s) 810 to implement the methods and processes herein. The computer readable code may be stored for instance, in the memory 820, storage 860, removable media storage device(s) 870, or other suitable tangible storage medium accessible by the microprocessor(s) 810. The memory 820 may also function as a working memory, e.g., to store data, an operating system, etc.
The methods and processes herein may be implemented as a machine-executable method executed on a computer system, e.g., one or more general or particular computing devices such as the processing devices 202 of
Computer program code for carrying out operations for any aspect or embodiment of the present disclosure may be written in any combination of one or more programming languages. The program code may execute fully or partially on the computer system 800. In the latter scenario, the remote computer may be connected to the computer system 800 through any type of network connection, e.g., using the network adapter 890 of the computer system 800. In implementing computer aspects of the present disclosure, any combination of computer-readable medium may be utilized. The computer-readable medium may be a computer readable signal medium, a computer-readable storage medium, or a combination thereof. Moreover, a computer-readable storage medium may be implemented in practice as one or more distinct mediums.
A computer-readable storage medium is a tangible device/hardware that may retain and store a program (instructions) for use by or in connection with an instruction execution system, apparatus, or device, e.g., a computer or other processing device set out more fully herein. Notably, a computer-readable storage medium does not encompass a computer-readable signal medium. Thus, a computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves through a transmission media. Specific examples of the computer-readable storage medium may include, but are not limited to, the following: a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), Flash memory, or any suitable combination of the foregoing. In particular, a computer-readable storage medium comprises computer-readable hardware such as a computer-readable storage device, e.g., memory. As used herein, a computer-readable storage device and computer-readable hardware are physical, tangible implementations that are non-transitory.
By non-transitory, it is meant that, unlike a transitory propagating signal per se, which will naturally cease to exist, the contents of the computer-readable storage device or computer-readable hardware that define the claimed subject matter persists until acted upon by an external action. For instance, program code loaded into random access memory (RAM) is deemed non-transitory in that the content will persist until acted upon, e.g., by removing power, by overwriting, deleting, modifying, etc. Moreover, since hardware comprises physical element(s) or component(s) of a corresponding computer system, hardware does not encompass software, per se. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited only to the embodiments in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention.
Having thus described the invention of the present application in detail and by reference to embodiments thereof, it will be apparent that modifications and variations are possible without departing from the scope of the invention defined in the appended claims.
This application claims the benefit of U.S. Provisional Application No. 62/425,099, filed Nov. 22, 2016, which is hereby incorporated by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/210,049, entitled “PROCESSING DEVICE HAVING A GRAPHICAL USER INTERFACE FOR INDUSTRIAL VEHICLE,” by Anthony T. Castaneda, et al., filed on Jul. 14, 2016, which claims the benefit of U.S. Provisional Patent Application Ser. No. 62/193,840, filed on Jul. 17, 2015, both of which are hereby incorporated by reference in their entirety. This application is also related to the following applications, all of which are filed concurrently herewith: U.S. patent application Ser. No. 15/815,788, entitled “USER INTERFACE DEVICE FOR INDUSTRIAL VEHICLE,” by Jonathan Ochenas, et al.; U.S. patent application Ser. No. 15/815,801, entitled “USER INTERFACE DEVICE FOR INDUSTRIAL VEHICLE,” by Jonathan Ochenas, et al.; and U.S. patent application Ser. No. 15/815,810, entitled “USER INTERFACE DEVICE FOR INDUSTRIAL VEHICLE,” by Jonathan Ochenas, et al., all of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3010595 | Stone | Nov 1961 | A |
3319816 | Christenson | May 1967 | A |
3410433 | Brown | Nov 1968 | A |
3542161 | Ulinski | Nov 1970 | A |
3854820 | Hansen | Dec 1974 | A |
3937339 | Geis et al. | Feb 1976 | A |
4062269 | Chichester et al. | Dec 1977 | A |
4074794 | Scholl | Feb 1978 | A |
4122957 | Allen et al. | Oct 1978 | A |
4130183 | Tjörnemark | Dec 1978 | A |
4162869 | Hitomi et al. | Jul 1979 | A |
4212375 | Peterson et al. | Jul 1980 | A |
4235308 | Davis | Nov 1980 | A |
4279328 | Ahlbom | Jul 1981 | A |
4411582 | Nakada | Oct 1983 | A |
4439102 | Allen | Mar 1984 | A |
4491918 | Yuki et al. | Jan 1985 | A |
4499541 | Yuki et al. | Feb 1985 | A |
4509127 | Yuki et al. | Apr 1985 | A |
4511974 | Nakane et al. | Apr 1985 | A |
4517645 | Yuki et al. | May 1985 | A |
4520443 | Yuki et al. | May 1985 | A |
4547844 | Adams | Oct 1985 | A |
4598797 | Schultz | Jul 1986 | A |
4612623 | Bazarnik | Sep 1986 | A |
4634332 | Kamide et al. | Jan 1987 | A |
4708577 | Fratzke | Nov 1987 | A |
4782920 | Gaibler et al. | Nov 1988 | A |
4957408 | Ohkura | Sep 1990 | A |
5006829 | Miyamoto | Apr 1991 | A |
5011358 | Andersen et al. | Apr 1991 | A |
5056437 | Maddock | Oct 1991 | A |
5088879 | Ranly | Feb 1992 | A |
5208753 | Acuff | May 1993 | A |
5224815 | Abels et al. | Jul 1993 | A |
5238086 | Aoki et al. | Aug 1993 | A |
5555957 | Dreher et al. | Sep 1996 | A |
5586620 | Dammeyer et al. | Dec 1996 | A |
5704051 | Lane et al. | Dec 1997 | A |
5734377 | Fukuzaki | Mar 1998 | A |
5749696 | Johnson | May 1998 | A |
5791440 | Lonzinski et al. | Aug 1998 | A |
5880684 | Diekhans et al. | Mar 1999 | A |
5890086 | Wellman et al. | Mar 1999 | A |
5956255 | Flamme | Sep 1999 | A |
5994650 | Eriksson et al. | Nov 1999 | A |
5995001 | Wellman et al. | Nov 1999 | A |
6005299 | Hengst | Dec 1999 | A |
6039141 | Denny | Mar 2000 | A |
6049813 | Danielson et al. | Apr 2000 | A |
6073069 | Kim | Jun 2000 | A |
6100476 | Adamietz et al. | Aug 2000 | A |
6128007 | Seybold | Oct 2000 | A |
6128553 | Gordon et al. | Oct 2000 | A |
6138795 | Kamiya | Oct 2000 | A |
6164415 | Takeuchi et al. | Dec 2000 | A |
6209913 | Ishikawa et al. | Apr 2001 | B1 |
6282464 | Obradovich | Aug 2001 | B1 |
6331866 | Eisenberg | Dec 2001 | B1 |
6343237 | Rossow et al. | Jan 2002 | B1 |
6345694 | Volker | Feb 2002 | B1 |
6369717 | Damiani et al. | Apr 2002 | B1 |
6429773 | Schuyler | Aug 2002 | B1 |
6437701 | Muller | Aug 2002 | B1 |
6494527 | Bischoff | Dec 2002 | B1 |
6539289 | Ogino et al. | Mar 2003 | B2 |
6600418 | Francis et al. | Jul 2003 | B2 |
6640114 | Bae | Oct 2003 | B2 |
6648581 | Gibson | Nov 2003 | B2 |
6667726 | Damiani et al. | Dec 2003 | B1 |
6686911 | Levin et al. | Feb 2004 | B1 |
6724403 | Santoro et al. | Apr 2004 | B1 |
6817824 | Winkler | Nov 2004 | B2 |
7010404 | Ichijo et al. | Mar 2006 | B2 |
7028264 | Santoro et al. | Apr 2006 | B2 |
7089098 | Rogg et al. | Aug 2006 | B2 |
7154480 | Iesaka | Dec 2006 | B2 |
7165643 | Bozem et al. | Jan 2007 | B2 |
D537374 | Smiley | Feb 2007 | S |
7172050 | Amamiya | Feb 2007 | B2 |
7192236 | Upmeyer | Mar 2007 | B1 |
7216024 | Abels et al. | May 2007 | B1 |
7219769 | Yamanouchi et al. | May 2007 | B2 |
7225413 | Kuenzner et al. | May 2007 | B1 |
7237203 | Kuenzner | Jun 2007 | B1 |
7274970 | Schuchard | Sep 2007 | B2 |
7287625 | Harris | Oct 2007 | B1 |
7322444 | Allerding et al. | Jan 2008 | B2 |
7360175 | Gardner et al. | Apr 2008 | B2 |
7372473 | Venolia | May 2008 | B2 |
7376907 | Santoro et al. | May 2008 | B2 |
7415352 | Olcott | Aug 2008 | B2 |
7418670 | Goldsmith | Aug 2008 | B2 |
7477268 | Venolia | Jan 2009 | B2 |
7595722 | Heimermann et al. | Sep 2009 | B2 |
7599776 | Sonderegger et al. | Oct 2009 | B2 |
7612673 | Onderko et al. | Nov 2009 | B2 |
7672768 | Narisawa et al. | Mar 2010 | B2 |
7683771 | Loeb | Mar 2010 | B1 |
7706947 | Bozem et al. | Apr 2010 | B2 |
7806470 | Steege et al. | Oct 2010 | B2 |
7822513 | Wulff | Oct 2010 | B2 |
7857090 | Ruhter et al. | Dec 2010 | B2 |
7872587 | Hindryckx et al. | Jan 2011 | B2 |
7896358 | Hoff | Mar 2011 | B2 |
7909561 | Addleman et al. | Mar 2011 | B2 |
7922899 | Vasta et al. | Apr 2011 | B2 |
7987431 | Santoro et al. | Jul 2011 | B2 |
7992686 | McCabe | Aug 2011 | B2 |
8001483 | de Souza et al. | Aug 2011 | B2 |
8055405 | Baginski et al. | Nov 2011 | B2 |
8083034 | Bordwell et al. | Dec 2011 | B2 |
8108090 | Bauer | Jan 2012 | B2 |
8125457 | Lawson et al. | Feb 2012 | B2 |
8201097 | Kondo et al. | Jun 2012 | B2 |
8207841 | Watson et al. | Jun 2012 | B2 |
8230976 | Baldini | Jul 2012 | B2 |
8239251 | Wellman | Aug 2012 | B2 |
8265836 | Yamada et al. | Sep 2012 | B2 |
8340873 | Finley et al. | Dec 2012 | B2 |
8362893 | Ishikawa | Jan 2013 | B2 |
8443943 | McCabe et al. | May 2013 | B2 |
8482534 | Pryor | Jul 2013 | B2 |
8515629 | Medwin et al. | Aug 2013 | B2 |
8521373 | Behncke et al. | Aug 2013 | B2 |
8536996 | Watson et al. | Sep 2013 | B2 |
8549432 | Warner | Oct 2013 | B2 |
8565913 | Emanuel et al. | Oct 2013 | B2 |
8583314 | de Oliveira et al. | Nov 2013 | B2 |
8627073 | Kherani et al. | Jan 2014 | B2 |
8632082 | Lantz et al. | Jan 2014 | B2 |
8649964 | Kizaki | Feb 2014 | B2 |
8682401 | Ebner et al. | Mar 2014 | B2 |
8694194 | Waltz et al. | Apr 2014 | B2 |
8701044 | Kolletzki | Apr 2014 | B2 |
8706920 | Fleizach et al. | Apr 2014 | B2 |
8713467 | Goldenberg et al. | Apr 2014 | B1 |
8731785 | McCabe et al. | May 2014 | B2 |
8756002 | Sathish | Jun 2014 | B2 |
8763759 | Viereck et al. | Jul 2014 | B2 |
8781642 | Tarasinski et al. | Jul 2014 | B2 |
8799799 | Cervelli et al. | Aug 2014 | B1 |
8811265 | Horvath | Aug 2014 | B2 |
8836545 | Eckstein et al. | Sep 2014 | B2 |
8849510 | Tanaka | Sep 2014 | B2 |
8892241 | Weiss | Nov 2014 | B2 |
8892294 | Waltz et al. | Nov 2014 | B2 |
8907778 | Wller et al. | Dec 2014 | B2 |
8977441 | Grimes et al. | Mar 2015 | B2 |
9002626 | Waltz et al. | Apr 2015 | B2 |
9008856 | Ricci et al. | Apr 2015 | B2 |
9025827 | Holeva et al. | May 2015 | B2 |
9057221 | Warr | Jun 2015 | B2 |
9075468 | Becker et al. | Jul 2015 | B2 |
9080319 | Oates, Jr. et al. | Jul 2015 | B2 |
9128575 | Lee | Sep 2015 | B2 |
9160854 | Daddi et al. | Oct 2015 | B1 |
9181965 | Pirotais | Nov 2015 | B2 |
9235553 | Fitch et al. | Jan 2016 | B2 |
9278839 | Gilbride et al. | Mar 2016 | B2 |
9361000 | Furue et al. | Jun 2016 | B2 |
9434585 | Gilbride et al. | Sep 2016 | B2 |
9448692 | Mierau et al. | Sep 2016 | B1 |
9575628 | Meegan et al. | Feb 2017 | B2 |
9658738 | Park et al. | May 2017 | B1 |
9723457 | Brahmi et al. | Aug 2017 | B2 |
9740304 | Chandel et al. | Aug 2017 | B2 |
9760644 | Khvostichenko et al. | Sep 2017 | B2 |
9792013 | Fleizach et al. | Oct 2017 | B2 |
9952703 | Hoen et al. | Apr 2018 | B2 |
10073708 | Kardamilas | Sep 2018 | B2 |
10138101 | Svensson et al. | Nov 2018 | B2 |
10282088 | Kim et al. | May 2019 | B2 |
20020070852 | Trauner et al. | Jun 2002 | A1 |
20020084887 | Arshad et al. | Jul 2002 | A1 |
20030205433 | Hagman | Nov 2003 | A1 |
20040031649 | Schiebel et al. | Feb 2004 | A1 |
20040150674 | Takahashi et al. | Aug 2004 | A1 |
20040200644 | Paine et al. | Oct 2004 | A1 |
20040233234 | Chaudhry et al. | Nov 2004 | A1 |
20040249538 | Osaki et al. | Dec 2004 | A1 |
20050102081 | Patterson | May 2005 | A1 |
20050113944 | Santarossa | May 2005 | A1 |
20050172239 | Liu et al. | Aug 2005 | A1 |
20060015818 | Chaudhri et al. | Jan 2006 | A1 |
20060182582 | Sharpton | Aug 2006 | A1 |
20060224945 | Khan et al. | Oct 2006 | A1 |
20070007080 | Manthey et al. | Jan 2007 | A1 |
20070111672 | Saintoyant et al. | May 2007 | A1 |
20070210901 | Ahrens et al. | Sep 2007 | A1 |
20070213869 | Bandringa et al. | Sep 2007 | A1 |
20070233304 | Baginski et al. | Oct 2007 | A1 |
20070236475 | Wherry | Oct 2007 | A1 |
20080015955 | Ehrman et al. | Jan 2008 | A1 |
20080055273 | Forstall | Mar 2008 | A1 |
20080067005 | Hagman | Mar 2008 | A1 |
20080154712 | Wellman | Jun 2008 | A1 |
20080211779 | Pryor | Sep 2008 | A1 |
20080244414 | Marcoullier et al. | Oct 2008 | A1 |
20090057065 | Akaki et al. | Mar 2009 | A1 |
20090059004 | Bochicchio | Mar 2009 | A1 |
20090101447 | Durham et al. | Apr 2009 | A1 |
20090125850 | Karstens | May 2009 | A1 |
20090236183 | Bordwell et al. | Sep 2009 | A1 |
20090265059 | Medwin et al. | Oct 2009 | A1 |
20090267921 | Pryor | Oct 2009 | A1 |
20090271778 | Mandyam et al. | Oct 2009 | A1 |
20090326717 | Vaherto | Dec 2009 | A1 |
20100005419 | Miichi et al. | Jan 2010 | A1 |
20100023865 | Fulker et al. | Jan 2010 | A1 |
20100039247 | Ziegler et al. | Feb 2010 | A1 |
20100100512 | Brodin et al. | Apr 2010 | A1 |
20100223332 | Maxemchuk et al. | Sep 2010 | A1 |
20100277438 | Kawashima et al. | Nov 2010 | A1 |
20110088979 | Bandringa et al. | Apr 2011 | A1 |
20110106294 | Bebbington | May 2011 | A1 |
20110119614 | Powell et al. | May 2011 | A1 |
20110234389 | Mellin | Sep 2011 | A1 |
20110238259 | Bai et al. | Sep 2011 | A1 |
20110252369 | Chaudhri | Oct 2011 | A1 |
20110320978 | Horodezky et al. | Dec 2011 | A1 |
20120012425 | Hayase et al. | Jan 2012 | A1 |
20120053754 | Pease et al. | Mar 2012 | A1 |
20120096979 | Trujillo Linke | Apr 2012 | A1 |
20120110493 | Cabral | May 2012 | A1 |
20120229394 | Ehrl et al. | Sep 2012 | A1 |
20120229493 | Kim et al. | Sep 2012 | A1 |
20120235804 | Gilbride et al. | Sep 2012 | A1 |
20120256843 | Epple et al. | Oct 2012 | A1 |
20120275892 | Allerding et al. | Nov 2012 | A1 |
20120284658 | Hirvonen | Nov 2012 | A1 |
20120284673 | Lamb et al. | Nov 2012 | A1 |
20130004282 | Grimes et al. | Jan 2013 | A1 |
20130050131 | Lee et al. | Feb 2013 | A1 |
20130075203 | Sayles | Mar 2013 | A1 |
20130081716 | Pirotais | Apr 2013 | A1 |
20130093685 | Kalu et al. | Apr 2013 | A1 |
20130101173 | Holeva et al. | Apr 2013 | A1 |
20130110329 | Kinoshita et al. | May 2013 | A1 |
20130111410 | Okada et al. | May 2013 | A1 |
20130132246 | Amin et al. | May 2013 | A1 |
20130145360 | Ricci | Jun 2013 | A1 |
20130152003 | Ricci et al. | Jun 2013 | A1 |
20130166146 | Tanaka | Jun 2013 | A1 |
20130169549 | Seymour et al. | Jul 2013 | A1 |
20130176223 | Lee et al. | Jul 2013 | A1 |
20130194228 | Tuzar | Aug 2013 | A1 |
20130205258 | Ecker et al. | Aug 2013 | A1 |
20130241720 | Ricci et al. | Sep 2013 | A1 |
20130254675 | De Andrade et al. | Sep 2013 | A1 |
20130285949 | Manabe et al. | Oct 2013 | A1 |
20130290887 | Sun et al. | Oct 2013 | A1 |
20130305354 | King et al. | Nov 2013 | A1 |
20140068477 | Roh | Mar 2014 | A1 |
20140081429 | Miles et al. | Mar 2014 | A1 |
20140082565 | Suzuki | Mar 2014 | A1 |
20140088827 | Yashiro | Mar 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140123072 | Bhowmick et al. | May 2014 | A1 |
20140133906 | Frelich et al. | May 2014 | A1 |
20140139354 | Miyazaki | May 2014 | A1 |
20140173516 | Hwang et al. | Jun 2014 | A1 |
20140188576 | de Oliveira et al. | Jul 2014 | A1 |
20140236389 | Higgins et al. | Aug 2014 | A1 |
20140258908 | Miyoshi | Sep 2014 | A1 |
20140278621 | Medwin et al. | Sep 2014 | A1 |
20140302774 | Burke et al. | Oct 2014 | A1 |
20140320293 | Hunter, Jr. et al. | Oct 2014 | A1 |
20140380243 | Furue | Dec 2014 | A1 |
20150062017 | Barabas | Mar 2015 | A1 |
20150064668 | Manci et al. | Mar 2015 | A1 |
20150113462 | Chen et al. | Apr 2015 | A1 |
20150130712 | Hirai | May 2015 | A1 |
20150169077 | Lee | Jun 2015 | A1 |
20150175397 | Lynn et al. | Jun 2015 | A1 |
20150177362 | Gutierrez | Jun 2015 | A1 |
20150225218 | Strand | Aug 2015 | A1 |
20150226560 | Chandrasekar et al. | Aug 2015 | A1 |
20150243167 | Stählin | Aug 2015 | A1 |
20150268746 | Cuddihy et al. | Sep 2015 | A1 |
20150298549 | Tamura | Oct 2015 | A1 |
20160012707 | McKinley et al. | Jan 2016 | A1 |
20160026838 | Gillet et al. | Jan 2016 | A1 |
20160041803 | Markov et al. | Feb 2016 | A1 |
20160054849 | Steiger | Feb 2016 | A1 |
20160077688 | Shim | Mar 2016 | A1 |
20160082960 | Slaton | Mar 2016 | A1 |
20160088146 | Ying | Mar 2016 | A1 |
20160196041 | Lavoie | Jul 2016 | A1 |
20160306503 | Youtsey | Oct 2016 | A1 |
20160313875 | Williams et al. | Oct 2016 | A1 |
20160347248 | Manci et al. | Dec 2016 | A1 |
20160374002 | Tuluca | Dec 2016 | A1 |
20170017392 | Castaneda | Jan 2017 | A1 |
20170024058 | Aubry | Jan 2017 | A1 |
20170091704 | Wolf et al. | Mar 2017 | A1 |
20170120723 | Sura | May 2017 | A1 |
20170178536 | Manci et al. | Jun 2017 | A1 |
20170249745 | Fiala | Aug 2017 | A1 |
20180107320 | Im et al. | Apr 2018 | A1 |
20180126507 | Rivers et al. | May 2018 | A1 |
Number | Date | Country |
---|---|---|
103631423 | Mar 2014 | CN |
10144751 | Mar 2003 | DE |
10259704 | Aug 2003 | DE |
10131839 | Feb 2004 | DE |
102005022476 | Nov 2006 | DE |
102007023774 | Nov 2008 | DE |
102008027695 | Oct 2009 | DE |
102009032492 | Jan 2011 | DE |
102010005034 | Jul 2011 | DE |
102010055971 | Jun 2012 | DE |
102011012415 | Aug 2012 | DE |
102011012416 | Aug 2012 | DE |
102011018520 | Sep 2012 | DE |
102011018802 | Oct 2012 | DE |
102011103029 | Dec 2012 | DE |
102011103214 | Dec 2012 | DE |
102012204694 | Sep 2013 | DE |
102013006412 | Oct 2014 | DE |
102014113555 | Mar 2016 | DE |
102015107260 | May 2016 | DE |
102016117013 | Mar 2018 | DE |
0416171 | Mar 1991 | EP |
0376206 | Aug 1995 | EP |
0712062 | Mar 2001 | EP |
0812799 | Dec 2001 | EP |
1203743 | Aug 2005 | EP |
1247686 | May 2006 | EP |
1468958 | Mar 2007 | EP |
1179466 | Apr 2007 | EP |
1447374 | Mar 2008 | EP |
1553044 | May 2008 | EP |
1604942 | Aug 2008 | EP |
1714822 | Jan 2009 | EP |
2272788 | Jan 2011 | EP |
1350668 | May 2012 | EP |
2123596 | Oct 2012 | EP |
2439165 | Oct 2012 | EP |
2511677 | Oct 2012 | EP |
2512163 | Oct 2012 | EP |
2518009 | Oct 2012 | EP |
2527288 | Aug 2013 | EP |
2631760 | Aug 2013 | EP |
2412661 | Jan 2014 | EP |
2518000 | Jan 2014 | EP |
2649820 | Nov 2014 | EP |
2799388 | Nov 2014 | EP |
2647591 | Dec 2014 | EP |
2470465 | Mar 2015 | EP |
2653429 | Mar 2015 | EP |
2653430 | Mar 2015 | EP |
2848437 | Mar 2015 | EP |
1655263 | May 2015 | EP |
2172413 | Jun 2015 | EP |
2886507 | Jun 2015 | EP |
2574589 | Jul 2015 | EP |
2889253 | Jul 2015 | EP |
2889258 | Jul 2015 | EP |
2924551 | Sep 2015 | EP |
2930603 | Oct 2015 | EP |
2848575 | Mar 2016 | EP |
2916505 | Oct 2016 | EP |
2889254 | Apr 2017 | EP |
2889255 | Apr 2017 | EP |
2889256 | Apr 2017 | EP |
2338720 | Jul 2017 | EP |
2518003 | Oct 2017 | EP |
2993155 | Nov 2017 | EP |
3023382 | Jan 2018 | EP |
2975982 | Dec 2012 | FR |
2975981 | May 2016 | FR |
1387670 | Mar 1975 | GB |
2352521 | Jan 2001 | GB |
2360500 | Oct 2003 | GB |
2460326 | Dec 2009 | GB |
2437629 | May 2010 | GB |
63192398 | Dec 1988 | JP |
7002496 | Jan 1995 | JP |
H07242400 | Sep 1995 | JP |
8091794 | Apr 1996 | JP |
3166413 | May 2001 | JP |
2003246598 | Sep 2003 | JP |
2004083273 | Mar 2004 | JP |
3572318 | Sep 2004 | JP |
2005126017 | May 2005 | JP |
2010006601 | Jan 2010 | JP |
2012214155 | Nov 2012 | JP |
5109964 | Dec 2012 | JP |
2013091471 | May 2013 | JP |
5278997 | Sep 2013 | JP |
6135698 | May 2017 | JP |
1992004693 | Mar 1993 | WO |
1998034812 | Aug 1998 | WO |
2002048955 | Jun 2002 | WO |
2008083982 | Jul 2008 | WO |
2009026663 | Mar 2009 | WO |
2009091639 | Jul 2009 | WO |
2011033015 | Mar 2011 | WO |
2012110207 | Aug 2012 | WO |
2013074899 | May 2013 | WO |
2013158079 | Oct 2013 | WO |
2014120248 | Aug 2014 | WO |
2015151619 | Oct 2015 | WO |
2017015046 | Jan 2017 | WO |
Entry |
---|
“Mazda3 Navigation System-Information Guide”, Maple Shade Mazda; https://www.youtube.com/watch?v=CzSW38Uu_5s; published on Oct. 9, 2013. |
“The Tech Inside, Episode 3: 2014 Mazda 6”; PhoneDog; https://www.youtube.com/watch?v=odpnSuUefNg; published on Aug. 1, 2013. |
“2013 and 2014 CX-5 Navigation System Tutorial”; Don Mealey's Sport Mazda; https://www.youtube.com/watch?v=y9v1dvxsfDU; published on Jul. 27, 2013. |
“How-To-Use: 2013 Mazda CX-9 Navigation Tutorial Video”; RamseyMazdaNJ; https://www.youtube.com/watch?v=P584CUo8Hno; published on Jul. 10, 2013. |
“Mazda CX-5—Navigation System-Information Guide”; Maple Shade Mazda; https://www.youtube.com/watch?v=JLTMeOalaaM; published on Jul. 23, 2013. |
“2014 Ford Fiesta MyFord Touch Infotainment Review”; Alex on Autos; https://www.youtube.com/watch?v=p6FM6mwfLGU; published Dec. 4, 2013. |
“Navigating Infotainment With a Navigation System”; AutoHow2; https://www.youtube.com/watch?v=zuQ-ZKeu6Fk&feature=youtube; published on Nov. 4, 2013. |
“Take a Tour of the Latest BMW iDrive system”; bmwopenroad; https://www.youtube.com/watch?v=XdnOjevfWIE; published on Jul. 30, 2010. |
“BMW X5 touchscreen”; Naessenselectronics; https://www.youtube.com/watch?v=VNReXZFKZI4; published on Jul. 11, 2012. |
Joe Bruzek; “Mazda Turns Up Connectivity With New System”; Cars.com; published on Nov. 14, 2013; https://www.cars.com/articles/2013/11/mazda-turns-up-connectivity-with-new-system/; downloaded on Sep. 12, 2018. |
“Using Infotainment with a Navigation System—Buick LaCrosse” video transcription; AutoHow2—Informative Automotive Videos; http://www.autohow2.com/video/using-infotainment-with-navigation-buick-lacrosse; downloaded on Sep. 12, 2018. |
U.S. Appl. No. 15/815,788; entitled “User Interface Device for Industrial Vehicle,” filed Nov. 17, 2017 by Jonathan Ochenas et al. |
U.S. Appl. No. 15/815,801; entitled “User Interface Device for Industrial Vehicle,” filed Nov. 17, 2017 by Jonathan Ochenas et al. |
U.S. Appl. No. 15/815,810; entitled “User Interface Device for Industrial Vehicle,” filed Nov. 17, 2017 by Jonathan Ochenas et al. |
“INTELLIVIEW™ IV 10.4″ Color Touchscreen Display”; Farm with Precision with New Holland; http://newhollandrochester.com/wp-content/uploads/pdf-front/145865207194354078.pdf. |
“Winmate's Military Grade Rugged Console Panel PC and Display”; Jun. 25, 2014; http://www.army-technology.com/contractors/computers/winmate-communication/pressreleases/presswinmates-military-grade-rugged-console-panel-pc-and-display. |
“Jungheinrich Presents New Reach Truck”; Mar. 19, 2013; Jungheinrich; http://www.jungheinrich.com/en/press/article/nl/310-jungheinrich-presents-new-reach-truck/. |
“Linde Safety Pilot: A technical breakthrough”; Industrial Vehicle Technology International report; Apr. 25, 2014; http://www.ivtinternational.com/news.php?NewsID=58313. |
Kaiser, Tiffany; “Microsoft Brings Live Tiles to Infotainment Systems with ‘Windows in the Car’ Concept”; Apr. 7, 2014; http://www.dailytech.com/microsoft+brings+live+tiles+to+infotainment+systems+with+windows+in+the+car+concept/article34667.htm. |
Michelangelo, Barba; International Search Report and Written Opinion of the International Searching Authority; International Application No. PCT/US2017/062137; dated Feb. 26, 2018; European Patent Office; Rijswijk, Netherlands. |
Bengtsson, Johan; International Search Report and Written Opinion of the International Search Authority; International Application No. PCT/US2016/042230; dated Oct. 10, 2016; European Patent Office; Rijswijk, Netherlands. |
Michelangelo, Barba; International Search Report and Written Opinion of the International Searching Authority; International Application No. PCT/US2017/062140; dated Feb. 26, 2018; European Patent Office; Rijswijk, Netherlands. |
Michelangelo, Barba; International Search Report and Written Opinion of the International Searching Authority; International Application No. PCT/US2017/062145; dated Feb. 26, 2018; European Patent Office; Rijswijk, Netherlands. |
Michelangelo, Barba; International Search Report and Written Opinion of the International Searching Authority; International Application No. PCT/US2017/062130; dated Mar. 1, 2018; European Patent Office; Rijswijk, Netherlands. |
Intermec Vehicle-Mount Computers Deliver Next Generation Technology to the Forklift, Boosting Productivity and Performance; Intermic; News Release; Jan. 2, 2012; Everett, Washington; https://www.intermec.com/about_us/newsroom/press_releases/2012-02-01-CV41-CV61-Forklift-Vehicle-Mount-Computers.aspx. |
Operator Manual for Crown RM 6000 Series Truck, Document No. PF18574 Rev. 5/17, © 2010, four pages; Crown Equipment Corporation. |
“Mastering ease of use with Topcon System 350,” Topcon Precision Agriculture; https://www.youtube.com/watch?v=far-XW8qKMY; published on Oct. 19, 2012. |
“Topcon X30 and System 350,” Topcon Precision Agriculture; https://www.youtube.com/watch?v=mBa_Xk7HjU; published on Nov. 17, 2011. |
“Topcon demonstrates iPad-like X30 console,” Topcon Precision Agriculture; https://www.youtube.com/watch?v=jISTF8e6UTA; published on Oct. 2, 2011. |
“Fendt Variotronic,” https://www.youtube.com/watch?v=EEYAnEzennA; published on May 4, 2010. |
“Fendt touchscreen overview,” https://www.youtube.com/watch?v=idPm92i3cY0; published on Feb. 3, 2013. |
“Fendt Teach in,” https://www.youtube.com/watch?v=b16uS4SnDEs; published on May 10, 2013. |
“Next Generation of Winmate G-Win Series with Intel's® latest quad-core Bay Trail processor,” http://www.army-technology.com/contractors/computers/winmate-communication/pressreleases/pressnext-generation-g-win-series; published May 7, 2014. |
U.S. Appl. No. 16/562,881; entitled “Processing Device Having a Graphical User Interface for Industrial Vehicle;” filed Sep. 6, 2019 with inventors Anthony T. Castaneda et al. |
Bengtsson, Johan, EPC Official Action; European Patent Application No. 16742538.8; dated Jul. 10, 2019; European Patent Office; Munich, Germany. |
Levy, Amy; Final Office Action; U.S. Appl. No. 15/210,049; dated Nov. 30, 2018; U.S. Patent and Trademark Office; Alexandria, VA. |
Android Developers Blog, “Touch Mode,” Dec. 1, 2008, available at https://android-developers.googleblog.com/2008/12/touch-mode.html. |
Android Developers, “Optimizing Navigation for TV,” Apr. 9, 2012, available at https://stuff.mit.edu/afs/sipb/project/android/docs/training/tv/optimizing-navigation-tv.html. |
Android Developers, “View—Android SDK,” Jul. 11, 2012, available at http://tool.oschina.net/uploads/apidocs/android/reference/android/view/View.html. |
Brochure, “Fendt Variotronic,” Variotronic/1.0-EN/06-11/4.5-E, document creation date Jun. 8, 2011, 24 pages. |
Operator Manual, FendtTM “VarioDoc—VarioGuide,” Nov. 2011, selected pages (16 pages total). |
Operator's Manual, Fendt “VarioDoc,” Nov. 2015, 128 pages. |
Operator's Manual, Fendt “VarioGuide: VarioGuide Novatel and VarioGuide Trimble,” Nov. 2015, 158 pages. |
Apps4Android, “Implementing Accessibility on Android,” Apr. 18, 2012, available at http://www.apps4android.org/?p=3628. |
“iEFIS Panel User Manual,” Jul. 9, 2012, 44 pages. |
Karlson, A.K. and Bederson, B.B., “Direct Versus Indirect Input Methods for One-Handed Touchscreen Mobile Computing,” Human-Computer Interaction Lab, University of Maryland, Apr. 2007, 10 pages. |
Wyttenbach, Joel; Youtube video; “Fendt 3 point hitch”; published Feb. 1, 2013; https://www.youtube.com/watch?v=Vvdw7InNpWE. |
Youtube video; “Aero-TV: Big, Bright and Beautiful—MGL iEFIS Challenger 10.4″ Display System”; Aero-News Network; published Nov. 21, 2013; https://www.youtube.com/watch?v=-qZIW4a36ak. |
Youtube video; Honeywell; “Bendix King KSN770 FMS Safety Display”; Avweb; published Sep. 10, 2013; https://www.youtube.com/watch?v=iYMZPoGGtms. |
Youtube video; Pretorian Technologies Ltd.; “iOS 7 Switch Control—An Introduction”; published Sep. 17, 2013; Ittps://www.youtube.com/watch?v=SnDA2pbBsTQ. |
Ogasawara, Todd; Youtube video; “T-Mobile G1 Trackball Demo”; Oct. 22, 2008; https://www.youtube.com/watch?v=Tq3IwVszW4o. |
Wyttenbach, Joel; Youtube video; “Driving a Fendt”; published Feb. 2, 2013; https://www.youtube.com/watch?v=tyX5UPYWFR8&t=7s. |
Internet video; Fendt.TV; “Operating the new Varioterminal”; Sep. 4, 2010; https://www.fendt.tv/en/home/operating-the-new-varioterminal_1312.aspx. |
Internet video; Fendt.TV; “Fendt Variotronic”; Nov. 10, 2013; https://www.fendt.tv/en/home/fendt-variotronic-the-leading-edge-through-integration_1612.aspx. |
Levy, Amy; Office Action; U.S. Appl. No. 15/210,049; dated Aug. 7, 2018; U.S. Patent and Trademark Office; Alexandria, VA. |
Andrea C. Leggett; Office Action; U.S. Appl. No. 15/815,810; dated Jun. 14, 2019; United States Patent and Trademark Office; Alexandria, Virginia. |
Abou El Seoud, Mohamed; Office Action; U.S. Appl. No. 15/815,788; dated Jun. 20, 2019; United States Patent and Trademark Office; Alexandria, Virginia. |
Levy, Amy M.; Notice of Allowance; U.S. Appl. No. 15/210,049; dated May 16, 2019; U.S. Patent and Trademark Office; Alexandria, VA. |
Song, Daeho D.; Office Action; U.S. Appl. No. 15/815,801; dated Dec. 23, 2019; United States Patent and Trademark Office; Alexandria, Virginia. |
Leggett, Andrea C.; Office Action; U.S. Appl. No. 15/815,810; dated Dec. 30, 2019; United States Patent and Trademark Office; Alexandria, Virginia. |
Abou El Seoud, Mohamed; Final Office Action; U.S. Appl. No. 15/815,788; dated Nov. 14, 2019; United States Patent and Trademark Office; Alexandria, Virginia. |
Andrea C. Leggett; Notice of Allowance and Fees Due; U.S. Appl. No. 15/815,810; dated Apr. 17, 2020; United States Patent and Trademark Office; Alexandria, Virginia. |
Song, Daeho D.; Office Action; U.S. Appl. No. 15/815,801; dated Apr. 22, 2020; United States Patent and Trademark Office; Alexandria, Virginia. |
Abou El Seoud, Mohamed; Office Action; U.S. Appl. No. 15/815,788; dated May 14, 2020; United States Patent and Trademark Office; Alexandria, Virginia. |
Johan Bengtsson; Communication pursuant to Article 94(3); European Application No. 16742538.8; dated Mar. 16, 2020; European Patent Office; Berlin, Germany. |
Song, Daeho D.; Advisory Action; U.S. Appl. No. 15/815,801; dated Jul. 14, 2020; United States Patent and Trademark Office; Alexandria, Virginia. |
Weng, Pei Yong.; Office Action; U.S. Appl. No. 16/562,881; dated Jul. 10, 2020; United States Patent and Trademark Office; Alexandria, Virginia. |
Final Office Action dated Oct. 1, 2020; U.S. Appl. No. 15/815,788; United States Patent and Trademark Office; Alexandria, Virginia. |
Communication Pursuant to Article 94(3) EPC dated Oct. 8, 2020; European Patent Application No. 17812130; European Patent Office; Berlin, Germany. |
Communication Under Rule 71(3) EPC dated Oct. 7, 2020; European Patent Application No. 17811772; European Patent Office; Berlin, Germany; p. 8. |
Benjamin Mouton; Communication pursuant to Article 94(3); European Application No. 17817363.9; dated Nov. 11, 2020; European Patent Office; Berlin, Germany. |
Communication pursuant to Article 94(3) EPC dated Feb. 23, 2021; European Application No. 17825310; European Patent Office; Berlin, Germany. |
Number | Date | Country | |
---|---|---|---|
20180143731 A1 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
62425099 | Nov 2016 | US |