The present application claims priority under 35 U.S.C. § 119 to United Kingdom Patent Application No. 2201421.1, filed Feb. 3, 2022. The full disclosure, in its entirety, of United Kingdom Application No. 2201421.1 is hereby incorporated by reference.
Embodiments of the present invention relate to systems and methods for assisted synchronization of agricultural machine operations. More particularly, embodiments of the present invention relate to systems and methods for assisted synchronization of machine movement during transfer of crop material from one machine to another.
Combine harvesters are used in agricultural production to cut or pick up crops such as wheat, corn, beans and milo from a field and process the crop to remove grain from stalks, leaves and other material other than grain (MOG). Processing the crop involves gathering the crop into a crop processor, threshing the crop to loosen the grain from the MOG, separating the grain from the MOG and cleaning the grain. The combine harvester stores the clean grain in a clean grain tank and discharges the MOG from the harvester onto the field. The cleaned grain remains in the clean grain tank until it is transferred out of the tank through an unload conveyor into a receiving vehicle, such as a grain truck or a grain wagon pulled by a tractor.
To avoid frequent stops during a harvesting operation it is common to unload the grain from a harvester while the combine harvester is in motion harvesting crop. Unloading the harvester while it is in motion requires a receiving vehicle to drive alongside the combine harvester during the unload operation. This requires the operator driving the receiving vehicle to align a grain bin of the receiving vehicle with the spout of an unload conveyor of the combine for the duration of the unload operation. Aligning the two vehicles in this manner is laborious for the operator of the receiving vehicle and, in some situations, can be particularly challenging. Some circumstances may limit the operator's visibility, for example, such as where there is excessive dust in the air around the receiving vehicle or at nighttime. Furthermore, if the receiving vehicle has a large or elongated grain bin, such as a large grain cart or a grain truck, it is desirable to shift the position of the grain bin relative to the spout during the unload operation to evenly fill the grain bin and avoid spilling grain. The operator of the receiving vehicle cannot see into the bin of the receiving vehicle from the operator's cabin and, therefore, must estimate the fill pattern of the receiving vehicle during the fill process and shift the position of the grain bin accordingly to try to fill the receiving vehicle evenly.
Forage harvesters also process crop but function differently from combine harvesters. Rather than separating grain from MOG, forage harvesters chop the entire plant—including grain and MOG—into small pieces for storage and feeding to livestock. Forage harvesters do not store the processed crop onboard the harvester during the harvest operation, but rather transfer the processed crop to a receiving vehicle by blowing the crop material through a discharge chute to the receiving vehicle, such as a silage wagon pulled by a tractor, without storing it on the harvester. Thus, a receiving vehicle must closely follow the forage harvester during the entire harvester operation. This presents similar challenges to those discussed above in relation to the combine harvester.
The above section provides background information related to the present disclosure which is not necessarily prior art.
A system according to a first embodiment comprises an agricultural harvester including a crop processor for reducing crop material to processed crop and an unload conveyor for transferring processed crop out of the agricultural harvester. The system further comprises one or more sensors for generating data indicating a fill level of processed crop within the grain bin of a receiving vehicle proximate the agricultural harvester, and a camera positioned to capture images of the receiving vehicle and configured to generate image data, the image data including image data of at least a portion of the receiving vehicle.
A controller is configured for receiving the data from the one or more sensors, determining, using the data from the one or more sensors, the fill level of the processed crop in the grain bin of the receiving vehicle, receiving the image data from the camera, generating a graphical indicator corresponding to the receiving vehicle, the graphical indicator including a graphical depiction of the receiving vehicle from the image data and a graphical marker superimposed over at least a portion of the graphical depiction of the receiving vehicle. A graphical user interface is in communication with the controller, the graphical user interface being configured to present the graphical indicator of the fill level to a user.
In some embodiments, the graphical marker includes a bar superimposed over the graphical depiction of the receiving vehicle, a height of the bar indicating the fill level of the receiving vehicle; the graphical marker may include a plurality of bars superimposed over the graphical depiction of the receiving vehicle, wherein a height of each of the plurality of bars indicates the fill level of the receiving vehicle at a different location in the receiving vehicle.
In some embodiments, the graphical marker includes a visual indicator of at least one of a grain bin of the receiving vehicle, an unload conveyor of the agricultural harvester, and a stream of crop flowing between the agricultural harvester and the receiving vehicle.
In some embodiments, the image data generated by the camera further includes image data of at least a portion of an unload conveyor of the agricultural harvester, and the graphical indicator generated by the controller further includes a depiction of at least a portion of the unload conveyor such that the graphical indicator includes the graphical depiction of the receiving vehicle from the image data, the graphical marker superimposed over at least a portion of the graphical depiction of the receiving vehicle, and the depiction of at least a portion of the unload conveyor.
A method according to another embodiment comprises using one or more sensors to generate data indicating a fill level of processed crop within a grain bin of a receiving vehicle proximate an agricultural harvester, the agricultural harvester including a crop processor for reducing crop material to processed crop and an unload conveyor for transferring processed crop out of the agricultural harvester. The method further comprises using a camera to generate image data, the image data including image data of at least a portion of the receiving vehicle; using a controller to determine, using the data from the one or more sensors, the fill level of the processed crop in the grain bin of the receiving vehicle; using the controller to generate a graphical indicator of the fill level, the graphical indicator including a graphical depiction of the receiving vehicle from the image data and a graphical marker superimposed over at least a portion of the graphical depiction of the receiving vehicle; and presenting the graphical indicator of the fill level on a graphical user interface, the graphical user interface being in communication with the controller and receiving the graphical indicator from the controller.
Embodiments of the present invention are described in detail below with reference to the attached drawing figures, wherein:
The drawing figures do not limit the present invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.
The following detailed description of embodiments of the invention references the accompanying drawings. The embodiments are intended to describe aspects of the invention in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments can be utilized and changes can be made without departing from the spirit and scope of the invention as defined by the claims. The following description is, therefore, not to be taken in a limiting sense. Further, it will be appreciated that the claims are not necessarily limited to the particular embodiments set out in this description.
In this description, references to “one embodiment”, “an embodiment”, or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment”, “an embodiment”, or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments, but is not necessarily included. Thus, the present technology can include a variety of combinations and/or integrations of the embodiments described herein.
When elements or components are referred to herein as being “connected” or “coupled,” the elements or components may be directly connected or coupled together or one or more intervening elements or components may also be present. In contrast, when elements or components are referred to as being “directly connected” or “directly coupled,” there are no intervening elements or components present.
Given the challenges of synchronizing the operation of harvesters and receiving vehicles during unload operations, as explained above, it is desirable to assist machine operators in manually controlling the machines to maintain the desired relative positions of the two machines. One method of assisting operation of at least one of the machines in this way involves using one or more sensors to detect a fill level and/or fill distribution of crop within a grain bin of a receiving vehicle and generating a graphical user interface indicating the fill level and/or the fill distribution of the crop within the grain bin of the receiving vehicle.
The graphical user interface includes a graphical depiction of the receiving vehicle with a graphical marker superimposed over at least a portion of the graphical depiction of the receiving vehicle, wherein the marker indicates the fill level. The marker may include a single bar whose height indicates a fill level, or may include multiple bars wherein the height of each bar indicates a fill level at a different location within the grain bin of the receiving vehicle. Using multiple bars to indicate the fill level at multiple distinct locations provides a visual indicator of the distribution of crop within the receiving vehicle. This graphical depiction of the receiving vehicle along with the graphical marker enables the operator of the receiving vehicle to immediately see the distribution of crop in the receiving vehicle and adjust the position of the receiving vehicle relative to the harvester, if necessary. The system updates the graphical user interface in real time or nearly in real time, thus addressing the challenge the operator faces of being aware of the fill level and/or distribution level of the receiving vehicle during operations in which crop is transferred from the harvester to the receiving vehicle.
A system according to a first embodiment of the invention comprises an agricultural harvester, one or more sensors, a camera, a controller and a graphical user interface. The agricultural harvester includes a crop processor for reducing crop material to processed crop and an unload conveyor for transferring processed crop out of the agricultural harvester. The one or more sensors generate data indicating a fill level of processed crop within the grain bin of a receiving vehicle proximate the agricultural harvester. The camera is positioned to capture images of the receiving vehicle and is configured to generate image data, the image data including image data of at least a portion of the receiving vehicle. The controller is configured to receive data from the one or more sensors, determine, using the data from the one or more sensors, the fill level of the processed crop in the grain bin of the receiving vehicle, receive the image data from the camera, and generate a graphical indicator of the fill level. The graphical indicator includes a graphical depiction of the receiving vehicle from the image data and a graphical marker superimposed over at least a portion of the graphical depiction of the receiving vehicle. The graphical user interface is in communication with the controller and is configured to present the graphical indicator of the fill level to a user.
Turning now to the drawing figures, and initially
The processor threshes the grain, separates the grain from the MOG, cleans the grain and stores the grain in a clean grain tank 20. Thus, the processor reduces crop material (plants or portions of plants cut or picked up from the field) to processed crop (grain). An unload conveyor 22 transfers grain from the clean grain tank 20 to a receiving vehicle or other receptacle using one or more augers, belts or similar mechanisms to move grain out of the clean grain tank 20, through the unload conveyor 22 and out a spout 24 positioned at an end of the unload conveyor 22 distal the body 18 of the harvester 10. The unload conveyor 22 is illustrated in a stowed position in
An operator cabin 26 includes a seat and a user interface for enabling an operator to control various aspects of the harvester 10. The user interface includes mechanical components, electronic components, or both such as, for example, knobs, switches, levers, buttons, dials as well as electronic touchscreen displays that both present information to the operator in graphical form and receive information from the operator. The harvester 10 includes a camera 28 mounted on an exterior surface 30 of the combine body 18. The camera 28 is configured and positioned for capturing images of an area proximate the agricultural harvester 10 and generates image data, as explained below. In the embodiment illustrated in
The harvester 10 further includes an electromagnetic detecting and ranging module 32 configured and positioned for detecting at least one of a fill level and a distribution of processed crop in the receiving vehicle. In the embodiment illustrated in
The unload synchronization assistance system 44 assists an operator of the receiving vehicle, an operator of the harvester 10, or both during the grain unloading process by providing the operator with information about the relative positions of the receiving vehicle and the harvester 10 as well as a fill level, a distribution pattern, or both of crop material inside the receiving vehicle. The system 44 broadly includes a controller 46, the camera 28, the electromagnetic detecting and ranging module 32, a wireless transceiver 48 and a portable electronic device 50 including a graphical user interface portion 52.
The controller 46 is a computing device and includes one or more integrated circuits programmed or configured to implement the functions described herein. By way of example the controller 46 may be a digital controller and may include one or more general purpose microprocessors or microcontrollers, programmable logic devices, application specific integrated circuits or other computing devices. The controller 46 may include multiple computing components, such as electronic control units, placed in various different locations on the harvester. The controller 46 may also include one or more discrete and/or analog circuit components operating in conjunction with the one or more integrated circuits or computing components. Furthermore, the controller 46 may include or have access to one or more memory elements operable to store executable instructions, data, or both. The storage component 46 stores data and preferably includes a non-volatile storage medium such as solid state, optic or magnetic technology.
The wireless transceiver 48 is configured to communicate with the portable electronic device 50 using wireless communications technology. The wireless transceiver 48 may be configured to communicate according to one or more wireless communications protocols or standards, such as one or more protocols based on the IEEE 802.11 family of standards (“Wi-Fi”), the Bluetooth wireless communications standard, and/or a 433 MHz wireless communications protocol. Alternatively or additionally, the wireless transceiver 48 may be configured to communicate according to one or more proprietary or non-standardized wireless communication technologies or protocols, such as proprietary wireless communications protocols using 2.4 GHz or 5 GHz radio signals. Although illustrated in the diagram of
The camera 28 is positioned and configured for capturing images of objects that are proximate the harvester 10. The camera 28 is located on the exterior side surface 30 of the body 18 of the harvester 10, as explained above, and has a field of view extending outwardly from the side surface 30 and with a center of the field of view being perpendicular or approximately perpendicular to the longitudinal axis of the harvester 10. In this configuration the camera's field of view corresponds to an area in which a receiving vehicle is located during crop transfer operations.
A diagram of certain components of the camera 28 is illustrated in
The electromagnetic detecting and ranging module 32 uses reflected electromagnet waves to generate a digital representation of objects within a field of view of the module. More particularly, the module 32 includes an emitter for emitting electromagnetic waves and a sensor for detecting reflected waves. Data generated by the sensor includes such information as an angle and a distance for each data point that indicate a point in space where the wave encountered and reflected off of an external object in the module's field of view. Thus, the digital representations generated by the module 32 include distances to and relative locations of objects and surfaces within the field of view. Technologies that may be used in the module 32 include LiDAR and RADAR.
Light detecting and ranging (LiDAR) is a method for measuring distances (ranging) by illuminating the target with laser light and measuring the reflection with a sensor. Differences in laser return times and wavelengths can then be used to make digital three-dimensional or two-dimensional representations of the area scanned. LiDAR may use ultraviolet, visible, or near infrared light to image objects and can target a wide range of materials, including metallic and non-metallic objects.
Radio detecting and ranging (RADAR) is a detection system that uses radio waves to determine the range, angle, and/or velocity of objects. A RADAR system includes a transmitter producing electromagnetic waves in the radio or microwave domains, a transmitting antenna, a receiving antenna (often the same antenna is used for transmitting and receiving) and a receiver and processor to determine properties of the object(s) within the scan zone of the system. Radio waves (pulsed or continuous) from the transmitter reflect off the object and return to the receiver, giving information about the object's location, direction of travel and speed.
The electromagnetic detecting and ranging module 32 collects data that define a digital representation of the area within the field of view of the module 32 and communicate that data to the controller 46. The data collected by the module 32 includes location information for each of a plurality of points making up a point cloud. The location information is relative to the module 32 and may include a set of two-dimensional Cartesian coordinates, such as X and Y coordinates of the point relative to the module 32; a set of three-dimensional Cartesian coordinates such as X, Y and Z coordinates; a set of polar coordinates such as a radial coordinate (r) indicating a distance from the module 32 and an angular coordinate (θ) indicating an angle from a reference direction; a set of spherical coordinates such as a radial coordinate (r) indicating a distance of the point from the module 32, a polar angle coordinate (θ) measured from a fixed zenith direction, and an azimuthal angle coordinate (φ) of its orthogonal projection on a reference plane that passes through the origin and is orthogonal to the zenith, measured from a fixed reference direction on that plane; or a set of cylindrical coordinates such as a distance (r) to the point from a reference axis (typically corresponding to a location of the module 32), a direction (φ) from the reference axis, and a distance (Z) from a reference plane that is perpendicular to the reference axis.
The portable electronic device 50 includes a user interface 52 for presenting graphical representations of the relative locations of the agricultural harvester 10 and the receiving vehicle, as well as the fill level, fill pattern, or both of crop material in the receiving vehicle. In the illustrated embodiment the portable electronic device 50 is a tablet computer, but it will be appreciated by those skilled in the art that it could be a smartphone or similar device capable of communicating wirelessly with the transceiver 48 to receive a wireless signal including location information and crop information and generating the graphical representations on the user interface 52. The portable electronic device 50 is discussed in greater detail below.
It will be appreciated that, for simplicity, certain elements and components of the system 44 have been omitted from the present discussion and from the diagram illustrated in
The first marker 70 is located at or near an upper front corner of the grain cart 36 and the second marker 72 is located at or near an upper rear corner of the grain cart 36. The markers 70, 72 contain a predetermined visual pattern or design that is included in images captured by the camera 28 and used by the one or more computing devices to recognize the markers 70, 72. The controller searches for and recognizes the markers 70, 72 and uses the location and size of the markers to determine information about the grain cart 36 including the location of the grain cart 36 relative to the agricultural harvester 10, the orientation of the grain cart 36 relative to the agricultural harvester 10 and the size of the grain cart 36.
The controller uses the size of the markers 70, 72 and the location of the markers 70, 72 in the image captured by the camera 28 to determine the location of the grain cart 36. The size of the markers 70, 72 in the image, such as the number of pixels corresponding to the width, the height and/or the area of the markers 70, 72, is used to determine a distance of each marker 70, 72 from the camera 28. Given that the actual size of the markers 70, 72 is fixed and known the distance of each marker can be correlated with the size of the marker in the image by, for example, using a lookup table to assign a distance to a size of the image in the marker. The controller uses the distance of the markers 70, 72 to the camera 28 to determine the lateral separation of the grain cart 36 from the harvester 10 or, in other words, the distance between the harvester 10 and the grain cart 36 along the direction 42 illustrated in
The controller also uses the locations of the markers 70, 72 in the image to determine whether the grain cart 36 is behind, in front of or even with the unload conveyor 22 or, in other words, the position of the grain cart 36 relative to the unload conveyor 22 along the direction 40 illustrated in
The first marker 70 is located at or near a top front corner of the grain cart 36 and the second marker is located at or near a rear top corner of the grain cart 36. This enables the controller to determine the size of the grain cart 36 using the distance of the markers 70, 72 from the camera 28 (determined by the size of the markers) and the distance between the markers 70, 72 (determine by the size and separation of the markers in the image). Both the location and size of the grain cart 36 may be used to generate a graphical representation of the relative positions of the unload conveyor 22 and the grain cart 36.
The markers 70, 72, 74, 76, 78, 80 may be permanently affixed to the grain cart 36 (or other receiving vehicle) or may be temporarily attached thereto using bolts, clamps, magnets or other fasteners. An advantage to temporarily attaching the markers to the receiving vehicle is that they can be quickly and easily removed from one receiving vehicle and attached to another.
The electromagnetic detecting and ranging module 32 is located at or near an end of the unload conveyor 22 corresponding to the spout 24 and distal the body 18 of the harvester 10. The module 32 includes a scanner positioned to scan an area extending downwardly from the end of the unload conveyor 22 that is perpendicular or approximately perpendicular to a longitudinal axis of the unload conveyor 22. This scan area includes an area inside the grain bin 38 of the receiving vehicle when the grain bin 38 is positioned below the spout 24 of the unload conveyor 22.
The module 32 includes a scanner that generates a plurality of data points within the plane corresponding to the scan area 61, each data point including a distance value corresponding to a distance from the module 32. The controller processes the data from the module 32 to identify patterns. A series of data points generated by the module 32 when the grain bin of the receiving vehicle is empty is illustrated in
The controller 46 uses the data generated by the module 32 to determine the fill level of the grain cart 36, the distribution of grain (or other processed crop material) within the grain cart 36, or both. To determine the fill level of the grain cart 36 the controller identifies data points 96 corresponding to grain (verses data points corresponding to walls or the floor of the grain bin), determine a fill height of each of the data points corresponding to grain, and then average the fill height of the data points corresponding to grain to generate an average fill level of the grain bin. The fill height of the various data points corresponds to the distribution of grain.
To identify data points corresponding to grain the controller uses patterns in the data, receiving vehicle location information generated using data from the module 28, or both. The controller uses patterns in the data by identifying patterns corresponding to certain parts of the grain bin such as a front wall (for example, pattern 92), rear wall (for example, pattern 94) and floor (for example, pattern 88) or a combination of two or more of these features. In the collection of data illustrated in
The graphical representation is presented as part of the graphical user interface portion 52 on the portable electronic device 50, illustrated in
The graphical indicator depicted in
In the graphical depictions set forth in
During a harvest operation, the controller 46 continuously or periodically receives image data from the camera 28 and uses the image data to detect the presence of the markers 70, 72 by detecting the patterns associated with each marker. Once the controller 46 has identified the markers 70, 72, it determines the location of the receiving vehicle relative to the harvester 10 using the size and location of the markers 70, 72 as explained above. The controller 46 may also use the size and location of the markers 70, 72 to determine the size of the receiving vehicle, the orientation of the receiving vehicle relative to the harvester 10, or both. The controller 46 also collects and communicates image data of the receiving vehicle to the portable electronic device 50 for presentation as part of the graphical user interface, as explained above.
The controller 46 also uses data from the electromagnetic detecting and ranging module 32 to determine a fill level, distribution pattern, or both of crop material in the receiving vehicle as explained above. The controller 46 communicates the location information and the crop information to the portable electronic device 50 via the wireless communications link using the transceiver 48.
The portable electronic device 50 uses the data from the electromagnetic detecting and ranging module 32 and the image data from the camera 28 to generate the graphical indicator of the fill level of the receiving vehicle, as explained above.
In addition to illustrating the fill level and/or distribution of crop in the receiving vehicle, the graphical indicator may also illustrate, in an intuitive way, the relative positions of the unload conveyor 22 and the grain bin of the receiving vehicle. The graphic representation is presented on the graphical user interface 52 of the portable electronic device 50, thereby allowing the operator to see the position of the grain bin relative to the unload conveyor and the fill level and/or fill pattern of crop material in the grain bin, and to steer the tractor so that the grain bin of the receiving vehicle is located beneath the spout of the unload conveyor. This relieves the operator(s) of the need to try to look backward to see the position of the unload conveyor while also watching the field ahead of the machine. The graphical representation has the further advantage of enabling the operator(s) to see the relative positions of the machines even in situations with limited visibility outside the operator cabin.
A schematic diagram of certain components of a portable electronic device 200 is illustrated in
In another embodiment of the invention the controller 46 uses image data generated by the camera 28 to identify the receiving vehicle within the field of view of the camera without the use of fiducial markers, such has markers 70-80 described above. In this embodiment, the system 44 includes software with one or more machine learning algorithms for determining whether an image includes a depiction of a receiving vehicle. The machine learning algorithms are trained to identify the receiving vehicle as well as portions of the receiving vehicle, as explained below. If the image captured by the camera 28 includes a receiving vehicle, the controller 46 identifies a portion of the receiving vehicle that receives the grain (or other crop material), such as the grain bin 38.
After the controller detects the presence of the grain cart in the image it then identifies the grain bin 38 of the grain cart 36 by identifying and excluding from the area of interest various components of the grain cart 36. The controller uses one or more machine learning algorithms to attempt to identify wheels, an unload auger and a hitch of the grain cart 36 in the image, as depicted in block 160. If the controller cannot identify the components in the image then the image is probably not a receiving vehicle and the controller presents the image to the operator without marking the image, as depicted in block 162.
If the controller identifies the components in step 160, it adjusts the area of interest 182 to exclude those components, as depicted in blocks 164, 166 and 168. Adjusting the area of interest 182 to exclude the wheels results in an area of interest as depicted in image 174 of
As indicated above, the software includes one or more machine learning algorithms for determining whether image data includes a depiction of a receiving vehicle and, if so, attributes about the receiving vehicle. As used herein, a machine learning algorithm is an algorithm that enables a computer to learn from experience. The concept of experience in this regard is typically represented as a dataset of historic events, and learning involves identifying and extracting useful patterns from such a dataset. A machine learning algorithm takes a dataset as input and returns a model that encodes the patterns the algorithm extracted (or “learned”) from the data.
The controller uses the machine learning algorithm to learn to identify receiving vehicles depicted in images captured by the camera 28 by analyzing many different images of different types of receiving vehicles, including different models of grain carts. This process may involve analyzing thousands, tens of thousands or hundreds of thousands of images and developing a model that takes as an input an image of a receiving vehicle captured by the camera 28 and returns values indicating a particular type of receiving vehicle. As explained below, dimensions such as length and height for each type of receiving vehicle are known and matched to the particular receiving vehicle identified by the model. The one or more computing devices use the actual dimensions of the receiving vehicle, the size of the receiving vehicle in the image and the position of the receiving vehicle in the image to determine a location of the receiving vehicle relative to the harvester 10.
The one or more machine learning algorithms preferably include a deep learning algorithm and, in particular, a deep learning algorithm involving a convolutional neural network. Convolutional neural networks are a class of deep learning neural networks well-suited for analyzing image data. A convolutional neural network includes an input layer, an output layer and a number of hidden layers. The hidden layers of a convolutional neural network typically consist of a series of convolutional layers that convolve with a multiplication or other dot product. The activation function is commonly a rectifier linear unit layer, and is subsequently followed by additional convolutions such as pooling layers, fully connected layers and normalization layers, referred to as hidden layers because their inputs and outputs are masked by the activation function and final convolution.
Another embodiment of the invention is illustrated in
Another embodiment of the invention includes an unload synchronization assistance system 200 that includes a wired connection to a console 202 with a graphical user interface 204 as illustrated in
Although the invention has been described with reference to the preferred embodiment illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the invention as recited in the claims.
The claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s).
Having thus described the preferred embodiment of the invention, what is claimed as new and desired to be protected by Letters Patent includes the following:
Number | Date | Country | Kind |
---|---|---|---|
2201421.1 | Mar 2022 | GB | national |