Method and system for tracking the delivery of an object to a specific location

Information

  • Patent Grant
  • 10572704
  • Patent Number
    10,572,704
  • Date Filed
    Tuesday, April 23, 2019
    5 years ago
  • Date Issued
    Tuesday, February 25, 2020
    4 years ago
Abstract
System and method for collecting object identification data from a plurality of objects that interact with a vehicle during operation of the vehicle, where the vehicle interacts with specific objects at specific geographical positions. An identification sensor is attached to a particular object that is to be delivered to specific location. A record for the object is generated, the record including the identification of the object, the position of the vehicle when the interaction between the object and the vehicle occurs, and the time of the interaction. The record also includes a specific target location for delivery of the object. Exemplary interactions include loading/unloading an object from the vehicle. The record may also include additional data about a parameter of the object (such as the object's weight, volume, or temperature). An alert is sent to a driver of the vehicle when he approaches a location that is adjacent to the delivery location of the vehicle.
Description
BACKGROUND
Description of the Related Art

As the cost of sensors, communications systems and navigational systems has dropped, operators of commercial and fleet vehicles now have the ability to collect a tremendous amount of data about the vehicles that they operate, including geographical position data collected during the operation of the vehicle.


Vehicle fleet operators often operate vehicles along predefined and generally invariant routes. For example, buses frequently operate on predefined routes, according to a predefined time schedule (for example, along a route that is geographically, as well as temporally defined). Migrating route data from one software platform to another software platform can be a tedious task.


It would be desirable to provide such fleet operators with additional tools for moving data between different software platforms, and for collecting and analyzing data (such as Global Positioning System (GPS) data, as well as other route related data) collected from vehicles traversing a predefined route.


BRIEF SUMMARY

The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope. In various embodiments, one or more of the above-described problems have been reduced or eliminated, while other embodiments are directed to other improvements.


One concept disclosed herein is the collection of object identification data during the operation of a vehicle, where the vehicle interacts with the object at a definable geographical position. An identification sensor is coupled to a geographical position sensor, and whenever an object is identified a record is generated, the record including the identification of the object, the position of the vehicle when the interaction between the object and the vehicle occurs, and the time of the interaction. Exemplary (but not limiting) objects that are identified include passengers, containers (such as pallets, packages, boxes, envelopes), and documents. Many different types of interactions are possible, including, but not limited to, loading an object (such as a parcel, document, or container) into the vehicle, unloading an object (such as a parcel, document, or container) from the vehicle, boarding a passenger (the object) onto the vehicle, unloading a passenger (once again, the passenger being the object) from the vehicle, transferring a bulk material (such as a solid, liquid or compressed gas) from the vehicle into a specific container (the container being the object), and/or transferring a bulk material (such as a solid, liquid or compressed gas) from a specific container (the container being the object) to the vehicle. The record may also include additional data about a parameter of the object (for example, in some embodiments, it will be useful to include the object's weight in the record, or the weight/volume of a material being transferred to or from the vehicle to a specific container). Such a data record is referred to herein and in the claims that follow as object identification (ID) and location data, and/or object ID encoded position data (encoded in the sense that the object data is combined with the position data). In some embodiments, the object ID and location data is stored at the vehicle for transfer to a remote computing device at a later time, and in other embodiments, the object ID and location data is wirelessly transmitted to a remote computing device during operation of the vehicle. The term “object identification data” is intended to refer to data that identifies an object with which a vehicle interacts. For example, for a passenger, object identification data can include the passenger's name, or a passenger number (or an alphanumeric code or other type of code) that uniquely identifies an individual. For other objects, the object identification data is generally a numeric or alphanumeric code that uniquely identifies the object.


Broadly speaking, position data from the vehicle is collected as the vehicle travels to the plurality of different locations, the position data identifying a specific geographical location of the vehicle at a specific point in time (thus, the vehicle position data is time indexed). Time indexed object identification data is collected as the vehicle interacts with objects at various locations visited by the vehicle. In some embodiments, the vehicle traverses a generally invariant route (such as a bus route), while in other embodiments the vehicle traverses a variable route (such as a parcel delivery vehicle). In an exemplary embodiment, the time indexing function is implemented by the geographical position sensing system. Periodically, the geographical position sensing system generates a record that documents the current time and current geographical position of the vehicle. Whenever the identification sensor identifies an object, the identification data is sent to the geographical position sensing system, which either appends the identification data to the most current record, or generates a new record that documents the identity of the object, the current time and the current geographical position, thereby generating the object ID and location data. It should thus be recognized that the set of location data collected by the geographical position sensing system during operation of the vehicle will also include object identification data at those points in time at which the vehicle interacts with an object that has been tagged in some way with a unique identifier that can be detected by the object identification sensor. Exemplary tags or tokens include optical codes (such as bar codes and other optically recognizable codes), radio frequency identification (RFID) tags, and magnetic tags/magnetic strips. It should be understood that the set of location data collected by the geographical position sensing system during operation of the vehicle (which at some time points includes only location data, and at other time points includes location data and object identification data) is collectively referred to herein as the object ID and location data. Such object ID and location data are conveyed to a remote computing device for storage/processing, either in real-time (i.e., while the vehicle is being operated, such that the vehicle requires a transmitter to convey the data to the remote computing device) or at some point after the vehicle has traversed a route and collected the different types of data (the position data and the object identification data). The term real-time is not intended to imply the data is transmitted instantaneously, rather the data is collected over a relatively short period of time (over a period of seconds or minutes), and transmitted to the remote computing device on an ongoing basis, as opposed to storing the data at the vehicle for an extended period of time (hour or days), and transmitting an extended data set to the remote computing device after the data set has been collected. Transmitting the object ID and location data at a later time, rather than in real time, is encompassed by the concepts disclosed herein, although real-time data transmission is likely to be popular with users. Note that transferring the object ID and location data at a later time can be achieved without requiring the vehicle to include a wireless transmitter (i.e., the object ID and location data can be transferred via a hardwire connection to either the remote computing device or an intermediate data collection device that is coupled to the vehicle to extract the object ID and location data, which is then conveyed to the remote computing device).


With respect to the remote computing device, in a preferred but not limiting embodiment, the time indexed object ID and location data are available in a networked computing environment. In at least one embodiment, the object ID and location data are stored by a company offering data management services to its clients, and clients can access the object ID and location data for each of their vehicles.


The object ID and location data will have a number of uses. In the context of objects being passengers, the object ID and location data can be used by school bus operators to provide parents with data about when and where their children entered and exited a school bus. The object ID and location data can also be used to alert drivers when students attempt to get off the bus at some location other than their normal stop. The object ID and location data can be used to provide proof of delivery (or pick up) of parcels, documents, and other objects. Historical object ID and location data for generally invariant routes (such as refuse collection routes and school bus routes) can be used to train new drivers, where historical object ID and location data is loaded onto the vehicle before the route is traversed, and that data is used to alert the driver of what objects (such as refuse containers or students) are associated with specific geographical locations in the route.


In addition to being implemented as a method, the concepts disclosed herein can also be implemented as a nontransitory memory medium storing machine instructions that when executed by a processor implement the method, and by a system for implementing the method. In such a system, the basic elements include a vehicle that is to be operated by a vehicle operator, a position data collection unit (such as a GPS tracking device), an object identification sensor (such as a token reader), a data link (which can be integrated into the GPS unit), and a remote computing device. In general, the remote computing device can be implemented by a computing system employed by an entity operating a fleet of vehicles. Entities that operate vehicle fleets can thus use such computing systems to track and process data relating to their vehicle fleet. It should be recognized that these basic elements can be combined in many different configurations to achieve the exemplary method discussed above. Thus, the details provided herein are intended to be exemplary, and not limiting on the scope of the concepts disclosed herein.


Identification of objects can be accomplished by a using reader to scan a token attached to the object. Exemplary tokens include optical codes (such as bar codes), radio frequency identification tags (RFID), and magnetic strips. Readers can be handheld devices, or when appropriate can be attached to the vehicle. For example, RFID tag readers could be attached to the vehicle proximate a door used to load or unload the vehicle, to automatically interrogate each RFID tagged item loaded onto or unloaded from the vehicle. Generally it will be preferable to record both loading and unloading of an object, although the concepts disclosed herein encompass embodiments where data relating to only loading or unloading is collected. Where the object is a person (i.e., a passenger), the person will be issued a token to be carried with them as they enter (or exit) the vehicle. In some cases, it may be desirable to identify a person that interacts with the vehicle even if the person is not a passenger (or is not entering or exiting the vehicle). Such a person might be tasked with delivering something to the vehicle or servicing the vehicle.


With respect to identifying passengers, a reader can be used to read a token (such as a ticket or rider pass) when a person enters or exits a vehicle. Generally it will be preferable to record both entry and exit, although the concepts disclosed herein encompass embodiments where data relating to only entry or exit is determined. In an exemplary but not limiting embodiment, a magnetic card reader is used to scan passenger cards as they enter or exit a vehicle. A particularly useful application of this type of object ID and position data tracking is to enable school bus operators to collect ridership data about students, tracking where and when students enter and exit a school bus. Such historical data can be used for training purposes whenever a driver is assigned a new route, as the historical data can be used to teach the driver which children get on and off at a particular stop. Once such historical data has been collected, if desired, the data can be used to prevent children from getting off at an incorrect stop (the token reader will automatically check the historical data, and if that child attempts to get off at a stop that is not part of the historical data for that child, an alert can be issued to the driver).


While the above noted method is preferably implemented by a processor (such as computing device implementing machine instructions to implement the specific functions noted above), note that such a method can also be implemented using a custom circuit (such as an application specific integrated circuit).


In addition to object identification data (i.e., data that uniquely identifies the object), many different types of object data can be collected. The following types of additional object data are intended to be exemplary, rather than limiting. Time indexing can be achieved by including a time stamp with the object data as the data is collected by the object identification sensor, or the time stamp can be provided by the position sensing system, generally as discussed above.


A first type of additional object data that can be collected during operation of the vehicle is a weight of the object. An exemplary embodiment of a vehicle collecting object ID and location data that includes weight is a refuse truck. In this embodiment, each refuse container serviced by the vehicle is tagged with a token that is detected by the identification sensor as the contents of the refuse container are loaded into the vehicle. In an exemplary but not limiting embodiment, the loading arms include an identification sensor that reads a token labeling each container as the containers are manipulated by the loading arms. The loading arms are also equipped with weight sensors, that determine the weight of the refuse emptied from the container. Thus, the object ID and location data in this embodiment can be used to identify when a container was emptied, where the container was located when it was emptied, and how much refuse was removed. That data is collected automatically, and can be used to provide proof of service, and the weight function maybe used for billing purposes if the client is to be billed by weight. Recycling containers can be tracked and weighed in a similar manner. Historical data about containers and position can be used for training purposes whenever a new driver is assigned to an existing route, as the historical data can be used to teach the new driver what containers are usually serviced at a particular location.


A second type of additional object data that can be collected during operation of the vehicle is volume. An exemplary embodiment of a vehicle collecting object ID and location data that includes volume is a liquid fuel or compressed gas delivery truck. In this embodiment, each fuel or gas container serviced by the vehicle is tagged with a token that is detected by the identification sensor as the contents of the truck are offloaded into the container. In an exemplary but not limiting embodiment, the connector used to fluidly couple the vehicle to the container includes an identification sensor that reads a token labeling each container. The identification sensor is coupled to a flow sensor or tank level in the vehicle which keeps track of how much product is delivered. That volume data, as well as the container identification data, is sent to the vehicle's geographical position sensing system as the container is filled. Thus, the object ID and location data in this embodiment can be used to identify when a container was filled, where the container was located when it was filled, and how much volume of product was delivered by the vehicle. That data is collected automatically, and can be used to provide proof of service, and the volume function may be used for billing purposes if the client is to be billed by volume. It should be noted that such liquid or compressed gas deliveries can also be tracked by weight. Related embodiments utilize data input devices to enable vehicle operators to manually enter container identifications and product weights/volumes into a processor or computing device that combines the weight/volume data and container ID data with the vehicle position data to generate the object ID and location data.


A third type of additional object data that can be collected during operation of the vehicle is object temperature. An exemplary embodiment of a vehicle collecting object ID and location data that includes temperature is a produce delivery truck. In an exemplary but not limiting embodiment, the temperature of each produce container delivered by the vehicle is measured as the container is loaded or unloaded from the vehicle. That temperature data, as well as the container identification data, is sent to the vehicle's geographical position sensing system as the container is loaded or unloaded. Thus, the object ID and location data in this embodiment can be used to identify when a container was loaded and/or unloaded, where the container/vehicle was located when the container was loaded and/or unloaded, and the temperature of the container. That data is collected, and can be used to provide proof of service, and the temperature function may be used for quality assurance purposes if the client asserts that poor product quality was caused by improper temperature conditions in transit. Related embodiments simply measure the temperature of the cargo area of the vehicle, rather than measuring the temperature of each container.


This Summary has been provided to introduce a few concepts in a simplified form that are further described in detail below in the Description. However, this Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following detailed descriptions.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

Exemplary embodiments are illustrated in referenced drawings. It is intended that the embodiments and figures disclosed herein are to be considered illustrative rather than restrictive.



FIG. 1 is a high level logic diagram showing exemplary overall method steps implemented in accordance with the concepts disclosed herein to collect time indexed object ID encoded position data;



FIG. 2 is an exemplary functional block diagram showing the basic functional components used to implement the method steps of FIG. 1;



FIG. 3 is a flow chart showing method steps implemented in an exemplary embodiment in which time indexed object ID encoded position data is analyzed to determine at least one parameter of an interaction between a uniquely labeled object and a vehicle;



FIG. 4 is a functional block diagram of an exemplary computing device that can be employed to implement some of the method steps disclosed herein;



FIG. 5 is a flow chart showing method steps implemented in an exemplary embodiment in which time indexed object ID encoded position data is used to help an operator of a vehicle manage interactions between the vehicle and objects at specified locations;



FIG. 6 is an exemplary functional block diagram showing the basic functional components used to implement the method steps of FIG. 5;



FIG. 7 schematically illustrates a school bus modified to implement the concepts disclosed herein, to collect and use object ID encoded position data;



FIG. 8 schematically illustrates a delivery truck modified to implement the concepts disclosed herein, to collect and use object ID encoded position data;



FIG. 9 schematically illustrates a refuse truck modified to implement the concepts disclosed herein, to collect and use object ID encoded position data;



FIG. 10 schematically illustrates a fuel truck modified to implement the concepts disclosed herein, to collect and use object ID encoded position data; and



FIG. 11 is a functional block diagram showing the basic functional components used to implement a handheld identification sensor, which can be used by a vehicle operator to collect object identification data.





DETAILED DESCRIPTION

For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles of the disclosure as described herein are contemplated as would normally occur to one skilled in the art to which the disclosure relates.



FIG. 1 is a high level flow chart showing the overall method steps implemented in accordance with one aspect of the concepts disclosed herein, to collect object ID and location data (otherwise referred to herein as object ID encoded position data). In a block 10, a vehicle is equipped with geographical position sensors (such as a GPS unit), so that geographical position data can be collected when the vehicle is being operated. In a block 12, the vehicle is equipped with an object identification sensor capable of uniquely identifying labeled objects that interact with the vehicle during operation of the vehicle. In general, the object identification sensor detects or reads a token attached to the object (or carried by the object, where the object is a person). Exemplary, but not limiting, object identification sensors include bar code readers, optical code readers reading optical codes other than simple bar codes, RFID tag readers, and magnetically encoded data readers. Other technologies that enable tokens or unique object labels to be identified are encompassed by the concepts disclosed herein. The object identification sensor can be integrated into the vehicle, integrated into a portion of the vehicle that interacts with the object, or provided as a hand held unit to be operated by the vehicle operator, as well as any combinations thereof. In a block 14, objects that interact with the vehicle are labeled with a token, such as a bar code (or some other type of optical code), an RFID tag, or a magnetically encoded token (such as a magnetic strip used in credit cards). Preferably, the tokens are both unique and relatively inexpensive, such that large numbers of objects can be labeled, without undue expense. The objects can include objects that will be loaded onto or unloaded from the vehicle, such as container (including but not limited to parcels, packages, boxes, barrels, and drums), pallets, mail, letters, documents, and people (who will generally carry a token). The objects can also include objects disposed at locations the vehicle will visit, such as recycling containers, refuse containers, and bulk material storage containers (including but not limited to fuel storage tanks and compressed gas storage tanks).


In a block 16, location data (such as GPS data, recognizing that other location tracking systems are known, and the term GPS is intended to be exemplary of a position tracking system, and not limiting) is collected while the vehicle is in operation. The location data is time indexed, meaning that the location data being collected is the location of the vehicle at a particular point in time. While the vehicle is in operation, and when the object identification sensor detects a labeled object (as indicated by a decision block 18), the object ID data is added to the time indexed GPS data, as indicated by a block 20. In some embodiments, the object identification sensor is always enabled, and detection of labeled objects occurs automatically when the labeled object and the object identification sensor are proximate (or in the case of a magnetic card reader type sensor, when the card is swiped through the reader). In other embodiments, such as with a hand held object identification sensor, the object identification sensor must be enabled by the vehicle operator, and detection of labeled objects occurs when the vehicle operator brings the labeled object and the object identification sensor into proximity of one another.



FIG. 2 is a schematic block diagram of exemplary functional components that can be employed to implement the method steps of FIG. 1. The components include a GPS unit 22, a transmitter 24, which may also have a corresponding receiver-not shown (or other data link), an object ID sensor 26 and a remote computing device 28 (generally as described above). It should be recognized that many GPS units are available that already incorporate a transmitter, such that a separate transmitter may not be required. It should be understood that the concepts disclosed herein can be used with other types of geographical position sensors/systems, and the use of the term GPS is intended to be exemplary, rather than limiting. It should be understood that GPS unit 22 includes a processor that can accept object ID data from object ID sensor 26, and combine the object ID data with the GPS data, to generate the object ID encoded position data. While not specifically shown, it should be understood that a separate processor (i.e., a processor separate from the GPS unit) can be used to combine the object ID data to generate the object ID encoded position data before the object ID encoded position data is transmitted to the remote computing device with transmitter/data link 24.


Referring once again to FIG. 2, note that power components have not been specifically shown, although it should be understood that such components will be utilized to provide electrical power to the GPS, ID sensor, data link, and remote computer.



FIG. 3 is a high level flow chart showing the overall method steps implemented in accordance with another exemplary embodiment for using object ID encoded position data collected during operation of a vehicle equipped with an object ID sensor and a GPS sensor. In a block 30, the object ID encoded position data is collected, generally as discussed above in connection with FIG. 1. In a block 32, the object ID encoded position data is transferred from the vehicle to a remote computing device via a data link (such as a hard wired data link, a wireless data link, or a portable memory media). As generally discussed above, other object data (such as weight, volume and/or temperature) can also be added to the object ID encoded position data. In a block 34, the remote computing device (or some other computing device that the object ID encoded position data is transferred to, or some other computing device that is provided access to the object ID encoded position data) is used determine at least one characteristic of an interaction between a labeled object and the vehicle. One such characteristic that can be determined is to identify at what time the vehicle and a specific labeled object interacted, by searching the object ID encoded position data as a function of the specific object. Another such characteristic that can be determined is to identify at what location the vehicle and a specific labeled object interacted, by searching the object ID encoded position data as a function of the specific object. Yet another such characteristic that can be determined is to identify any labeled object that interacted with the vehicle at a specific location, by searching the object ID encoded position data as a function of the specific location. The object ID encoded position data includes time, location, and object identity as minimum elements, and the artisan of ordinary skill will recognize that many different analyses of the object ID encoded position data can be performed by defining one or more of those minimum elements as fixed or variable search parameters. As noted above, in some embodiments the object ID encoded position data will also include additional object data (exemplary types of additional object data include weight, volume, and temperature), and where the object ID encoded position data includes such additional object data, additional search queries of the object ID encoded position data are possible. For example, a billing function could be implemented where weight or volume associated with a specific object are retrieved from the object ID encoded position data and used to bill a client.


In general, analysis of the object ID encoded position data will be carried out by a remote computing device. The remote computing device in at least one embodiment comprises a computing system controlled or accessed by the fleet operator. The remote computing device can be operating in a networked environment, and in some cases, may be operated by a third party under contract with the fleet operator to perform such services. FIG. 4 schematically illustrates an exemplary computing system 250 suitable for use in implementing the method of FIG. 3 (i.e., for executing block 34 of FIG. 3). Exemplary computing system 250 includes a processing unit 254 that is functionally coupled to an input device 252 and to an output device 262, e.g., a display (which can be used to output a result to a user, although such a result can also be stored). Processing unit 254 comprises, for example, a central processing unit (CPU) 258 that executes machine instructions for carrying out an analysis of the object ID encoded position data, generally as discussed above. The machine instructions implement functions generally consistent with those described above with respect to block 34 of FIG. 3. CPUs suitable for this purpose are available, for example, from Intel Corporation, AMD Corporation, Motorola Corporation, and other sources, as will be well known to those of ordinary skill in this art.


Also included in processing unit 254 are a random access memory (RAM) 256 and non-volatile memory 260, which can include read only memory (ROM) and may include some form of memory storage, such as a hard drive, optical disk (and drive), etc. These memory devices are bi-directionally coupled to CPU 258. Such storage devices are well known in the art. Machine instructions and data are temporarily loaded into RAM 256 from non-volatile memory 260. Also stored in the non-volatile memory are operating system software and ancillary software. While not separately shown, it will be understood that a generally conventional power supply will be included to provide electrical power at voltage and current levels appropriate to energize computing system 250.


Input device 252 can be any device or mechanism that facilitates user input into the operating environment, including, but not limited to, one or more of a mouse or other pointing device, a keyboard, a microphone, a modem, or other input device. In general, the input device will be used to initially configure computing system 250, to achieve the desired processing (i.e., analysis of the object ID encoded position data). Configuration of computing system 250 to achieve the desired processing includes the steps of loading appropriate processing software into non-volatile memory 260, and launching the processing application (e.g., loading the processing software into RAM 256 for execution by the CPU) so that the processing application is ready for use. Output device 262 generally includes any device that produces output information, but will most typically comprise a monitor or computer display designed for human visual perception of output. Use of a conventional computer keyboard for input device 252 and a computer display for output device 262 should be considered as exemplary, rather than as limiting on the scope of this system. Data link 264 is configured to enable object ID encoded position data to be input into computing system 250 for subsequent analysis. Those of ordinary skill in the art will readily recognize that many types of data links can be implemented, including, but not limited to, universal serial bus (USB) ports, parallel ports, serial ports, inputs configured to couple with portable memory storage devices, FireWire ports, infrared data ports, wireless data communication such as Wi-Fi and Bluetooth™, network connections via Ethernet ports, and other connections that employ the Internet.


It should be recognized that processors can be implemented as general purpose processors, where the functions implemented by the processor are changeable or customizable using machine instructions (i.e., software). Processors can also be implemented as customized hardware circuits, where the functions implemented are fixed by the design of the circuit (such processors are sometimes referred to as application specific integrated circuits). The flexibility of software controlled processors often results in software based processors being selected over hardware based processors, although it should be understood that the concepts disclosed herein can be implemented using both software based processors and hardware based processors.



FIG. 5 is a high level logic diagram showing exemplary overall method steps implemented in accordance with the concepts disclosed herein, and summarized in the Summary of disclosure section above, to utilize object ID encoded position data to facilitate planned interactions between a vehicle and objects as specific geographical positions. Such a technique can be used to enhance such interactions, as well as to train new operators to understand vehicle/object interactions over predefined routes (such as school bus routes, refuse collection routes, and product delivery routes; such routes being exemplary, and not limiting). In a block 36, object ID encoded position data is provided, the object ID encoded position data correlating a specific object with a specific location (and if desired, to a specific time, although time is not a required element in this embodiment). The object ID encoded position data that is provided can be data collected by the vehicle generally as described above in connection with FIG. 1, or can be generated by combining predefined object ID data and position data together (for example, a dispatcher managing a school bus or delivery route could compile the data and ensure the data is provided to the vehicle). As discussed in greater detail below, the provided object ID encoded position data is stored in a memory accessible by the vehicle (or a processor associated with the vehicle) during operation of the vehicle. In a block 38, the current location of the vehicle is monitored (using a GPS unit or equivalent device). In a block 40, the current position of the vehicle is compared to the provided object ID encoded position data, and an indication (such as a display or audile alert, noting that such indications are exemplary, rather than limiting) is provided to the operator of the vehicle whenever the object ID encoded position data indicates that an interaction with a specific object is to occur at the current location of the vehicle. The artisan of ordinary skill will recognize that the indication can be provided as soon as the vehicle approaches an interaction location specified in the object ID encoded position data, to provide the operator of the vehicle reasonable advance notice. The indication will minimally identify the specific object that will interact with the vehicle at the specified location, and may include additional details as necessary to facilitate the interaction. For example, if the interaction is the delivery of a bulk material to a storage tank, the storage tank being the specified object, instructions as to a quantity of bulk material to deliver, or detailed instructions regarding material transfer or accessing the storage tank can be provided.



FIG. 6 is a functional block diagram of exemplary functional components included in a vehicle employed to implement the method steps of FIG. 5. A vehicle implementing the method includes a GPS unit 42 (which in at least some embodiments, includes a transmitter so that object ID encoded position data collected by the vehicle during its present operational state can be generated and conveyed to a remote computing device, generally as described in connection with FIG. 1, although it should be recognized that a GPS unit without a transmitter can be coupled with a transmitter or other data link to achieve similar functionality; as well as recognizing that the vehicle could be configured to only use object ID encoded position data stored in a memory 48 to facilitate interactions with objects, as being opposed to collecting object ID encoded position data during the current operation cycle of the vehicle as well). GPS unit 42 is coupled to processor 44 (noting that processor 44 may be part of the GPS unit itself, as opposed to a separate device). Processor 44 is also logically coupled to memory 48 (in which object ID encoded position data defining specific locations where interactions with specific objects are expected are stored), as well as a display 46 (or other output device, such as a speaker) used to alert the vehicle operator that the vehicle is approaching or has reached a geographical position where the object ID encoded position data stored in memory 48 indicates an interaction between the vehicle and a specific labeled object is to occur.


As discussed above, the expected interaction can encompass different interactions between the vehicle and a labeled object, including but not limited to picking up a passenger (where the passenger is the labeled object, or rather carries with them a token that can be read by the identification sensor and thereby uniquely identifies them), dropping off a passenger (where the passenger is the labeled object, or rather carries with them a token that can be read by the identification sensor and thereby uniquely identifies them), picking up an object (such as a parcel, package, container, letter, or document), delivering an object (such as a parcel, package, container, letter, or document), and servicing an object (such as a container or piece of equipment) disposed at the specified location. In particular, servicing an object includes, but is not limited to, removing refuse from a labeled container, removing recyclables from a labeled container, removing refuse from a container at a location that is labeled (i.e., the token is attached to a location where the container is disposed, as opposed to being attached to the container itself), removing recyclables from a container at a location that is labeled (i.e., the token is attached to a location where the container is disposed, as opposed to being attached to the container itself), transferring a bulk material (such as a solid material, a liquid, or a compressed gas) to a labeled container, transferring a bulk material (such as a solid material, a liquid, or a compressed gas) to a container at a location that is labeled (i.e., the token is attached to a location where the container is disposed, as opposed to being attached to the container itself), transferring a bulk solid material to a location that is labeled (i.e., the token is attached to a location, and there is no container, the bulk solid material simply being delivered to the location), and having the vehicle operator perform a service call on a piece of equipment or a structure at the specified location, where either or both the object being serviced or the location is labeled with a token. Those of ordinary skill in the art will recognize that the servicing of structures and/or equipment encompasses services performed by skilled tradesmen, including, but not limited to, plumbers, electricians, carpenters, technicians specializing in servicing specific types of equipment (including but not limited to computers, heating and ventilation equipment, construction equipment, and vehicles), and technicians responsible for other types of repair and maintenance functions.


In an exemplary, but not limiting embodiment, display 46 is used to inform the vehicle operator that the vehicle is approaching or has arrived at a location where an interaction between the vehicle and a labeled object (or labeled location, as noted above) is expected. The display will minimally identify the object, and in some embodiments can be used to provide more detailed information about the interaction. For example, where the interaction is a service call, details about the specific service required may be provided (i.e., replace a faulty component in a piece of equipment, or perform a specific type of scheduled maintenance on a piece of equipment, such services being exemplary and not limiting).


A dashed block 50 around GPS 42, processor 44, and display 46 is intended to indicate that in some embodiments, those three elements will be combined into a single device. It should be recognized that the concepts disclosed herein encompass the use of individual devices to implement each of GPS 42, processor 44, and display 46, as well embodiments where the functions of one or more of GPS 42, processor 44, and display 46 (and memory 48) are implemented by a common device.


Referring once again to FIG. 6, data link and power components have not been specifically shown, although it should be understood that such components will be utilized to provided electrical power to the GPS, processor, display and memory, and some type of data link will be used to load the previously generated object ID encoded position data into the memory.



FIG. 7 schematically illustrates a school bus modified to implement the concepts disclosed herein, to collect and/or use object ID encoded position data, where the object interacting with the vehicle is a student (i.e., a passenger) carrying a token that can be detected by the object identification sensor. Where the bus is configured to collect object ID encoded position data during operation of the bus, then the bus will include the functional elements discussed in connection with FIG. 2 (except the remote computing device, which of course is remote from the bus). Where the bus is configured to use previously generated object ID encoded position data to facilitate transportation of students (such as training a new driver to understand which students get on and off at what stop), generally as discussed above in connection with FIG. 5, then the bus will include the functional elements discussed in connection with FIG. 6. It should be recognized that the concepts disclosed herein encompass buses that perform both the methods discussed above in connection with FIGS. 1 and 5, and such buses will include the functional components of both FIGS. 2 and 6 (again, except for the remote computing device).


Referring to FIG. 7, as shown, bus 52 is configured to implement the method of FIG. 1 (i.e., to collect object ID encoded position data), and thus bus 52 includes a GPS unit 54 and an object ID sensor 56. Not specifically shown are the data link and processor elements of FIG. 2, which, as discussed above, can be implemented by a GPS unit including such elements. As shown, ID sensor 56 is disposed proximate a door 58, so that ID sensor 56 can detect tokens carried by students as they board and exit the bus. Exemplary (but not limiting) tokens include RFID tags (which can be read automatically) and ID cards including a magnetic strip or optical data (which require the child to swipe the ID card through a reader as they enter the bus). If the bus is equipped with other doors that are used to board or disembark students, another ID sensor can be positioned at the other door. In an exemplary embodiment, once the boarding or disembarkation of students has occurred, the object ID encoded position data can be conveyed to the remote computing device of FIG. 2, and the remote computing device can make the boarding/disembarkation information available to school administrators or parents, either through a website accessible to the administrator/parent, or by sending the information in an email, a text message, a voice message, or an instant message. In an exemplary (but not limiting embodiment) administrators (that is, a first class of users) would have access to boarding/disembarkation data about all students, whereas parents (that is, a second class of users) would only be able to access such data about their child (that is, a subset of passengers). Such boarding/disembarkation data would be generated from the object ID encoded position data collected at the vehicle, and would define the location and time a specific student boarded and/or exited the bus.


As noted above, the concepts disclosed herein also encompass bus 52 being configured to implement the method of FIG. 5 (i.e., to use previously generated object ID encoded position data to help the bus driver recognize what students should be boarding/exiting the bus at which stops). In such an embodiment, bus 52 will require GPS unit 54 to track the current position of the bus, so a GPS processor or other processor can compare the current position of the bus with the previously generated object ID encoded position data, stored in a memory 48 as shown in FIG. 6. Bus 52 will also need to include the display/output device of FIG. 6, to provide a mechanism to inform the driver which students are associated with a specific bus stop.


If the bus configured to implement the method of FIG. 5 (using previously generated object ID encoded position data to help the bus driver recognize what students should be boarding/exiting the bus at which stops) is not intended to also implement the method of FIG. 1 (collecting object ID encoded position data while the bus is being currently operated), then the ID sensor and data link to the remote computing device shown in FIG. 2 are not required. Customers employing this technology will likely desire the flexibility of being able to perform both the method of FIG. 1 (collecting object ID encoded position data while the bus is being currently operated) and the method of FIG. 5 (using previously generated object ID encoded position data to help the bus driver recognize what students should be boarding/exiting the bus at which stops), and such buses will need to employ the components of both FIGS. 2 and 6 (i.e., the GPS unit, the ID sensor, the processor to combine the GPS data with the object ID data to generate the object ID encoded position data (which may be part of the GPS unit), the data link to convey the object ID encoded position data to the remote computing device, the memory storing the previously generated object ID encoded position data used to alert the driver which students are associated with which bus stops, the processor to monitor the current position of the bus and produce an indication/alert when the current position of the bus corresponds to a location correlated to one of the students (note the same processor combining the current GPS data with the object ID data can be used, or a separate processor can be used), and the display (or other output) used to alert the driver that the bus is at or approaching a location at which a particular student will get on or off.



FIG. 8 schematically illustrates a delivery truck modified to implement the concepts disclosed herein, to collect and/or use object ID encoded position data, where the object interacting with the vehicle is some type of cargo (including but not limited to a package, a document, an item of mail, a product, and a piece of equipment) including a token that can be detected by the object identification sensor. Once again, the delivery truck can be configured to implement one or both of the method of FIG. 1 (collecting object ID encoded position data while the delivery vehicle is delivering or picking up cargo) and the method of FIG. 5 (where the operator of the vehicle is using previously generated object ID encoded position data to help the delivery driver to deliver or pick up labeled cargo at specified locations). Where the delivery truck is configured to implement the method of FIG. 1, then the delivery truck will require the functional elements discussed in connection with FIG. 2 (except the remote computing device, which of course is remote from the delivery vehicle). Where the delivery vehicle is configured to use previously generated object ID encoded position data to facilitate delivery or pick up of cargo at specific locations (generally as discussed above in connection with FIG. 5), then the delivery vehicle will include the functional elements discussed in connection with FIG. 6. It should be recognized that the concepts disclosed herein encompass delivery vehicles that perform both the methods discussed above in connection with FIGS. 1 and 5, and such delivery vehicles will include the functional components of both FIGS. 2 and 6 (again, except for the remote computing device).


Referring to FIG. 8, as shown, delivery truck 60 is configured to implement the method of FIG. 1 (i.e., to collect object ID encoded position data about cargo being delivered or picked up), and thus delivery truck 60 includes a GPS unit 62 and one or more object ID sensors 64. Not specifically shown are the data link and processor elements of FIG. 2, which as discussed above can be implemented by a GPS unit including such elements. As shown, ID sensors 64 are disposed proximate a side door 66 and a rear door 68, so that ID sensors 64 can detect tokens attached to cargo 70 being picked up or delivered. As discussed above, many types of token/sensor combinations can be employed. In at least one embodiment, tokens are RFID tags that can automatically be read as the cargo passes through one of doors 66 and 68 (noting that some delivery vehicles have more or fewer doors, and the specific location of the door(s) can vary). This automatic sensing function should reduce the time required for loading and unloading, by eliminating any manual involvement in the object ID sensing function. Optical codes can also be employed, but the person loading/unloading the cargo would need to ensure the optical code can be scanned by the ID sensor (much in the way a grocery checker must ensure that products are read by the bar code scanner at checkout). As will be discussed below, the concepts disclosed herein also encompass the use of a handheld ID sensor 72, which though functional requires more effort on the part of the cargo handler.


If desired, a temperature sensor 69 can be included in the cargo area of the delivery truck, to measure the ambient temperature of the cargo area. The temperature measurement represents additional object data, that will be combined with the object ID and the time and GPS data, to generate the object ID encoded position data. The temperature sensor, if present, is configured to communicate its data to the GPS unit, or the processor responsible for combining the object ID data, the temperature data, and the GPS data together to generate the time indexed object ID encoded position data. The temperature data may be important for temperature sensitive cargo, and collecting such data and combining it with the object ID encoded position data will enable the delivery service to prove to the shipper that the cargo was maintained in the correct temperature controlled environment during transit. In a related embodiment, the temperature sensor can be incorporated into the object, and the temperature data can be manually entered into the GPS unit/processor during delivery, or acquired using a hand held sensor that logically communicates that data to the GPS unit/processor for incorporation into the object ID encoded position data.


In an exemplary embodiment, once the loading or unloading of cargo has occurred, the object ID encoded position data can be conveyed to the remote computing device of FIG. 2, and the remote computing device can make the loading/unloading information available to one or more of the delivery service, the cargo shipper, and the cargo recipient, either through a website accessible to the parties, or by sending the information in an email, a text message, a voice message, or an instant message. In an exemplary (but not limiting embodiment) the delivery service would have access to pick up/delivery data for all cargo, whereas shippers/recipients would only be able to access such data about their cargo. Such pick up/delivery data would be generated from the object ID encoded position data collected at the vehicle, and would define the location and time an item of cargo was loaded or unloaded from the delivery vehicle. This data can be used to assure shippers/recipients that their cargo was picked up/delivered, and may be used by the delivery service to bill their clients.


As noted above, the concepts disclosed herein also encompass delivery truck 60 being configured to implement the method of FIG. 5 (i.e., to use previously generated object ID encoded position data to help the delivery driver recognize what cargo should be loaded/unloaded from the delivery vehicle at which locations). In such an embodiment, delivery truck 60 will require GPS unit 62 (to track the current position of the vehicle, so a GPS processor or other processor can compare the current position of the vehicle with the previously generated object ID encoded position data, stored in a memory 48 as shown in FIG. 6). Delivery truck 60 will also need to include the display/output device of FIG. 6, to provide a mechanism to inform the driver which cargo is associated with a specific delivery or pick up location.


If delivery truck 60 is configured to implement the method of FIG. 5 (using previously generated object ID encoded position data to help the driver recognize what cargo is loaded/unloaded at what location) and is not intended to also implement the method of FIG. 1 (collecting object ID encoded position data while the delivery vehicle is being currently operated), then the ID sensor and data link to the remote computing device shown in FIG. 2 are not required. Customers employing this technology will likely desire the flexibility of being able to perform both the method of FIG. 1 (collecting object ID encoded position data while the delivery truck is being currently operated) and the method of FIG. 5 (using previously generated object ID encoded position data to help the delivery driver recognize what cargo should be loaded/unloaded from the vehicle at what location), and such delivery vehicles will need to employ the components of both FIGS. 2 and 6 (i.e., the GPS unit, the ID sensor, the processor to combine the GPS data with the object ID data to generate the object ID encoded position data (which may be part of the GPS unit), the data link to convey the object ID encoded position data to the remote computing device, the memory storing the previously generated object ID encoded position data used to alert the driver which cargo is associated with which location, the processor to monitor the current position of the delivery vehicle and produce an indication/alert when the current position of the delivery vehicle corresponds to a location correlated to an item of cargo (note the same processor combining the current GPS data with the object ID data can be used, or a separate processor can be used), and the display (or other output) used to alert the driver that the delivery vehicle is at or approaching a location at which an item of cargo will be delivered or collected.


As noted above, when the method of FIG. 1 is being implemented in the context of cargo shipping, instead of equipping the vehicle with ID sensors 64 proximate vehicle doors 66 or 68 to automatically collect object ID data from the cargo, handheld ID sensor 72 can be manually used by a cargo handler (such as the driver) when loading or unloading the cargo. The handheld sensor must at some point be logically coupled with GPS 62 so that the object ID encoded position data can be generated. This can be achieved by using a physical connection, or a wireless data link. This embodiment may be less expensive (providing a handheld unit may be more cost effective than adding ID sensors to the doors), but reduces efficiency by requiring the cargo handler to perform an additional function.



FIG. 9 schematically illustrates a refuse truck (or recycling truck) modified to implement the concepts disclosed herein, to collect and/or use object ID encoded position data, where the object interacting with the vehicle is a refuse or recycling container whose contents are transferred from the container to the refuse truck. In at least one embodiment, the token read by the ID sensor is attached to the container itself, although it should be understood that the concepts disclosed herein encompass embodiments in which the token being detected by the ID sensor is attached to some other physical object or structure at the location where the container is stored. Once again, the refuse truck can be configured to implement one or both of the method of FIG. 1 (collecting object ID encoded position data while the refuse truck is collecting refuse or recyclables) and the method of FIG. 5 (where the operator of the vehicle is using previously generated object ID encoded position data to help the refuse truck driver to recognize what containers at what locations need to be emptied).


Where the refuse truck is configured to implement the method of FIG. 1, the refuse truck will require the functional elements discussed in connection with FIG. 2 (except the remote computing device, which of course is remote from the refuse truck). Where the refuse truck is configured to use previously generated object ID encoded position data to facilitate collection of refuse/recyclables from specific containers at specific locations (generally as discussed above in connection with FIG. 5), then the refuse truck will include the functional elements discussed in connection with FIG. 6. It should be recognized that the concepts disclosed herein encompass refuse trucks that perform both of the methods discussed above in connection with FIGS. 1 and 5, and such refuse trucks will include the functional components of both FIGS. 2 and 6 (again, except for the remote computing device).


Referring to FIG. 9, as shown, refuse truck 74 is configured to implement the method of FIG. 1 (i.e., to collect object ID encoded position data about containers from which recyclables or refuse is collected), and thus refuse truck 60 includes a GPS unit 76 and object ID sensors 78, which as shown are disposed on a container manipulator 86 (which lifts and rotates a container 82, such that the refuse falls into a cargo area 88). Note the container manipulator need not be mounted on the front of the vehicle, as other container manipulator positions (such as at the sides or rear of the vehicle) are known. Not specifically shown are the data link and processor elements of FIG. 2, which as discussed above can be implemented by a GPS unit including such elements. The position of ID sensors 78 is such that the ID sensors can detect a token 80 that uniquely identifies container 82 automatically as container manipulator 86 engages the container. As discussed above, many types of token/sensor combinations can be employed. In at least one embodiment, tokens are RFID tags that can automatically be read as container manipulator 86 engages the container. This automatic sensing function should reduce the time required for identifying the object, by eliminating any manual involvement in the object ID sensing function. Optical codes can also be employed, but such optical codes can become obscured by dirt and grime, and may be less suitable for this application. As noted above, other embodiments encompassed by the concepts herein will place the token on a structure or object near the container, rather than the container itself, and in such embodiments the ID sensor may be positioned differently. The concepts disclosed herein also encompass embodiments in which the vehicle operator uses a handheld ID sensor to read a token, which though functional requires more effort on the part of the operator. If desired, a weight sensor 84 can be included on container manipulator 86, to measure the full weight and emptied weight of the container, to enable the weight of the refuse unloaded from the container to be measured (note that such a weight sensor could also be included in cargo area 88, to measure the weight of the material in the cargo area before and after a container is emptied). The weight measurement represents additional object data, that will be combined with the object ID and the time and GPS data, to generate the object ID encoded position data.


In an exemplary embodiment, once the container has been emptied, the object ID encoded position data can be conveyed to the remote computing device of FIG. 2, and the remote computing device can make the container emptying information available to one or more of the refuse removal service and the client, either through a website accessible to the parties, or by sending the information in an email, a text message, a voice message, or an instant message. In an exemplary (but not limiting embodiment) the refuse removal service would have access to pick up data for all containers, whereas clients would only be able to access such data about their containers. Such pick up data would be generated from the object ID encoded position data collected at the vehicle, and would define the location and time a container was emptied, and the weight of the material removed, if the weight data was collected. This data can be used to assure clients that their refuse was picked up, and may be used by the refuse removal service to bill their clients.


As noted above, the concepts disclosed herein also encompass refuse truck 74 being configured to implement the method of FIG. 5 (i.e., to use previously generated object ID encoded position data to help the driver recognize what containers should be emptied at which locations). In such an embodiment, refuse truck 74 will require GPS unit 76 (to track the current position of the vehicle, so a GPS processor or other processor can compare the current position of the vehicle with the previously generated object ID encoded position data, stored in a memory 48 as shown in FIG. 6). Refuse truck 74 will also need to include the display/output device of FIG. 6, to provide a mechanism to inform the driver what containers are associated with a specific service location.


If refuse truck 74 is configured to implement the method of FIG. 5 (using previously generated object ID encoded position data to help the driver recognize what containers are emptied at what location) and is not intended to also implement the method of FIG. 1 (collecting object ID encoded position data while the refuse truck is being currently operated), then the ID sensor and data link to the remote computing device shown in FIG. 2 are not required. Customers employing this technology will likely desire the flexibility of being able to perform both the method of FIG. 1 (collecting object ID encoded position data while the refuse truck being currently operated) and the method of FIG. 5 (using previously generated object ID encoded position data to help the delivery driver recognize what containers should be emptied at what location), and such refuse trucks will need to employ the components of both FIGS. 2 and 6 (i.e., the GPS unit, the ID sensor, the processor to combine the GPS data with the object ID data to generate the object ID encoded position data (which may be part of the GPS unit), the data link to convey the object ID encoded position data to the remote computing device, the memory storing the previously generated object ID encoded position data used to alert the driver about which containers are associated with which locations, the processor to monitor the current position of the vehicle and produce an indication/alert when the current position of the refuse truck corresponds to a location correlated to a labeled container (note the same processor combining the current GPS data with the object ID data can be used, or a separate processor can be used), and the display (or other output) used to alert the driver that the refuse truck is at or approaching a location at which a container will be emptied.



FIG. 10 schematically illustrates a tanker truck modified to implement the concepts disclosed herein, to collect and/or use object ID encoded position data, where the object interacting with the vehicle is a bulk material storage container (such as a home heating oil storage tank, a propane gas fuel storage tank, or a compressed gas storage tank, understanding that such types of storage tanks are intended to be exemplary, rather than limiting), where a product or bulk material is transferred from the tanker truck to the storage tank. In at least one embodiment, the token read by the ID sensor is attached to the storage tank itself, although it should be understood that the concepts disclosed herein encompass embodiments in which the token being detected by the ID sensor is attached to some other physical object or structure at the location where the storage tank is stored. Once again, the tanker truck can be configured to implement one or both of the method of FIG. 1 (collecting object ID encoded position data while the tanker truck is out delivering bulk materials to storage tanks) and the method of FIG. 5 (where the operator of the tanker truck is using previously generated object ID encoded position data to help the tanker truck driver to recognize what storage tanks at which locations need to be filled).


Where the tanker truck is configured to implement the method of FIG. 1, the tanker truck will require the functional elements discussed in connection with FIG. 2 (except the remote computing device, which of course is remote from the tanker truck). Where the tanker truck is configured to use previously generated object ID encoded position data to facilitate delivery of a bulk material to a specific storage tank at specific locations (generally as discussed above in connection with FIG. 5), then the tanker truck will include the functional elements discussed in connection with FIG. 6. It should be recognized that the concepts disclosed herein encompass tanker trucks that perform both the methods discussed above in connection with FIGS. 1 and 5, and such tanker trucks will include the functional components of both FIGS. 2 and 6 (again, except for the remote computing device).


Referring to FIG. 10, as shown, tanker truck 90 is configured to implement the method of FIG. 1 (i.e., to collect object ID encoded position data about storage tanks to which the tanker truck is delivering a bulk material), and thus tanker truck 90 includes a GPS unit 92 and an object ID sensor 94, which as shown is disposed on a distal end of a product delivery hose 96 (which is used to transfer the bulk material from the tanker truck to a storage tank 100, which is labeled with a token 98). Note, as shown, the ID sensor (disposed on the distal end of the delivery hose) and the token on the storage tank (disposed proximate an inlet 102 to the storage tank) will be in close proximity to each other when the delivery hose is coupled to the tank inlet. Thus, the position of ID sensor 94 is such that the ID sensor can detect token 98 (which uniquely identifies storage tank 100 automatically as the delivery hose engages the tank inlet). As discussed above, many types of token/sensor combinations can be employed. In at least one embodiment, tokens are RFID tags that can automatically be read as the delivery hose engages the tank inlet. Because the bulk material may be flammable, care needs to be taken that the ID sensor/token interaction does not generate a spark or static electricity that could act as an ignition source. For bulk transfer of flammable material, grounding is routinely performed to minimize such risks. This automatic sensing function should reduce the time required for identifying the object, by eliminating any manual involvement in the object ID sensing function. Optical codes can also be employed, but such optical codes can become obscured by dirt and grime, and may be less suitable for this application. As noted above, other embodiments encompassed by the concepts herein will place the token on a structure or object near the storage tank, rather than the container itself, and in such embodiments the ID sensor may be positioned differently. The concepts disclosed herein also encompass embodiments in which the tank truck operator uses a handheld ID sensor to read a token, which, though functional, requires more effort on the part of the operator.


If desired, a volume delivery sensor 104 can be included on the tanker truck, to measure the volume of bulk material being delivered. The volume measurement represents additional object data, that will be combined with the object ID and the time and GPS data, to generate the object ID encoded position data. Referring to FIG. 10 and its relationship to the elements of FIG. 2, FIG. 10 does not specifically show the data link and processor elements of FIG. 2, which, as discussed above, can be implemented by a GPS unit including such elements.


In an exemplary embodiment, once the storage tank has been filled, the object ID encoded position data can be conveyed to the remote computing device of FIG. 2, and the remote computing device can make the tank filling information available to one or more of the tanker truck service and the client, either through a website accessible to the parties, or by sending the information in an email, a text message, a voice message, or an instant message. In an exemplary (but not limiting) embodiment, the tanker truck service would have access to filling data for all storage tanks, whereas clients would only be able to access such data about their storage tanks. Such tank filling data would be generated from the object ID encoded position data collected at the tanker truck, and would define the location and time a storage tank was filled, and the volume of the material transferred, if the volume data was collected. This data can be used to assure clients that their tank was filled, and may be used by the tanker truck service to bill their clients.


As noted above, the concepts disclosed herein also encompass tanker truck 90 being configured to implement the method of FIG. 5 (i.e., to use previously generated object ID encoded position data to help the driver recognize what storage tanks at which locations should be filled). In such an embodiment, tanker truck 90 will require GPS unit 92 (to track the current position of the vehicle, so a GPS processor or other processor can compare the current position of the vehicle with the previously generated object ID encoded position data, stored in a memory 48 as shown in FIG. 6). Tanker truck 90 will also need to include the display/output device of FIG. 6, to provide a mechanism to inform the driver what storage tanks are associated with a specific service location.


If tanker truck 90 is configured to implement the method of FIG. 5 (using previously generated object ID encoded position data to help the driver recognize what storage tanks are filled at which location) and is not intended to also implement the method of FIG. 1 (collecting object ID encoded position data while the tanker truck is being currently operated), then the ID sensor and data link to the remote computing device shown in FIG. 2 are not required. Customers employing this technology will likely desire the flexibility of being able to perform both the method of FIG. 1 (collecting object ID encoded position data while the tanker truck is being currently operated) and the method of FIG. 5 (using previously generated object ID encoded position data to help the tanker truck driver recognize what storage tanks should be filled at which location), and such tanker trucks will need to employ the components of both FIGS. 2 and 6 (i.e., the GPS unit, the ID sensor, the processor to combine the GPS data with the object ID data to generate the object ID encoded position data (which may be part of the GPS unit), the data link to convey the object ID encoded position data to the remote computing device, the memory storing the previously generated object ID encoded position data used to alert the driver about which storage tanks are associated with which locations, the processor to monitor the current position of the vehicle and produce an indication/alert when the current position of the tanker truck corresponds to a location correlated to a labeled storage tank (note the same processor combining the current GPS data with the object ID data can be used, or a separate processor can be used), and the display (or other output) used to alert the driver that the tanker truck is at or approaching a location at which a storage tank will be filled).


With respect to any of the embodiments of FIGS. 7-10, it should be recognized that the relative location of the GPS unit in the vehicle as shown in the figure is intended to be exemplary, rather than limiting.



FIG. 11 is a functional block diagram showing the basic functional components used to implement a hand held identification sensor, which can be used by a vehicle operator to collect object identification data, as shown in FIG. 8. A hand held ID sensor 110 includes a plurality of functional components that are included in portable ID sensor 110, either on or inside a housing 112. A central processing unit (CPU) 120 comprises the controller for portable ID sensor 110 and is coupled bi-directionally to a memory 116 that includes both random access memory (RAM) and read only memory (ROM). Memory 116 is used for storing data in RAM and machine instructions in ROM that control the functionality of CPU 120 when executed by it. CPU 120 is also coupled to ID sensor 114, and is configured to receive operator input from user controls 122. In addition, CPU 120 provides text and graphics to display 124 for the prompts and other messages, and menu items and options from which the operator can select, using controls 122.


After the operator has used portable ID sensor 110 to identify each labeled object, the operator can transmit the object ID data that have been collected to the vehicle GPS or the processor that will combine the object ID data with the GPS data to generate the object ID encoded position data, using a data link 118 (in an exemplary embodiment the data link employs an RF transmission, though hardwired and other wireless type data links can be used).


As noted above, the tokens that are affixed to the objects to be identified can be of several different types, depending upon the type of sensor 114 that is included on portable ID sensor 110. In a preferred form of one of the concepts disclosed herein, the token that is preferably employed is an RFID tag that is attached with a fastener or an appropriate adhesive to the object (or is carried by a passenger, or is attached to a location proximate the object, generally as discussed above). One type of RFID tag that is suitable for this purpose is the WORLDTAG™ token that is sold by Sokymat Corporation. This tag is excited by an RF transmission from portable ID sensor 110 via an antenna in sensor 114. In response to the excitation energy received, the RFID tag modifies the RF energy that is received from the antenna in sensor 114 in a manner that specifically identifies the component associated with the RFID tag, and the modified signal is detected by sensor 46.


An alternative type of token that can also be used in one of the concepts disclosed herein is an IBUTTON™ computer chip, which is armored in a stainless steel housing and is readily affixed to an object or location. The IBUTTON chip is programmed with JAVA™ instructions to provide a recognition signal when interrogated by a signal received from a nearby transmitter, such as from an antenna in sensor 114. The signal produced by the IBUTTON chip is received by sensor 114, which determines the type of component associated with a token. This type of token is less desirable since it is more expensive, although the program instructions that it executes can provide greater functionality.


Yet another type of token that might be used is an optical bar code in which a sequence of lines of varying width or other optical patterns encode light reflected from the bar code tag. The encoded reflected light is received by sensor 114, which is then read by an optical detector. Bar code technology is well understood by those of ordinary skill in the art and readily adapted for identifying a particular type of component and location of the component on a vehicle or other system or apparatus. One drawback to the use of a bar code tag as a token is that the bar code can be covered with dirt or grime that must be cleaned before the sequence of bar code lines or other pattern can be properly read. If the bar code is applied to a plasticized adhesive strip, it can readily be mounted to any surface and then easily cleaned with a rag or other appropriate material.


Yet another type of token usable in one of the concepts disclosed herein is a magnetic strip in which a varying magnetic flux encodes data identifying the particular component associated with the token. Such magnetic strips are often used in access cards that are read by readers mounted adjacent to doors or in an elevator that provides access to a building. However, in this aspect of the concepts disclosed herein, the magnetic flux reader comprises sensor 114. The data encoded on such a token are readily read as the portable device is brought into proximity of the varying magnetic flux encoded strip comprising the token.


As yet another alternative, an active token can be employed that conforms to the BLUETOOTH™ specification for short distance data transfer between computing devices using an RF signal.


In at least one embodiment, the interaction between the vehicle and a labeled object is an inspection of the object. The vehicle is used to convey the inspector to the labeled object. A sensor attached to the vehicle or in a handheld device is used to collect the object ID data, which is combined with the position data collected by the vehicle to generate the object ID encoded position data, which can be used to verify that the inspector was proximate the specific object at a specific time. Objects that can be labeled for inspection include, but are not limited to, buildings, bridges, utility vaults, traffic signals, traffic signs, cell phone towers, transformers, pipelines, utility poles, and construction equipment.


In at least one embodiment, the interaction between the vehicle and a labeled object does not include loading the object onto the vehicle or removing the object from the vehicle. Thus, in the claims that follow, it should be recognized that support exists for a negative claim limitation to that effect. FIGS. 9 and 10 relate to embodiments where material is loaded from a container into the vehicle, or from the vehicle into the container, without the container (the labeled object) being loaded onto or being removed from the vehicle. Another type of interaction between a labeled object and the vehicle includes the vehicle being used to transport an inspector to the location of a labeled object that requires an inspection. In such an embodiment, inspection of the object will not result in the object being loaded onto or being removed from the vehicle. In each of these interactions, object ID data is collected from the object, which is combined with the position data collected by the vehicle to generate the object ID encoded position data.


In at least one embodiment, the interaction between the vehicle and a labeled object involves loading or unloading the object from a cargo area in the vehicle that is not suitable for passengers. Thus, in the claims that follow, it should be recognized that support exists for a negative claim limitation to that effect. FIG. 8 relates to an embodiment where objects are loaded or unloaded from a non-passenger cargo area.


Although the concepts disclosed herein have been described in connection with the preferred form of practicing them and modifications thereto, those of ordinary skill in the art will understand that many other modifications can be made thereto within the scope of the claims that follow. Accordingly, it is not intended that the scope of these concepts in any way be limited by the above description, but instead be determined entirely by reference to the claims that follow.


While a number of exemplary aspects and embodiments have been discussed above, those possessed of skill in the art will recognize certain modifications, permutations, additions and sub-combinations thereof. It is therefore intended that the following appended claims and claims hereafter introduced are interpreted to include all such modifications, permutations, additions and sub-combinations as are within their true spirit and scope.


The various embodiments described above can be combined to provide further embodiments. All of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.


These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.

Claims
  • 1. A method, comprising: tracking a current location of a vehicle during transit along a predefined driving route;outputting on a display device an indication to a driver of the vehicle that the vehicle is approaching a specific, predefined target delivery location along the predefined driving route prior to the vehicle arriving at the specific, predefined target delivery location;outputting on the display device an indication to the driver of the vehicle a list of specific, predefined objects that are to that are be unloaded from the vehicle at the specific, predefined target delivery location prior to the vehicle arriving at the specific, predefined target delivery location;stopping the vehicle upon the vehicle reaching the specific, predefined target delivery location;starting the unloading the predefined objects at the specific, predefined target delivery location; andsending a notification alert to the driver of the vehicle if an object in the vehicle begins to unload that is not on the list of specific, predefined objects to be unloaded at the specific, predefined target delivery location.
  • 2. The method according to claim 1 wherein the specific, predefined target delivery location is a bus stop.
  • 3. The method according to claim 1 wherein the vehicle is a bus and the objects to be unloaded are passengers who are scheduled to be on the bus.
  • 4. The method according to claim 1, further including: storing in a memory associated with the display device a list of the specific, predefined target delivery locations prior to the vehicle starting to transit along the predefined driving route; andstoring in the memory associated with the display device the list having the specific objects on it that are to that are be unloaded from the vehicle at the specific, predefined target delivery location prior to the vehicle starting to transit along the predefined driving route.
  • 5. The method according to claim 1 further wherein the step of sending a notification alert to the driver of the vehicle if an object in the vehicle begins to unload that is not on the list of specific, predefined objects to be unloaded at the predefined target delivery location is based solely on the current location of the vehicle that is compared to the list of specific, predefined objects to be unloaded at the specific, predefined target location.
  • 6. A method, comprising: storing in a computer memory a predefined driving route for a bus and predefined bus stops along the route;storing in the memory a list of people who are approved to exit at each predefined bus stop;tracking a current location of the bus during transit along the predefined driving route;outputting on a display device an indication to a driver of the bus that the bus is approaching a specific, predefined bus stop along the predefined driving route prior to the bus arriving at the specific, predefined bus stop;outputting on the display device an indication to the driver of the bus a list of people who are authorized to exit the bus at the specific, predefined bus stop prior to the bus arriving at the specific, predefined bus stop;stopping the bus upon the bus reaching the specific, predefined bus stop;starting the exiting of people on the bus at the specific, predefined bus stop;comparing, via a computer processor, the names of the people exiting at the specific, predefined bus stop with the list of people authorized to exit at the specific, predefined bus stop; andsending a notification alert to the driver of the bus if a person begins to exit who is not on the list of people authorized to exit at the specific, predefined bus stop based solely based on the person beginning to exit not being on the list of people who are approved to exit at the specific, predefined bus stop.
  • 7. The method according to claim 6 wherein storing in the memory a list of people who are approved to exit at each predefined bus stop occurs prior to the bus starting to drive on the predefined route.
  • 8. The method according to claim 6, further including: sensing an RFID chip that is associated with each person as they exit from the bus.
  • 9. The method according to claim 6, further including: sensing an RFID chip that is associated with each person as they enter the bus.
  • 10. The method according to claim 9 wherein the list of people who are approved to exit at each predefined bus stop is modified after the bus starts to drive on the predefined route based on who has entered the bus.
  • 11. The method according to claim 6 wherein the list of people who are approved to exit at each predefined bus stop is created prior to the bus starting to drive on the predefined route and is not modified while the bus is driving on the predefined route.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of application Ser. No. 15/980,548, filed on May 15, 2018, which itself is a continuation of application Ser. No. 15/235,853, filed on Aug. 12, 2016, which itself is a continuation of application Ser. No. 15/083,208, filed Mar. 28, 2016, which itself is a continuation of application Ser. No. 12/942,874, filed Nov. 9, 2010, the benefit of the filing dates of each of which is hereby claimed under 35 U.S.C. § 120.

US Referenced Citations (316)
Number Name Date Kind
3573620 Ashley et al. Apr 1971 A
3990067 Van Dusen et al. Nov 1976 A
4009375 White et al. Feb 1977 A
4025791 Lennington et al. May 1977 A
4092718 Wendt May 1978 A
4258421 Juhasz et al. Mar 1981 A
4263945 Van Ness Apr 1981 A
4325057 Bishop Apr 1982 A
4469149 Walkey et al. Sep 1984 A
4602127 Neely et al. Jul 1986 A
4651157 Gray et al. Mar 1987 A
4658371 Walsh et al. Apr 1987 A
4713661 Boone et al. Dec 1987 A
4763356 Day, Jr. et al. Aug 1988 A
4799162 Shinkawa et al. Jan 1989 A
4804937 Barbiaux et al. Feb 1989 A
4814711 Olsen et al. Mar 1989 A
4846233 Fockens Jul 1989 A
4897792 Hosoi Jan 1990 A
4934419 Lamont et al. Jun 1990 A
4935195 Palusamy et al. Jun 1990 A
5007786 Bingman Apr 1991 A
5014206 Scribner et al. May 1991 A
5021780 Fabiano et al. Jun 1991 A
5058044 Stewart et al. Oct 1991 A
5068656 Sutherland Nov 1991 A
5072380 Randelman et al. Dec 1991 A
5119894 Crawford et al. Jun 1992 A
5120942 Holland et al. Jun 1992 A
5128651 Heckart Jul 1992 A
5204819 Ryan Apr 1993 A
5206643 Eckelt Apr 1993 A
5209312 Jensen May 1993 A
5223844 Mansell et al. Jun 1993 A
5230393 Mezey Jul 1993 A
5243323 Rogers Sep 1993 A
5247160 Zicker Sep 1993 A
5304744 Jensen Apr 1994 A
5321629 Shirata et al. Jun 1994 A
5337003 Carmichael et al. Aug 1994 A
5359522 Ryan Oct 1994 A
5394136 Lammers et al. Feb 1995 A
5399844 Holland Mar 1995 A
5400020 Jones et al. Mar 1995 A
5442553 Parrillo Aug 1995 A
5459304 Eisenmann Oct 1995 A
5459660 Berra Oct 1995 A
5479479 Braitberg et al. Dec 1995 A
5488352 Jasper Jan 1996 A
5499182 Ousborne Mar 1996 A
5541845 Klein Jul 1996 A
5546305 Kondo Aug 1996 A
5557254 Johnson et al. Sep 1996 A
5557268 Hughes et al. Sep 1996 A
5572192 Berube Nov 1996 A
5585552 Heuston et al. Dec 1996 A
5594650 Shah et al. Jan 1997 A
5596501 Comer et al. Jan 1997 A
5600323 Boschini Feb 1997 A
5610596 Petitclerc Mar 1997 A
5623258 Dorfman Apr 1997 A
5623260 Jones Apr 1997 A
5629678 Gargano et al. May 1997 A
5657010 Jones Aug 1997 A
5668543 Jones Sep 1997 A
5671158 Fournier et al. Sep 1997 A
5680328 Skorupski et al. Oct 1997 A
5719771 Buck et al. Feb 1998 A
5731893 Dominique Mar 1998 A
5732074 Spaur et al. Mar 1998 A
5742915 Stafford Apr 1998 A
5745049 Akiyama et al. Apr 1998 A
5758299 Sandborg et al. May 1998 A
5758300 Abe May 1998 A
5781871 Mezger et al. Jul 1998 A
5794164 Beckert et al. Aug 1998 A
5808565 Matta et al. Sep 1998 A
5809437 Breed Sep 1998 A
5815071 Doyle Sep 1998 A
5835871 Smith et al. Nov 1998 A
5837945 Cornwell et al. Nov 1998 A
5838251 Brinkmeyer et al. Nov 1998 A
5839112 Schreitmueller et al. Nov 1998 A
5867404 Bryan Feb 1999 A
5874891 Lowe Feb 1999 A
5884202 Arjomand Mar 1999 A
5890061 Timm et al. Mar 1999 A
5890520 Johnson, Jr. Apr 1999 A
5913180 Ryan Jun 1999 A
5922037 Potts Jul 1999 A
5923572 Pollock Jul 1999 A
5942753 Dell Aug 1999 A
5956259 Hartsell, Jr. et al. Sep 1999 A
5995898 Tuttle Nov 1999 A
6006159 Schmier et al. Dec 1999 A
6009355 Obradovich et al. Dec 1999 A
6009363 Beckert et al. Dec 1999 A
6016795 Ohki Jan 2000 A
6024142 Bates Feb 2000 A
6025776 Matsuura Feb 2000 A
6043661 Gutierrez Mar 2000 A
6054950 Fontana Apr 2000 A
6061614 Carrender et al. May 2000 A
6064299 Lesesky et al. May 2000 A
6070156 Hartsell, Jr. May 2000 A
6078255 Dividock et al. Jun 2000 A
6084870 Wooten et al. Jul 2000 A
6092021 Ehlbeck et al. Jul 2000 A
6107915 Reavell et al. Aug 2000 A
6107917 Carrender et al. Aug 2000 A
6112152 Tuttle Aug 2000 A
6127947 Uchida et al. Oct 2000 A
6128551 Davis et al. Oct 2000 A
6128959 McGovern et al. Oct 2000 A
6169938 Hartsell, Jr. Jan 2001 B1
6169943 Simon et al. Jan 2001 B1
6199099 Gershman et al. Mar 2001 B1
6202008 Beckert et al. Mar 2001 B1
6208948 Klingler et al. Mar 2001 B1
6236911 Kruger May 2001 B1
6240365 Bunn May 2001 B1
6253129 Jenkins et al. Jun 2001 B1
6256579 Tanimoto Jul 2001 B1
6259358 Fjordbotten Jul 2001 B1
6263273 Henneken et al. Jul 2001 B1
6263276 Yokoyama et al. Jul 2001 B1
6278936 Jones Aug 2001 B1
6285953 Harrison et al. Sep 2001 B1
6295492 Lang et al. Sep 2001 B1
6313760 Jones Nov 2001 B1
6313791 Klanke Nov 2001 B1
6330499 Chou et al. Dec 2001 B1
6339745 Novik Jan 2002 B1
6362730 Razavi et al. Mar 2002 B2
6370454 Moore Apr 2002 B1
6374176 Schmier et al. Apr 2002 B1
6396413 Hines et al. May 2002 B2
6411203 Lesesky et al. Jun 2002 B1
6411891 Jones Jun 2002 B1
6417760 Mabuchi et al. Jul 2002 B1
6438472 Tano et al. Aug 2002 B1
6450411 Rash et al. Sep 2002 B1
6456039 Lauper et al. Sep 2002 B1
6502030 Hilleary Dec 2002 B2
6505106 Lawrence et al. Jan 2003 B1
6507810 Razavi et al. Jan 2003 B2
6529723 Bentley Mar 2003 B1
6529808 Diem Mar 2003 B1
6539296 Diaz et al. Mar 2003 B2
6587768 Chene et al. Jul 2003 B2
6594579 Lowrey et al. Jul 2003 B1
6594621 Meeker Jul 2003 B1
6597973 Barich et al. Jul 2003 B1
6604033 Banet et al. Aug 2003 B1
6608554 Lesesky et al. Aug 2003 B2
6609082 Wagner Aug 2003 B2
6611740 Lowrey et al. Aug 2003 B2
6614392 Howard Sep 2003 B2
6616036 Streicher et al. Sep 2003 B2
6621452 Knockeart et al. Sep 2003 B2
6636790 Lightner et al. Oct 2003 B1
6664897 Pape et al. Dec 2003 B2
6671646 Manegold et al. Dec 2003 B2
6680694 Knockeart et al. Jan 2004 B1
6683542 Jones Jan 2004 B1
6700506 Winkler et al. Mar 2004 B1
6708113 Von Gerlach et al. Mar 2004 B1
6714859 Jones Mar 2004 B2
6727818 Wildman et al. Apr 2004 B1
6732031 Lightner et al. May 2004 B1
6732032 Banet et al. May 2004 B1
6744352 Lesesky et al. Jun 2004 B2
6754183 Razavi et al. Jun 2004 B1
6768994 Howard et al. Jul 2004 B1
6801841 Tabe Oct 2004 B2
6804606 Jones Oct 2004 B2
6804626 Manegold et al. Oct 2004 B2
6816762 Hensey et al. Nov 2004 B2
6834259 Markwitz et al. Dec 2004 B1
6856820 Kolls Feb 2005 B1
6876642 Adams et al. Apr 2005 B1
6879894 Lightner et al. Apr 2005 B1
6880390 Emord Apr 2005 B2
6894617 Richman May 2005 B2
6899151 Latka et al. May 2005 B1
6904359 Jones Jun 2005 B2
6909947 Douros et al. Jun 2005 B2
6919804 Cook et al. Jul 2005 B1
6924750 Flick Aug 2005 B2
6928348 Lightner et al. Aug 2005 B1
6946953 Lesesky et al. Sep 2005 B2
6952645 Jones Oct 2005 B1
6954689 Hanson et al. Oct 2005 B2
6957133 Hunt et al. Oct 2005 B1
6958701 Storkamp et al. Oct 2005 B1
6972668 Schauble Dec 2005 B2
6988033 Lowrey et al. Jan 2006 B1
7022018 Koga Apr 2006 B2
7027955 Markwitz et al. Apr 2006 B2
7048185 Hart May 2006 B2
7068301 Thompson Jun 2006 B2
7080778 Kressin et al. Jul 2006 B1
7103460 Breed Sep 2006 B1
7113127 Banet et al. Sep 2006 B1
7117121 Brinton et al. Oct 2006 B2
7155199 Zalewski et al. Dec 2006 B2
7171372 Daniel et al. Jan 2007 B2
7174243 Lightner et al. Feb 2007 B1
7174277 Vock et al. Feb 2007 B2
7225065 Hunt et al. May 2007 B1
7228211 Lowrey et al. Jun 2007 B1
7254516 Case, Jr. et al. Aug 2007 B2
7343252 Wiens Mar 2008 B2
7362229 Brinton et al. Apr 2008 B2
7447574 Washicko et al. Nov 2008 B1
7477968 Lowrey et al. Jan 2009 B1
7480551 Lowrey et al. Jan 2009 B1
7523159 Williams et al. Apr 2009 B1
7532962 Lowrey et al. May 2009 B1
7532963 Lowrey et al. May 2009 B1
7564375 Brinton et al. Jul 2009 B2
7596437 Hunt et al. Sep 2009 B1
7604169 Demere Oct 2009 B2
7627546 Moser et al. Dec 2009 B2
7640185 Giordano et al. Dec 2009 B1
7650210 Breed Jan 2010 B2
7672756 Breed Mar 2010 B2
7672763 Hunt et al. Mar 2010 B1
7680595 Brinton et al. Mar 2010 B2
7778752 Hunt et al. Aug 2010 B1
7783507 Schick et al. Aug 2010 B2
8378854 Siris Feb 2013 B1
9858462 McQuade et al. Jan 2018 B2
10013592 McQuade et al. Jul 2018 B2
20010037174 Dickerson Nov 2001 A1
20010047283 Melick et al. Nov 2001 A1
20010053983 Reichwein et al. Dec 2001 A1
20020016655 Joao Feb 2002 A1
20020022979 Whipp et al. Feb 2002 A1
20020022984 Daniel et al. Feb 2002 A1
20020030595 Kasik Mar 2002 A1
20020049054 O'Connor et al. Apr 2002 A1
20020057212 Hamilton et al. May 2002 A1
20020065698 Schick et al. May 2002 A1
20020069017 Schmier et al. Jun 2002 A1
20020070882 Jones Jun 2002 A1
20020089434 Ghazarian Jul 2002 A1
20020099500 Schmier et al. Jul 2002 A1
20020104013 Ghazarian Aug 2002 A1
20020107833 Kerkinni Aug 2002 A1
20020107873 Winkler et al. Aug 2002 A1
20020111725 Burge Aug 2002 A1
20020122583 Thompson Sep 2002 A1
20020133273 Lowrey et al. Sep 2002 A1
20020133275 Thibault Sep 2002 A1
20020147610 Tabe Oct 2002 A1
20020150050 Nathanson Oct 2002 A1
20020156558 Hanson et al. Oct 2002 A1
20020163449 Flick Nov 2002 A1
20020178147 Arroyo et al. Nov 2002 A1
20020188593 Moser et al. Dec 2002 A1
20030004644 Farmer Jan 2003 A1
20030014166 Chinigo et al. Jan 2003 A1
20030030550 Talbot Feb 2003 A1
20030033061 Chen et al. Feb 2003 A1
20030109973 Hensey et al. Jun 2003 A1
20030120745 Katagishi et al. Jun 2003 A1
20030137435 Haddad et al. Jul 2003 A1
20030183697 Porter Oct 2003 A1
20030206133 Cheng Nov 2003 A1
20030233190 Jones Dec 2003 A1
20040009819 Koga Jan 2004 A1
20040066315 Szulanski Apr 2004 A1
20040083054 Jones Apr 2004 A1
20040236596 Chowdhary et al. Nov 2004 A1
20050038572 Krupowicz Feb 2005 A1
20050156759 Aota et al. Jul 2005 A1
20050203683 Olsen et al. Sep 2005 A1
20050273250 Hamilton et al. Dec 2005 A1
20060006228 Poulter Jan 2006 A1
20060011721 Olsen, III et al. Jan 2006 A1
20060055564 Olsen et al. Mar 2006 A1
20060095277 Noonan et al. May 2006 A1
20060097896 Jones May 2006 A1
20060145837 Horton et al. Jul 2006 A1
20060202030 Kressin et al. Sep 2006 A1
20060208075 Kressin et al. Sep 2006 A1
20060208087 Kressin et al. Sep 2006 A1
20060232406 Filibeck Oct 2006 A1
20060280582 Kouri Dec 2006 A1
20070040672 Chinigo Feb 2007 A1
20070050193 Larson Mar 2007 A1
20070069947 Banet et al. Mar 2007 A1
20070172340 Curotto et al. Jul 2007 A1
20070179709 Doyle Aug 2007 A1
20070262878 Maruca et al. Nov 2007 A1
20080140253 Brown Jun 2008 A1
20080154489 Kaneda et al. Jun 2008 A1
20080154712 Wellman Jun 2008 A1
20080314263 Martin Dec 2008 A1
20080319665 Berkobin et al. Dec 2008 A1
20090069999 Bos Mar 2009 A1
20090177350 Williams et al. Jul 2009 A1
20090222200 Link, II et al. Sep 2009 A1
20100088127 Betancourt et al. Apr 2010 A1
20100185479 Brinton et al. Jul 2010 A1
20100278620 Rimsa Nov 2010 A1
20110068954 McQuade et al. Mar 2011 A1
20110116899 Dickens May 2011 A1
20110316689 Reyes et al. Dec 2011 A1
20160210370 McQuade et al. Jul 2016 A1
20160350567 McQuade et al. Dec 2016 A1
20170351890 McQuade et al. Dec 2017 A1
20180260595 McQuade et al. Sep 2018 A1
20180314865 McQuade et al. Nov 2018 A1
20190042816 McQuade et al. Feb 2019 A1
Foreign Referenced Citations (20)
Number Date Country
2138378 Nov 1994 CA
2 326 892 Oct 1999 CA
2 388 572 May 2001 CA
0 755 039 Jan 1997 EP
0 814 447 Dec 1997 EP
0 926 020 Jun 1999 EP
1 067 498 Jan 2001 EP
1 271 374 Jan 2003 EP
1 005 627 Oct 2003 EP
1 027 792 Jan 2004 EP
2 116 968 Nov 2009 EP
60-204071 Oct 1985 JP
11-327628 Nov 1999 JP
2002-96913 Apr 2002 JP
2003-85471 Mar 2003 JP
9726750 Jul 1997 WO
9803952 Jan 1998 WO
9830920 Jul 1998 WO
03023550 Mar 2003 WO
2007092711 Aug 2007 WO
Non-Patent Literature Citations (37)
Entry
“Miras GPS vehicle tracking using the Internet,” Business Wire, Nov. 22, 1996, URL=https://www.thefreelibrary.com/_/print/PrintArticle.aspx?id=18878231, download date Aug. 28, 2017, 2 pages.
“Nextel, Motorola and Symbol Technologies Offer First Wireless Bar Code Scanner for Mobile Phones,” The Auto Channel, Jun. 11, 2003, URL=https://www.theautochannel.com/news/2003/06/11/162927.html, download date Aug. 28, 2017, 4 pages.
“ObjectFX Integrates TrackingAdvisor with Qualcomm's FleetAdvisor System; Updated Version Offers Benefit of Visual Display of Vehicles and Routes to Improve Fleet Productivity,” Business Wire, Oct. 27, 2003, URL=https://www.thefreelibrary.com/_/print/PrintArticle.aspx?id=109283193, download date Aug. 28, 2017, 3 pages.
“Private fleets moving to wireless communications,” Drivers, May 1, 1997, URL=https://web.archive.org/web/20060511114359/driversmag.com/ar/fleet_private_fleets_moving/index.html, download date Aug. 28, 2017, 4 pages.
Albright, “Indiana embarks on ambitious RFID roll out,” Frontline Solutions, May 20, 2002, URL=https://web.archive.org/web/20021102141244/http://www.frontlinetoday.com/frontline/article/articleDetail.jsp?id=19358, download date Aug. 28, 2017, 3 pages.
Anonymous, “Transit agency builds GIS to plan bus routes”, American City & Country 118(4):14-16, Apr. 1, 2003. (4 pages).
Black, “OBD II Up Close,” MOTOR:28-34, Jul. 1998. (6 pages).
Child Checkmate Systems Inc., “What is the Child Check-Mate Safety System,” URLs=http://www.childcheckmate.com/what.html, http://www.childcheckmate.com/overview.html, http://www.childcheckmate.com/how.html, download date Apr. 7, 2004, 5 pages.
Detex, “Detex Announces the Latest Innovation in Guard Tour Verification Technology,” Jan. 1, 2003, URL=https://web.archive.org/web/20031208082505/http://www.detex.com/NewsAction.jspa!id=3 , download date Aug. 28, 2017, 1 page.
Dwyer et al., “Analysis of the Performance and Emissions of Different Bus Technologies on the City of San Francisco,” SAE Commercial Vehicle Engineering Congress and Exhibition, Rosemont, Illinois, USA, Oct. 26-28, 2004. (Abstract only) (2 pages).
FleeTTrakkeR, “D.O.T. Driver Vehicle Inspection Reports on your wireless phone!,” ReporTTrakkeR, URL=http://www.fleettrakker.com/web/index.jsp, download date Mar. 12, 2004, 3 pages.
GCS General Control Systems, “The Data Acquisition Unit Escorte,” Nov. 20, 2001, URL=http://www.gcs.at/eng/produkte/hw/escorte.htm, download date Apr. 21, 2005, 4 pages.
GCS General Control Systems, “The PenMaster,” and “The PSION Workabout,” Nov. 20, 2001, URL=http://www.gcs.at/eng/produkte/hw/penmaster.htm, download date Apr. 5, 2007, 3 pages.
GCS General Control Systems, News, Dec. 11, 2002, URL=http://www.gcs.at/eng/news/allgemein.htm, download date Apr. 21, 2005, 2 pages.
Guensler et al., “Development of a Comprehensive Vehicle Instrumentation Package for Monitoring Individual Tripmaking Behavior,” Technical Specifications and Analysis, Georgia Institute of Technology, School of Civil and Environmental Engineering, Atlanta, Georgia, USA, Feb. 1999, 31 pages.
Jenkins et al., “Real-Time Vehicle Performance Monitoring Using Wireless Networking,” Proceedings of the 3rd IASTED International Conference on Communications, Internet and Information Technology, St. Thomas, US Virgin Islands, Nov. 22-24, 2004, pp. 375-380.
Kurtz, “Indiana's E-Government: A Story Behind Its Ranking,” INContext 4(1):6-8, 2003.
Kwon, “Networking Technologies of In-Vehicle,” Seoul National University, School of Electrical Engineering, Seoul, South Korea, Mar. 8, 2000, 44 pages.
Leavitt, “The Convergence Zone,” FleetOwner, Jun. 1, 1998, URL=http://www.driversmag.com/ar/fleet_convergence_zone/index.html, download date Aug. 24, 2010, 4 pages.
MIRAS, “About SPS Technologies,” as archived on May 7, 1999, URL=http://replay.waybackmachine.org/19990507195047/http://www.miras.com/html/about_sps_technologies.html, download date Sep. 29, 2010, 1 page.
MIRAS, “How MIRAS Works,” as archived on Apr. 29, 1999, URL=http://replay.waybackmachine.org/19990429144910/http://www.miras.com/html/products.html, download date Sep. 29, 2010, 1 page.
MIRAS, “MIRAS Unit,” as archived on May 4, 1999, URL=http://replay.waybackmachine.org/19990504052250/http://www.miras.com/html/1000unit.html, download date Sep. 29, 2010, 1 page.
MIRAS, “Monitoring Vehicle Functions,” as archived on Apr. 27, 1999, URL=http://replay.waybackmachine.org/19990427152518/http://www.miras.com/html/monitoring.html, download date Sep. 29, 2010, 1 page.
MIRAS, “Remote Control,” as archived on Apr. 29, 1999, URL=http://replay.waybackmachine.org/19990429145717/http://www.miras.com/html/remote_control.html, download date Sep. 29, 2010, 1 page.
MIRAS, “Tracking & Monitoring Software,” as archived on Apr. 29, 1999, URL=http://replay.waybackmachine.org/19990429160322/http://www.miras.com/html/software.html, download date Sep. 29, 2010, 1 page.
MIRAS, MIRAS 4.0 Screenshot, as archived on May 7, 1999, URL=http://replay.waybackmachine.org/19990507205618/http://www.miras.com/html/largescreen.html, download date Sep. 29, 2010, 1 page.
Office Action, dated Nov. 26, 2018, for McQuade et a1., U.S. Appl. No. 16/157,490, 8 pages.
Quaan et al., Guard Tour Systems, post dates Sep. 16, 2003, Oct. 3, 2003, and Sep. 4, 2004, URL=http://www.securitymanagement.com/ubb/Forum30/HTML/000066.html, download date Mar. 8, 2005, 1 page.
Senger, “Inside RF/ID: Carving a Niche Beyond Asset Tracking,” Business Solutions, Feb. 1, 1999, 3 pages.
Sterzbach et al., “A Mobile Vehicle On-Board Computing and Communication System,” Computers & Graphics 20(4):659-667, 1996.
The Gale Group, “Tracking out of route: software helps fleets compare planned routes to actual miles.(Technology),” Commercial Carrier Journal 162(10):S46, 2005. (4 pages).
Tiscor, “Inspection Manager 6.0,” Product User Guide, 2004, 73 pages.
Tiscor, “Inspection Manager: An Introduction,” Sep. 27, 2004, 19 pages.
Tsakiri et al., “Urban Fleet Monitoring with GPS and GLONASS,” The Journal of Navigation 51(3):382-393, 1998. (Abstract only) (2 pages).
Tuttle, “Digital RF/ID Enhances GPS,” Proceedings of the 2nd Annual Wireless Symposium, Santa Clara, California, USA, Feb. 15-18, 1994, pp. 406-411.
Want, “RFID: A Key to Automating Everything,” Scientific American 290(1):56-65, 2004.
Zujkowski, “Savi Technology, Inc.: Savi Security and Productivity Systems,” Remarks, ATA Security Forum, Chicago, Illinois, USA, May 15, 2002, 21 pages.
Related Publications (1)
Number Date Country
20190251315 A1 Aug 2019 US
Continuations (4)
Number Date Country
Parent 15980548 May 2018 US
Child 16391951 US
Parent 15235853 Aug 2016 US
Child 15980548 US
Parent 15083208 Mar 2016 US
Child 15235853 US
Parent 12942874 Nov 2010 US
Child 15083208 US