As the cost of sensors, communications systems and navigational systems has dropped, operators of commercial and fleet vehicles now have the ability to collect a tremendous amount of data about the vehicles that they operate, including geographical position data collected during the operation of the vehicle.
Vehicle fleet operators often operate vehicles along predefined and generally invariant routes. For example, buses frequently operate on predefined routes, according to a predefined time schedule (for example, along a route that is geographically, as well as temporally defined). Migrating route data from one software platform to another software platform can be a tedious task.
It would be desirable to provide such fleet operators with additional tools for moving data between different software platforms, and for collecting and analyzing data (such as Global Positioning System (GPS) data, as well as other route related data) collected from vehicles traversing a predefined route.
The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope. In various embodiments, one or more of the above-described problems have been reduced or eliminated, while other embodiments are directed to other improvements.
One concept disclosed herein is the collection of object identification data during the operation of a vehicle, where the vehicle interacts with the object at a definable geographical position. An identification sensor is coupled to a geographical position sensor, and whenever an object is identified a record is generated, the record including the identification of the object, the position of the vehicle when the interaction between the object and the vehicle occurs, and the time of the interaction. Exemplary (but not limiting) objects that are identified include passengers, containers (such as pallets, packages, boxes, envelopes), and documents. Many different types of interactions are possible, including, but not limited to, loading an object (such as a parcel, document, or container) into the vehicle, unloading an object (such as a parcel, document, or container) from the vehicle, boarding a passenger (the object) onto the vehicle, unloading a passenger (once again, the passenger being the object) from the vehicle, transferring a bulk material (such as a solid, liquid or compressed gas) from the vehicle into a specific container (the container being the object), and/or transferring a bulk material (such as a solid, liquid or compressed gas) from a specific container (the container being the object) to the vehicle. The record may also include additional data about a parameter of the object (for example, in some embodiments, it will be useful to include the object's weight in the record, or the weight/volume of a material being transferred to or from the vehicle to a specific container). Such a data record is referred to herein and in the claims that follow as object identification (ID) and location data, and/or object ID encoded position data (encoded in the sense that the object data is combined with the position data). In some embodiments, the object ID and location data is stored at the vehicle for transfer to a remote computing device at a later time, and in other embodiments, the object ID and location data is wirelessly transmitted to a remote computing device during operation of the vehicle. The term “object identification data” is intended to refer to data that identifies an object with which a vehicle interacts. For example, for a passenger, object identification data can include the passenger's name, or a passenger number (or an alphanumeric code or other type of code) that uniquely identifies an individual. For other objects, the object identification data is generally a numeric or alphanumeric code that uniquely identifies the object.
Broadly speaking, position data from the vehicle is collected as the vehicle travels to the plurality of different locations, the position data identifying a specific geographical location of the vehicle at a specific point in time (thus, the vehicle position data is time indexed). Time indexed object identification data is collected as the vehicle interacts with objects at various locations visited by the vehicle. In some embodiments, the vehicle traverses a generally invariant route (such as a bus route), while in other embodiments the vehicle traverses a variable route (such as a parcel delivery vehicle). In an exemplary embodiment, the time indexing function is implemented by the geographical position sensing system. Periodically, the geographical position sensing system generates a record that documents the current time and current geographical position of the vehicle. Whenever the identification sensor identifies an object, the identification data is sent to the geographical position sensing system, which either appends the identification data to the most current record, or generates a new record that documents the identity of the object, the current time and the current geographical position, thereby generating the object ID and location data. It should thus be recognized that the set of location data collected by the geographical position sensing system during operation of the vehicle will also include object identification data at those points in time at which the vehicle interacts with an object that has been tagged in some way with a unique identifier that can be detected by the object identification sensor. Exemplary tags or tokens include optical codes (such as bar codes and other optically recognizable codes), radio frequency identification (RFID) tags, and magnetic tags/magnetic strips. It should be understood that the set of location data collected by the geographical position sensing system during operation of the vehicle (which at some time points includes only location data, and at other time points includes location data and object identification data) is collectively referred to herein as the object ID and location data. Such object ID and location data are conveyed to a remote computing device for storage/processing, either in real-time (i.e., while the vehicle is being operated, such that the vehicle requires a transmitter to convey the data to the remote computing device) or at some point after the vehicle has traversed a route and collected the different types of data (the position data and the object identification data). The term real-time is not intended to imply the data is transmitted instantaneously, rather the data is collected over a relatively short period of time (over a period of seconds or minutes), and transmitted to the remote computing device on a ongoing basis, as opposed to storing the data at the vehicle for an extended period of time (hour or days), and transmitting an extended data set to the remote computing device after the data set has been collected. Transmitting the object ID and location data at a later time, rather than in real time, is encompassed by the concepts disclosed herein, although real-time data transmission is likely to be popular with users. Note that transferring the object ID and location data at a later time can be achieved without requiring the vehicle to include a wireless transmitter (i.e., the object ID and location data can be transferred via a hardwire connection to either the remote computing device or an intermediate data collection device that is coupled to the vehicle to extract the object ID and location data, which is then conveyed to remote computing device).
With respect to the remote computing device, in a preferred but not limiting embodiment, the time indexed object ID and location data are available in a networked computing environment. In at least one embodiment, the object ID and location data are stored by a company offering data management services to its clients, and clients can access the object ID and location data for each of their vehicles.
The object ID and location data will have a number of uses. In the context of objects being passengers, the object ID and location data can be used by school bus operators to provide parents with data about when and where their children entered and exited a school bus. The object ID and location data can also be used to alert drivers when students attempt to get off the bus at some location other than their normal stop. The object ID and location data can be used to provide proof of delivery (or pick up) of parcels, documents, and other objects. Historical object ID and location data for generally invariant routes (such as refuse collection routes and school bus routes) can be used to train new drivers, where historical object ID and location data is loaded onto the vehicle before the route is traversed, and that data is used to alert the driver of what objects (such as refuse containers or students) are associated with specific geographical locations in the route.
In addition to being implemented as a method, the concepts disclosed herein can also be implemented as a nontransitory memory medium storing machine instructions that when executed by a processor implement the method, and by a system for implementing the method. In such a system, the basic elements include a vehicle that is to be operated by a vehicle operator, a position data collection unit (such as a GPS tracking device), an object identification sensor (such as a token reader), a data link (which can be integrated into the GPS unit), and a remote computing device. In general, the remote computing device can be implemented by a computing system employed by an entity operating a fleet of vehicles. Entities that operate vehicle fleets can thus use such computing systems to track and process data relating to their vehicle fleet. It should be recognized that these basic elements can be combined in many different configurations to achieve the exemplary method discussed above. Thus, the details provided herein are intended to be exemplary, and not limiting on the scope of the concepts disclosed herein.
Identification of objects can be accomplished by a using reader to scan a token attached to the object. Exemplary tokens include optical codes (such as bar codes), radio frequency identification tags (RFID), and magnetic strips. Readers can be handheld devices, or when appropriate can be attached to the vehicle. For example, RFID tags readers could be attached to the vehicle proximate a door used to load or unload the vehicle, to automatically interrogate each RFID tagged item loaded onto or unloaded from the vehicle. Generally it will be preferable to record both loading and unloading of an object, although the concepts disclosed herein encompass embodiments where data relating to only loading or unloading is collected. Where the object is a person (i.e., a passenger), the person will be issued a token to be carried with them as they enter (or exit) the vehicle. In some cases, it may be desirable to identify a person that interacts with the vehicle even if the person is not a passenger (or is not entering or exiting the vehicle). Such a person might be tasked with delivering something to the vehicle or servicing the vehicle.
With respect to identifying passengers, a reader can be used to read a token (such as a ticket or rider pass) when a person enters or exits a vehicle. Generally it will be preferable to record both entry and exit, although the concepts disclosed herein encompass embodiments where data relating to only entry or exit is determined. In an exemplary but not limiting embodiment, a magnetic card reader is used to scan passenger cards as they enter or exit a vehicle. A particularly useful application of this type of object ID and position data tracking is to enable school bus operators to collect ridership data about students, tracking where and when students enter and exit a school bus. Such historical data can be used for training purposes whenever a driver is assigned a new route, as the historical data can be used to teach the driver which children get on and off at a particular stop. Once such historical data has been collected, if desired, the data can be used to prevent children from getting off at an incorrect stop (the token reader will automatically check the historical data, and if that child attempts to get off at a stop that is not part of the historical data for that child, an alert can be issued to the driver).
While the above noted method is preferably implemented by a processor (such as computing device implementing machine instructions to implement the specific functions noted above), note that such a method can also be implemented using a custom circuit (such as an application specific integrated circuit).
In addition to object identification data (i.e., data that uniquely identifies the object), many different types of object data can be collected. The following types of additional object data are intended to be exemplary, rather than limiting. Time indexing can be achieved by including a time stamp with the object data as the data is collected by the object identification sensor, or the time stamp can be provided by the position sensing system, generally as discussed above.
A first type of additional object data that can be collected during operation of the vehicle is a weight of the object. An exemplary embodiment of a vehicle collecting object ID and location data that includes weight is a refuse truck. In this embodiment, each refuse container serviced by the vehicle is tagged with a token that is detected by the identification sensor as the contents of the refuse container is loaded into the vehicle. In an exemplary but not limiting embodiment, the loading arms include an identification sensor that reads a token labeling each container as the containers are manipulated by the loading arms. The loading arms are also equipped with weight sensors, that determines the weight of the refuse emptied from the container. Thus, the object ID and location data in this embodiment can be used to identify when a container was emptied, where the container was located when it was emptied, and how much refuse was removed. That data is collected automatically, and can be used to provide proof of service, and the weight function maybe used for billing purposes if the client is to be billed by weight. Recycling containers can be tracked and weighed in a similar manner. Historical data about containers and position can be used for training purposes whenever a new driver is assigned to an existing route, as the historical data can be used to teach the new driver what containers are usually serviced at a particular location.
A second type of additional object data that can be collected during operation of the vehicle is volume. An exemplary embodiment of a vehicle collecting object ID and location data that includes volume is a liquid fuel or compressed gas delivery truck. In this embodiment, each fuel or gas container serviced by the vehicle is tagged with a token that is detected by the identification sensor as the contents of the truck is offloaded into the container. In an exemplary but not limiting embodiment, the connector used to fluidly couple the vehicle to the container includes an identification sensor that reads a token labeling each container. The identification sensor is coupled to a flow sensor or tank level in the vehicle which keeps track of how much product is delivered. That volume data, as well as the container identification data, is sent to the vehicle's geographical position sensing system as the container is filled. Thus, the object ID and location data in this embodiment can be used to identify when a container was filled, where the container was located when it was filled, and how much volume of product was delivered by the vehicle. That data is collected automatically, and can be used to provide proof of service, and the volume function may be used for billing purposes if the client is to be billed by volume. It should be noted that such liquid or compressed gas deliveries can also be tracked by weight. Related embodiments utilize data input devices to enable vehicle operators to manually enter container identifications and product weights/volumes into a processor or computing device that combines the weight/volume data and container ID data with the vehicle position data to generate the object ID and location data.
A third type of additional object data that can be collected during operation of the vehicle is object temperature. An exemplary embodiment of a vehicle collecting object ID and location data that includes temperature is a produce delivery truck. In an exemplary but not limiting embodiment, the temperature of each produce container delivered by the vehicle is measured as the container is loaded or unloaded from the vehicle. That temperature data, as well as the container identification data, is sent to the vehicle's geographical position sensing system as the container is loaded or unloaded. Thus, the object ID and location data in this embodiment can be used to identify when a container was loaded and/or unloaded, where the container/vehicle was located when the container was loaded and/or unloaded, and the temperature of the container. That data is collected, and can be used to provide proof of service, and the temperature function may be used for quality assurance purposes if the client asserts that poor product quality was caused by improper temperature conditions in transit. Related embodiments simply measure the temperature of the cargo area of the vehicle, rather than measuring the temperature of each container.
This Summary has been provided to introduce a few concepts in a simplified form that are further described in detail below in the Description. However, this Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following detailed descriptions.
Exemplary embodiments are illustrated in referenced drawings. It is intended that the embodiments and figures disclosed herein are to be considered illustrative rather than restrictive.
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the invention relates.
In a block 16, location data (such as GPS data, recognizing that other location tracking systems are known, and the term GPS is intended to be exemplary of a position tracking system, and not limiting) is collected while the vehicle is in operation. The location data is time indexed, meaning that the location data being collected is the location of the vehicle at a particular point in time. While the vehicle is in operation, and when the object identification sensor detects a labeled object (as indicated by a decision block 18), the object ID data is added to the time indexed GPS data, as indicated by a block 20. In some embodiments, the object identification sensor is always enabled, and detection of labeled objects occurs automatically when the labeled object and the object identification sensor are proximate (or in the case of a magnetic card reader type sensor, when the card is swiped through the reader). In other embodiments, such as with a hand held object identification sensor, the object identification sensor must be enabled by the vehicle operator, and detection of labeled objects occurs when the vehicle operator brings the labeled object and the object identification sensor into proximity of one another.
Referring once again to
In general, analysis of the object ID encoded position data will be carried out by a remote computing device. The remote computing device in at least one embodiment comprises a computing system controlled or accessed by the fleet operator. The remote computing device can be operating in a networked environment, and in some cases, may be operated by a third party under contract with the fleet operator to perform such services.
Also included in processing unit 254 are a random access memory (RAM) 256 and non-volatile memory 260, which can include read only memory (ROM) and may include some form of memory storage, such as a hard drive, optical disk (and drive), etc. These memory devices are bi-directionally coupled to CPU 258. Such storage devices are well known in the art. Machine instructions and data are temporarily loaded into RAM 256 from non-volatile memory 260. Also stored in the non-volatile memory are an operating system software and ancillary software. While not separately shown, it will be understood that a generally conventional power supply will be included to provide electrical power at voltage and current levels appropriate to energize computing system 250.
Input device 252 can be any device or mechanism that facilitates user input into the operating environment, including, but not limited to, one or more of a mouse or other pointing device, a keyboard, a microphone, a modem, or other input device. In general, the input device will be used to initially configure computing system 250, to achieve the desired processing (i.e., analysis of the object ID encoded position data). Configuration of computing system 250 to achieve the desired processing includes the steps of loading appropriate processing software into non-volatile memory 260, and launching the processing application (e.g., loading the processing software into RAM 256 for execution by the CPU) so that the processing application is ready for use. Output device 262 generally includes any device that produces output information, but will most typically comprise a monitor or computer display designed for human visual perception of output. Use of a conventional computer keyboard for input device 252 and a computer display for output device 262 should be considered as exemplary, rather than as limiting on the scope of this system. Data link 264 is configured to enable object ID encoded position data to be input into computing system 250 for subsequent analysis. Those of ordinary skill in the art will readily recognize that many types of data links can be implemented, including, but not limited to, universal serial bus (USB) ports, parallel ports, serial ports, inputs configured to couple with portable memory storage devices, FireWire ports, infrared data ports, wireless data communication such as Wi-Fi and Bluetooth™, network connections via Ethernet ports, and other connections that employ the Internet.
It should be recognized that processors can be implemented as general purpose processors, where the functions implemented by the processor are changeable or customizable using machine instructions (i.e., software). Processors can also be implemented as customized hardware circuits, where the functions implemented are fixed by the design of the circuit (such processors are sometimes referred to as application specific integrated circuits). The flexibility of software controlled processors often results in software based processors being selected over hardware based processors, although it should be understood that the concepts disclosed herein can be implemented using both software based processors and hardware based processors.
As discussed above, the expected interaction can encompass different interactions between the vehicle and a labeled object, including but not limited to, picking up a passenger (where the passenger is the labeled object, or rather carries with them a token that can be read by the identification sensor and thereby uniquely identifies them), dropping off a passenger (where the passenger is the labeled object, or rather carries with them a token that can be read by the identification sensor and thereby uniquely identifies them), picking up an object (such as a parcel, package, container, letter, or document), delivering an object (such as a parcel, package, container, letter, or document), and servicing an object (such as a container or piece of equipment) disposed at the specified location. In particular, servicing an object includes, but is not limited to, removing refuse from a labeled container, removing recyclables from a labeled container, removing refuse from a container at a location that is labeled (i.e., the token is attached to a location where the container is disposed, as opposed to being attached to the container itself), removing recyclables from a container at a location that is labeled (i.e., the token is attached to a location where the container is disposed, as opposed to being attached to the container itself), transferring a bulk material (such as a solid material, a liquid, or a compressed gas) to a labeled container, transferring a bulk material (such as a solid material, a liquid, or a compressed gas) to a container at a location that is labeled (i.e., the token is attached to a location where the container is disposed, as opposed to being attached to the container itself), transferring a bulk solid material to a location that is labeled (i.e., the token is attached to a location, and there is no container, the bulk solid material simply being delivered to the location), and having the vehicle operator perform a service call on a piece of equipment or a structure at the specified location, where either or both the object being serviced or the location is labeled with a token. Those of ordinary skill in the art will recognize that the servicing of structures and/or equipment encompasses services performed by skilled tradesmen, including, but not limited to, plumbers, electricians, carpenters, technicians specializing in servicing specific types of equipment (including but not limited to computers, heating and ventilation equipment, construction equipment, and vehicles), and technicians responsible for other types of repair and maintenance functions.
In an exemplary, but not limiting embodiment, display 46 is used to inform the vehicle operator that the vehicle is approaching or has arrived at a location where an interaction between the vehicle and a labeled object (or labeled location, as noted above) is expected. The display will minimally identify the object, and in some embodiments can be used to provide more detailed information about the interaction. For example, where the interaction is a service call, details about the specific service required may be provided (i.e., replace a faulty component in a piece of equipment, or perform a specific type of scheduled maintenance on a piece of equipment, such services being exemplary and not limiting).
A dashed block 50 around GPS 42, processor 44, and display 46 is intended to indicate that in some embodiments, those three elements will be combined into a single device. It should be recognized that the concepts disclosed herein encompass the use of individual devices to implement each of GPS 42, processor 44, and display 46, as well embodiments where the functions of one or more of GPS 42, processor 44, and display 46 (and memory 48) are implemented by a common device.
Referring once again to
Referring to
As noted above, the concepts disclosed herein also encompass bus 52 being configured to implement the method of
If the bus configured to implement the method of
Referring to
If desired, a temperature sensor 69 can be included in the cargo area of the delivery truck, to measure the ambient temperature of the cargo area. The temperature measurement represents additional object data, that will be combined with the object ID and the time and GPS data, to generate the object ID encoded position data. The temperature sensor, if present, is configured to communicate its data to the GPS unit, or the processor responsible for combining the object ID data, the temperature data, and the GPS data together to generate the time indexed object ID encoded position data. The temperature data may be important for temperature sensitive cargo, and collecting such data and combining it with the object ID encoded position data will enable the delivery service to prove to the shipper that the cargo was maintained in the correct temperature controlled environment during transit. In a related embodiment, the temperature sensor can be incorporated into the object, and the temperature data can be manually entered into the GPS unit/processor during delivery, or acquired using a hand held sensor that logically communicates that data to the GPS unit/processor for incorporation into the object ID encoded position data.
In an exemplary embodiment, once the loading or unloading of cargo has occurred, the object ID encoded position data can be conveyed to the remote computing device of
As noted above, the concepts disclosed herein also encompass delivery truck 60 being configured to implement the method of
If delivery truck 60 is configured to implement the method of
As noted above, when the method of
Where the refuse truck is configured to implement the method of
Referring to
In an exemplary embodiment, once the container has been emptied, the object ID encoded position data can be conveyed to the remote computing device of
As noted above, the concepts disclosed herein also encompass refuse truck 74 being configured to implement the method of
If refuse truck 74 is configured to implement the method of
Where the tanker truck is configured to implement the method of
Referring to
If desired, a volume delivery sensor 104 can be included on the tanker truck, to measure the volume of bulk material being delivered. The volume measurement represents additional object data, that will be combined with the object ID and the time and GPS data, to generate the object ID encoded position data. Referring to
In an exemplary embodiment, once the storage tank has been filled, the object ID encoded position data can be conveyed to the remote computing device of
As noted above, the concepts disclosed herein also encompass tanker truck 90 being configured to implement the method of
If tanker truck 90 is configured to implement the method of
With respect to any of the embodiments of
After the operator has used portable ID sensor 110 to identify each labeled object, the operator can transmit the object ID data that have been collected to the vehicle GPS or the processor that will combine the object ID data with the GPS data to generate the object ID encoded position data, using a data link 118 (in an exemplary embodiment the data link employs an RF transmission, though hardwired and other wireless type data links can be used).
As noted above, the tokens that are affixed to the objects to be identified can be of several different types, depending upon the type of sensor 114 that is included on portable ID sensor 110. In a preferred form of one of the concepts disclosed herein, the token that is preferably employed is an RFID tag that is attached with a fastener or an appropriate adhesive to the object (or is carried by a passenger, or is attached to a location proximate the object, generally as discussed above). One type of RFID tag that is suitable for this purpose is the WORLDTAG™ token that is sold by Sokymat Corporation. This tag is excited by an RF transmission from portable ID sensor 110 via an antenna in sensor 114. In response to the excitation energy received, the RFID tag modifies the RF energy that is received from the antenna in sensor 114 in a manner that specifically identifies the component associated with the RFID tag, and the modified signal is detected by sensor 46.
An alternative type of token that can also be used in one of the concepts disclosed herein is an IBUTTON™ computer chip, which is armored in a stainless steel housing and is readily affixed to an object or location. The IBUTTON chip is programmed with JAVA™ instructions to provide a recognition signal when interrogated by a signal received from a nearby transmitter, such as from an antenna in sensor 114. The signal produced by the IBUTTON chip is received by sensor 114, which determines the type of component associated with a token. This type of token is less desirable since it is more expensive, although the program instructions that it executes can provide greater functionality.
Yet another type of token that might be used is an optical bar code in which a sequence of lines of varying width or other optical patterns encode light reflected from the bar code tag. The encoded reflected light is received by sensor 114, which is then read by an optical detector. Bar code technology is well understood by those of ordinary skill in the art and readily adapted for identifying a particular type of component and location of the component on a vehicle or other system or apparatus. One drawback to the use of a bar code tag as a token is that the bar code can be covered with dirt or grime that must be cleaned before the sequence of bar code lines or other pattern can be properly read. If the bar code is applied to a plasticized adhesive strip, it can readily be mounted to any surface and then easily cleaned with a rag or other appropriate material.
Yet another type of token usable in one of the concepts disclosed herein is a magnetic strip in which a varying magnetic flux encodes data identifying the particular component associated with the token. Such magnetic strips are often used in access cards that are read by readers mounted adjacent to doors or in an elevator that provides access to a building. However, in this aspect of the concepts disclosed herein, the magnetic flux reader comprises sensor 114. The data encoded on such a token are readily read as the portable device is brought into proximity of the varying magnetic flux encoded strip comprising the token.
As yet another alternative, an active token can be employed that conforms to the BLUETOOTH™ specification for short distance data transfer between computing devices using an RF signal.
In at least one embodiment, the interaction between the vehicle and a labeled object is an inspection of the object. The vehicle is used to convey the inspector to the labeled object. A sensor attached to the vehicle or in a handheld device is used to collect the object ID data, which is combined with the position data collected by the vehicle to generate the object ID encoded position data, which can be used to verify that the inspector was proximate the specific object at a specific time. Objects that can be labeled for inspection include, but are not limited to, buildings, bridges, utility vaults, traffic signals, traffic signs, cell phone towers, transformers, pipelines, utility poles, and construction equipment.
In at least one embodiment, the interaction between the vehicle and a labeled object does not include loading the object onto the vehicle or removing the object from the vehicle. Thus, in the claims that follow, it should be recognized that support exists for a negative claim limitation to that effect.
In at least one embodiment, the interaction between the vehicle and a labeled object involves loading or unloading the object from a cargo area in the vehicle that is not suitable for passengers. Thus, in the claims that follow, it should be recognized that support exists for a negative claim limitation to that effect.
Although the concepts disclosed herein have been described in connection with the preferred form of practicing them and modifications thereto, those of ordinary skill in the art will understand that many other modifications can be made thereto within the scope of the claims that follow. Accordingly, it is not intended that the scope of these concepts in any way be limited by the above description, but instead be determined entirely by reference to the claims that follow.
While a number of exemplary aspects and embodiments have been discussed above, those possessed of skill in the art will recognize certain modifications, permutations, additions and sub-combinations thereof. It is therefore intended that the following appended claims and claims hereafter introduced are interpreted to include all such modifications, permutations, additions and sub-combinations as are within their true spirit and scope.
This application is a continuation of prior co pending application Ser. No. 15/235,853, filed Aug. 12, 2016, which is itself a continuation of Ser. No. 15/083,208, filed Mar. 28, 2016, and issued as U.S. Pat. No. 9,858,462 on Jan. 2, 2018, which itself is a continuation of application Ser. No. 12/942,874, filed Nov. 9, 2010, the benefit of the filing date of which is hereby claimed under 35 U.S.C. § 120.
Number | Name | Date | Kind |
---|---|---|---|
3573620 | Ashley et al. | Jun 1971 | A |
3990067 | Van Dusen et al. | Feb 1976 | A |
4009375 | White et al. | Feb 1977 | A |
4025791 | Lennington et al. | Apr 1977 | A |
4092718 | Wendt | May 1978 | A |
4258421 | Juhasz et al. | Mar 1981 | A |
4263945 | Van Ness | Apr 1981 | A |
4325057 | Bishop | Apr 1982 | A |
4469149 | Walkey et al. | Apr 1984 | A |
4602127 | Neely et al. | Jul 1986 | A |
4651157 | Gray et al. | Mar 1987 | A |
4658371 | Walsh et al. | Apr 1987 | A |
4713661 | Boone | Dec 1987 | A |
4763356 | Day, Jr. et al. | Sep 1988 | A |
4799162 | Shinakawa et al. | Jan 1989 | A |
4804937 | Barbiaux et al. | Feb 1989 | A |
4814711 | Olsen et al. | Mar 1989 | A |
4846233 | Fockens | Nov 1989 | A |
4897792 | Hosoi | Jan 1990 | A |
4934419 | Lamont et al. | Jun 1990 | A |
4935195 | Palusamy et al. | Jun 1990 | A |
5007786 | Bingman | Apr 1991 | A |
5014206 | Scribner | May 1991 | A |
5058044 | Stewart et al. | Oct 1991 | A |
5072380 | Randelman et al. | Oct 1991 | A |
5068656 | Sutherland | Nov 1991 | A |
5119894 | Crawford | Jun 1992 | A |
5128651 | Heckart | Jul 1992 | A |
5120942 | Holland | Sep 1992 | A |
5204819 | Ryan | Apr 1993 | A |
5206643 | Eckelt | Apr 1993 | A |
5209312 | Jensen | May 1993 | A |
5223844 | Mansell et al. | Jun 1993 | A |
5230393 | Mezey | Jul 1993 | A |
5243323 | Rogers | Jul 1993 | A |
5247160 | Zicker | Sep 1993 | A |
5304744 | Jensen | Apr 1994 | A |
5321629 | Shirata et al. | Jun 1994 | A |
5337003 | Carmichael et al. | Sep 1994 | A |
5359522 | Ryan | Oct 1994 | A |
5394136 | Lammers et al. | Feb 1995 | A |
5399844 | Holland | Mar 1995 | A |
5400020 | Jones et al. | Mar 1995 | A |
5442553 | Parrillo | Aug 1995 | A |
5459304 | Eisenmann | Oct 1995 | A |
5459660 | Berra | Oct 1995 | A |
5479479 | Braitberg et al. | Dec 1995 | A |
5488352 | Jasper | Jan 1996 | A |
5499182 | Ousborne | Mar 1996 | A |
5572192 | Berube | May 1996 | A |
5541845 | Klein | Jul 1996 | A |
5546305 | Kondo | Aug 1996 | A |
5557254 | Johnson et al. | Sep 1996 | A |
5557268 | Hughes et al. | Sep 1996 | A |
5585552 | Heuston et al. | Dec 1996 | A |
5594650 | Shah et al. | Jan 1997 | A |
5596501 | Comer et al. | Jan 1997 | A |
5600323 | Boschini | Apr 1997 | A |
5623258 | Dorfman | Apr 1997 | A |
5629678 | Gargano et al. | May 1997 | A |
5657010 | Jones | Aug 1997 | A |
5668543 | Jones | Sep 1997 | A |
5671158 | Fournier et al. | Sep 1997 | A |
5680328 | Skorupski et al. | Oct 1997 | A |
5610596 | Petitclerc | Nov 1997 | A |
5719771 | Buck et al. | Feb 1998 | A |
5731893 | Dominique | Mar 1998 | A |
5732074 | Spaur et al. | Mar 1998 | A |
5742915 | Stafford | Apr 1998 | A |
5745049 | Akiyama et al. | Apr 1998 | A |
5758299 | Sandborg et al. | May 1998 | A |
5758300 | Abe | May 1998 | A |
5781871 | Mezger et al. | Jul 1998 | A |
5808565 | Matta et al. | Sep 1998 | A |
5809437 | Breed | Sep 1998 | A |
5815071 | Doyle | Sep 1998 | A |
5835871 | Smith et al. | Oct 1998 | A |
5794164 | Beckert et al. | Nov 1998 | A |
5837945 | Cornwell | Nov 1998 | A |
5838251 | Brinkmeyer et al. | Nov 1998 | A |
5839112 | Schreitmueller et al. | Nov 1998 | A |
5867404 | Bryan | Feb 1999 | A |
5874891 | Lowe | Feb 1999 | A |
5884202 | Arjomand | Mar 1999 | A |
5890061 | Timm et al. | Mar 1999 | A |
5890520 | Johnson, Jr. | Jun 1999 | A |
5913180 | Ryan | Jun 1999 | A |
5922037 | Potts | Jul 1999 | A |
5923572 | Pollock | Jul 1999 | A |
5942753 | Dell | Aug 1999 | A |
5956259 | Hartsell, Jr. et al. | Sep 1999 | A |
5995898 | Tuttle | Nov 1999 | A |
6006159 | Schmier et al. | Dec 1999 | A |
6009355 | Obradovich et al. | Dec 1999 | A |
6009363 | Beckert et al. | Dec 1999 | A |
6016795 | Ohki | Jan 2000 | A |
6024142 | Bates | Feb 2000 | A |
6025776 | Matsuura | Feb 2000 | A |
6043661 | Gutierrez | Mar 2000 | A |
6127947 | Uchida et al. | Mar 2000 | A |
6128551 | Davis et al. | Mar 2000 | A |
6054950 | Fontana | Apr 2000 | A |
6061614 | Carrender et al. | May 2000 | A |
6064299 | Lesesky et al. | May 2000 | A |
6070156 | Hartsell, Jr. | May 2000 | A |
6078255 | Dividock et al. | Jun 2000 | A |
6084870 | Wooten et al. | Jul 2000 | A |
6092021 | Ehlbeck et al. | Jul 2000 | A |
6107915 | Reavell et al. | Aug 2000 | A |
6107917 | Carrender et al. | Aug 2000 | A |
6112152 | Tuttle | Aug 2000 | A |
6128959 | McGovern et al. | Oct 2000 | A |
6169938 | Hartsell, Jr. | Feb 2001 | B1 |
6169943 | Simon | Feb 2001 | B1 |
6202008 | Beckert et al. | Mar 2001 | B1 |
6208948 | Klingler et al. | Mar 2001 | B1 |
6256579 | Tanimoto | Mar 2001 | B1 |
6285953 | Harrison et al. | Apr 2001 | B1 |
6236911 | Kruger | May 2001 | B1 |
6240365 | Bunn | May 2001 | B1 |
6199099 | Gershman et al. | Jun 2001 | B1 |
6253129 | Jenkins et al. | Jun 2001 | B1 |
6263273 | Henneken et al. | Jul 2001 | B1 |
6263276 | Yokoyama et al. | Jul 2001 | B1 |
6278936 | Jones | Aug 2001 | B1 |
6295492 | Lang et al. | Sep 2001 | B1 |
6259358 | Fjordbotten | Oct 2001 | B1 |
6313760 | Jones | Nov 2001 | B1 |
6313791 | Klanke | Nov 2001 | B1 |
6330499 | Chou et al. | Nov 2001 | B1 |
6339745 | Novik | Jan 2002 | B1 |
6362730 | Razavi et al. | Mar 2002 | B2 |
6370454 | Moore | Apr 2002 | B1 |
6374176 | Schmier et al. | Apr 2002 | B1 |
6396413 | Hines et al. | May 2002 | B2 |
6411203 | Leseskey et al. | Jun 2002 | B1 |
6411891 | Jones | Jun 2002 | B1 |
6417760 | Mabuchi et al. | Jul 2002 | B1 |
6438472 | Tano et al. | Aug 2002 | B1 |
6450411 | Rash et al. | Sep 2002 | B1 |
6456039 | Lauper et al. | Sep 2002 | B1 |
6502030 | Hilleary | Dec 2002 | B2 |
6505106 | Lawrence et al. | Jan 2003 | B1 |
6507810 | Razavi et al. | Jan 2003 | B2 |
6614392 | Howard | Feb 2003 | B2 |
6539296 | Diaz et al. | Mar 2003 | B2 |
6529723 | Bentley | Apr 2003 | B1 |
6529808 | Diem | Apr 2003 | B1 |
6604033 | Banet et al. | May 2003 | B1 |
6594579 | Lowrey et al. | Jul 2003 | B1 |
6594621 | Meeker | Jul 2003 | B1 |
6597973 | Barich et al. | Jul 2003 | B1 |
6608554 | Leseskey et al. | Aug 2003 | B2 |
6609082 | Wagner | Aug 2003 | B2 |
6616036 | Streicher et al. | Sep 2003 | B2 |
6621452 | Knockeart et al. | Sep 2003 | B2 |
6636790 | Lightner et al. | Oct 2003 | B1 |
6664897 | Pape et al. | Dec 2003 | B2 |
6671646 | Manegold et al. | Dec 2003 | B2 |
6680694 | Knockeart et al. | Jan 2004 | B1 |
6683542 | Jones | Jan 2004 | B1 |
6744352 | Lesesky et al. | Jan 2004 | B2 |
6700506 | Winkler et al. | Mar 2004 | B1 |
6708113 | Von Gerlach et al. | Mar 2004 | B1 |
6714859 | Jones | Mar 2004 | B2 |
6727818 | Wildman et al. | Apr 2004 | B1 |
6732031 | Lightner et al. | May 2004 | B1 |
6732032 | Banet et al. | May 2004 | B1 |
6801841 | Tabe | May 2004 | B2 |
6754183 | Razavi et al. | Jun 2004 | B1 |
6768994 | Howard et al. | Jul 2004 | B1 |
6816762 | Hensey et al. | Sep 2004 | B2 |
6804606 | Jones | Dec 2004 | B2 |
6804626 | Manegold et al. | Dec 2004 | B2 |
6834259 | Markwitz et al. | Dec 2004 | B1 |
6856820 | Kolls | Feb 2005 | B1 |
6924750 | Flick | Feb 2005 | B2 |
6876642 | Adams et al. | Apr 2005 | B1 |
6880390 | Emord | Apr 2005 | B2 |
6952645 | Jones | Apr 2005 | B1 |
6894617 | Richman | May 2005 | B2 |
6899151 | Latka et al. | May 2005 | B1 |
6904359 | Jones | Jun 2005 | B2 |
6909947 | Douros et al. | Jun 2005 | B2 |
6972668 | Schauble | Jun 2005 | B2 |
6919804 | Cook et al. | Jul 2005 | B1 |
6928348 | Lightner et al. | Sep 2005 | B1 |
6946953 | Lesesky et al. | Sep 2005 | B2 |
6957133 | Hunt et al. | Oct 2005 | B1 |
6958701 | Storkamp et al. | Oct 2005 | B1 |
6879894 | Lightner et al. | Dec 2005 | B1 |
6988033 | Lowrey et al. | Jan 2006 | B1 |
7117121 | Brinton et al. | Mar 2006 | B2 |
7048185 | Hart et al. | May 2006 | B2 |
7080778 | Kressin et al. | Jul 2006 | B1 |
7103460 | Breed | Sep 2006 | B1 |
7113127 | Banet et al. | Sep 2006 | B1 |
7027955 | Markwitz et al. | Nov 2006 | B2 |
7155199 | Zalewski et al. | Dec 2006 | B2 |
7225065 | Hunt et al. | May 2007 | B1 |
7174243 | Lightner et al. | Jun 2007 | B1 |
7174277 | Vock et al. | Jun 2007 | B2 |
7228211 | Lowrey et al. | Jun 2007 | B1 |
7254516 | Case, Jr. et al. | Jul 2007 | B2 |
7362229 | Brinton et al. | Apr 2008 | B2 |
7447574 | Ivashicko et al. | Apr 2008 | B1 |
7343252 | Wiens | Nov 2008 | B2 |
7477968 | Lowrey et al. | Jan 2009 | B1 |
7480551 | Lowrey et al. | Jan 2009 | B1 |
7523159 | Williams et al. | Apr 2009 | B1 |
7564375 | Brinton et al. | Jul 2009 | B2 |
7596437 | Hunt et al. | Sep 2009 | B1 |
7604169 | Demere | Oct 2009 | B2 |
7532962 | Lowrey et al. | Dec 2009 | B1 |
7532963 | Lowrey et al. | Dec 2009 | B1 |
7640185 | Giordano et al. | Dec 2009 | B1 |
7650210 | Breed | Jan 2010 | B2 |
7672756 | Breed | Feb 2010 | B2 |
7672763 | Hunt et al. | Feb 2010 | B1 |
7680595 | Brinton et al. | Mar 2010 | B2 |
7778752 | Hunt et al. | Aug 2010 | B1 |
8378854 | Sills | Feb 2013 | B1 |
20010037174 | Dickerson | Nov 2001 | A1 |
20010047283 | Melick et al. | Nov 2001 | A1 |
20010053983 | Reichwein et al. | Dec 2001 | A1 |
20020016655 | Joao | Feb 2002 | A1 |
20020022979 | Whipp et al. | Feb 2002 | A1 |
20020022984 | Daniel et al. | Feb 2002 | A1 |
20020030595 | Kasik | Mar 2002 | A1 |
20020049054 | O'Connor et al. | Apr 2002 | A1 |
20020057212 | Hamilton et al. | May 2002 | A1 |
20020065698 | Schick et al. | May 2002 | A1 |
20020069017 | Schmier | Jun 2002 | A1 |
20020070882 | Jones | Jun 2002 | A1 |
20020089434 | Ghazarian | Jul 2002 | A1 |
20020099500 | Schmier et al. | Jul 2002 | A1 |
20020163449 | Flick | Jul 2002 | A1 |
20020104013 | Ghazarian | Aug 2002 | A1 |
20020107833 | Kerkinni | Aug 2002 | A1 |
20020107873 | Winkler et al. | Aug 2002 | A1 |
20020111725 | Burge | Aug 2002 | A1 |
20020122583 | Thompson | Sep 2002 | A1 |
20020133273 | Lowrey et al. | Sep 2002 | A1 |
20020133275 | Thibault | Sep 2002 | A1 |
20020147610 | Tabe | Oct 2002 | A1 |
20020150050 | Nathanson | Oct 2002 | A1 |
20020156558 | Hanson et al. | Oct 2002 | A1 |
20020178147 | Arroyo et al. | Nov 2002 | A1 |
20020188593 | Moser et al. | Dec 2002 | A1 |
20030004644 | Farmer | Jan 2003 | A1 |
20030030550 | Talbot | Feb 2003 | A1 |
20030033061 | Chen et al. | Feb 2003 | A1 |
20030120745 | Katagishi et al. | Jun 2003 | A1 |
20030206133 | Cheng | Nov 2003 | A1 |
20030109973 | Hensey et al. | Dec 2003 | A1 |
20030233190 | Jones | Dec 2003 | A1 |
20040009819 | Koga | Jan 2004 | A1 |
20040083054 | Jones | Apr 2004 | A1 |
20040236596 | Chowdhary et al. | Nov 2004 | A1 |
20050038572 | Krupowicz | Feb 2005 | A1 |
20050156759 | Aota | Jul 2005 | A1 |
20050203683 | Olsen et al. | Sep 2005 | A1 |
20050273250 | Hamilton et al. | Dec 2005 | A1 |
20060006228 | Poulter | Jan 2006 | A1 |
20060011721 | Olsen et al. | Jan 2006 | A1 |
20060055564 | Olsen et al. | Mar 2006 | A1 |
20060095277 | Noonan et al. | May 2006 | A1 |
20060097896 | Jones | May 2006 | A1 |
20060145837 | Horton et al. | Jul 2006 | A1 |
20060202030 | Kressin et al. | Sep 2006 | A1 |
20060208075 | Kressin et al. | Sep 2006 | A1 |
20060208087 | Kressin et al. | Sep 2006 | A1 |
20060232406 | Filibeck | Oct 2006 | A1 |
20060280582 | Kouri | Dec 2006 | A1 |
20070040672 | Chinigo | Feb 2007 | A1 |
20070069947 | Banet et al. | Mar 2007 | A1 |
20070172340 | Curotto | Jul 2007 | A1 |
20070179709 | Doyle | Aug 2007 | A1 |
20070262878 | Maruca | Nov 2007 | A1 |
20080140253 | Brown | Jun 2008 | A1 |
20080154489 | Kaneda et al. | Jun 2008 | A1 |
20080154712 | Wellman | Jun 2008 | A1 |
20080314263 | Martin | Dec 2008 | A1 |
20080319665 | Berkobin et al. | Dec 2008 | A1 |
20090177350 | Williams et al. | Jul 2009 | A1 |
20090222200 | Link, II et al. | Sep 2009 | A1 |
20090069999 | Bos | Dec 2009 | A1 |
20100088127 | Betancourt et al. | Aug 2010 | A1 |
20100278620 | Rimsa | Nov 2010 | A1 |
20110068954 | McQuade et al. | Mar 2011 | A1 |
20110116899 | Dickens | May 2011 | A1 |
20110316689 | Reyes | Dec 2011 | A1 |
20160210370 | McQuade et al. | Jul 2016 | A1 |
20160350567 | McQuade et al. | Dec 2016 | A1 |
20180260595 | McQuade et al. | Sep 2018 | A1 |
20180314865 | McQuade | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
2138378 | Nov 1994 | CA |
2326892 | Oct 1999 | CA |
2388572 | May 2001 | CA |
0755039 | Jan 1997 | EP |
0814447 | Dec 1997 | EP |
0926020 | Jun 1999 | EP |
1005627 | Jun 2000 | EP |
1027792 | Aug 2000 | EP |
1067498 | Jan 2001 | EP |
1271374 | May 2004 | EP |
2116968 | Nov 2009 | EP |
60-204071 | Oct 1985 | JP |
11-327628 | Nov 1999 | JP |
2002-96913 | Apr 2002 | JP |
2003-85471 | Mar 2003 | JP |
WO97026750 | Jul 1997 | WO |
WO98003952 | Jan 1998 | WO |
WO98030920 | Jul 1998 | WO |
WO03023550 | Mar 2003 | WO |
WO07092711 | Aug 2007 | WO |
Entry |
---|
“D.0.T. Driver Vehicle Inspection Reports on your wireless phone!” FleeTTrakkeR LLC 2002-2003 FleeTTrakkeR LLC . All rights reserved <http://www. fleettrakker.com/web/index. jsp> Accessed Mar. 12, 2004. |
“Detex Announces the Latest Innovation in Guard Tour Verification Technology.” Detex Life Safety, Security and Security Assurance. Jan. 1, 2003. Ipp. © 2002-2004 Detex Corporation. <http://www.detex.com/NewsAction.jspa?id=3>. |
“Nextel, Motorola and Symbol Technologies Offer First Wireless Bar Code Scanner for Mobile Phones.” Jun. 11, 2003. <http://theautochannel.com/news/2003/06/11/162927 .htm>. |
“OBD Up.” MOTOR: 28-34, Jul. 1998. |
“The Data Acquisition Unit Escorte.” The Proxi Escort.com. Nov. 20, 2001. 4pp. Copyright © 2000 GCS General control Systems. <http://www.gcs.at/eng/produkte/hw/escorte.htm>. |
“The PenMaster” and “The PSION Workabout.” Copyright 2000 GCS General Control Systems. <http://www.gcs.at/eng/produkte/hw/penmaster.htm>. |
“Tracking out of route: software helps fleets compare planned routes to actual miles. (Technology).” Commercial Carrier Journal. Published Oct. 1, 2005. 4pp. NDN-219-1054-1717-0. |
“Transit agency builds GIS to plan bus routes.” American City & County. vol. 118, No. 4. Published Apr. 1, 2003. 4pp. NDN-258-0053-0664-6. |
“What is the Child Check-Mate Safety System”? 2002 © Child Checkmate Systems Inc. <http://www. childcheckmate.com/what.html>. |
Albright, Brian: “Indiana Embarks on Ambitious RFID roll out.” Frontline Solutions. May 20, 2002; 2pp. Available at: (http://www. frontlinetoday .com/frontline/ article/articleDetailjsp?id= 19358>. |
Contact: GCS (UK), Tewkesbury Gloucestershire. Dec. 11, 2002. 2pp. Copyright © 2000 GCS General Control Systems < http://www.gcs.at?eng/news allegemein.htm>. |
Dwyer, H.A., et al. Abstract: “Analysis of the Performance and Emissions of Different Bus Technologies on the city of San Francisco Routes.” Technical paper published by Society of Automotive Engineers, Inc. Published Oct. 26, 2004. 2pp. NDN-116-0014-3890-6. |
Guensler et al., “Development of a Comprehensive Vehicle Instrumentation Package for Monitoring Individual Tripmaking Behavior.” Georgia Institute of Technology: School of Civil and Environmental Engineering: 31pp., Feb. 1999. |
Jenkins et al., “Real-Time Vehicle Performance Monitoring Using Wireless Networking.” IASTED International conference on Communications, Internet, and Information Technology: 375-380, Nov. 22-24, 2004. |
Kurtz, Jennifer. “Indiana's E-Government: A Story Behind It's Ranking.” Incontext Indiana; s Workforce and Economy. Jan.-Feb. 2003 vol. 4, No. 5pp. Available at <http://www.incontext.indiana.edu/2003/jan-feb03/govemement.html>. |
Kwon, W., “Networking Technologies ofln-Vehicle.” Seoul National University: School of electrical engineering: 44pp., Mar. 8, 2000. |
Leavitt, Wendy., “The Convergence Zone.” FleetOwner, 4pp. <www.driversmag.com/ar/fleet_convergence_zone! index.html> 1998. |
Miras. “About SPS Technologies.” Ipg., May 7, 1999. |
Miras. “How MIRAS Works.” Ipg., Apr. 29, 1999. |
Miras. “Miras 4.0 Screenshot.” Ipg., May 7, 1999. Miras. “MIRAS Unit.” Ipg., May 4, 1999. |
Miras. “Monitoring Vehicle Functions.” 1pg., Apr. 27, 1999 Miras. “Remote Control.” 1pg., Apr. 29, 1999. |
Miras. “Tracking & Monitoring Software.” Ipg., Apr. 29, 1999. |
N.a., “MIRAS GPS vehicle tracking using the Internet.” Business Wire, 2pp., Nov. 22, 1996. |
N.a., “Private fleets moving to wireless communications.” FleetOwner, 4pp. <www.driversmag.com/ar/fleet_private _fleets_moving/index.html> 1997. |
Quaan et al., “Guard Tour Systems.” Security Management Online. Sep. 16, 2003. 1pg. © 2000 <http://www.securitymanagement.com/ubb/Forum30/HTML/000066.html>. |
Qualcomm. “Object FX Integrates TrackingAdvisor with QUALCOMM's FleetAdvisor System; Updated Version Offers Benefit of Visual Display of Vehicles and Routes to Improve Fleet Productivity.” Source: Newswire. Published Oct. 27, 2003. 4pp. NDN-121-0510-3002-5. |
Senger, Nancy. “Inside RF/ID: Carving a Niche Beyond Asset Tracking.” Business Solutions. Feb. 1999: 5pp. Available at: <http://www.businesssolutionsmag.com/ Articles/1999 _02/990208 .html>. |
Sterzbach et al., “A Mobile Vehicle On-Board Computing and Communication System.” Comput. & Graphics, vol. 20, No. 4: 659-667, 1996. |
Tiscor: Inspection Manager 6.0 User Guide. USA; 2004. 1-73. |
Tiscor: The Mobile Software Solutions Provider. Inspection Manager: An Introduction. Sep. 27, 2004. Slide presentation; 19pp. Available: www.TISCOR.com. |
Tsakiri, M et al. Abstract: “Urban fleet monitoring with GPS and GLONASS.” Journal of Navigation, vol. 51, No. 3. Published Sep. 1998. 2pp. NDN-174-0609-4097-3. |
Tuttle, John R. “Digital RF/ID Enhances GPS” Proceedings of the Second Annual Wireless Symposium, pp. 406-411, Feb. 15-18, 1994, Santa Clara, CA. |
Want, Roy, “RFIDA Key to Automating Everything.” Scientific American, Jan. 2004, p. 58-65. |
Zujkowski, Stephen. “Savi Technolgy, Inc: Savi Security and Productivity Systems.” ATA Security Forum 2002, Chicago, IL: 21pp., May 15, 2002. |
Number | Date | Country | |
---|---|---|---|
20190042816 A1 | Feb 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15235853 | Aug 2016 | US |
Child | 16157490 | US | |
Parent | 15083208 | Mar 2016 | US |
Child | 15235853 | US | |
Parent | 12942874 | Nov 2010 | US |
Child | 15083208 | US |