The invention is generally related to encoded information reading (EIR) terminals and is specifically related to EIR terminals comprising an imaging device.
RFID methods are widely used in a number of applications, including smart cards, item tracking in manufacturing, inventory management in retail, etc. An RFID tag can be attached, e.g., to an inventory item. An RFID reading terminal can be configured to read the memory of an RFID tag attached to an inventory item.
In one embodiment, there is provided an apparatus that may include a motion sensing device, a communication interface; and a processor. The processor may receive a point of origin that corresponds to a spatial position of the apparatus. The processor may determine, responsive to detecting a user interface command and based on a plurality of values received from the motion sensing device, a spatial position of the apparatus relative to the point of origin, and to acquire an image of an object in a field of view of the apparatus. The processor may also determine a scale factor of the image based on at least the spatial position of the apparatus relative to the point of origin.
In another embodiment, there is provided an encoded information reading (EIR) terminal comprising a microprocessor, a memory, an EIR device including a two-dimensional imaging device, a motion sensing device, and a communication interface. The EIR device can be configured to output raw image data containing an encoded message and/or to output decoded message data corresponding to an encoded message. The EIR terminal can be configured, responsive to detecting a pre-defined pattern in a first plurality of values received from the motion sensing device, to store in the memory a point of origin equal to a first spatial position and orientation of the EIR terminal. The EIR terminal can be further configured, responsive to detecting a user interface command, to determine, based on a second plurality of values received from the motion sensing device, a second spatial position and orientation of the EIR terminal relative to the point of origin, and to acquire an image of an object in a field of view of the imaging device. The EIR terminal can be further configured to determine the image scale factor based on at least the second spatial position. The image scale factor can be provided by a ratio of the size of the object along a chosen direction to the size of the image of the object in the same direction.
In a further aspect, the motion sensing device can be provided by at least three accelerometers configured to measure proper acceleration values of the EIR terminal along at least three mutually-perpendicular axes. In one embodiment, the motion sensing device can be provided by a 9-DOF (degree of freedom) motion sensing unit containing a 3-axis accelerometer, a 3-axis magnetometer, and 3-axis gyroscope sensors.
In a further aspect, the EIR terminal can be further configured to determine a change of a spatial position and orientation of the EIR terminal based on proper acceleration values received from at least the accelerometers.
In a further aspect, the EIR terminal can be further configured to process the acquired image before determining the image scale factor, with the purpose of removing various image distortions including but not limited to keystone-related distortion and/or rotation-related distortions.
In a further aspect, the EIR terminal can be further configured to process the acquired image to detect a plurality of edges of the object and to determine one or more dimensions of the object.
In a further aspect, the EIR terminal can be further configured to transmit the acquired image to an external computer via the communication interface. In one embodiment, the EIR terminal can be further configured to also transmit the imaged object identifier, the object description, and/or one or more dimensions of the object to the external computer.
In a further aspect, the EIR terminal can be further configured to identify the imaged object, e.g., by scanning a bar code attached to the object, or by reading an RFID tag attached to the object.
In a further aspect, the EIR terminal can comprise a second EIR device provided by a bar code reading device, an RFID reading device, or a magnetic card reading device. The EIR device can be configured to output raw message data containing an encoded message and/or to output decoded message data corresponding to an encoded message
In another embodiment, there is provided a method of producing an image of an object by an EIR terminal comprising a microprocessor, a memory, a two-dimensional imaging device, and a motion sensing device. The method can comprise the step of storing in the memory of the EIR terminal a first spatial position of the EIR terminal as a point of origin, responsive to detecting a pre-defined pattern in a first plurality of values received from the motion sensing device. The method can further comprise the step of determining, based on a second plurality of values received from the motion sensing device, a second position of the EIR terminal relative to the point of origin, responsive to detecting a user interface command. The method can further comprise the step of acquiring an image of an object in the field of view of the imaging device. The method can further comprise the step of determining the image scale factor based on at least the second spatial position.
In a further aspect, the motion sensing device can be provided by at least three accelerometers configured to measure proper acceleration values of the EIR terminal along at least three mutually-perpendicular axes. In one embodiment, the motion sensing device can be provided by a 9-DOF (degree of freedom) motion sensing unit containing a 3-axis accelerometer, a 3-axis magnetometer, and 3-axis gyroscope sensors.
In a further aspect, the method can further comprise the step of processing the image before determining the image scale factor, with the purpose of removing various image distortions including but not limited to keystone-related distortion and/or rotation-related distortions.
In a further aspect, the method can further comprise the steps of the processing the image to detect a plurality of edges of the object; and the EIR terminal determining one or more dimensions of the object.
In a further aspect, the method can further comprise the step of the transmitting the image to an external computer. In one embodiment, the method can further comprise the step of transmitting the imaged object identifier, the object description, and/or one or more dimensions of the object to the external computer.
In a further aspect, the method can further comprise the step of identifying the imaged object, e.g., by scanning a bar code attached to the object and/or by reading an RFID tag attached to the object.
For the purpose of illustrating the invention, the drawings show aspects of one or more embodiments of the invention. However, it should be understood that the present invention is not limited to the precise arrangements and instrumentalities shown in the drawings, wherein:
The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views.
In one embodiment, there is provided an EIR terminal comprising a microprocessor, a memory, an EIR device including an imaging device, and a motion sensing device. Using the motion sensing data, the EIR terminal can be configured to determine its spatial position relatively to the object being imaged by the imaging device, or at least the distance to the surface of the object being imaged, as described in details herein infra. Based on the known distance to the imaged physical object, the EIR terminal can be further configured to calculate the scale factor of the image.
The above described functionality can be particularly useful for a portable RFID reading terminal configured to display a scan trace overlaid over an image of a physical structure containing inventory items, thus providing the terminal's operator with a visual feedback with respect to the scanning progress, as described in the commonly assigned U.S. patent application Ser. No. 13/359,005 entitled “Portable RFID Reading Terminal with Visual Indication of Scan Trace” filed on Jan. 26, 2012, which is incorporated herein by reference in its entirety.
At any moment in time, the RF signal coverage emitted by an RFID reading terminal can be defined by a 3D shape. The form and size of the 3D shape defining the RF signal coverage depend, among other factors, on the RFID transmit power level and the number and configuration of the RF antennas employed by the RFID reading device. Hence, a target scan area by an RFID reading terminal can be visualized as a projection of the 3D RF signal coverage shape onto an arbitrarily chosen plane. For a moving RFID reading terminal, a visual scan trace can be provided by a line defined by a multitude of time varying points, each point being a projection of the 3D RF signal coverage shape onto the arbitrarily chosen plane at a given moment in time. The imaginary plane onto which the visual scan trace is projected can be chosen to intersect a physical structure (e.g., a shelving unit) containing a plurality of items to be inventoried, and thus the scan trace can be overlaid over an image of the physical structure.
The images of the physical structures (e.g., shelving units disposed in retail or storage facilities) having known spatial positions and known dimensions can be acquired by the EIR terminal disclosed herein, and can be transmitted to a database and/or to the portable RFID reading terminal employed to read RFID tags attached to items stored in a manufacturing, retail, and/or storage facility.
In one embodiment, the EIR terminal disclosed herein can be equipped with an RFID reading device. The EIR terminal can be configured to acquire an image of a physical structure, determine the scale factor of the image, as described in details herein infra, and then display an RFID scan trace, as described in the above mentioned U.S. patent application Ser. No. 13/359,005
The operator of the EIR terminal can be instructed, before activating the shutter release control for acquiring an image, to bring EIR terminal 100 into a mechanical contact with a surface of a physical object 190 to be imaged, as schematically shown in
In a further aspect, the motion sensing device can be provided by at least three accelerometers configured to measure the proper acceleration values of the EIR terminal along three mutually perpendicular axes. Bringing the EIR terminal into a mechanical contact with a stationary object would result in a distinctive spike in the data returned by the motion sensing device, caused by the mechanical velocities and proper accelerations of the EIR terminal along the three axes becoming zeroes almost immediately, and remaining at zero levels while the operator holds the terminal in a mechanical contact with the physical structure.
Responsive to detecting the above described pattern in the data returned by the motion sensing device, the EIR terminal can be configured to set a point of origin at its current spatial position, and to start tracking, by the motion sensing device, any future movements of the EIR terminal relatively to this point of origin. As follows from the above explanations, the point of origin should coincide with either a pre-defined point or an arbitrary chosen point on the surface of the object of interest.
Responsive to detecting a user interface command (e.g., a shutter release button or touch screen GUI control), the EIR terminal can determine its current position relative to the previously defined point of origin, and release the shutter of the imaging device, thus acquiring an image of the object in the field of view of the imaging lens of the imaging device.
In one embodiment, the operator of EIR terminal 100 can bring EIR terminal 100 into a physical contact with a pre-defined point 199 (hereinafter referred to as the “pre-defined tap point”) on the surface of the physical object 190, as schematically shown in
In another embodiment, the operator of EIR terminal 100 can bring EIR terminal 100 into a physical contact with an arbitrarily chosen point 197 (hereinafter referred to as an “arbitrarily chosen tap point”) on the surface of the physical object 190, as schematically shown in
Thus, in any of the above described two illustrative embodiments, EIR terminal 100 can be configured to determine the distance along the Z-axis between itself and the imaged physical object, based on the known position of the EIR terminal relative to the pre-defined point of origin at the time of releasing the shutter. The acquired image, the distance between the EIR terminal and the imaged object, and/or the position of the EIR terminal relative to the pre-defined point of origin (including orientation of the EIR terminal, i.e., the direction in which the EIR terminal was pointing at the time when the shutter release control was activated) can then be stored in the memory of the EIR terminal.
In a further aspect, an identifier of the imaged physical object can also be stored in the memory of the EIR terminal. In one embodiment, the imaged object can be identified by scanning a bar code label attached to the object and decoding the bar code to retrieve the object identifier. In a further aspect, the bar code label can be attached to the surface of the object in a visual reference with the pre-defined tap point.
In a further aspect, the message encoded in the bar code can further include the object description, such as the position of the object on the facility floor plan and/or characteristics of the object comprising, for example, physical dimensions of the object, the number, sizes, and locations of shelves. In a further aspect, the message encoded in the bar code can further include the coordinates of the pre-defined tap point in the reference frame.
In another embodiment, the imaged physical object can be identified by reading an RFID tag attached to the object. In one embodiment, the RFID tag identifying the physical object can be attached to the surface of the physical object 190 at or in the vicinity the pre-defined tap point 199. Alternatively, the RFID tag identifying the physical object can be mounted elsewhere on the surface or within the physical object. To distinguish the object identifier RFID tag from other RFID tags which can be present within the RFID reading range of the physical object, the object identifier tag can include a pre-defined data pattern, e.g., as a part of the tag identifier (TID). The pre-defined data pattern can serve as an indicator that the RFID tag's memory contains the physical object identifier, object description, and/or the coordinates of the pre-defined tap point in the reference frame. Should more than one RFID tag be read (as would be the case if the shelving unit contained inventory items with RFID tags attached to them), the EIR terminal can select and further interrogate the tag having the TID containing the pre-defined data pattern. In a further aspect, the message encoded in the RFID tag memory can further include the object description, such as the position of the object on the facility floor plan and/or the description of the object comprising, for example, physical dimensions, the number, sizes, and locations of shelves.
In a yet another embodiment, the object identifier, object description, and/or the coordinates of the pre-defined tap point in the origin frame can be entered into the EIR terminal via the user interface.
In a further aspect, the acquired imaged can be assumed to include the entire physical object (e.g., a physical structure sustaining a plurality inventory items, such as a shelving unit). The EIR terminal can be configured to process the acquired image to detect the edges corresponding to the boundaries of the imaged object 2010, as schematically shown in
In the illustrative embodiment of
In one embodiment, the EIR terminal can be configured to detect edges by computing a plurality of derivatives of image pixels brightness, followed by searching for local maxima of the first order derivatives of image pixel brightness (e.g., by searching for zero crossings by the second-level derivatives). Image pixels corresponding to the local maxima of the first order derivatives of pixel brightness can be presumed to indicate the edges within the image. A skilled artisan would appreciate the fact that other methods of edge detection is within the scope of this disclosure.
In a further aspect, the EIR terminal can be configured to process the acquired image to correct any keystone-, and/or rotation-related image distortions. Keystone-related distortions, often nicknamed “keystone effect” can be caused by the optical axis of the imaging device not being substantially perpendicular to the center of the surface of the imaged object, resulting in the image of a rectangle on the surface of the imaged object becoming a trapezoid (which is the shape of an architectural keystone, which explains the name of the effect).
Rotation-related image distortions are deviations of horizontal (and/or vertical) lines within the imaged object from the respective horizontal and vertical axes in the image frame.
In some situations, the imager during the exposure period can be rotated with respect to the frame of reference of the physical object 2010f, as schematically illustrated by
In a further aspect, the EIR terminal can be configured to determine an image scale factor as a function of the distance between the EIR terminal and the imaged object, measured in the direction orthogonal to the front surface of the imaged object (i.e., the direction of Z axis as depicted in
S=ƒ(z),
wherein S is the image scale factor; and
z is the distance between the EIR terminal and the surface of the imaged object, measured in the direction orthogonal to the front surface of the imaged object.
The image scale factor can be defined as the ratio of the physical size of the object (in the units of length, e.g., feet and inches) in a chosen direction to the size of the image of the object (in pixels) in the same direction.
In one embodiment, the EIR terminal can be configured to calculate the scale factor of the acquired image using the function ƒ(d) defined by calibration of the camera lens with reference images of specific dimensions at selected values of distance d. In another embodiment, the EIR terminal can be configured to calculate the scale factor of the acquired image using the lens maker equation and image sensor specification.
In one illustrative embodiment, schematically shown in
S=G/|P0−P1|,
wherein S is the image scale factor measured in the units of length per image pixel;
|P0-P1| is the separation of corresponding edges in image in the given direction measured in pixels; and
G is the field of view of the imaging lens which can be determined as follows:
G=2*d*tan(α/2),
wherein α is the maximum angle of view of the imaging lens; and
d is the distance to the object determined using motion sensing data as described herein supra.
In a further aspect, a physical dimension of the imaged object can be calculated as follows:
D=|D2−D3|=S*|P2−P3|,
wherein |D2-D3| is the distance between two points D2 and D3 situated on the surface of the imaged object measured in the units of length; and
|P2-P3| is the distance between corresponding images P2 and P3 measured in pixels of points D2 and D3, respectively.
In one embodiment, the acquired image, the object identifier, the object description, and/or the calculated image scale factor can be transmitted by EIR terminal 100 to an external computer via a wired or wireless communication interface. In one embodiment, the acquired image, the object identifier, the object description, and/or the calculated image scale can be stored in a database together with a description of the physical structure.
Component-level diagram of one embodiment of the EIR terminal disclosed herein is now being described with references to
EIR terminal 100 can further comprise a communication interface 340 communicatively coupled to the system bus 370. In one embodiment, the communication interface can be provided by a wireless communication interface. The wireless communication interface can be configured to support, for example, but not limited to, the following protocols: at least one protocol of the IEEE 802.11/802.15/802.16 protocol family, at least one protocol of the HSPA/GSM/GPRS/EDGE protocol family, TDMA protocol, UMTS protocol, LTE protocol, and/or at least one protocol of the CDMA/1×EV-DO protocol family. In another embodiment, the communication interface 340 can be provided by a wired interface. In a yet another embodiment, the communication interface 340 can be provided by an optical interface. A skilled artisan would appreciate the fact that other types of communication interfaces are within the scope of this disclosure.
EIR terminal 100 can further comprise a battery 356. In one embodiment, the battery 356 can be provided by a replaceable rechargeable battery pack. EIR terminal 100 can further comprise a GPS receiver 380. EIR terminal 100 can further comprise at least one connector 390 configured to receive a subscriber identity module (SIM) card.
EIR terminal 100 can further comprise one or more EIR devices 330. EIR device can be provided, for example, by a bar code reading device, a magnetic card reading device, a smart card reading device, or an RFID reading device. A skilled artisan would appreciate the fact that other types of EIR devices are within the scope of this disclosure. In one embodiment, EIR device 330 can be configured to output raw message data containing an encoded message. Alternatively, EIR device 330 can be configured to output decoded message data corresponding to an encoded message. For example, a bar code reading device can be configured to output an image containing a bar code and/or to output a byte sequence containing a decoded message corresponding to a scanned bar code. In another example, an RFID reading device can be configured to read and output a byte sequence from a memory of an RFID tag.
As used herein, “message” is intended to denote a bit sequence or a character string comprising alphanumeric and/or non-alphanumeric characters. An encoded message can be used to convey information, such as identification of the source and the model of an item, for example, in an EPC code.
As noted herein supra, EIR device 330 can comprise an imaging device 333 comprising a two-dimensional image sensor and at least one imaging lens which can be employed to focus an image of the target object onto the image sensor.
In one embodiment, EIR terminal 100 can further comprise a graphical user interface including a display adapter 175 and a keyboard 179. In one embodiment, the EIR terminal 100 can further comprise an audio output device, e.g., a speaker 181.
It is not necessary that a device's primary function involve reading encoded messages in order to be considered an EIR terminal; for example, a cellular telephone, a smart phone, a PDA, or other portable computing device that is capable of acquiring two-dimensional images can be referred to as an EIR terminal for purposes of this disclosure.
In a further aspect, EIR terminal 100 can be incorporated in a data collection system. One embodiment of the data collection system, schematically shown in
An EIR terminal 100a-100z can establish a communication session with an external computer 171 (provided, for example, by a database server 171a or a portable RFID reading terminal 171b). In one embodiment, network frames can be exchanged by the EIR terminal 100 and the external computer 171 via one or more routers 140, access points 135, and other infrastructure elements. In another embodiment, the external computer 171 can be reachable by the EIR terminal 100 via a local area network (LAN). In a yet another embodiment, the external computer 171 can be reachable by the EIR terminal 100 via a wide area network (WAN). In a yet another embodiment, the external computer 171 can be reachable by the EIR terminal 100 directly (e.g., via a wired or wireless interface). A skilled artisan would appreciate the fact that other methods of providing interconnectivity between the EIR terminal 100 and the external computer 171 relying upon LANs, WANs, virtual private networks (VPNs), and/or other types of network are within the scope of this disclosure.
A “computer” herein shall refer to a programmable device for data processing and control, including a central processing unit (CPU), a memory, and at least one communication interface. For example, in one embodiment, a computer can be provided by a server running a single instance of a multi-tasking operating system. In another embodiment, a computer can be provided by a virtual server, i.e., an isolated instance of a guest operating system running within a host operating system. A “network” herein shall refer to a set of hardware and software components implementing a plurality of communication channels between two or more computers. A network can be provided, e.g., by a local area network (LAN), or a wide area network (WAN). While different networks can be designated herein, it is recognized that a single network as seen from the application layer interface to the network layer of the OSI model can comprise a plurality of lower layer networks, i.e., what can be regarded as a single Internet Protocol (IP) network, can include a plurality of different physical networks.
The communications between the EIR terminal 100 and the external computer 171 can comprise a series of requests and responses transmitted over one or more TCP connections. A skilled artisan would appreciate the fact that using various transport and application level protocols is within the scope of this disclosure.
As noted herein supra, at least one of the messages transmitted by EIR terminal 100 to external computer 171 can include an image of a physical object (e.g., a physical structure sustaining one or more retail items), the object identifier, and the image scale factor calculated by the EIR terminal. In one embodiment, at least one of the messages transmitted by the EIR terminal 100 to external computer 171 can further comprise physical dimensions of the object which can be calculated by EIR terminal 100 as described herein supra and/or inputted by the EIR terminal 100, e.g., by decoding a bar code or querying an RFID tag attached to the imaged object. In another embodiment, at least one of the messages transmitted by the EIR terminal 100 to external computer 171 can further comprise the object description, such as the position of the object on the facility floor plan and/or characteristics of the object comprising, for example, the number, sizes, and locations of shelves.
As noted herein supra, in one embodiment the external computer 171 can be provided by a database server 171a configured to store images and descriptions of physical objects (e.g., physical structured employed to sustain inventory items in manufacturing, retail, or storage facilities). In another embodiment, the external computer 171 can be provided by a portable RFID reading terminal 171b employed to read RFID tags attached to items stored in a manufacturing, retail, and/or storage facility. A skilled artisan would appreciate the fact that other types and uses of external computers 171 are within the scope of this disclosure.
One embodiment of a method of acquiring an image of a physical object by an EIR terminal disclosed herein is now being described with references to
At steps 5010-5020, EIR terminal 100 can perform a data input loop acquiring data from its motion sensing device, and responsive to detecting, at step 5020, a pre-defined pattern in the acquired data, the processing can continue at step 5030; otherwise, the method can loop back to step 5010. As noted herein supra, the pre-defined data pattern can be chosen to correspond to the mechanical velocities and proper accelerations of the EIR terminal along three mutually perpendicular axes immediately becoming zeroes, which can be caused by the operator of the EIR terminal bringing the EIR terminal in a mechanical contact with a stationary physical object.
At step 5030, EIR terminal 100 can store in the memory its current spatial position and orientation as a point of origin. As explained herein supra, the operator of the EIR terminal can be instructed, before activating the shutter release control, to bring the EIR terminal 100 into a mechanical contact with a pre-defined area 199 of a physical object 190 to be imaged, which would result in the EIR terminal setting the point of origin to coincide with a pre-defined point on the surface of the object 190.
At step 5035, EIR terminal 100 can input the identifier of the imaged physical object. In one embodiment, the physical object can be identified by scanning a bar code label attached to the object and decoding the bar code to retrieve the object identifier. In another embodiment, the imaged object can be identified by reading an RFID label attached to the object, as described in details herein supra. In a further aspect, the message encoded in the bar code or in the RFID tag can further include the object description, such as the position of the object on the facility floor plan and/or the description of the object comprising, for example, object dimensions, the number, sizes, and locations of shelves. In an alternative embodiment, the step of inputting the imaged object identifier can precede the step 5010 of acquiring motion sensing data, i.e., the operator of EIR terminal 100 can scan a bar code label attached to the surface of the imaged object (or, in another embodiment, EIR terminal 100 can read an RFID tag attached to the surface of the imaged object) either before or after “tapping” a designated point on the surface of the imaged object.
At steps 5040-5050, EIR terminal 100 can perform a user interface input loop, and responsive to establishing at step 5050 that Shutter release button has been activated by the operator of EIR terminal 100, the processing can continue at step 5060; otherwise, the method can loop back to step 5040. A skilled artisan would appreciate the fact that other ways of initiating an image acquiring operation are within the scope of this disclosure.
The user interface loop can comprise a step 5045 of acquiring motion data from a motion sensing device. As noted herein supra, in one embodiment, the motion sensing device can be provided by at least three accelerometers configured to measure proper acceleration values of the EIR terminal along at least three mutually-perpendicular axes. In another embodiment, the motion sensing device can be provided by a 9-DOF (degree of freedom) motion sensing unit containing a 3-axis accelerometer, a 3-axis magnetometer, and 3-axis gyroscope sensors.
At step 5060, EIR terminal 100 can determine the current position of the EIR terminal relative to the previously identified point of origin, based on the data received from the motion sensing device at step 5045 since the moment of detecting a mechanical contact with a stationary object.
At step 5070, EIR terminal 100 can acquire an image of the object focused onto by the imaging lens.
At step 5075, EIR terminal 100 can process the image. In one embodiment, image processing comprises removing keystone- and rotation-related distortions as described in details herein supra. In another embodiment, image processing further comprise detecting edges within the image as described in details herein supra.
At step 5080, EIR terminal 100 can calculate the scale factor of the acquired image and the dimensions of the imaged object, as described in details herein supra.
At step 5100, EIR terminal 100 can transmit to an external computer the object identifier, the acquired image of the object, the object description, and/or the calculated image scale factor, and the method can terminate.
One embodiment of the EIR terminal 100 is schematically shown in
While the present invention has been particularly shown and described with reference to certain exemplary embodiments, it will be understood by one skilled in the art that various changes in detail may be affected therein without departing from the spirit and scope of the invention as defined by claims that can be supported by the written description and drawings. Further, where exemplary embodiments are described with reference to a certain number of elements it will be understood that the exemplary embodiments can be practiced utilizing less than the certain number of elements.
A small sample of systems, methods, and apparata that are described herein is as follows:
This is a Continuation of application Ser. No. 13/451,744 filed Apr. 20, 2012. The disclosure of the prior application is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5331419 | Yamada | Jul 1994 | A |
6659344 | Otto et al. | Dec 2003 | B2 |
7003138 | Wilson | Feb 2006 | B2 |
7015967 | Kochi et al. | Mar 2006 | B1 |
7237721 | Bilcu et al. | Jul 2007 | B2 |
7308158 | Herbert et al. | Dec 2007 | B2 |
7405662 | Steinke et al. | Jul 2008 | B2 |
7407096 | McQueen et al. | Aug 2008 | B2 |
7494063 | Kotlarsky et al. | Feb 2009 | B2 |
7501950 | Suzuki | Mar 2009 | B2 |
7535361 | Doan et al. | May 2009 | B2 |
7551090 | Doan et al. | Jun 2009 | B2 |
7602288 | Broussard | Oct 2009 | B2 |
7627191 | Xu et al. | Dec 2009 | B2 |
7677602 | Bennett et al. | Mar 2010 | B2 |
7696874 | Stevens | Apr 2010 | B2 |
7702187 | Rusman et al. | Apr 2010 | B2 |
7708205 | Kotlarsky et al. | May 2010 | B2 |
7735731 | Skaaksrud et al. | Jun 2010 | B2 |
7756292 | Lev | Jul 2010 | B2 |
7756319 | Odell | Jul 2010 | B2 |
7786865 | Park | Aug 2010 | B2 |
7786925 | Knibbe et al. | Aug 2010 | B1 |
7815121 | Kotlarsky et al. | Oct 2010 | B2 |
7821400 | Tabet et al. | Oct 2010 | B2 |
7831082 | Holsing et al. | Nov 2010 | B2 |
7855643 | Tuttle | Dec 2010 | B2 |
7870999 | Skaaksrud et al. | Jan 2011 | B2 |
7883013 | Skaaksrud et al. | Feb 2011 | B2 |
7886972 | Skaaksrud et al. | Feb 2011 | B2 |
7951003 | Russell et al. | May 2011 | B2 |
7961908 | Tzur et al. | Jun 2011 | B2 |
7965186 | Downie et al. | Jun 2011 | B2 |
8115601 | Nonaka | Feb 2012 | B2 |
8149094 | Deoalikar et al. | Apr 2012 | B2 |
8727225 | Zumsteg et al. | May 2014 | B2 |
20020165758 | Hind et al. | Nov 2002 | A1 |
20050212817 | Cannon et al. | Sep 2005 | A1 |
20060053645 | Rock | Mar 2006 | A1 |
20060131418 | Testa | Jun 2006 | A1 |
20060262961 | Holsing et al. | Nov 2006 | A1 |
20060266836 | Bilcu et al. | Nov 2006 | A1 |
20070008136 | Suzuki | Jan 2007 | A1 |
20070102506 | Stevens | May 2007 | A1 |
20070199995 | Kotlarsky et al. | Aug 2007 | A1 |
20070215706 | Kotlarsky et al. | Sep 2007 | A1 |
20080037899 | Xu et al. | Feb 2008 | A1 |
20080061937 | Park | Mar 2008 | A1 |
20080111661 | Lin et al. | May 2008 | A1 |
20080117167 | Aonuma | May 2008 | A1 |
20080122785 | Harmon | May 2008 | A1 |
20080164313 | Kotlarsky et al. | Jul 2008 | A1 |
20080164317 | Kotlarsky et al. | Jul 2008 | A1 |
20080169343 | Skaaksrud et al. | Jul 2008 | A1 |
20080172303 | Skaaksrud et al. | Jul 2008 | A1 |
20080173706 | Skaaksrud et al. | Jul 2008 | A1 |
20080173710 | Skaaksrud et al. | Jul 2008 | A1 |
20080203147 | Skaaksrud et al. | Aug 2008 | A1 |
20080203166 | Skaaksrud et al. | Aug 2008 | A1 |
20080210749 | Skaaksrud et al. | Sep 2008 | A1 |
20080210750 | Skaaksrud et al. | Sep 2008 | A1 |
20080224870 | Yeo et al. | Sep 2008 | A1 |
20080285091 | Skaaksrud et al. | Nov 2008 | A1 |
20090009626 | Ko | Jan 2009 | A1 |
20090021353 | Nanaka | Jan 2009 | A1 |
20090040025 | Volpi et al. | Feb 2009 | A1 |
20090045913 | Nelson et al. | Feb 2009 | A1 |
20090045924 | Roberts, Sr. et al. | Feb 2009 | A1 |
20090121025 | Romanchik | May 2009 | A1 |
20090161964 | Tzur et al. | Jun 2009 | A1 |
20090243801 | Strzelczyk | Oct 2009 | A1 |
20090245755 | Lee et al. | Oct 2009 | A1 |
20090322537 | Tapp et al. | Dec 2009 | A1 |
20100045436 | Rinkes | Feb 2010 | A1 |
20100073487 | Sogoh et al. | Mar 2010 | A1 |
20100109844 | Carrick et al. | May 2010 | A1 |
20100109903 | Carrick | May 2010 | A1 |
20100142825 | Maxwell et al. | Jun 2010 | A1 |
20100148985 | Lin et al. | Jun 2010 | A1 |
20100201488 | Stern et al. | Aug 2010 | A1 |
20100201520 | Stern et al. | Aug 2010 | A1 |
20100220894 | Ackley et al. | Sep 2010 | A1 |
20100226530 | Lev | Sep 2010 | A1 |
20100232712 | Tomita et al. | Sep 2010 | A1 |
20100250183 | Willins et al. | Sep 2010 | A1 |
20100252621 | Ito et al. | Oct 2010 | A1 |
20100271187 | Uysal et al. | Oct 2010 | A1 |
20100296753 | Ito et al. | Nov 2010 | A1 |
20100303348 | Tolliver et al. | Dec 2010 | A1 |
20100308964 | Ackley et al. | Dec 2010 | A1 |
20110052008 | Holsing et al. | Mar 2011 | A1 |
20110084808 | Tuttle | Apr 2011 | A1 |
20110115947 | Oh | May 2011 | A1 |
20110128125 | Kai et al. | Jun 2011 | A1 |
20110143811 | Rodriguez | Jun 2011 | A1 |
20110205387 | Tzur et al. | Aug 2011 | A1 |
20110212717 | Rhoads et al. | Sep 2011 | A1 |
20110280447 | Conwell | Nov 2011 | A1 |
20110284625 | Smith et al. | Nov 2011 | A1 |
20110290883 | Kotlarsky et al. | Dec 2011 | A1 |
20120105825 | Gogolla et al. | May 2012 | A1 |
20120205440 | Fujiwara | Aug 2012 | A1 |
20120218444 | Stach | Aug 2012 | A1 |
20130049962 | Smith | Feb 2013 | A1 |
20130194077 | Vargas et al. | Aug 2013 | A1 |
20130278386 | Zumsteg | Oct 2013 | A1 |
20130278393 | Zumsteg | Oct 2013 | A1 |
20130306720 | Todeschini et al. | Nov 2013 | A1 |
20140014724 | Koziol et al. | Jan 2014 | A1 |
Entry |
---|
EPC Global, Specification for RFID Air Interface, EPC Radio-Frequency Identity Protocols Class-1 Generation-2 UHF RFID Protocol for Communications at 860 MHz-960 MHz, Version 1.0.9, Jan. 31, 2005, pp. 1-94. |
Number | Date | Country | |
---|---|---|---|
20150053761 A1 | Feb 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13451744 | Apr 2012 | US |
Child | 14532439 | US |