This invention relates to a method and apparatus for automatically determining the characteristics of a package, such as weight and dimensions, at the point of package acceptance.
It may be desirable for the employee of the shipping organization to know certain physical characteristics of a package being dropped off by the customer. For example, the price which the organization charges the customer may be determined in whole or in part on the weight of the package. Alternatively, the price may be determined in light of the shape or dimensions of the package. In another example, the employee may be required to sort the package based on its physical characteristics. If the customer is unable to provide the dimensions, or if the employee is to verify the customers reported dimensions, some sort of characteristic determination must be employed. Waiting to perform such determinations until the package is transferred to a conveyor system equipped to do the job may result in significant delay if the customer is required to wait. Alternatively, if the customer is excused before such determinations are made, insufficient or excessive fees may be assessed. For these reasons, and others, it may be desirable to enable the employee to perform characteristic determination quickly and accurately at the point of package acceptance.
Determining the dimensions and weight of a package are essential to modern methods of shipping. For example, the price charged to ship an item may be determined in whole, or in part, on the shape, size, or weight of the item. In another example, shippers may choose to sort or organize packages based on the absolute or relative dimensions or weight of each package in order to optimize the way in which transport vehicles are loaded and routed. For these and many other reasons, numerous efforts have been made to facilitate quick and effective determination of object characteristics such as physical dimensions and weight. For example, numerous characteristic acquisition systems have been patented. The systems and methods known in the art for object characterization are commonly designed for large scale use as part of a conveyer system and involve elaborate arrays of sensors and other assorted hardware.
Capturing, storing and processing a picture of the customer and the package would be also useful information to track the package and the persons submitting the package.
For the foregoing reasons there is a need for a convenient, economic means for determining characteristics of an object at the point of package acceptance.
The present invention is directed to systems and methods for determining the characteristics of an object or package and a customer. In one embodiment, a camera and a set of lasers are positioned at a distance from an object or package. The lasers are directed towards the object and the camera is directed to detect the location of the laser beam on the object. A processor is used to determine the relative positions of the laser beam projection within the field of detection of the camera and determine the distance of the object from the camera or other reference surface.
In another embodiment, a processor may use this distance measurement along with the profile of the object within the camera's field of detection to determine one or more additional physical characteristics of the object. In one example, these one or more additional physical characteristics may be the perimeter of the object, or, if the object is rectangular, the object's length and width.
In another embodiment, the reference surface may be a weight measuring device and the weight of the object may be determined before, during, or after the process of determining other physical characteristics of the object. In another embodiment, additional information may be collected to associate the object or package to the customer.
In another embodiment, the above features may be integrated with other standard features such as processors, scale, printers, scanners, etc. into a single mailing point of sale or self service device. The preceding is meant only to illustrate some of the embodiments of the present invention and is not to be read to limit the scope of the invention. A more detailed description may be found below.
The detailed description set forth below in connection with the appended drawings is intended as a description of presently-preferred embodiments of the invention and is not intended to represent the only forms in which the present invention may be constructed or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the invention in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences may be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of the invention.
It will be further appreciated that while the package or object is depicted as a rectangular box, embodiments of the present invention are not limited to operation on similarly shaped packages. For clarity in explanation, however, when rectangular box type objects are described, the following standard reference system will be used unless otherwise stated. The dimension extending perpendicularly from a reference surface 102 upon which an object 10 rests will be referred to as the object's height H. The longer of the remaining two dimensions will be referred to as the object's length L. The remaining dimension will be referred to as the object's width W. Again, it will be appreciated that this reference is adopted solely for the purpose of explanation and not as a limitation on the scope or operation of embodiments of the present invention.
As shown in
The reference surface 102 may be a passive element upon which the object 10 rests. In some embodiments, the reference surface 102 may be a device for determining the weight or mass of object 10. For example, the reference surface 102 may be a scale for measuring weight.
As shown in
The laser beam 206 may be perpendicular to the reference surface 102 or it may be set at a predetermined angle. An object 10 may be placed on the reference surface 102 so that the laser beam 206 projects onto the object 10. In some embodiments, the measurement system 104 may have two lasers 204, 208 directed towards the reference surface 102. The two lasers 204, 208 may be arranged in any configuration relative to each other, projecting their respective laser beams 206, 210 towards the reference surface 102 or an object 10. In a preferred embodiment, the two lasers 204, 208 are positioned bilaterally to the optical detection device 200. In another embodiment, a plurality of lasers may direct a plurality of laser beams towards the reference surface 102. The laser beam 206 projected onto object 10 may be in the form of line segments, circles, squares, rectangles, triangles, or any other geometric configuration.
As depicted in
If the height H of the reference object 10 is known, the height H′ of the uncharacterized object 10′ can be calculated based on the known height H of the reference object 10, the measurement of the reference distance D, and the measurement of the variable distance D′ because the ratio of the reference height H to the reference distance D should be the same as the ratio of the variable height H′ to the variable distance D′. In other words, H/D is proportional H′/D′. A conversion factor C can be determined based on how the change in height of a reference object correlates with the change in pixels in the object from a first height to a second height. Once the measurement system 104 is calibrated with a reference object of a known height, a new distance D′ can be measured using the measurement system 104 and the new height H′ can be calculated using the equation H′=(H/D)*D′*C, where H is a known height of the reference object 10, D is the measured distance between the reference laser beam images 214, 216 in the reference image frame 212, D′ is the measured distance between the variable laser beam images 214′, 216′ in the variable image frame 212′, and H′ is the height of the unknown object 10′.
Again it will be appreciated that more or fewer lasers could be used. For example, in some embodiments as shown in
In some embodiments, the package dimensioner and reader may utilize a plurality of lasers. For example, if four lasers are used, the distance between each of two pairs of laser beam spots may be calculated. Advantageously, this pair of measurements provides for the possibility of error detection, thereby improving accuracy through averaging.
While the words laser, optical, and camera are used for convenience in explanation, it will be appreciated that aspects of the present invention may be implemented using similar devices that function in other ranges of the electromagnetic spectrum or by other means of transmission. For example, radiation sources operating outside of the visible spectrum coupled with a detector capable of detecting such radiation may also be used under certain conditions.
In some embodiments, the distance measurements are made in units of pixels. After determining the distance D′ between the laser beam images 214′, 216′, the actual height H′ of the uncharacterized object 10′ is calculated 808. In some embodiments, the correlation between various heights H′ and the distances D′ can be stored in a database and readily available as a look up table. A table of actual heights and pixel measurements can be generated before hand and the height H′ corresponding to the current distance D′ measurements can be quickly accessed. Thus, prior to characterizing any uncharacterized object 10′, a reference object 10, can be used to generate a conversion factor for the database. In one embodiment, the height dimension is determined to the accuracy of tenths of an inch. Advantageously, the present method, and the associated system provide a means for quickly and accurately determining the height H′ of an uncharacterized object 10′
In addition to determining the height H′ of an uncharacterized object 10′, the present system may also be used to determine additional characteristics of an uncharacterized object 10′ in accordance with an embodiment of the present invention. For example, the length L′ and width W′ of an uncharacterized object 10′ may be determined as well.
As shown in
For example, if the uncharacterized object 10′ is square or rectangular, the length L′ and width W′ of the object may be the additional desired characteristics. If the uncharacterized object 10′ is circular, the radius or circumference may be additional desired characteristics. For other shapes, the perimeter may be a desired characteristic. In one embodiment, the representative values of these desired characteristics are determined by comparing the reference image 212 frame with the variable image frame 212′ generated during execution of the steps described above. Differences between the reference image frame 212 and the variable image frame 212′ are analyzed to generate an outline of the uncharacterized object 10′.
In one embodiment, as depicted in
Using a reference object 10 with known dimensions, a conversion factor between pixel-length and a unit of distance (e.g. inches, centimeters, etc.) may be determined. Using the determined conversion factor, the representative pixel values may be converted into actual measurements. For example, if the uncharacterized object 10′ is square or rectangular and the length L′ and width W′ had been determined in terms of pixels, a conversion factor such as P pixels per inch or per centimeter could be determined based on how the pixel numbers change within an object based on the height of the object (or the variable camera height Z′). Dividing the pixel-lengths by the conversion factor would determine the actual length and width of the object. In one embodiment, as with the case for converting a pixel measurement into the height of the object, these conversion factors may be determined before hand for quick and easy access during processing. Other algebraic, trigonometric, geometric, and other mathematical principles and formulae may be applied to calculate the actual dimensions of an uncharacterized object 10′, such as length, width, height, diameter, perimeter, area, circumference, etc., from pixel counts.
As shown in
For example, a laser line beam 1000 may be projected across the field of detection 202 either orthogonal or oblique to reference surface 102. An uncharacterized object 10′ having a non-uniform top surface (e.g. cylindrical container on its side, pyramidal container, trapezoidal container, etc.) may be placed under the laser line beam 1000 such that the non-uniform portion or the point of change 1004 intersects the laser line beam 1000. The laser line beam 1000 generating a laser line segment 1002 on top of the uncharacterized object 10′ would create a laser line segment 1002 with uniform characteristics where the top surface is uniformly flat. If, however, the distance of the top surface to the laser source 204 changes (e.g. the top surface is not uniformly flat due to curve, slope, dip, etc.), then a change in the characteristic of the laser line segment 1002 would be present. For example, the portion of the laser line beam 1000 projecting on to the point of change 1004 of the surface of the uncharacterized object 10′ may cause a diffraction, deflection, or an otherwise altered absorption of the laser line beam 1000. This change would indicate that the top surface is not uniform and translate into an alteration 1006 of the laser line segment 1002. If the laser line beam 1000 is orthogonal to the reference surface 102, then the alteration 1006 in the laser line segment may be a change in contrast as shown in
In some embodiments, the laser line segment may project onto the uncharacterized object 10′ at an oblique angle or an angle not orthogonal to the top surface. Again, where the top surface is uniform, the projected laser line segment 1002 is also uniform in shape. At the location where the top surface changes, the laser line segment 1002 projected onto the surface at the point of change 1004 experiences an alteration 1006 in characteristic. For example, the laser line segment 1002 may appear bent at the point of change 1004 on the top surface as shown in
In some embodiments, the laser line beam 1000 may be projected incident to the reference surface 102 and the optical detection device 200 may be pointed at an oblique angle to the reference surface 102 so that a perspective view of the uncharacterized object 10′ is seen. In such an embodiment, the location where the change 1004 in the top surface of the uncharacterized object 10′ occurs, results in a break, bend or some other alteration 1006 in the laser line segment 1002 on the uncharacterized object 10′ depending on whether the change on the top surface is abrupt or gradual and the extent of the change 1004.
In some embodiments, using other types of light sources, a change in the distance of the top surface from the light source 204 results in a change in the dimension of the line segment formed on the top surface. For example, as shown in
Using control objects 10, the degree of the alteration 1006 in the line segment 1002 (e.g. the degree of change in contrast, the degree of change in the bend, the degree of change in the width of the line segment, etc.) may be used to calculate the extent of the change in the top surface.
Many different variations of placement of the light source 204 and the optical detection device 200 relative to the uncharacterized object 10′ or the reference surface 102 have been contemplated. In each case, an alteration 1006 in the laser line segment 1002 projected on the uncharacterized object 10′ at the point of change 1004 can be detected. Once the controls have been established, the changes in the line segment characteristics may be quantified to determine the precise shape of the uncharacterized object. In addition, a plurality of light sources may be used to more fully characterize the object using these principles.
In some embodiments, optional upside package details such as sender and recipient addresses, barcode package information, payment transaction number, additional services requested, and customs information can be examined 908. The processor may further comprise optical character recognition (OCR) and/or Zooming-In capabilities to examine the captured images for additional processing. Thus, any text or barcode information on the top surface of an uncharacterized object may be captured by the optical detection device 200 and read by the processor to determine additional information.
This additional information may include a picture of the customer, a picture of the package, an OCR reading of the package sender and receiver information, a payment transaction number, barcode information containing packaging information, additional services requested, and customs information.
In some embodiments, the optical detection device 200 may be movable to alter the field of detection 202. For example, the optical detection device 200 may be directed towards the uncharacterized object 10′, then moved to an oblique position relative to the lasers to take a picture of the customer.
In some embodiments, the package dimensioner and reader may comprise a second optical detection device 110 to capture an image of the customer or any other intended image.
The package dimensioner and reader 100 may also include a support member 108. As illustrated in
The package dimensioner and reader may be integrated into a single mailing point of sale system. The processor 106 communicates with the measurement system 142, a scale for measuring weight, a credit/debit card 1100, printer 1102, and barcode reader 1104. It will be appreciated that the processor may be internal to either the measurement system 104 or the scale, or may be housed separately. For example, the processor 106 associated with memory executes code to orchestrate the interaction of the systems. In another example, the processor 106 may be a personal computer (PC) or other general-purpose computer or application specific integrated circuit (ASIC) or other programmable logic designed to carry out the described functionality.
In use, a reference image frame 212 is generated. The reference surface 102 comprising a scale alerts the processor 106 that the scale has reached a steady state, non-zero weight after an uncharacterized object 10′ was placed on the reference surface 102. The processor 106 alerts the measurement system 104 to activate the lasers 204, 208. The processor 106 alerts the measurement system 104 to activate the cameras 200, 800. The measurement system 104 generates variable image frames 212′ and sends it to the processor 106. The processor 106 determines the height H′ of the uncharacterized object 10′ by converting the variable distance D′ between laser beam images 214′, 216′ into height H′ based on a predetermined conversion factor. The processor 106 determines an outline of the uncharacterized object 10′ by comparing the reference image frame 212 to the variable image frame 212. The processor 106 determines the length L′ and width W′, or other pertinent characteristics, in numbers of pixels. The processor 106 determines the actual length L′ and width W′, or other pertinent characteristics, by converting from pixels to actual length based on the conversion factor. The processor 106 determines and processes optional upside package details and customer picture.
The foregoing description of the preferred embodiment of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.