This disclosure relates generally to computing devices, and more specifically to determining physical locations of items within an environment.
In the operation of a physical environment, such as a warehouse or datacenter, various items may be moved throughout the environment over time. For example, tools, computer systems, or other items may be moved between locations based on the needs of a user at a particular time, rather than according to a schedule. Determining the physical location of items within a large area may therefore be burdensome in some cases. In a warehouse, for example, in which there are a large number of items belonging to various parties distributed throughout the warehouse, it may be particularly difficult to determine a physical location of an item within the environment.
Techniques are disclosed relating to determining a physical location of an item within an environment. For example, in various embodiments, a location system may determine a location of a first item of a plurality of items. In some embodiments, the location system may emit a pulse of light via a light source. The location system may receive a plurality of reflections that have been reflected from retroreflective material on one or more of the plurality of items. Further, in some embodiments, the location system may determine a direction of the location of the first item relative to a reference location. For example, in some embodiments, the location system may determine the direction based on an angle of a reflection corresponding to the first item. Further, in various embodiments, the location system may determine a distance between the reference location and the first item. Additionally, in some embodiments, the location system may determine identification information associated with the first item.
Although specific embodiments are described below, these embodiments are not intended to limit the scope of the present disclosure, even where only a single embodiment is described with respect to a particular feature. Examples of features provided in the disclosure are intended to be illustrative rather than restrictive unless stated otherwise. The description herein is intended to cover such alternatives, modifications, and equivalents as would be apparent to a person skilled in the art having the benefit of this disclosure.
Although the embodiments disclosed herein are susceptible to various modifications and alternative forms, specific embodiments are shown by way of example in the drawings and are described herein in detail. It should be understood, however, that drawings and detailed description thereto are not intended to limit the scope of the claims to the particular forms disclosed. On the contrary, this application is intended to cover all modifications, equivalents and alternatives falling within the spirit and scope of the disclosure of the present application as defined by the appended claims.
This disclosure includes references to “one embodiment,” “a particular embodiment,” “some embodiments,” “various embodiments,” or “an embodiment.” The appearances of the phrases “in one embodiment,” “in a particular embodiment,” “in some embodiments,” “in various embodiments,” or “in an embodiment” do not necessarily refer to the same embodiment. Particular features, structures, or characteristics may be combined in any suitable manner consistent with this disclosure.
Referring now to
As discussed in more detail below with reference to
For example, in various embodiments, location system 102A may be configured to emit a pulse of light in environment 100. In some embodiments, light from that pulse may strike a subset of the items 104 in environment 100 and be reflected back in the direction of its source, location system 102A. In the depicted embodiment, for example, light emitted by location system 102A may strike items 104B, 104C, and 104D. In various embodiments, when the light strikes the retroreflective material included on the items 104B-104D, some portion of the light is reflected back to and received by location system 102A.
After receiving the reflections 108, location system 102A may determine a location of one or more of items 104B-104D. For example, location system 102A may determine a location of item 104B based on reflection 108B corresponding to item 104B. In some embodiments, location system may determine a direction of a location of item 104B relative to a reference location (e.g., the location of item 104A) based on an angle at which location system 102A receives reflection 108B. Further, in various embodiments, location system 102A may be configured to determine a distance between the reference location and item 104B based on reflection 108B, as described in more detail below with reference to
Note that, in the depicted embodiment, light emitted by location system 102A does not reach item 104E due to obstruction 106 located between items 104A and 104E. Accordingly, in such an embodiment, location system 102A would not receive a reflection corresponding to item 104E, and would therefore be unable to determine a location of item 104E. The location of item 104E may nonetheless be determined according to various embodiments of the disclosed systems. For example, in various embodiment, multiple location systems 102 may perform the process outlined herein to determine the location of items 104 within their line of sight. Further, once a location system 102 determines the location of one or more items 104, this location information may be communicated to or otherwise shared with other location systems 102 in the environment 100. For example, in one embodiment, location systems 102 may be configured to communicate with a server computer system (not shown), and may transmit location information corresponding to the items 104 to the server computer system. In some such embodiments, the server computer system may communicate this location information to the other location systems 102 in the environment 100. Thus, although location system 102A may be unable to determine the location of item 104E, location system 102D, for example, may be within a line of sight of item 104E and may be able to determine its location as described herein, according to some embodiments. Location system 102D may then communicate the location information to other location systems 102 or a server computer system. Thus, in some embodiments, location systems 102 may be capable of determining a location of a plurality of items 104 in environment 100.
Further, note that a given location system 102 (e.g., location system 102A) may determine a location of an item 104 relative to a reference location, such as the location of location system 102A and/or item 104A to which it is attached. In various embodiments, once the physical location of any of the location systems 102 is established, the physical locations of the remaining location systems 102 may also be determined. For example, if the physical location of item 104A and its associated location system 102A are known to a user, this information may be used to determine a physical location for the remaining items 104 whose locations are not known to the user. In some embodiments, the user may establish a known location reference point by introducing an item 104 with a location system 102 at a known location within environment 100. Additionally, in some embodiments, environment 100 may include retroreflective patterns at known locations, which may be used by the location systems 102 to establish their physical location within environment 100, as discussed in more detail below with reference to
In various embodiments, the location information corresponding to items 104 in environment 100 may be utilized according to various techniques. For example, as noted, one or more location systems 102 may transmit location information to a server computer system, in some embodiments. In some such embodiments, a user may utilize a software application, for example on a mobile device, to view the location information. For example, the software application, in one embodiment, may provide a map or blueprint of environment 100, and may overlay the location of the items 104 at their corresponding location on the map.
Note that, in various embodiments, the nature of the items 104 and the environment 100 may vary without departing from the scope of this disclosure. For example, in one embodiment, environment 100 may include a warehouse in which various items 104, such as tools, computing devices, machinery, merchandise, etc., are distributed. In various embodiments, a location system 102 may include various items of hardware and software (described in more detail below with reference to
In various embodiments, the disclosed systems and methods for determining a physical location of one or more items in an environment may provide various advantages. For example, consider the situation described above, in which various items, such as tools, computing devices, machinery, merchandise, etc., are distributed throughout a warehouse. In such a situation, the user may be required to manually search the warehouse to locate a given item. The disclosed systems and methods, however, may allow the location systems to determine the location of the user's items in the environment. Further, in some embodiment, the user may view a blueprint or map of the environment in a software application, with a marker provided at the location of each of the user's items. This, in turn, may result in reduced time spent retrieving the items, reduce the number of items lost by the user, and simplify the user's inventory process. Thus, in various embodiments, the disclosed systems and methods may provide various advantages, particularly as it relates to determining the physical location of items within an environment.
Turning now to
In various embodiments, light source 202 may include one or more fluorescent lamps, LEDs, or any other light source suitable to emit a flash of light to induce reflections from other location systems 102 in a given environment. In some embodiments, location systems 102 may be configured to communicate via visible light communication (“VLC”). In such embodiments, light source 202 may be any light source suitable for transmitting information via VLC. Further note that, in some embodiments in which location systems 102 communicate via VLC, location system 102 may further include may further include one or more optical sensors (e.g., photodiodes) suitable for use in VLC. Location system 102 of
Location system 102 further includes retroreflective pattern 206. As discussed in more detail below with reference to
As depicted in
As shown in
Referring now to
In various embodiments, retroreflective pattern 300 may include retroreflective material arranged in a particular pattern on an item 104. In the embodiment depicted in
Turning now to
In some embodiments, it may be advantageous to include more than two retroreflective points 352 in a retroreflective pattern, for example to facilitate more accurate distance determinations by a location system 102. For example, in an embodiment in which location system 102A emits a pulse of light that strikes an item 104D that includes retroreflective pattern 350, retroreflective points 352A-352C may each reflect a portion of the light back to its source, location system 102A. In such an embodiment, location system 102A may determine the distance between the reference location and item 104D based on three reference points (corresponding to retroreflective points 352A-352C) and two separation distances 354-356. By having more available information, the addition of a third retroreflective point 352 may permit location system 102A to make a more accurate distance determination, according to some embodiments. Further, as discussed in more detail below with reference to
Note that, in various embodiments, the separation distances 304, 354, and 356 depicted in
Further, note that, in some embodiments, the retroreflective patterns may include retroreflective material arranged in a barcode, a QR code, or any other suitable machine-readable optical code format. In such embodiments, the retroreflective pattern may include information encoded into the machine-readable code, such as identification information associated with the item 104 on which the retroreflective pattern is attached, separation distance(s) between two or more retroreflective points on the item 104, etc. Further, in such embodiments, one or more of the reflections received back at location system 102A may include a reflected version of a machine-readable code (e.g., QR code), which location system 102A may use in determining identification information associated with the item from which the reflected machine-readable code originated.
Additionally, in some embodiments, a retroreflective pattern, such as retroreflective pattern 300 or 350, may be placed on an item without a location system 102. For example, in some embodiments, it may be desirable to place a retroreflective pattern on a fixed object of known location, such as a wall, to allow other location systems 102 to determine their own location relative to a known location within a given environment 100. For example, a distinct retroreflective pattern (e.g., a retroreflective pattern with a particular number of retroreflective points) may be placed in the environment 100 on a fixture of known location, according to some embodiments. In such embodiments, location systems 102 may recognize the distinct retroreflective pattern as a known geographic location point, and may determine their own location within environment 100 based on this known geographic location. Further, in some embodiments, placing a retroreflective pattern on an item 104 without a complete location system 102 may be desirable for situations in which it is impractical to attach the other hardware components included in a location system on the item, for example due to the item's size.
Referring now to
As discussed above, location system 102A may be configured to determine the location of one or more items 104 within an environment 400. In various embodiments, location system 102A may emit a pulse of light, for example using light source 202. When this pulse of light is received at item 104B, it may strike one or more retroreflective points 302 in retroreflective pattern 300. For example, in the depicted embodiment, the pulse of light emitted by location system 102A strikes retroreflective points 302A and 302B. Once the pulse of light reaches item 104B, retroreflective points 302A and 302B are configured to reflect some portion of that light back to its source, location system 102A. For example, as shown in
In various embodiments, after receiving the reflections 402, location system 102A may be configured to detect one or more reference points in the reflection 402 corresponding to item 104B. For example, in the depicted embodiment, location system 102A may be configured to detect first and second reference points in the reflections 402, where the first and second reference points respectively correspond to retroreflective points 302A-302B in retroreflective pattern 300. Further, in various embodiments, location system 102A may be configured to determine an angle between the first reference point and the second reference point. For example, as shown in
In various embodiments, location system 102A may be configured to determine the distance between a reference location, such as the location of item 104A, and item 104B based on the angle θ between the first and second reference points and the separation distance 304 between retroreflective points 302A-302B. For example, in embodiments in which the distance between items 104A and 104B (e.g., 50 feet) is much larger than the separation distance 304 (e.g., 6 inches), the distance between items 104A and 104B may be approximated as follows:
Where D is the distance between items 104A and 104B, SD is the separation distance 304 between retroreflective points 302A and 302B, and θ is the angle between the first and second reference points. Note, however, that this described technique for determining the distance between items 104 is provided merely as an example and is not intended to limit the scope of this disclosure. One of ordinary skill in the art with the benefit of this disclosure will recognize that various techniques may be implemented to determine such a distance without departing from the scope of the present disclosure.
Thus, in various embodiments, location system 102A may determine a distance between a reference location (e.g., the location of item 104A) and an item 104 based on the reflections received back at the location system 102A. In various embodiments, this may eliminate the need for additional hardware, such as a laser range finder, to determine the distance between items 104 in environment 100.
Location system 102A may determine separation distance 304 according to various techniques. For example, in some embodiments, location system 102A may store information corresponding to retroreflective patterns 300, such as information indicative of the separation distance 304 between retroreflective points 302, which may permit location system 102A to determine the distance between an item 104 and a reference location based on this stored information. In one embodiment, for example, each of the retroreflective patterns 300 implemented within a given environment 400 may be the same, such that each has the same separation distance 304 between retroreflective points 302A and 302B. In such embodiments, location system 102A may store information corresponding to retroreflective pattern 300, such as the value of the separation distance 304, and use this information to determine the distance between items 104.
In other embodiments, however, multiple retroreflective patterns may be implemented in a given environment 400. For example, in one embodiment, both retroreflective patterns 300 and 350 of
Further, in some embodiments, location systems 102 may be configured to communicate information corresponding to the separation distances 304. For example, after determining a direction to the location of item 104B, location system 102A of
As noted above, in some embodiments, location system 102A may be configured to determine orientation information associated with an item 104 based on the reflections 402 received back by location system 102A. For example, in embodiments in which a retroreflective pattern includes three or more retroreflective points, location system 102A may be configured to determine an orientation of an item 104 based on the reference points included in the reflection from that retroreflective pattern. For example, consider an item 104C that includes a retroreflective pattern (not shown) with four retroreflective points arranged in a square pattern. In such an embodiment, the separation distances between the four retroreflective points may be the same. If, however, location system 102A determines that, based on the measured angles between the reference points and the distance between the item 104C and the reference location, the perceived separation distances are not the same, location system 102A may determine that item 104C is turned relative to the location system 102A.
Turning now to
Method 500 then proceeds to block 504, which includes receiving a plurality of reflections that have been reflected from the retroreflective material on a subset of the plurality of items. For example, location system 102A may receive reflections that have been reflected off of retroreflective material on items 104B, 104C, and 104D. Location system 102A may not, however, receive a reflection from one or more other items 104 in environment 100. For example, in the embodiment depicted in
Method 500 then proceeds to block 506, which includes determining a direction of the location of the first item relative to the reference location. In some embodiments, the reference location is a location of the location system 102 that is determining the location of one or more other items 104 in environment 100. For example, location system 102A may determine a direction of the location of item 104B relative to its own location using light direction sensor 204.
Method 500 then proceeds to block 508, which includes determining a distance between the reference location and the first item based on the reflection corresponding to the first item. For example, as explained in more detail above with reference to
Method 500 then proceeds to block 510, which includes, subsequent to determining the direction of the location of the first item, determining identification information associated with the first item. Location system 102A may use various techniques to determine identification information associated with items 104B-D. For example, in some embodiments, location system 102A may use light source 202 and camera 208 communicate with one or more location system 102B-102D via visible light communication. In such embodiments, block 510 may include location system 102A controlling camera 208 to point in the direction of item 104B, sending, by location system 102A to location system 102B, a request for identification information associated with item 104B, and receiving, by location system 102A, identification information from location system 102B.
As shown in
With reference to
Referring now to
As shown in
Further, portion 606 may include an identifier 608 (not shown). In various embodiments, identifier 608 may include a machine-readable optical code, such as a bar code or QR code, with information associated with object 604 encoded into the identifier. In some embodiments, identifier 608 may uniquely identify object 604 such that data associated with object 604 may be retrieved. For example, in the depicted embodiment, AR device 600 may detect the identifier 608 from the image 602. Further, AR device 600 may send a request to a remote server computer system (not necessarily object 604) for data associated with object 604 based on the identifier 608. In various embodiments, AR device 600 may be configured to overlay computer-generated graphic content in portion 606 that is painted in the particular color.
Turning now to
In various embodiments, AR device 600 may generate graphics frame 650 based on image 602. For example, in some embodiments, AR device 600 may determine a location to overlay graphic content 652 by detecting a boundary of the portion 606 painted in the particular color. Further, in some embodiments, AR device 600 may remove (e.g., filter) some percentage of the portion 606 based on the particular color and, in that space within graphics frame 650, overlay graphic content 652. In various embodiments, the graphic content 652 may be based on the data associated with object 604. Note that the nature of the graphic content 652 may vary based on the object 604 on which it is overlaid. In the depicted example, in which object 604 is a server computer system, graphic content 652 may include identification or status information associated with the server computer system, such as a MAC address, reported performance issues, etc.
In various embodiments, the disclosed systems and methods for providing computer-generated content in an augmented reality environment may provide various improvements to the functioning of AR device and to the AR user experience. For example, when viewing an object 604 in using an AR device, it may be desirable to incorporate computer-generated graphic content onto the object, to view useful information relating to the object. Most physical objects, however, were not specifically designed for viewing in an AR environment. For example, consider a situation in which the surface of object includes numerous components (e.g., buttons, labels, handles, etc.). In such a situation, when the AR device incorporates the computer-generated graphic content with the image of the object, the resulting image may be visually cluttered, making it difficult for a user to discern the added graphic content from the numerous features on the surface of object. This result renders addition of the graphic content less useful and detracts from the user experience. The disclosed systems and methods, however, may allow for more effective incorporation of computer-generated graphic content onto an object when viewed using an AR device. For example, by painting a portion, such as the components or casing, of the object with the particular color, the AR device may be able to detect a boundary of the portion, filter some amount of that portion from the image, and overlay graphic content in that space within a graphics frame. In various embodiment, this may result in a graphics frame with useful graphic content and less visual clutter, rendering the AR device more useful to the user. Thus, in various embodiments, the disclosed systems and methods may provide various advantages, particularly as it relates to incorporating computer-generated graphic content into an image in a AR context.
Note that, although only one portion 606 is shown in
Turning now to
As depicted in
AR device 600 further includes graphics unit 704. Graphics unit 704 may include one or more processors and/or one or more graphics processing units (GPU's). Graphics unit 704 may receive graphics-oriented instructions and execute GPU instructions or perform other operations based on the received graphics-oriented instructions. Graphics unit 704 may generally be configured build images in a frame buffer for output to a display. AR device 600 further includes display unit 706. Display unit 706 may be configured to read data from a frame buffer and provide a stream of pixel values for display. Further, display unit 706 may include one or more interfaces for coupling to display 601.
AR device 600 further includes wireless interface 708, which, in various embodiments, may be configured to use various communication protocols, such as near-field communications (NFC), WiFi Direct, Bluetooth, etc. In various embodiments, wireless interface 708 may be configured to send requests for data associated with one or more objects to a server computer system and receive, from the server computer system, the data associated with the object, for example.
Referring now to
Method 800 then proceeds to block 804, which includes detecting an identifier in the first image. For example, in one embodiment, AR device 600 may detect an identifier in the form of a machine-readable optical code, such as a QR code, in the first image. Further, in some embodiments, the identifier may be included in the portion painted in the particular color. In other embodiments, however, the identifier may be included on some portion of the object other than the portion painted in the particular color.
Method 800 then proceeds to block 806, which includes sending a request for data associated with the object to a server computer system based on the identifier. For example, in some embodiments, AR device 600 may send a request to a server computer system storing information associated with the object. In such embodiments, the server computer system may be configured to service requests from AR device 600 by retrieving data associated with the object based on the identifier and sending that data to AR device 600. Method 800 then proceeds to block 808, which includes receiving, from the server computer system the data associated with the object.
Method 800 then proceeds to block 810, which includes generating a graphics frame based on the first image. In some embodiments, block 810 may include generating the graphics frame by detecting a boundary of the portion painted in the particular color, and overlaying graphic content based on the data associated with the boundary. For example, as depicted in
Turning now to
Processor subsystem 920 may include one or more processors or processing units. In various embodiments of computer system 900, multiple instances of processor subsystem 920 may be coupled to interconnect 980. In various embodiments, processor subsystem 920 (or each processor unit within 920) may contain a cache or other form of on-board memory.
System memory 940 is usable to store program instructions executable by processor subsystem 920 to cause system 900 perform various operations described herein. System memory 940 may be implemented using different physical, non-transitory memory media, such as hard disk storage, floppy disk storage, removable disk storage, flash memory, random access memory (RAM—SRAM, EDO RAM, SDRAM, DDR SDRAM, RAMBUS RAM, etc.), read only memory (PROM, EEPROM, etc.), and so on. Memory in computer system 900 is not limited to primary storage such as system memory 940. Rather, computer system 900 may also include other forms of storage such as cache memory in processor subsystem 920 and secondary storage on I/O Devices 970 (e.g., a hard drive, storage array, etc.). In some embodiments, these other forms of storage may also store program instructions executable by processor subsystem 920.
I/O interfaces 960 may be any of various types of interfaces configured to couple to and communicate with other devices, according to various embodiments. In one embodiment, I/O interface 960 is a bridge chip (e.g., Southbridge) from a front-side to one or more back-side buses. I/O interfaces 960 may be coupled to one or more I/O devices 970 via one or more corresponding buses or other interfaces. Examples of I/O devices 970 include storage devices (hard drive, optical drive, removable flash drive, storage array, SAN, or their associated controller), network interface devices (e.g., to a local or wide-area network), or other devices (e.g., graphics, user interface devices, etc.). In one embodiment, I/O devices 970 includes a network interface device (e.g., configured to communicate over WiFi, Bluetooth, Ethernet, etc.), and computer system 900 is coupled to a network via the network interface device.
As used herein, the term “based on” is used to describe one or more factors that affect a determination. This term does not foreclose the possibility that additional factors may affect the determination. That is, a determination may be solely based on specified factors or based on the specified factors as well as other, unspecified factors. Consider the phrase “determine A based on B.” This phrase specifies that B is a factor that is used to determine A or that affects the determination of A. This phrase does not foreclose that the determination of A may also be based on some other factor, such as C. This phrase is also intended to cover an embodiment in which A is determined based solely on B. As used herein, the phrase “based on” is synonymous with the phrase “based at least in part on.”
As used herein, the phrase “in response to” describes one or more factors that trigger an effect. This phrase does not foreclose the possibility that additional factors may affect or otherwise trigger the effect. That is, an effect may be solely in response to those factors, or may be in response to the specified factors as well as other, unspecified factors. Consider the phrase “perform A in response to B.” This phrase specifies that B is a factor that triggers the performance of A. This phrase does not foreclose that performing A may also be in response to some other factor, such as C. This phrase is also intended to cover an embodiment in which A is performed solely in response to B.
As used herein, the terms “first,” “second,” etc. are used as labels for nouns that they precede, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.), unless stated otherwise. For example, if a location system determines locations of four items in an environment, the terms “first item” and “second item” can be used to refer to any two of the four items.
When used in the claims, the term “or” is used as an inclusive or and not as an exclusive or. For example, the phrase “at least one of x, y, or z” means any one of x, y, and z, as well as any combination thereof (e.g., x and y, but not z).
It is to be understood that the present disclosure is not limited to particular devices or methods, which may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” include singular and plural referents unless the content clearly dictates otherwise. Furthermore, the word “may” is used throughout this application in a permissive sense (i.e., having the potential to, being able to), not in a mandatory sense (i.e., must). The term “include,” and derivations thereof, mean “including, but not limited to.” The term “coupled” means directly or indirectly connected.
Within this disclosure, different entities (which may variously be referred to as “units,” “circuits,” other components, etc.) may be described or claimed as “configured” to perform one or more tasks or operations. This formulation [entity] configured to [perform one or more tasks] is used herein to refer to structure (i.e., something physical, such as an electronic circuit). More specifically, this formulation is used to indicate that this structure is arranged to perform the one or more tasks during operation. A structure can be said to be “configured to” perform some task even if the structure is not currently being operated. A “memory device configured to store data” is intended to cover, for example, an integrated circuit that has circuitry that performs this function during operation, even if the integrated circuit in question is not currently being used (e.g., a power supply is not connected to it). Thus, an entity described or recited as “configured to” perform some task refers to something physical, such as a device, circuit, memory storing program instructions executable to implement the task, etc. This phrase is not used herein to refer to something intangible.
The term “configured to” is not intended to mean “configurable to.” An unprogrammed FPGA, for example, would not be considered to be “configured to” perform some specific function, although it may be “configurable to” perform that function after programming.
Reciting in the appended claims that a structure is “configured to” perform one or more tasks is expressly intended not to invoke 35 U.S.C. § 112(f) for that claim element. Accordingly, none of the claims in this application as filed are intended to be interpreted as having means-plus-function elements. Should Applicant wish to invoke Section 112(f) during prosecution, it will recite claim elements using the “means for” [performing a function] construct.
The scope of the present disclosure includes any feature or combination of features disclosed herein (either explicitly or implicitly), or any generalization thereof, whether or not it mitigates any or all of the problems addressed herein. Accordingly, new claims may be formulated during prosecution of this application (or an application claiming priority thereto) to any such combination of features. In particular, with reference to the appended claims, features from dependent claims may be combined with those of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in the specific combinations enumerated in the appended claims.