A facility can experience issues during the course of operation. The issues may take many forms such as issues relating to personal security, physical hazards, product conditions, product availability and other conditions. Before the issues can be addressed they must first be identified.
Illustrative embodiments are shown by way of example in the accompanying drawings and should not be considered as a limitation of the present disclosure:
Described in detail herein is an issue reporting and detection system. A user can capture an image of an issue at a location in a facility using an image capturing device integrated with the user's mobile device. An application executing on the mobile device can receive a user provided input associated with the image and transmit the image and the input associated with the image to a computing system associated with the facility from the user's mobile device. The computing system can receive the image and the input associated with the image and can extract a set of attributes from the image. As explained further below, the computing system can analyze the image to extract attributes such as product bar codes, shelf or other facility identifiers or some other type of attribute that can be used to determine a location in the facility. The computing system can query the issues types database to retrieve a type of facility issue using the input associated with the image. The computing system can determine a location at which the image was taken in the facility based on the set of attributes and can then transmit a command, based on the type of facility issue, to a selected Unmanned Aerial/Ground Vehicle (UAGV) to navigate to the determined location of the facility. In one embodiment the selected UAGV is one of a group of UAGVs available in the facility and may be selected based on proximity to the determined location. In another embodiment, the selected UAGV may be selected based on remaining battery life or other criteria. The selected UAGV can capture an image using am image capturing device coupled to the UAGV, of the reported issue at the location in the facility. The selected UAGV can transmit the image to the computing system. The computing system is further configured to extract a set of attributes from the image captured by the image capturing device coupled to the UAGV in order to confirm the type of the facility issue based on the extracted set of attributes. The computing system may also transmit an alert in response to confirming the type of facility issue. In one embodiment the alert may be directed to an authorized individual at the facility to address the determined issue. For example, if the issue is missing product or broken glass, the alert may be sent to an employee able to replace product or clean up the broken glass. In another embodiment, instead of alerting an employee, the computing system may transmit an alert to a UAGV with ground-based navigational capability that also has the ability to dispose of broken glass. In an embodiment, the UAGV that is alerted may be the same UAGV that captured an image to confirm the type of issue.
As used herein the term “UAGV” should be understood to encompass unmanned vehicles having either or both of ground-based or aerial based navigational capability. It will be appreciated that certain tasks as described herein will be appropriately performed by vehicles that are primarily land-based such as, but not limited to, picking up hazards, stocking shelves, moving objects to their correct locations, and certain other tasks may be more appropriate for aerial-based vehicles such as, but not limited to, scanning for issues, reconnaissance of user reported issues and searching for sources of hazards to feed the information to a ground-based UAGV. It should further be appreciated that some tasks may be performed by UAGVs regardless of whether their primary navigation mode is ground-based or aerial-based.
With reference to
With reference to
Once a user either captures an image or selects an image from the stored photo library, the mobile application displays the image within the upload photo input box. The image can be a photo of a location within the facility in which the missing physical object is designated to be disposed. The image can include information such as an machine-readable element encoded with an identifier associated with the physical object, or a neighboring physical object. In some embodiments, the size of the image will be reduced so that it can fit into the upload photo input box 118. The user can crop and/or move the image within the upload photo input box 118. The user can enter a name of the missing physical object in the product name input box 120. The product name input box 120 may accept alphanumeric input. Once the user has uploaded the image in the upload photo input box 118 and entered the name of the missing physical object 118, the user can select the submit button 120. In response to selecting the submit button, the image and the name of the physical object can be transmitted to a computing system 300. The computing system 300 will be discussed in further detail with reference to
With reference to
Once a user either captures an image or selects an image from the stored photo library, the mobile application may display the image within the upload photo input box. The image can be a photo of a location within the facility of the emergency. The image can include the actual emergency. The image can also include information within the image to indicate the location of the image, such as an machine-readable element encoded with an identifier associated with a physical object disposed at the location or some sort of landmark at the location. In some embodiments, the size of the image will be reduced so that it can fit into the upload photo input box 132. The user can crop and/or move the image within the upload photo input box 132. The user can enter input attempting to describe the emergency in the emergency type input box 134. The emergency type input box 134 may accept alphanumeric input. In some embodiments, the each emergency can have a specific alphanumeric code/identifier. Once the image appears in the upload photo input box 134 and entered the type of emergency, the user can select the submit button 134. In response to selecting the submit button, the image and the input regarding the type of emergency can be transmitted to a computing system 300. The computing system 300 will be discussed in further detail with reference to
With reference to
Once user either captures an image or selects an image from the stored photo library, the mobile application can display the image within the upload photo input box. The image can be a photo of a location within the facility where there is an in-store issue. The image can include the in-store issue. The image can also include information within the image to indicate the location of the image, such as an machine-readable element encoded with an identifier associated with a physical object disposed at the location or some sort of landmark at the location. In some embodiments, the size of the image will be reduced so that it can fit into the upload photo input box 140. The user can crop and/or move the image within the upload photo input box 140. The user can enter text regarding the issue type box 142. The issue type input box 142 may accept alphanumeric input. In some embodiments, the each issue can have a specific alphanumeric code/identifier. Once the image appears in the upload photo input box 140 and the user has entered input regarding the type of issue, the user can select the submit button 144. In response to selecting the submit button, the image and the type of issue can be transmitted to a computing system 300. The computing system 300 will be discussed in further detail with reference to
With further reference to
The UAGV 200 can include a speaker system 206, a light source 208 and an image capturing device 210. The image capturing device 210 can be configured to capture still or moving images. The light source 208 can be configured to generate various types of lights and generate various effects using the light. The speaker system 206 can be configured to generate audible sounds. The UAGV 200 can include a controller 212a, and the inertial navigation system can include a GPS receiver 212b, accelerometer 212c and a gyroscope 212d. The UAGV 200 can also include a motor 212e. The controller 212a can be programmed to control the operation of the image capturing device 210, the GPS receiver 212b, accelerometer 212c, a gyroscope 212d, motor 212e, and drive assemblies 212 (e.g., via the motor 212e), in response to various inputs including inputs from the GPS receiver 212b, the accelerometer 212c, and the gyroscope 212d. The motor 212e can control the operation of the motive assemblies 204 directly and/or through one or more drive trains (e.g., gear assemblies and/or belts). The motive assemblies 204 can be but are not limited to wheels, tracks, rotors, rotors with blades, and propellers.
The GPS receiver 212b can be a L-band radio processor capable of solving navigation equations in order to determine a position of the UAGV 200 and to determine a velocity and precise time (PVT) by processing a signal broadcast by GPS satellites. The accelerometer 212c and gyroscope 212d can determine the direction, orientation, position, acceleration, velocity, tilt, pitch, yaw, and roll of the UAGV 200. In exemplary embodiments, the controller can implement one or more algorithms, such as a Kalman filter, for determining a position of the UAGV 200.
In one embodiment, the UAGV 200 can also include a sensor 214. The sensor 214 can be one or more of a moisture sensor, ultraviolet light sensor, or an molecular scanner. In the event the sensor 214 is a moisture sensor, the sensor can detect moisture emitted by physical objects. In the event the sensor 214 is an ultraviolet light sensor, the sensor 214 can be configured to detect ultraviolet light in a facility. In the event the sensor is an 214 is a molecular scanner, the sensor 214 can use a near-IR spectroscopy method to determine the contents of a physical object. The interaction of the vibration of molecules can be detected and referenced to a database of molecular compositions and vibrations. Using the detected vibration of the molecules can determine the contents of a physical object. As a non-limiting example, molecular scanners can be used for determining the contents of the following physical objects: pharmaceuticals, food, beverages, art, collectibles, and jewelry.
In exemplary embodiments, the autonomous UAGV 200 may receive instructions from the computing system 300 to confirm an issue which has been reported as described in
In an example embodiment, one or more portions of the communications network 315 can be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless wide area network (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a WiMax network, any other type of network, or a combination of two or more such networks.
The computing system 300 includes one or more computers or processors configured to communicate with the databases 305, the mobile devices 100 and the UAGVs 200 via the network 315. In one embodiment, computing system 300 is associated with a facility. The computing system 300 hosts one or more applications configured to interact with one or more components of the issue reporting and confirmation system 350. The databases 305 may store information/data, as described herein. For example, the databases 305 can include an images database 345, a physical objects database 335 and an issue types database 325. The image database 345 can store images captured by the image capturing device 210 of the UAGVs 200 and/or images captured by the user on their mobile device. The physical objects database 335 can store information associated with physical objects. The issues types database 325 can include types of issues related to facilities and types of emergencies. The issues type database 325 can include alphanumeric codes and/or identifiers of types of issues. The databases 305 and server 310 can be located at one or more geographically distributed locations from each other or from the computing system 300. Alternatively, the databases 305 can be included within server 310 or computing system 300.
In exemplary embodiments, a user can discover an issue in a facility. For example, the issue can be a missing physical object, an issue associated with the facility or an emergency. In the event the issue is a missing physical object, a physical object can be missing from a designated location within the facility. Exemplary issues associated with a facility can be, but are not limited to, one or more of a need for more associates in a certain location, a broken fixture in the facility, damaged or decomposing products and/or other assistance needed in the facility. Exemplary emergencies can be, but are not limited to, one or more of a fire, a spill, broken glass, theft and/or any other dangerous condition in the facility. The user's mobile device 100 can execute an application associated with the facility. The application can display a user interface prompting the user to upload an image of the facility issue and enter input associated with the facility issue. In one embodiment, the mobile device 100 can automatically execute the image capturing device 103 in response to the user interacting with the user interface displayed by the executed application. The user can capture an image of the issue using the image capturing device 103 of their mobile device 100. Alternatively, the user can capture an image of the issue prior to executing the application and interaction with the user interface displayed by the application can automatically retrieve stored images so that the user can select at least one of the stored images corresponding with the issue. The user can upload the captured or selected image and enter input associated with the issue. For example, the user can enter alphanumeric input associated with the name of the missing physical object, the type of issue associated with the facility and/or the type of emergency. The user can submit the uploaded image and input associated with issue. The uploaded image and input associated with the issue can be transmitted to the computing system 300.
The computing system 300 can execute the routing engine 320 in response to receiving the image and the input associated with the image. The routing engine 320 can extract a set of attributes from the image using video analytics and/or machine vision. The routing engine 320 can determine the location of the issue within the facility based on the extracted set of attributes. For example, the extracted set of attributes can include a machine-readable element encoded with an identifier of a physical object designated to be disposed in the facility. The routing engine 320 can extract the machine readable element and decode the identifier associated with the machine-readable element from the image. The routing engine 320 can query the physical objects database 335 using an identifier, to determine the location at which the physical object is designated to be disposed in the facility. In some embodiments, the routing engine 320 can extract a landmark from the image to determine the location of the reported issue. The routing engine 320 can query the issue types database 325 using the input associated with image to determine the type of facility issue. In some embodiments, the user input will match a stored type of issue. In other embodiments, the input will be parsed based on pre-defined criteria (e.g.: keywords) to determine the type of issue the user is attempting to report. In one embodiment, in the event the issue is reporting a missing physical object, the routing engine 320 can query the physical objects database 335 using the input associated with the name of the missing physical object to determine the identification of the physical object.
The routing engine 320 can instruct a UAGV 200 to confirm the reported issue. In some embodiments, the routing engine 320 can detect a UAGV 200 within a specified threshold distance of the determined location of the issue and instruct the detected UAGV 200 to confirm the reported issue based on proximity. In another embodiment, the UAGV 200 may be selected based on other criteria such as battery life remaining or UAGV capabilities. The instructions can include the determined location, the image and the identification of the physical object and the determined type of facility issue.
The UAGV 200 can receive instructions from the routing engine 320 to confirm the reported issue. The UAGV 200 can navigate to the determined location. The UAGV 200 can capture an image of the location of the reported issue using the image capturing device 210. In some embodiments, the UAGV 200 can capture multiple images of the location of the reported issues. The UAGV 200 can transmit the images to the computing system 300. In some embodiments, the UAGV 200 can scan the location using the image capturing device 210 to detect the issue based on the received facility issue type.
The computing system 300 can receive the images from the UAGV 200. The routing engine 320 can extract attributes from the images captured by the UAGV 200. The routing engine 320 can query the images database 345 to retrieve the image received by the user. The routing engine 320 can perform video analysis of the image received from the UAGV to extract attributes from the image. In one embodiment the routing engine can compare the extracted attributes of the images received from the UAGV 200 and the extracted attributes from the image received from the user. In another embodiment, the image from the UAGV may be analyzed to detect an issue without relying on the image received from the user. The routing engine 320 can confirm the reported issue following analysis of the second image received from the UAGV.
In one embodiment, in the event, the reported issue is associated with a missing physical object and the routing engine 320 is able to confirm the physical object is missing from the designated location within the facility the computing system 320 can transmit an alert to an authorized individual in the facility.
In another embodiment, the routing engine 320 can instruct the UAGV 200 to retrieve and deposit like physical objects in the designated location of the missing physical object within the facility. The routing engine can query the physical objects database 335 to determine a location of the like physical objects are disposed in a location other than the designated location within the facility and provide the information to the UAGV 200. The UAGV 200 can navigate to the location, pick-up a specified quantity of like physical objects and carry the like physical objects to the designated location. The UAGV 200 can deposit the like physical objects in the designated location. The UAGV 200 can update the physical objects database 335 based on the specified quantity of like physical objects deposited in the designated location. The UAGV 200 can also transmit an alert to the routing engine indicating that it has deposited like physical objects in the designated location. The routing engine 320 can transmit an alert to the mobile device 100 indicating the deposited like physical objects. The mobile device 100 can display the alert on the display 102. In some embodiments, the alert can be displayed on the user interface displayed by the application.
In an embodiment, the UAGV 200 may notice that items in the facility are misplaced such as being in a wrong shelf location or located on the floor in a facility aisle. The UAGV can inform the routing engine of a misplaced item, receive an assigned location for the misplaced item from the routing engine following the routing engine's query of the physical objects database 335, retrieve the misplaced item and replace the misplaced item in its assigned location. In some cases this replacing of the misplaced item may require the UAGV 200 to place an item into a different location on a same shelf. In other circumstances, the UAGV 200 may need to navigate large distances in the facility to perform the replacement operation.
In one embodiment, upon items being confirmed as being missing by a UAGV, the routing engine can query the physical objects database 335 to determine where like physical objects are disposed in a location other than the designated location within the facility. If no alternate location within the facility is found, the routing engine may programmatically initiate a resupply order of a quantity of the missing items from a facility supplier for delivery to the facility.
In one embodiment, in the event the issue is an emergency, and the routing engine 320 is able to confirm the emergency in the location within the facility the routing engine 320 can transmit an alert. The routing engine 320 can also instruct the UAGV 200 to sound an alarm at the location of the emergency. For example, UAGV 200 may generate a light effect using the light source 206 and generate audible sounds using the speaker system 208 at the location of the emergency in response to receiving the instructions.
In one embodiment, in the event the issue is an issue associated with the facility and the routing engine 320 is able to confirm the issue associated with the facility at the location within the facility the routing engine 320 can transmit an alert. In a non-limiting example, the routing engine 320 can transmit an alert to a mobile device of an associate working within the facility.
As described above, in one embodiment, the UAGV 200 can also include a sensor 214. The sensor 214 can be one or more of a moisture sensor, ultraviolet light sensor, molecular scanner, or provide an X-ray capability. In the event the sensor 214 is a moisture sensor, the sensor can detect moisture emitted by physical objects. In the event the sensor 214 is an ultraviolet light sensor, the sensor 214 can be configured to detect ultraviolet light in a facility. In the event the sensor is an 214 is a molecular scanner, the sensor 214 can use a near-IR spectroscopy method to determine the contents of a physical object. The interaction of the vibration of molecules can be detected and referenced to a database of molecular compositions and vibrations. Using the detected vibration of the molecules can determine the contents of a physical object can be determined. As a non-limiting example, molecular scanners can be used for determining the contents of the following physical objects: pharmaceuticals, food, beverages, art, collectibles, and jewelry. In the event the sensor 214 provides an X-ray capability, the sensor 214 can take an X-ray to view the insides of physical objects to determine the state of the physical objects. The sensor 214 can detect broken, damaged, spoiled, deteriorating, degenerating, decaying, or decomposing physical objects in the facility. In response to detecting broken, damaged, spoiled, deteriorating, degenerating, decaying, or decomposing physical objects in the facility, image capturing device 210 can capture an image of the damaged, spoiled, deteriorating, degenerating, decaying, or decomposing physical objects in the facility and transmit the image to the computing system 300. The computing system 300 can receive the image and the routing engine 320 can instruct the UAGV 200 to take remedial measures. For example, the routing engine 320 can instruct the UAGV 200 clean up a broken physical object, replace the damaged physical objects, and/or transmit an alert regarding the broken/damaged physical objects.
As a non-limiting example, the issue reporting and confirmation system 350 can be implemented in a retail store. A customer can travel around the retail store and report an issue within the retail store using a mobile application associated with the retail store that is executed on the customer's smartphone or other mobile device.
As noted above, the routing engine 320 can extract a set of attributes from an image using video analytics and/or machine vision. The types of machine vision and/or video analytics used by the routing engine 320 can be but are not limited to: Stitching/Registration, Filtering, Thresholding, Pixel counting, Segmentation, Inpainting, Edge detection, Color Analysis, Blob discovery & manipulation, Neural net processing, Pattern recognition, Barcode Data Matrix and “2D barcode” reading, Optical character recognition and Gauging/Metrology. The routing engine 320 determine the location of the issue within the facility based on the extracted set of attributes.
As noted above, the UAGV 200 can capture an image of the location of the reported issue using the image capturing device 210. In some embodiments, the UAGV 200 can capture multiple images of the location of the reported issues.
Virtualization may be employed in the computing device 400 so that infrastructure and resources in the computing device 400 may be shared dynamically. A virtual machine 412 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
Memory 406 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 406 may include other types of memory as well, or combinations thereof.
A user may interact with the computing device 400 through a visual display device 414, such as a computer monitor, which may display one or more graphical user interfaces 416, multi touch interface 420, a pointing device 418, and image capturing device 434. The image capturing device 434 can be configured to capture still or moving images. The light source 436 can be configured to generate light effects. The speakers 432 can be configured to generate audible sounds.
The computing device 400 may also include one or more storage devices 426, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the present disclosure (e.g., applications such as the routing engine 420). For example, exemplary storage device 426 can include one or more databases 428 for storing information associated information associated with physical objects, captured images and information associated with issue types. The databases 428 may be updated manually or automatically at any suitable time to add, delete, and/or update one or more data items in the databases.
The computing device 400 can include a network interface 408 configured to interface via one or more network devices 424 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, T1, T3, 56 kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. In exemplary embodiments, the computing system can include one or more antennas 422 to facilitate wireless communication (e.g., via the network interface) between the computing device 600 and a network and/or between the computing device 400 and other computing devices. The network interface 608 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing device 400 to any type of network capable of communication and performing the operations described herein.
The computing device 400 may run any operating system 410, such as versions of the Microsoft® Windows® operating systems, different releases of the Unix and Linux operating systems, versions of the MacOS® for Macintosh computers, embedded operating systems, real-time operating systems, open source operating systems, proprietary operating systems, or any other operating system capable of running on the computing device 400 and performing the operations described herein. In exemplary embodiments, the operating system 410 may be run in native mode or emulated mode. In an exemplary embodiment, the operating system 410 may be run on one or more cloud machine instances.
In describing exemplary embodiments, specific terminology is used for the sake of clarity. For purposes of description, each specific term is intended to at least include all technical and functional equivalents that operate in a similar manner to accomplish a similar purpose. Additionally, in some instances where a particular exemplary embodiment includes a multiple system elements, device components or method steps, those elements, components or steps may be replaced with a single element, component or step. Likewise, a single element, component or step may be replaced with multiple elements, components or steps that serve the same purpose. Moreover, while exemplary embodiments have been shown and described with references to particular embodiments thereof, those of ordinary skill in the art will understand that various substitutions and alterations in form and detail may be made therein without departing from the scope of the present disclosure. Further still, other aspects, functions and advantages are also within the scope of the present disclosure.
Exemplary flowcharts are provided herein for illustrative purposes and are non-limiting examples of methods. One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.
This application claims priority to U.S. Provisional Application No. 62/459,876 filed on Feb. 16, 2017, and U.S. Provisional Application No. 62/467,510 filed on Mar. 6, 2017, the content of each is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62467510 | Mar 2017 | US | |
62459876 | Feb 2017 | US |