The invention relates to a device and method for assisting with the safe use of work machines such as mobile stone crushers and screeners and in particular to safe movement on or around work machines.
A crusher is a machine designed to reduce large rocks into smaller rocks, gravel, or rock dust. Crushers may be used to reduce the size, or change the form, of waste materials so they can be more easily disposed of or recycled, or to reduce the size of a solid mix of raw materials (as in rock ore), so that pieces of different composition can be differentiated.
Mining, quarrying demolition and recycling operations use crushers, commonly classified by the degree to which they fragment the starting material, with primary and secondary crushers handling coarse materials, and tertiary and quaternary crushers reducing ore particles to finer gradations. Each crusher is designed to work with a certain maximum size of raw material, and often delivers its output to a screening machine which sorts and directs the product for further processing.
Machines such as the crushers illustrated in
As is apparent from
Risks include:
In general, hazards are shown by a black symbol inside a yellow triangle with a black outline. Prohibitions are shown by a black symbol inside a red circle with a diagonal red bar that extends across the black symbol. Mandatory actions are illustrated by a white symbol inside a blue circle. Such symbols are well known and specific examples can be found in machine health and safety manuals. In addition, operator safety features are included, for example:
In accordance with a first aspect of the invention there is provided a device for assisting with the safe use of a machine, the device comprising:
a user interface for displaying an augmented reality scene;
a camera for capturing in real time, video of an area of interest for inclusion in an augmented reality scene;
a positioning module for determining the position of the device relative to the machine;
a shape recognition module for recognising an actual location near the machine in the video;
a database of virtual safe areas around the machine which are displayable in the augmented reality scene;
a matching module for matching the virtual safe areas to the corresponding actual component, to identify to a user, in the augmented reality scene, areas around the machine which are safe and corresponding areas which are unsafe.
In at least one embodiment of the invention, sensors determine the active status of the machine the safety of a zone is determined depending on whether the zone is in or near an area where the machine is inactive or active.
In at least one embodiment, the machine is paired with the device using a near field communication system that detects the device is in the vicinity of the machine and then instructs the user to look at the machine or scan the reference marker before proceeding.
It will be appreciated that the term Augmented Reality (AR) may be substituted with the term Mixed Reality (MR). Both terms convey the use of technology to provide a user interface that merges real and virtual worlds.
In at least one embodiment, the device is a handheld device.
In at least one embodiment, the handheld device is a tablet computer or smartphone.
In at least one embodiment, the device is an augmented reality headset. The term headset includes but is not limited to, headwear, eyewear and other similar wearable technology.
In at least one embodiment, the user interface is an audio output.
In at least one embodiment, the user interface combines a graphical user interface and an audio output.
In at least one embodiment, the graphical user interface is controlled by physical interaction with the handheld device.
In at least one embodiment, the graphical user interface is controlled by gestures which interact with objects in the augmented reality environment.
In at least one embodiment, the graphical user interface comprises augmented reality video and animation combined with instruction windows.
In at least one embodiment, the positioning module comprises a GPS location device.
In at least one embodiment, the positioning module comprises a local network device which determines the position of the device with respect to nodes in a local network.
In at least one embodiment, the positioning module comprises a combination of GPS and local network.
The local network may comprise (Wi-Fi™, Bluetooth™, mesh network or other) or alternatively connected to the machine's telematics API through an internet connection (GSM, 3G, 4G, 5G).
In at least one embodiment, the positioning module detects and plots the position of the device with respect to the machine.
In at least one embodiment, the positioning module uses a grid/mesh network of gaming objects positioned a set distance away from the machine.
In at least one embodiment, the positioning module defines an area relative to the machine and detects when the device has moved into/out of an area corresponding to a virtual safe area.
In at least one embodiment, the positioning module provides updated device position information to allow the augmented reality scene to reflect a change of location of the device with respect to the machine.
In at least one embodiment, the matching module indicates matching by an alert on the user interface.
In at least one embodiment the alert is a flashing virtual area.
In at least one embodiment, the alert is a change in colour of the virtual area.
In at least one embodiment the alert is a sound.
In at least one embodiment, the machine is a stone crusher.
In at least one embodiment the machine is a screener.
In at least one embodiment, a reference marker is provided at a predetermined location on the machine to orient the device with respect to the machine.
In at least one embodiment, the reference marker is a barcode.
In at least one embodiment, the reference marker is a QR code
In at least one embodiment, the reference marker is scanned and the augmented reality scene is mapped out on the user interface with respect to that reference point.
In at least one embodiment of the present invention, the machine acts as its own independent server to store the augmented reality content required to be displayed on the headwear.
In at least one embodiment of the present invention, a hand held device acts as its own independent server to store the augmented reality content required to be displayed on the headwear.
In at least one embodiment, the device may stop the machine.
In at least one embodiment a plurality of devices may be used in conjunction with a single machine.
In another embodiment multiple machines may be controlled from a single device
In at least one embodiment, at least one of the plurality of devices are provided with location information on the other devices.
In accordance with a second aspect of the invention there is provided a computer implemented method for assisting with the maintenance of machinery, the method comprising the steps of:
capturing, in real time, a video of an area of interest for inclusion in an augmented reality scene, on a user interface of the device;
determining the position of the device relative to a machine;
recognising an actual area near the machine in the video;
accessing a database of virtual safe areas which are displayable in the augmented reality scene;
matching the virtual safe areas to the corresponding actual area to show a user in the augmented reality scene, areas around the machine which are safe and corresponding areas which are unsafe.
In at least one embodiment of the invention, the active status of the machine is detected using sensors and the safety of a zone is determined depending on whether the zone is in or near an area where the machine is inactive or active.
In at least one embodiment, the user interface is a graphical user interface which is controlled by gestures which interact with objects in the augmented reality environment.
In at least one embodiment, the step of determining the position of the device relative to a machine uses GPS.
In at least one embodiment, the step of determining the position of the device relative to a machine determines the position of the device with respect to nodes in a local network.
In at least one embodiment, the step of determining the position of the device relative to a machine combines GPS and the local network.
The local network may comprise (Wi-Fi™, Bluetooth™, mesh network etc) or alternatively connected to the machines telematics API through an internet connection (GSM, 3G, 4G, 5G).
In at least one embodiment, the step of determining the position of the device relative to a machine uses a grid/mesh network of invisible gaming objects positioned a set distance away from the machine.
In at least one embodiment, the step of determining the position of the device relative to a machine defines an area relative to the machine and detects when the device has moved into/out of the area.
In at least one embodiment, an alert is provided to the device if it moves out of the area.
In at least one embodiment, the step of determining the position of the device relative to a machine detects and plots the position of the device with respect to the machine.
In at least one embodiment, the step of determining the position of the device relative to a machine, the device position information is updated to allow the augmented reality scene to reflect a change of location of the device with respect to the machine.
In at least one embodiment, the step of determining the position of the device relative to a machine, the device position information is updated to allow a change in the position of the virtual component with respect to the machine as shown by the video image of the area of interest.
In at least one embodiment, the virtual component is overlaid with an actual component image received from the camera.
In at least one embodiment, matching is indicated by an alert on the user interface.
In at least one embodiment the alert is a flashing virtual area.
In at least one embodiment, the alert is a change in colour of the virtual area.
In at least one embodiment, the alert is a sound.
In at least one embodiment, a reference marker is provided at a predetermined location on the machine to orient the AR experience with respect to the machine.
In at least one embodiment, the reference marker is a barcode.
In at least one embodiment, the reference marker is a QR code.
In at least one embodiment, the reference marker is scanned and the augmented reality scene is mapped out in the device with respect to that reference point.
In at least one embodiment, the device is connected to the machine to allow it to control the machine.
In at least one embodiment, the device may stop the machine.
In at least one embodiment a plurality of devices may be used in conjunction with a single machine.
In at least one embodiment, the plurality of devices are provided with location information on the other devices.
In another embodiment multiple machines may be controlled from a single device.
The invention will be more clearly understood from the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, in which:
The present invention provides a device with an augmented reality interface which assists a user in identifying safe areas around a work machine. The device of the present invention may be implemented as an AR headset, or as a handheld device such as a rugged tablet computer.
The user's device can be connected directly to the machine or a remote service tablet via a wireless network connection such as Wi-Fi™, Bluetooth™, mesh network or alternatively connected to the machines telematics API through an internet connection (GSM, 3G, 4G, 5G etc).
The machine or remote service tablet may act as its own independent server to store the augmented reality content required to be displayed on the headwear. This independent server removes the need for an internet connection so the user can avail themselves of the experiences in an offline, remote geographic locations. Based on whether the machine is running or not, digital content is displayed on or around the machine to notify the user of “unseen” dangers. These notifications can be tailored to the location of the user relative to the machine.
The AR environment includes Environmental Health & Safety (EHS) prompts where applicable to inform/remind the user of all the associated hazards/dangers around the perimeter of the machine and upon or under it. It is known to list EHS warnings in the operators' manual and operating procedures documentation. However, by integrating them into an AR system in which this information is combined with real time video of the machine, an extra level of context and understanding is provided. In at least one embodiment, the device of the present invention compels a user to acknowledge that they have seen/read/listened to content before the user may proceed onto a subsequent piece of content. This feature also confirms that the user has accepted that they have understood the instruction.
The AR environment creates zones around the physical machine can be highlighted in different colours. These virtual components map on to the actual components of the machine to show which zones are safe around the machine. Arrows, warning symbols and audio prompts can also be used to provide safety guidance to a user around the machine and notify the user of any dangers as they move around the machine. The handheld device solution has arrows, voice commands and other symbols incorporated with real time video of the machine. The AR headset allows the user to look straight at the machine when viewing the AR content.
Positional awareness of the device may be provided by a grid/mesh network of invisible gaming objects positioned a set distance away from the machine. If the user collides with one of these gaming objects then a notification can be displayed on the device and a signal sent to the machine. The machine upon receipt of this information, checks the status of the machine to determine whether it is in operation, e.g. tracking, crushing/screening. Based upon that status, the machine can then trigger another event, such as stop machine from moving (if tracking), stop a belt and crusher/screen (if crushing/screening) or allow user to proceed towards the machine, if it is not functioning.
In addition to the above, the user will also have the ability to remotely stop the machine at any point directly from the headwear/handheld device if the user is within the immediate vicinity of the machine. This drastically cuts down the time for the user to reach for the nearest E-stop on the machine, especially if the user is away from the machine when the alert is raised and had to put themselves in potential danger by going up to the machine to stop it.
Multiple users may share the same experience. This means that if more than one user is in the vicinity of the machine then the other user could be notified of their whereabouts/location. For example, if the second user/device was at the other side of the machine and not in the first users line of sight.
This could also include if the other user/device was in a machine and they were moving into the area the first user is in. The user in danger could then be notified through an alarm or visual of the immediate risk and told to exit that area to avoid coming into harm.
The device uses positioning module 67 which determines the position of the device relative to the machine and which may use a reference marker such as a QR code as a reference marker to orient the device relative to the machine. Shape recognition module 69 is used to recognise the actual component of the machine in the video. The database of safe areas 73 provides a graphical representation displayable in the augmented reality scene. Matching module 71 matches a location in a safe area to the corresponding actual area around or on the work machine.
recognising an actual component of the machine in the video 103;
accessing a database of virtual safe areas which are displayable in the augmented reality scene 104; matching the virtual component to the corresponding actual component to show a user in the augmented reality scene, areas around the machine which are safe and corresponding areas which are unsafe 105.
In
The description of the invention including that which describes examples of the invention with reference to the drawings may comprise a computer apparatus and/or processes performed in a computer apparatus. However, the invention also extends to computer programs, particularly computer programs stored on or in a carrier adapted to bring the invention into practice. The program may be in the form of source code, object code, or a code intermediate source and object code, such as in partially compiled form or in any other form suitable for use in the implementation of the method according to the invention. The carrier may comprise a storage medium such as ROM, e.g. CD ROM, or magnetic recording medium, e.g. a memory stick or hard disk. The carrier may be an electrical or optical signal which may be transmitted via an electrical or an optical cable or by radio or other means.
The description of the invention including that which describes examples of the invention describes the use of an augmented reality (AR) systems and apparatus.
AR overlays virtual objects onto the real-world environment. AR devices like the Microsoft HoloLens and various enterprise-level “smart glasses” are transparent, letting you see everything in front of you as if you are wearing a weak pair of sunglasses. The technology is designed for completely free movement while projecting images over whatever you look at. The term mixed reality overlays and anchors virtual objects to the real world.
In the specification the terms “comprise, comprises, comprised and comprising” or any variation thereof and the terms include, includes, included and including” or any variation thereof are considered to be totally interchangeable and they should all be afforded the widest possible interpretation and vice versa.
The invention is not limited to the embodiments hereinbefore described but may be varied in both construction and detail.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2020/056044 | 3/6/2020 | WO |