The present application relates to a management information display system and a management information display method.
Storage states of objects, such as tools, that are placed on predetermined storage locations, such as shelves and stands, may be managed, the storage states indicating, for example, whether or not the objects have been taken out and by whom the objects have been taken out. A known technique for such management enables storage states of objects to be known readily by, for example, storing the storage state in a storage and displaying the storage state on a display as needed (see, Japanese Patent Application Publication No. 2009-070259).
According to the technique described in Japanese Patent Application Publication No. 2009-070259, the display is arranged at a position different from the positions where the objects are placed. Therefore, it is difficult for an operator returning one of the objects, to know information on a specific position to place the object instantly, for example.
A management information display system and a management information display method are disclosed.
According to one aspect, there is provided a management information display system, comprising: a management state detecting unit configured to detect a placement state of an object placed on a predetermined placement area such that the object is able to be taken out; a projecting device configured to be capable of projecting a projection image onto the predetermined placement area; and a control device configured to: determine, based on a detection result from the management state detecting unit, whether or not the object has been taken out from the predetermined placement area; set projection information including information on the object that has been taken out, when the control device determines that the object has been taken out; and control the projecting device such that the projection image representing the projection information is projected onto a position where the object had been placed on the predetermined placement area.
According to one aspect, there is provided a management information display method, including: detecting a placement state of an object placed on a predetermined placement area such that the object is able to be taken out; determining, based on a result of the detecting of the placement state, whether or not the object has been taken out from the predetermined placement area; setting projection information including information on the object that has been taken out when it is determined that the object has been taken out; and projecting, by means of a projecting device, a projection image representing the projection information onto a position where the object had been placed on the predetermined placement area.
The above and other objects, features, advantages and technical and industrial significance of this application will be better understood by reading the following detailed description of presently preferred embodiments of the application, when considered in connection with the accompanying drawings.
Embodiments of a management information display system and a management information display method according to the present disclose will hereinafter be described based on the drawings. The invention is not limited by these embodiments. Furthermore, elements in the following embodiments include those easily substitutable by persons skilled in the art or those that are substantially the same.
The object T is placed on a placement surface (placement area) 40a of a placement stand 40, for example, in a state where the object T is able to be taken out. Examples of this object T include a tool to be used in an operation and an exhibit to be exhibited at an exhibition. However, the object T is not limited to these examples, and the object T may be any other type of an article that is placed on a predetermined placement area such that the object T is able to be picked up from the predetermined placement area. Tools T1 and T2 of different kinds will be described as examples of the object T with respect to the embodiment.
One or more placement stands 40 are provided in the storeroom, for example. With respect to this embodiment, a case in which one object T is placed on a single placement stand 40 will be described as an example. However, a case is not limited to this example, and plural objects T may be placed on a single placement stand. The placement surface 40a may be, for example, planar, but not limited to this example, and the placement surface 40a may be a non-flat surface, such as an area provided with projections and depressions.
The management state detecting unit 10 detects a management state of the object T. The management state detecting unit 10 has an object detecting unit 11, an RFID detecting unit 12, and a person detecting unit 13. One object detecting unit 11, one RFID detecting unit 12, and one person detecting unit 13 may be provided for a detection of one placement stand 40 or one object T, for example, or one or more may be provided for a detection of the objects in the whole storeroom.
The object detecting unit 11 detects a placement state of the object T on the placement surface 40a. Any of various sensors, such as a three-dimensional measurement sensor like an RGB-D camera, and an optical sensor, may be used as the object detecting unit 11, for example. When the object T is on the placement surface 40a of the placement stand 40, the object detecting unit 11 is able to detect, as a placement state, a shape of the object T and a position and a posture of the object T on the placement surface 40a, for example. Furthermore, when the object T is not on the placement surface 40a, the object detecting unit 11 detects the placement surface 40a in a state in which the object T is not placed thereon. That is, in this case, the object detecting unit 11 detects, as a placement state, the state in which the object T is not placed on the placement surface 40a. The object detecting unit 11 transmits the placement state information I1 that is a detection result to the control device 30.
The RFID (radio frequency identifier) detecting unit 12 reads a predetermined RF tag 14 and obtains information that has been embedded in the RF tag 14. The RFID detecting unit 12 is capable of reading the RF tag 14 positioned within a predetermined distance from the RFID detecting unit 12, for example. This predetermined distance may be set as appropriate. The RFID detecting unit 12 is provided for each placement stand 40, for example. The RFID detecting unit 12 may be provided at another location, such as a doorway of the storeroom, for example. An RFID reader is used, for example, as the RFID detecting unit 12. The RF tag 14 is owned by a person, such as an operator P, who has been permitted to come in and out of the storeroom, for example. Identification information for identifying the operator P who is an owner thereof, for example, has been embedded in the RF tag 14. The RFID detecting unit 12 obtains the identification information of the operator P by reading the RF tag 14. The RFID detecting unit 12 transmits obtained identification information 12 to the control device 30.
The person detecting unit 13 detects a person in the room, such as the storeroom. A device that may be used as the person detecting unit 13, for example, is capable of obtaining information on a person to enable an appearance of the person to be identified, the appearance being, for example, the face and figure of the person. Examples of this device include an imaging device, such as a camera, and a three-dimensional measurement sensor, such as an RGB-D camera. When such a device is used, the appearance of the person and the identification information are stored in a storage (a storage 35 of the control device 30 or an external storage) in association with each other, and a person is able to be identified by comparison between a detection result and the stored appearance through image processing, for example. The identification of the person may be performed by the control device 30. The person detecting unit 13 transmits obtained person information 13 to the control device 30.
The projecting device 20 is provided on a wall or a ceiling of the storeroom. The projecting device 20 is able to project a projection image onto the placement surface 40a of the placement stand 40. In this embodiment, based on control by the control device 30 described later, the projecting device 20 is able to project the projection image.
Based on a detection result from the management state detecting unit 10, the control device 30 sets projection information according to a management state. The control device 30 controls the projecting device 20 such that the projection image indicating a set projection information is projected onto the placement surface 40a. The control device 30 has a management state acquiring unit 31, a determining unit 32, a person identifying unit 33, a display controller 34, and the storage 35.
The management state acquiring unit 31 obtains the detection result transmitted from the management state detecting unit 10. The management state acquiring unit 31 obtains the placement state information I1 on the object T detected by the object detecting unit 11. The management state acquiring unit 31 obtains the identification information 12 on the person detected by the RFID detecting unit 12. The management state acquiring unit 31 obtains the person information 13 obtained by the person detecting unit 13.
Based on the detection result from the object detecting unit 11, the determining unit 32 determines whether or not the object T has been taken out. When the placement state information I1 indicates the shape of the object T and the position and the posture of the object T on the placement surface 40a, for example, the determining unit 32 is able to determine that the object T has not been taken out. Furthermore, when the placement state information I1 indicates a state in which the object T is not placed on the placement surface 40a, the determining unit 32 is able to determine that the object T has been taken out. When the determining unit 32 determines that the object T has been taken out, the person identifying unit 33 described later identifies an object person who has taken out the object T. When plural objects T are placed, the determining unit 32 makes a determination for each of the plural objects T.
After the object person who has taken out the object T is identified and a projection image is projected onto the placement surface 40a, the determining unit 32 determines whether or not the object person who has taken out the object T is present around the placement stand 40. First, based on a detection result from the RFID detecting unit 12 or the person detecting unit 13, the determining unit 32 determines whether or not a person is present around the placement stand 40. When a person is detected within a range of a predetermined distance or less from the placement stand 40, for example, the determining unit 32 is able to determine that a person is present around the placement stand 40. Next, based on an identification result from the person identifying unit 33 described later, the determining unit 32 determines whether or not the person detected around the placement stand 40 is the object person. When the person identified by the person identifying unit 33 is the object person, the determining unit 32 is able to determine that the object person who has taken out the object T is present around the placement stand 40.
After the object T is taken out, the determining unit 32 determines, based on a detection result from the object detecting unit 11, whether or not the object T has been returned onto the placement surface 40a. When the placement state information I1 has been changed from a state indicating the placement surface 40a in which the object T is not placed thereon to a state including the shape of the object T and the position and the posture of the object T on the placement surface 40a, the determining unit 32 determines that the object T has been returned. Furthermore, when the placement state information I1 indicates that the placement surface 40a is still in a state in which the object T is not placed thereon, the determining unit 32 determines that the object T has not been returned.
When it is determined that the object T has been taken out, the person identifying unit 33 identifies, based on the detection result from the RFID detecting unit 12 or the person detecting unit 13, the person who has taken out the object T as the object person. When the person identifying unit 33 has identified the object person, the person identifying unit 33 stores, into the storage 35, information on the object person and information on the object T taken out in association with each other. Furthermore, when the determining unit 32 determines that a person is present around the placement stand 40 after a projection image is projected onto the placement surface 40a, the person identifying unit 33 identifies, based on the detection result from the RFID detecting unit 12 or the person detecting unit 13, the person present around the placement stand 40.
When the detection result from the RFID detecting unit 12 is used, the person identifying unit 33 is able to identify an object person based on the identification information transmitted from the RFID detecting unit 12. When the detection result from the person detecting unit 13 is used, the appearance of the person and the identification information are stored in association with each other in the storage 35 in advance, for example. By performing an image processing, for example, based on an image captured or a result detected and the appearance of the person stored in the storage 35, the determining unit 32 is able to identify an object person.
When the determining unit 32 determines that the object T has been taken out, the display controller 34 sets projection information including information on the object T that has been taken out. The projection information includes, for example, the information on the object T taken out, and information indicating that an object person has taken out the object T. The information on the object T may include, for example, at least one of: textual information indicating a name of the object T; and image information indicating an appearance of the object T. When plural objects T are taken out, the display controller 34 sets projection information for each of the objects T. The display controller 34 may set the projection information so that a moving image is displayed.
After setting the projection information, the display controller 34 controls the projecting device 20 such that a projection image representing the set projection information is projected onto the placement surface 40a. The display controller 34 transmits a signal including, for example, the projection information and specification information specifying a projection range on the placement surface 40a, to the projecting device 20. When the projecting device 20 has received the signal, the projecting device 20 projects, based on the projection information and the specification information, the projection image onto the placement surface 40a. The specification information may correspond to, for example, a range including a position on the placement surface 40a, the position being where the object T had been placed.
When the determining unit 32 determines that the object person is present around the placement stand 40 after the projection image is projected onto the placement surface 40a, the display controller 34 controls the projecting device 20 to cause the projection image to be highlight-displayed. That is, the display controller 34 controls the projecting device 20 such that the projection image projected onto the placement surface 40a where the object T taken out by the object person is to be placed is highlight-displayed. By this highlight-display, where to return the object T is able to be clearly notified to the object person. When plural objects T have been taken out by the same object person and plural projection images are projected, the display controller 34 controls the projecting device 20 such that the plural projection images corresponding to the objects T taken out by the object person are highlight-displayed. The display controller 34 controls the projecting device 20 such that any projection image corresponding to an object T taken out by a person different from the object person is not highlight-displayed.
Furthermore, when the determining unit 32 determines that the object T has been returned onto the placement surface 40a, the display controller 34 controls the projecting device 20 to cause the projection image to be erased. That is, the display controller 34 controls the projecting device 20 such that the projection image projected on the placement surface 40a is erased.
The storage 35 stores various kinds of information. The storage 35 has a management information storage 36 and an object information storage 37. The management information storage 36 stores information (hereinafter, referred to as management information) indicating a management state of an object T on each placement stand 40. Examples of the management information include, for example, identification information of each placement stand 40, various types of information on an object T placed on each placement stand 40, and information on whether or not the objects T have been taken out. Examples of the various types of information on the objects T include names and appearances of the objects T, information for identification of the objects T, maintenance frequencies of the objects T, and an increase or decrease in the number of objects T. The identification information of the object T may be stored in association with the identification information of the placement stand 40 where the object T is placed. The object information storage 37 stores information on whether or not an object T has been taken out, and information on an object person who has taken out the object T when the object T has been taken out in association with each other. Information on the object T and information on the placement stand 40 in the management information storage 36 may be stored in association with information on the object T and information on the object person in the object information storage 37.
Operation of the management information display system 100 configured as described above will be described next.
As illustrated in
The control device 30 obtains, at the management state acquiring unit 31, the placement state information I1. In this case, the placement state included in the placement state information I1 is a state in which the tool T1 is placed on the placement surface 40a, for example. The determining unit 32 thus determines that the object T has not been taken out.
As illustrated in
After accessing the placement stand 40, the operator P may take out the tool T1. As illustrated in
When the determining unit 32 determines that the tool T1 has been taken out, the person identifying unit 33 identifies, based on the detection result from the RFID detecting unit 12 or the person detecting unit 13, the object person who has taken out the tool T1. Based on identification of the operator P included in the identification information 12 or image information on the operator P included in the person information 13, for example, the person identifying unit 33 is able to identify the operator P as the object person. After identifying the operator P as the object person, the person identifying unit 33 stores, into the object information storage 37 of the storage 35, information on the operator P who is the object person and information on the tool T1 taken out in association with each other.
Based on the determination result from the determining unit 32 and the identification result from the person identifying unit 33, the display controller 34 sets projection information. The display controller 34 sets the projection information such that the projection information includes, for example, information on the tool T1 taken out, and information indicating the operator P that is the object person who has taken out the tool T1. The display controller 34 controls the projecting device 20 such that a projection image representing the set projection information is projected onto the placement surface 40a. The display controller 34 transmits a signal including the projection information and specification information specifying a projection range on the placement surface 40a to the projecting device 20. In this case, the specification information may correspond to, for example, a range including a position on the placement surface 40a, the position being where the tool T1 had been placed. When the operator P has taken out another object T different from the tool T1, the display controller 34 sets projection information similarly for the another object T.
The projecting device 20 receives the signal from the control device 30 and projects the projection image representing the projection information described above, over the projection range specified. As illustrated in
After the projection image D is projected onto the placement surface 40a, the determining unit 32 determines, based on the detection result from the RFID detecting unit 12 or the person detecting unit 13, whether or not the object person who has taken out the tool T1 is present around the placement stand 40. Firstly, the determining unit 32 determines whether or not any person is present around the placement stand 40. For example, when an RF tag 14 has been read by the RFID detecting unit 12 or when a person has been detected by the person detecting unit 13, the determining unit 32 is able to determine that a person is present around the placement stand 40. In the example illustrated in
When the determining unit 32 has determined that a person is present around the placement stand 40, the person identifying unit 33 identifies the person around the placement stand 40. When a determination is made by use of a detection result from the RFID detecting unit 12, the person identifying unit 33 is able to identify the person based on the identification information embedded in the RF tag 14 of the person around the placement stand 40. When a detection result from the person detecting unit 13 is used, the person identifying unit 33 is able to identify that the person is the operator P based on image information on the person included in the person information 13 and appearance information on the operator P stored in the storage 35 in advance.
Subsequently, the determining unit 32 determines, based on the identification result from the person identifying unit 33, whether or not the person present around the placement stand 40 is the object person who has taken out the tool T1. The person identifying unit 33 has identified that the person present around the placement stand 40 is the operator P. Based on this identification result and information stored in the object information storage 37 of the storage 35, the determining unit 32 is able to determine that the person present around the placement stand 40 is the operator P.
When the determining unit 32 determines that the operator P who has taken out the tool T1 is present around the placement stand 40, the display controller 34 controls the projecting device 20 such that the projection image D is highlight-displayed. By this control, a projection image Da that has been highlighted is projected onto the placement surface 40a, as illustrated in
The operator P returning the tool T1 to the placement stand 40 brings the placement surface 40a of the placement stand 40 into the state in which the tool T1 is placed on the placement surface 40a, as illustrated in
When it is determined that the tool T1 has been returned properly, the display controller 34 controls the projecting device 20 to cause the projection image Da to be erased. That is, the display controller 34 controls the projecting device 20 such that the projection image Da projected on the placement surface 40a is erased. This control causes the projecting device 20 to stop projecting the projection image Da on the placement surface 40a. As a result, as illustrated in
When the object person who has taken out the object T is identified, the display controller 34 sets the projection information (Step S40). At Step S40, the display controller 34 sets the projection information such that the projection information includes information on the object T taken out and information indicating that the object person has taken out the object T. The display controller 34 controls the projecting device 20 such that the projection image D representing the set projection information is projected onto the placement surface 40a. This control causes the projection image D to be projected onto the placement surface 40a (Step S50).
After the projection image D is projected, the determining unit 32 determines, based on the detection result from the person detecting unit 13, whether or not the object person who has taken out the tool T1 is present around the placement stand 40 (Step S60). When it is determined at Step S60 that the object person who has taken out the tool T1 is not present around the placement stand 40 (No at Step S60), the processing of Step S60 is repeated. Furthermore, when it is determined that the object person who has taken out the tool T1 is present around the placement stand 40 (Yes at Step S60), the projecting device 20 is controlled such that the image projected onto the placement surface 40a where the tool T1 is to be placed is highlight-displayed. This control causes the projection image Da that has been highlighted to be displayed on the placement surface 40a (Step S70).
Thereafter, the determining unit 32 determines whether or not the tool T1 has been returned (Step S80). When it is determined at Step S80 that the tool T1 has not been returned onto the placement surface 40a (No at Step S80), the processing of Step S80 is repeated. Furthermore, when it is determined that the tool T1 has been returned onto the placement surface 40a (Yes at Step S80), the projecting device 20 is controlled such that the position where the tool T1 has been returned to is regarded as a new regular position and the image projected on the placement surface 40a is erased. This controls causes the projecting device 20 to stop the projection onto the placement surface 40a (Step S90).
As described above, the management information display system 100 according to the embodiment includes: the management state detecting unit 10 configured to detect the placement state of the object T placed on the predetermined placement surface 40a such that the object T is able to be taken out; the projecting device 20 configured to be capable of projecting a projection image onto the placement surface 40a; and the control device 30 configured to determine, based on the detection result from the management state detecting unit 10, whether or not the object T has been taken out from the placement surface 40a; to set projection information including information on the object T that has been taken out when the control device 30 determines that the object T has been taken out; and to control the projecting device 20 such that the projection image D representing the projection information is projected onto the position where the object T had been placed on the placement surface 40a.
Furthermore, the management information display method according to the embodiment includes: detecting the placement state of the object T placed on the predetermined placement surface 40a such that the object T is able to be taken out; determining, based on the result of the detecting of the placement state, whether or not the object T has been taken out from the placement surface 40a; setting the projection information including information on the object T that has been taken out when it is determined that the object T has been taken out; and projecting, by means of the projecting device 20, the projection image D representing the projection information onto the position where the object T had been placed on the placement surface 40a.
According to this configuration, the projection image D representing the information on the object T that has been taken out is projected onto the placement surface 40a when the object T has been taken out from the placement surface 40a. Therefore, an administrator who sees the placement surface 40a having the projection image D projected thereon is able to readily know the information related to the object T that had been placed on the placement surface 40a. The administrator is thereby able to readily know the management state of the object T which corresponds to the placement state of the object T.
In the management information display system 100 according to the embodiment, the management state detecting unit 10 is capable of detecting the information for identifying the object person who has taken out the object T, and the control device 30 identifies, based on the detection result from the management state detecting unit 10, the person who has taken out the object T as the object person, when it is determined that the object T has been taken out; and sets projection information such that the projection information includes information indicating that the object person has taken out the object T. This configuration enables an administrator to know the object T that has been taken out from the placement surface 40a and the object person who has taken out the object T in association with each other.
In the management information display system 100 according to the embodiment, the management state detecting unit 10 is capable of detecting information for identifying the object person who has taken out the object T and information for identifying the person present around the placement surface 40a, and the control device 30 identifies, based on the detection result from the management state detecting unit 10, the person who has taken out the object T as the object person, when it is determined that the object T has been taken out; determines, based on the detection result from the management state detecting unit 10, whether or not the object person is present around the placement surface 40a after the projection image D is projected onto the placement surface 40a; and controls the projecting device 20 such that projection information is highlight-displayed when it is determined that the object person is present around the placement surface 40a. This configuration enables the object person to readily know where to return the object T.
In the management information display system 100 according to the embodiment, after the projection image D is projected onto the placement surface 40a, the control device 30 determines, based on the detection result from the management state detecting unit 10, whether or not the object T has been returned to the placement surface 40a; and when the control device 30 determines that the object T has been returned, the control device 30 controls the projecting device 20 such that the projection of the projection image D is erased. This configuration enables the object person or the administrator to readily know that the object T has been returned since the projection of the projection image D is erased when the object T has been returned.
In the management information display system 100 according to the embodiment, the control device 30 sets the projection information such that the projection information includes image information representing the appearance of the object T when it is determined that the object T has been taken out. This configuration enables the object person or the administrator to readily know the appearance of the object T to be returned to the placement surface 40a.
The technical scope of the present application is not limited to the above described embodiment, and modifications may be made as appropriate without departing from the gist of the present application. For example, in addition to the above described configuration, the management information display system 100 may have a configuration to transmit information to the operator P by output of sound. For example, the management information display system 100 may be configured to be able to determine whether or not the operator P who is the object person that has taken out the object T is looking at the placement surface 40a of the placement stand 40 when the operator P is present around the placement stand 40 on which the object T had been placed. In this configuration, the management information display system 100 may be configured to output, by means of a sound output unit, for example, information to instruct the operator P to look at the placement surface 40a, when it is determined that the operator P is not looking at the placement surface 40a.
Furthermore, the case in which the object T is placed on the placement surface 40a that is planar has been described as an example in the above described embodiment, but the embodiment is not limited to this example. A configuration in which the object T is placed in a three-dimensional placement portion or inside a container, such as a case, may be adopted. In this case, the projecting device 20 may project a projection image according to the shape of the placement portion for the object T.
Furthermore, the above described embodiment is configured to project a projection image D representing information on the object T that has been taken out onto a placement surface 40a, when the object T has been taken out from the placement surface 40a, but the projection image D is not necessarily projected constantly.
A management information display system and a management information display method according to the present application may be used in, for example, a processing apparatus, such as a computer.
The present application provides a management information display system and a management information display method that enable management states of objects to be readily known.
Although the application has been described with respect to specific embodiments for a complete and clear application, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2020-092228 | May 2020 | JP | national |
This application is a Continuation of PCT International Application No. PCT/JP2021/014679 filed on Apr. 6, 2021 which claims the benefit of priority from Japanese Patent Application No. 2020-092228 filed on May 27, 2020, the entire contents of both of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/014679 | Apr 2021 | US |
Child | 17961593 | US |