The technique disclosed herein relates to an image display device configured to be used by being worn on a head of a user.
For example, JP 2014-93050 A (hereinbelow termed Patent Literature 1) describes an image display device used by being worn on a user's head. This type of image display device is provided with a display unit configured to display an image of a range corresponding to a user's view (that is, a reality image) and a computer configured to compose an object image, which indicates an object related to an image to be displayed on the display unit, to the reality image displayed on the display unit and causing them to be displayed. As such, a technique which enhances and expands the world of reality perceivable to a human by using a computer is known as Augmented Reality (AR).
In the image display device of Patent Literature 1, a situation may be expected in which, while an article image of a target article is being displayed on the display unit, an object image indicating a manual related to handling of the target article is to be displayed in combination with the article image on the display unit.
However, even in such a case, the image display device cannot determine whether or not the target article was actually handled in accordance with a procedure which the manual instructs, so users and the like cannot confirm whether or not the target article was handled in accordance with the procedure which the manual instructs.
In this description, a technique that enables users and the like to confirm whether or not a target article was handled in accordance with a procedure instructed by a manual.
An image display device disclosed herein may be configured to be used by being worn on a head of a user. The image display device may comprise: a display unit; a first camera configured to capture a specific range corresponding to a range of view of the user; a second camera provided in a different position from the first camera, and configured to capture the specific range; a sensor configured capable of detecting a posture of the image display device; a controller; and a memory configured to store a manual related to handling of a target article. The controller may be configured to: specify spatial information for specifying features of a space around the image display device based on a first calibration image acquired from the first camera and a second calibration image acquired from the second camera; specify a position and a posture of the image display device in the space based on the spatial information, a first captured image acquired from the first camera, a second captured image acquired from the second camera, and the posture of the image display device detected by the sensor; cause the display unit to display a first instruction screen in a case where the target article is included in the specific range, the first instruction screen including a first object image in combination with the target article, the first object image indicating a first procedure for handling the target article in accordance with the manual stored in the memory; determine whether an operation that the user actually performed on the target article in the specific range follows the first procedure or not based on the first captured image and the second captured image while the first instruction screen is displayed on the display unit; and cause the memory to store a result of the determination.
According to the above configuration, while the first instruction screen is displayed on the display unit, the controller determines whether or not the operation that the user actually performed on the target article in the specific range follows the first procedure based on the first captured image and the second captured image, and causes the memory to store this determination result. Due to this, by confirming the determination result stored in the memory, users and the like can confirm whether or not the target article was handled in accordance with the procedures instructed by the manual.
Here, the “first captured image” may be a same image as the “first calibration image” or an image different therefrom. Similarly, the “second captured image” may be a same image as the “second calibration image” or an image different therefrom. Here, “handling of the target article” includes various work for handling the target article, such as assembly, disassembly, use, and repair of the target article. The “first object image” includes both still images and video images.
A controlling method, a computer program, and a computer-readable recording medium storing the computer program for implementing the image display device as above are also novel and useful.
Primary features of embodiments described below will be listed. The technical elements described herein are each independent technical elements, which exhibit technical usefulness solely or in various combinations, and are not limited to combinations recited in the claims as originally filed.
(Feature 1) The controller may be configured to cause the display unit to display a second instruction screen instead of the first instruction screen in a case where it is determined that the operation follows the first procedure, the second instruction screen including a second object image in combination with the target article, the second object image indicating a second procedure to be performed after the first procedure in accordance with the manual stored in the memory. The “second object image” includes both a still image and a video image.
According to this configuration, the second instruction screen is displayed on the display unit in a case where it is determined that the operation follows the first procedure. That is, the controller may not display the second instruction screen on the display unit in a case where it is determined that the operation does not follow the first procedure. Due to this, a possibility by which a user can properly handle the target article in accordance with the procedures indicated by the manual becomes higher.
(Feature 2) The image display device may comprise: a receiving unit configured to receive action information related to an action content of a tool from the tool. The operation may include an operation of using the tool. The controller may be configured to determine whether the operation follows the first procedure or not based on the first captured image, the second captured image, and the action information acquired from the receiving unit while the first instruction screen is displayed on the display unit.
According to this configuration, the controller determines whether or not the operation follows the first procedure based on the action information acquired from the tool via the receiving unit in addition to the first captured image and the second captured image. Due to this, according to the above configuration, the controller can more suitably determine whether or not the operation follows the first procedure.
(Feature 3) The image display device may further comprise a sending unit configured to send information to an external server. The image display device may send work information including the result of the determination stored in the memory to the external server via the sending unit.
According to this configuration, the controller sends the work information including the determination result stored in the memory to the external server via the sending unit. Due to this, the external server accumulates the work information. An administrator and the like of the external server can browse through the accumulated work information to confirm whether or not the target article was handled properly.
Further, in the disclosure herein, a computer program for a terminal device configured to communicate with the external server storing the work information sent by an image display device as above. The terminal device is provided with a display unit and a computer. The computer program causes the computer to perform receiving the work information from the external server by communicating with the external server, and causing the display unit to display a browse screen represented by the received work information.
According to this configuration, a user of the terminal device can browse through the browse screen displayed on the display unit to confirm whether or not the target article was handled properly.
(Configuration of Communication System 2;
As shown in
(Configuration of Image Display Device 10)
The image display device 10 shown in
The support body 12 is a member in a shape of a glass frame. The user can wear the image display device 10 on the head by wearing the support body 12 as one would wear glasses.
The display units 14a, 14b are transparent display unit members, respectively. When the user wears the image display device 10 on the head, the display unit 14a is arranged at a position facing a right eye of the user and the display unit 14b is arranged at a position facing a left eye of the user. Hereinbelow, the left and right display units 14a, 14b may collectively be called a display unit 14.
The projection units 15a, 15b are members configured to project images on the display units 14a, 14b. The projection units 15a, 15b are provided at lateral sides of the display units 14a, 14b. Hereinbelow, the left and right projection units 15a, 15b may collectively be called a projection unit 15. In this embodiment, the projection unit 15 projects a predetermined object image on the display unit 14 in accordance with an instruction from a controller 26. Due to this, the user can see an object in the real world and/or in a space and the object image as if object image is composed over the object in the real world visible to the user and/or at a predetermined position in the space through the display unit 14. Hereinbelow, in this description, an explanation of operations of the projection unit 15 will be omitted when explaining about the controller 26 causing the display unit 14 to display a desired image by instructing the projection unit 15 to project this image, and this may be expressed simply as “the controller 26 causes the display unit 14 to display the desired image”.
The first camera 16 is a camera arranged on the support body 12 at a position above the display unit 14a (that is, at a position corresponding to the right eye of the user). On the other hand, the second camera 18 is a camera arranged on the support body 12 at a position above the display unit 14b (that is, at a position corresponding to the left eye of the user). Each of the first camera 16 and the second camera 18 allows to capture a range corresponding to a range of view of the user wearing the image display device 10 (hereinbelow termed a “specific range”) from different angles.
The control box 19 is a box attached to a part of the support body 12. The control box 19 accommodates respective elements functioning as a control system of the image display device 10.
Specifically, as shown in
The sensor 20 is a triaxial acceleration sensor. The sensor 20 detects acceleration along three axes being X, Y, and Z axes. The controller 26 is configured capable of specifying a posture and a motion state of the image display device 10 using detection values from the sensor 20.
The BT I/F 22 is an I/F configured to perform BT communication with an external device (for example, the tool 40).
The Wi-Fi I/F 24 is an I/F configured to perform Wi-Fi communication with an external device (for example, the server 50) via the Internet 4.
The controller 26 is configured to perform various processes according to programs stored in the memory 28. Contents of the processes that the controller 26 performs will be described later in detail. Further, the controller 26 is electrically connected to the display unit 14, the projection unit 15, the first camera 16, the second camera 18, the sensor 20, the BT I/F 22, the Wi-Fi I/F 24, and the memory 28, and is configured to control operations of these elements.
The memory 28 stores various programs. The programs include various types of programs such as a manual application program 30. In
In the example of
(Configuration of tool 40) As shown in
The BT I/F 46 is an I/F configured to perform BT communication with an external device (for example, the image display device 10). The controller 42 is configured to perform sending action information related to an action content of the tool to the image display device 10 via the BT I/F 46 in accordance with a program stored in the memory 44. Specifically, the controller 42 detects a torque value upon tightening a screw, and performs the process of sending the action information including the detected torque value to the image display device 10 via the BT I/F 46. The memory 44 stores various programs.
In the example of
(Configuration of Server 50)
The server 50 of
The display unit 52 is a display configured to display various types of information. The operation unit 54 includes a keyboard and a mouse. A user of the server 50 can input various instruction to the server 50 by operating the operation unit 54. The Wi-Fi I/F 56 is an I/F for performing Wi-Fi communication with an external device (for example, the image display device 10 and the external PC 70) via the Internet 4. The controller 58 is configured to perform various processes in accordance with programs stored in the memory 60. The memory 60 stores various programs. Further, the memory 60 stores work information received from the image display device 10 when the image display device 10 performs a manual process (see
(Configuration of External PC 70)
The external PC 70 in
The display unit 71 is a display configured to display various types of information. The operation unit 72 includes a keyboard. A user of the external PC 70 can input various instruction to the external PC 70 by operating the operation unit 72. The Wi-Fi I/F 73 is an I/F for performing Wi-Fi communication with the server 50 via the Internet 4. The controller 74 is configured to perform various processes in accordance with programs stored in the memory 75. The memory 75 stores various programs. In the present embodiment, the memory 75 especially stores a browsing program 76 used for the external PC 70 to communicate with the server 50 to browse the work information stored in the server 50.
(Display Device Process;
A display device process executed by the controller 26 of the image display device 10 of the present embodiment will be described with reference to
In S10, the controller 26 displays a predetermined calibration screen on the display unit 14. The calibration screen is a screen for allowing the user to perform calibration. Here, “calibration” is a process for specifying spatial information (that is, calibration data) for specifying features in a surrounding space of the image display device 10. Further, the “features in the surrounding space of the image display device 10” includes, for example, various types of information for characterizing an indoor space in a case where the image display device 10 exists indoors, such as a distance between a wall and the device, a direction of the wall, a distance between a ceiling and the device, a height of the ceiling, an area of a floor, a position of furniture, a distance to the furniture, and the like. On the other hand, for example, in a case where the image display device 10 exists outdoors, the “features in the surrounding space of the image display device 10” includes various types of information for characterizing the surrounding space of the device, such as a distance to a target object in the surroundings.
In subsequent S12, the controller 26 monitors completion of the specification of the spatial information. As aforementioned, by the user performing an operation to follow the pointer P with his eyes (that is, the user moves the head according to a motion of the pointer P) after the calibration screen (see
In S14, the controller 26 initiates a real-time process (see
(Real-Time Process;
In S30 of
In subsequent S32, the controller 26 calculates a distance between a specified feature point, which is found commonly in the first and second captured images, and the image display device 10. The “feature point” mentioned herein is for example one of the plural feature points included in the spatial information (case of YES in S12 of
In subsequent S34, the controller 26 calculates the posture of the image display device 10 at this timepoint based on the detection values of the sensor 20. Specifically, the controller 26 calculates tilt angles (θx, θy, θz) of X-axis, Y-axis, and Z-axis in a case of setting a direction of gravity as 0° based on the detection values of the sensor 20 (that is, acceleration in each axis direction of the X-axis, Y-axis, and Z-axis), and calculates the posture of the image display device 10 (that is, its tilt relative to a horizontal plane) at this timepoint of S10 based on these tilt angles.
In subsequent S36, the controller 26 uses the spatial information specified in the case of YES in S12 of
After completing S36, the controller 26 returns to S30 and executes the respective processes of S30 to S36 repeatedly. That is, the controller 26 can specify the position and the posture of the image display device 10 in the space where the image display device 10 exists on real-time basis by executing the processes of S30 to S36 repeatedly.
(Continuation of Display Device Process: From S16 of
As above, when the controller 26 initiates the real-time process (see
As shown in
In subsequent S18, the controller 26 monitors detection of the user operation in the specific range. Here, the “user operation in the specific range” includes various operations such as the gesture that the user performs to the object image such as the menu object image (for example, a gesture to instruct moving the image or changing a size thereof, a gesture to instruct terminating display of the image, a gesture to select an icon, a gesture to instruct turning the power of the image display device 10 off, etc.), a movement of the user in the space, changing a direction of the view of the user, and the like. In S18, the controller 26 determines whether or not the user performed the operation in the specific range based on the first captured image from the first camera 16, the second captured image from the second camera 18, and the detection values from the sensor 20. When the user having performed the operation in the specific range is detected, the controller 26 determines YES in S18 and proceeds to S20.
In S20, the controller 26 determines whether or not the operation performed by the user is the predetermined gesture to instruct turning the power of the image display device 10 off (hereinbelow termed a “shutdown gesture”). When the operation performed by the user is determined as being the predetermined shutdown gesture, the controller 26 determines YES in S20, proceeds to S24, and turns off the power of the image display device 10. In this case, the display device process of
In S22, the controller 26 performs a process corresponding to the operation. For example, when the operation performed by the user is an operation to move a display position of the menu object image 80 (see
The controller 26 returns to S18 after completing S22, and monitors the user's operation being performed again. Due to this, each time the user performs an operation such as performing a gesture in the specific range or changing the direction of the user's view, the controller 26 changes display positions and manners of the object images and the guide image displayed in the display unit 14 in accordance with the operation. The controller 26 repeatedly executes the respective processes of S18 to S22 until the shutdown gesture is performed (YES in S20).
(Manual Process;
The manual process performed by the controller 26 of the image display device 10 will be described with reference to
In S50, the controller 26 identifies features of an article existing in a specific range (hereinbelow may be termed a “target article”) based on the first captured image from the first camera 16 and the second captured image from the second camera 18. Specifically, in S50, the controller 26 identifies the features of the target article (such as a shape, color, material, etc.) based on images of the target article included in the first captured image and the second captured image.
In subsequent S52, the controller 26 specifies an article ID of the target article having the features identified in S50. Specifically, in S52, the controller 26 determines whether or not the article having the features identified in S50 is included in the article table 32 in the memory 28. As shown in
For example, in a case where the target article has the shape shown in
In S54, the controller 26 reads a manual corresponding to the article ID specified in S52 from the manual data 34 (see
In subsequent S55, the controller 26 creates a procedure list using the manual read in S54.
A procedure list 200 in
A procedure list 300 in
Although not shown in
In subsequent S56, the controller 26 determines whether or not an arranging direction of the target article existing in the specific range matches a specific arranging direction designated by the manual read in S54. In a case where the arranging direction of the target article matches the specific arranging direction, the controller 26 determines YES in S56 and proceeds to S58. On the other hand, in a case where the arranging direction of the target article does not match the specific arranging direction, the controller 26 determines NO in S56 and proceeds to S57. In S57, the controller 26 displays a message for prompting the user to change the arranging direction of the target article to the specific arranging direction on the display unit 14. After S57, the controller 26 returns to the determination of S56.
In S58, the controller 26 causes the display unit 14 to display an instruction screen for instructing to perform a first procedure according to the procedure list created in S55.
For example, in the case where the article ID “P1” (see the combination information 102 in
Then, in S60, the controller 26 monitors an actual operation following the procedure instructed in the instruction screen shown in S58 (which may hereinbelow be termed “specific procedure”) to be operated. The controller 26 determines whether or not the operation following the specific procedure has been performed based on at least one of the first captured image, the second captured image, and action information sent from the tool 40. In determining that the operation following the specific procedure has been performed, the controller 26 determines YES in S60 and proceeds to S62. The controller 26 does not determine YES in S60 while the determination that the operation following the specific procedure is not performed is made. Here, cases where the controller 26 determines that “the operation following the specific procedure is not performed” include various cases for example in which a component to be attached is incorrect, a screw tightening torque value is lower than a predetermined value, an incorrect operation had been performed, and the like.
For example, in the case where the instruction screen of
In S62, the controller 26 updates the procedure list. That is, the controller 26 records that the operation following the procedure instructed by the instruction screen of S58 was performed in the procedure list created in S55.
For example, in the case where the instruction screen of
In subsequent S64, the controller 26 determines whether or not all of the procedures have been completed. Specifically, the controller 26 determines whether all of the procedures indicated by the procedure list created in S55 have been completed (that is, whether “OK” is included in their result column). In a case where all of the procedures indicated by the procedure list created in S55 have been completed, the controller 26 determines YES in S64 and proceeds to S66. On the other hand, in a case where all of the procedures indicated by the procedure list created in S55 are not yet completed at the timepoint of S64, the controller 26 determines NO in S64 and returns to S58. In this case, the controller 26 repeatedly performs the processes of S58 to S62 until YES is determined in S64.
Detailed description will be given according to a specific example. For example, in the case where the result corresponding to the procedure “cover the lid” in the procedure table 202 of the procedure list 200 (
At this occasion, in S60 of the second cycle, the controller 26 monitors the screws being actually threaded into the screw holes H1 to H6. The user can thread the actual screws into the screw holes H1 to H6 using the tool 40 while looking at the instruction screen of
In this case, in subsequent S62 in the second cycle, the controller 26 changes the results corresponding to the procedure “tighten the screws” in the procedure table 202 of the procedure list 200 (
In S66, the controller 26 stores the completed procedure list in the list storage region 36 of the memory 28.
In subsequent S68, the controller 26 creates work information including the procedure list stored in the list storage region 36 and sends the same to the server 50 via the Wi-Fi I/F 24. The work information includes various types of information (such as sending time, etc.) in addition to the procedure list. When S68 is completed, the controller 26 terminates the manual process of
Although not shown in
As above, the execution of the manual process of
(Processes by Controller 58 of Server 50)
Next, processes performed by the controller 58 of the server 50 will be described. As aforementioned, the controller 26 of the image display device 10 performs the manual process (see
Further, the user of the server 50 can input a browse request for browsing the work information in the memory 60 to the server 50 by operating the operation unit 54. When the browse request is inputted, the controller 58 reads the work information designated by the browse request from the memory 60 and causes the display unit 52 to display a screen represented by the work information. In this case, the screen represented by the work information includes information similar to the procedure list (see
Due to this, the user of the server 50 can see the screen displayed on the display unit 52 to confirm whether or not the target article was handled properly by the user of the image display device 10.
Further, the user of the external PC 70 can input an operation for browsing the work information in the server 50 to the external PC 70 by operating the operation unit 72. In this case, the controller 74 of the external PC 70 can send a request signal for browsing the work information in the memory 60 to the server 50 via the Wi-Fi I/F 73. In this case, the controller 58 receives the request signal via the Wi-Fi I/F 56. Then, the controller 58 reads the work information designated by the request signal from the memory 60, and sends the read work information to the external PC 70 via the Wi-Fi I/F 56.
The controller 74 of the external PC 70 receives the work information via the Wi-Fi I/F 73. Then, the controller 74 can cause the display unit 71 to display a browsing screen represented by the received work information. In this case as well, the browsing screen represented by the work information includes the information similar to the procedure list (see
As above, the configuration and actions of the communication system 2 of the present embodiment were described. As aforementioned, in the present embodiment, the controller 26 of the image display device 10 determines whether or not the operation that the user actually performed on the target article within the specific range follows the procedure instructed by the instruction screen (S60 of
In the present embodiment, when the operation following the specific procedure instructed by the instruction screen (see
Further, in the present embodiment, the controller 26 of the image display device 10 receives the action information including the torque values by which the user tightened the screws from the tool 40 via the BT I/F 22. Due to this, the controller 26 determines whether or not the screws were correctly threaded into the screw holes H1 to H6 by the predetermined tightening torque values based on the action information received from the tool 40 in addition to the first captured image and the second captured image (S60). Due to this, the controller 26 can properly determine whether or not the screws were threaded into the screw holes H1 to H6 correctly at the predetermined tightening torque.
Further, in the present embodiment, the controller 26 of the image display device 10 creates the work information including the procedure list stored in the list storage region 36, and sends the same to the server 50 via the Wi-Fi I/F 24 (S68). Due to this, the work information is accumulated in the server 50. The user of the server 50 and the users and the like of the external PC 70 and the like can confirm whether or not the target article was properly handled by seeing the screen represented according to the work information accumulated in the server 50.
The procedure “cover the lied” in the procedure table 202 of
An image display device 1010 of a second embodiment will be described with reference to
As aforementioned, in the present embodiment, since the display unit 1014 is a light-shielding display, when power of the image display device 1010 is turned on, the controller 26 causes the first captured image (that is, the captured image from the first camera 16) to be displayed in a region facing the right eye of the user and causes the second captured image (that is, the captured image from the second camera 18) to be displayed in a region facing the left eye of the user. Then, for example, in the case where the arrangement position of the menu object image 80 is included in the specific range, the controller 26 causes the display unit 1014 to display a screen in which the menu object image 80 is composed over the first captured image and the second captured image.
As aforementioned, details of the embodiments have been described, however, these are merely an illustration, and do not restrict the scope of claims. Techniques described in claims encompass various variants and modifications of the specific examples illustrated above. For example, the following variants may be employed.
(Variant 1) In the second embodiment, the controller 26 displays the first captured image in the region facing the right eye of the user and the second captured image in the region facing the left eye of the user. Not being limited hereto, the controller 26 may display one of the first captured image and the second captured image on the display unit 1014. Further, the controller 26 may display an image that is made by composing the first captured image and the second captured image on the display unit 1014.
(Variant 2) In the respective embodiments as above, the controller 26 monitors the detection of the user operation in the specific range in S18 of
(Variant 3) In the respective embodiments as above, the controller 26 initiates the real-time process (S14) after having executed calibration (YES in S10, S12 of
(Variant 4) In the respective embodiments as above, both the image display devices 10, 1010 have a support frame that is substantially in the shape of glasses, and they can be worn on the head of the user similar to how the glasses are worn. Not being limited to this, the image display device may have an arbitrary support frame, such as in a hat shape, a helmet shape, and the like so long as it is wearable on the head of the user.
(Variant 5) The image display device may be configured by attaching the first camera 16, the second camera 18, and the control box 19 on an eyewear generally used for an orthoptic purpose or for eye protection (such as glasses, sunglasses, etc.). In this case, lens portions of the eyewear may be used as the display unit.
(Variant 6) In each of the above embodiments, the controller 26 identifies the features of the target article based on the first and second captured images (S50 in
(Variant 7) The tool 40 is not limited to the screw-turning tool (so-called screwdriver) that the user uses by holding it, and other arbitrary types of tools (such as a wrench or Vernier caliper, etc.). In such a case as well, the tool 40 simply needs to be configured capable of sending the action information to the image display device 10.
(Variant 8) In each of the above embodiments, the respective object images being the menu object image 80 (
(Variant 9) In each of the above embodiments, in the case of being determined that the specific procedure is not completed (NO in S60 of
(Variant 10) In each of the above embodiments, the controller 26 monitors the operation indicated by the specific procedure to be actually performed (that is, completed) in S60 of
(Variant 11) In each of the above embodiments, the procedure tables 202, 302 in the procedure lists 200, 300 (
(Variant 12) In each of the above embodiments, in the case of being determined that the specific procedure is not completed (NO in S60 of
(Variant 13) The plural procedures indicated by the handling method of the target article indicated by the manual may include a procedure that can be skipped (that is, a procedure which does not adversely affect the handling of the target article even if this procedure is not completed). Such a skippable procedure may include optional procedures, such as “cleaning the surrounding between working steps”, “adhere a decorative sticker on a surface of the target article”, and the like. In this case, the controller 26 may cause the display unit 14 to display an instruction screen for instructing to perform a next step according to user's operation while displaying an instruction screen for instructing to perform the skippable procedure on the display unit 14, even before the completion of the skippable procedure.
Further, technical elements described in this description and drawings exhibit technical usefulness solely or in combination thereof, and are not limited to combinations recited in the claims as originally filed. Further, the art described in the description and the drawings may concurrently achieve a plurality of aims, and technical significance thereof resides in achieving any one of such aims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/050674 | 1/12/2016 | WO | 00 |