This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2023-176263, filed on Oct. 11, 2023; the entire contents of which are incorporated herein by reference.
Embodiments of the invention generally relate to a control method, a mixed reality system, a mixed reality device, and a storage medium.
Conventionally, a mixed reality device is used to perform a task efficiently. The mixed reality device can display a virtual space to overlap real space, and can provide various information to a worker. The worker can perform the task more efficiently by referring to the information displayed by the mixed reality device. Technology that can support a task performed by multiple workers using mixed reality devices is desirable.
According to one embodiment, a control method includes causing a first mixed reality device to display a first virtual object at a first fastening location. The control method includes causing a second mixed reality device to display a second virtual object at a second fastening location. The control method includes, when a screw is determined to have been turned at the first fastening location and a screw is determined to have been turned at the second fastening location, causing the first mixed reality device to display a third virtual object at a third fastening location, and causing the second mixed reality device to display a fourth virtual object at a fourth fastening location.
Embodiments of the invention will now be described with reference to the drawings. The drawings are schematic or conceptual; and the relationships between the thicknesses and widths of portions, the proportions of sizes between portions, etc., are not necessarily the same as the actual values thereof. The dimensions and/or the proportions may be illustrated differently between the drawings, even in the case where the same portion is illustrated. In the drawings and the specification of the application, components similar to those described thereinabove are marked with like reference numerals, and a detailed description is omitted as appropriate.
The embodiment of the invention relates to a mixed reality device (a MR device). For example, as shown in
In the illustrated example, the MR device 100 is a binocular head mounted display. Two lenses, i.e., the lens 111 and the lens 112, are fit into the frame 101. The projection device 121 and the projection device 122 respectively project information onto the lenses 111 and 112.
The projection device 121 and the projection device 122 display a recognition result of a body of a worker, a virtual object, etc., on the lenses 111 and 112. Only one of the projection device 121 or the projection device 122 may be included, and information may be displayed on only one of the lens 111 or the lens 112.
The lens 111 and the lens 112 are light-transmissive. The wearer of the MR device 100 can visually recognize reality via the lenses 111 and 112. The wearer of the MR device 100 also can visually recognize information projected onto the lenses 111 and 112 by the projection devices 121 and 122. The information (the virtual space) is displayed to overlap real space by being projected by the projection devices 121 and 122.
The image camera 131 detects visible light and obtains a two-dimensional image. The depth camera 132 irradiates infrared light and obtains a depth image based on the reflected infrared light. The sensor 140 is a six-axis detection sensor and is configured to detect angular velocities in three axes and accelerations in three axes. The microphone 141 accepts an audio input.
The processing device 150 controls components of the MR device 100. For example, the processing device 150 controls the display by the projection devices 121 and 122. The processing device 150 detects movement of the visual field based on a detection result of the sensor 140. The processing device 150 changes the display by the projection devices 121 and 122 according to the movement of the visual field. The processing device 150 also is configured to perform various processing by using data obtained from the image camera 131 and the depth camera 132, data of the storage device 170, etc.
The battery 160 supplies power necessary for the operations to the components of the MR device 100. The storage device 170 stores data necessary for the processing of the processing device 150, data obtained by the processing of the processing device 150, etc. The storage device 170 may be located outside the MR device 100, and may communicate with the processing device 150.
The MR device according to the embodiment is not limited to the illustrated example, and may be a monocular head mounted display. The MR device may be an eyeglasses-type as illustrated, or may be a helmet-type.
As shown in
In the example shown in
As shown in
When starting the task, the image camera 131 and the depth camera 132 image the marker 210. The processing device 150 recognizes the marker 210 based on the captured image. The processing device 150 sets the three-dimensional coordinate system by using the position and orientation of the marker 210 as a reference.
In the task, the image camera 131 and the depth camera 132 image the article 200, a left hand 251 of the worker, and a right hand 252 of the worker. The processing device 150 uses hand tracking to recognize the left and right hands 251 and 252 based on the captured image. The processing device 150 may cause the projection devices 121 and 122 to display the recognition result on the lenses 111 and 112. Hereinafter, the processing device using the projection device to display information on the lens also is called simply “the processing device displaying information”.
For example, as shown in
When the left hand 251 and the right hand 252 are recognized, the processing device 150 measures the coordinates of the hands. Specifically, each hand includes multiple joints such as a DIP joint, a PIP joint, an MP joint, a CM joint, etc. The coordinate of any of these joints is used as the coordinate of the hand. The centroid position of multiple joints may be used as the coordinate of the hand. Or, the center coordinate of the entire hand may be used as the coordinate of the hand.
As shown in
For example, as shown in
At this time, the worker disposes the extension bar 290 so that the extension bar 290 approaches or contacts the virtual object 313. Also, the worker grips the head of the wrench 280 so that the hand contacts the virtual object 303. By displaying the virtual object, the worker can easily ascertain the positions at which the tool and the hand are to be located when turning the screw at the fastening location 203. The work efficiency can be increased thereby.
In the illustrated example, the virtual objects 301 to 305 are spherical; and the virtual objects 311 to 315 are rod-shaped. The shapes of the virtual objects are not limited to the examples as long as the worker can visually recognize the virtual objects. For example, the virtual objects 301 to 305 may be cubic; and the virtual objects 311 to 315 may be wire-shaped. Virtual objects similar to the virtual objects 301 to 305 and the virtual objects 311 to 315 also are displayed for fastening locations 206 to 208 not shown in
After the virtual object is displayed, the processing device 150 may determine whether or not a prescribed object contacts the virtual objects 301 to 305. For example, the processing device 150 determines whether or not a hand contacts the virtual object. Specifically, the processing device 150 calculates the distances between the coordinate of the hand and the virtual objects 301 to 305. When one distance is less than a preset threshold, the processing device 150 determines that the hand contacts the virtual object. As an example in
The processing device 150 may determine whether or not the tool contacts the virtual objects 301 to 305. For example, as shown in
When the hand or the tool contacts one of the virtual objects 301 to 305, it is determined that a screw is being turned at the fastening location corresponding to the one of the virtual objects 301 to 305. In the example shown in
The tool that is used may be a digital tool that can detect torque. In such a case, the processing device 150 receives the detected torque from the tool. The torque (the threshold) that is necessary for the fastening may be preset, and the tool may refer to the necessary torque. The tool determines whether or not the necessary torque is detected, and transmits the determination result to the processing device 150. The tool also can transmit the rotation angle, the time at which the torque was detected, etc., to the processing device 150. The processing device 150 associates the data received from the tool with data related to the task location. A more detailed task record is automatically generated thereby.
According to the embodiment, the multiple MR devices 100 collaborate to control the displays of the virtual objects. The tasks that are performed are mainly classified into the three tasks of “simultaneous task”, “alternating task”, and “arbitrary task”. The MR device 100 switches the display method of the virtual object according to the task to be performed.
In the simultaneous task, multiple workers respectively turn screws simultaneously at multiple fastening locations. Here, “simultaneous” means that at least a part of the period in which one worker turns a screw and at least a part of the period in which another worker turns a screw overlap.
In the example shown in
The worker W1 starts the task at the fastening location 201 according to the display of the MR device 100a. The worker W2 starts the task at the fastening location 205 according to the display of the MR device 100b. When the hand or tool of the worker W1 contacts the virtual object 301, the MR device 100a determines that the task is performed at the fastening location 201. When the hand or tool of the worker W2 contacts the virtual object 305, the MR device 100b determines that the task is performed at the fastening location 205. The MR device 100a transmits the determination result related to the fastening location 201 to the MR device 100b. The MR device 100b transmits the determination result related to the fastening location 205 to the MR device 100a.
Subsequently, when it is determined that the screws are turned at the fastening locations 201 and 205, the MR devices 100a and 100b display the virtual objects at the next fastening locations. For example, as shown in
In the simultaneous task, the display of the MR device 100a and the display of the MR device 100b guide the workers W1 and W2 to simultaneously turned the screws of the article 200. In the illustrated example, the workers W1 and W2 are guided to simultaneously turn the screws at opposite positions of the article 200.
For example, the MR device 100 refers to a preset sequence of the task for the fastening locations 201 to 208. The MR device 100 sequentially displays the virtual objects at the fastening locations according to the sequence of the task. The priority of the task at each fastening location may be preset. In such a case, the MR device 100 sequentially displays the virtual objects at the fastening locations according to the priorities. For example, the MR device 100 sequentially displays the virtual objects so that the screws are turned in order from the fastening locations having higher priorities.
One of the MR device 100a or 100b may notify the state of the task of the wearer of the other of the MR device 100a or the MR device 100b by using a display, a voice, a vibration, etc. For example, when the prescribed object contacts the virtual object, the MR device 100 determines that the worker is in a “preparation completion” state. After preparation completion is determined, when a torque that is greater than a threshold is detected by the tool, the MR device 100 determines that the worker is in a “task completion” state.
For example, when preparation completion or task completion is determined for the MR device 100a, the MR device 100a notifies the worker W1 of the determination result and transmits the determination result to the MR device 100b. The MR device 100b notifies the worker W2 of the determination result of the MR device 100a. Similarly, when preparation completion or task completion is determined for the MR device 100b, the MR device 100b notifies the worker W2 of the determination result and transmits the determination result to the MR device 100a. The MR device 100a notifies the worker W1 of the determination result of the MR device 100b. As a result, the workers W1 and W2 can ascertain each other's states.
In the alternating task, multiple workers alternately turn screws at the fastening locations. In the alternating task, the period in which one worker turns a screw and the period in which another worker turns a screw do not overlap.
As an example, screws are turned in the order of the fastening location 201, the fastening location 205, the fastening location 202, the fastening location 206, the fastening location 203, the fastening location 207, the fastening location 204, and the fastening location 208. In such a case, first, as shown in
The MR device 100a transmits the determination result related to the fastening location 201 to the MR device 100b. When it is determined that the screw has been turned at the fastening location 201, the MR device 100b displays the virtual objects 305 and 315 at the fastening location 205 as shown in
Similarly thereafter, the task by the worker W1 and the task by the worker W2 are alternately performed. When the MR device 100a receives the determination result related to the fastening location 205, the MR device 100a displays the virtual objects 302 and 312 at the fastening location 202 as shown in
In the alternating task, the display of the MR device 100a and the display of the MR device 100b guide the workers W1 and W2 to alternately turn the screws of the article 200. In the illustrated example, the workers W1 and W2 are guided to alternately turn the screws at opposite positions of the article 200.
The fastening location at which the virtual object is initially displayed may be determined according to the position of the worker. Or, the sequence for the fastening locations may be preset. In such a case, the MR device 100 displays the virtual objects at the fastening locations according to the set sequence.
In the alternating task, one of the MR device 100a or 100b may notify the state of the task of the wearer of the other of the MR device 100a or 100b by using a display, a voice, a vibration, etc. For example, when task completion is determined for the MR device 100a, the determination result is transmitted to the MR device 100b. When receiving the determination result, the MR device 100b notifies the worker W2 of the task completion of the wearer (the worker W1) of the MR device 100a.
In the arbitrary task, multiple workers turn screws at any fastening location. The arbitrary task is performed when it is unnecessary to simultaneously perform the tasks at two or more fastening locations, when the sequence of the fastening locations at which the tasks are performed is not specified, etc.
In the arbitrary task, the MR devices 100a and 100b display the virtual objects 301 to 308 and 311 to 318 at the fastening locations 201 to 208 as shown in
The priority of the task at each fastening location may be preset. In such a case, the MR device 100 sequentially displays virtual objects at the fastening locations according to the priorities.
After the display of the virtual object, the display of the virtual object corresponding to the fastening location may change or disappear when the screw is turned at the fastening location. As a result, the workers can easily ascertain the fastening locations of the screws that are not yet turned.
When the screw is turned multiple times at one fastening location, the display of the virtual object may be changed according to the number of times. For example, the color, shape, size, etc., of the virtual object changes according to the number of times. As a result, the workers can easily ascertain the number of times that the screw is turned at each fastening location.
In the processing method shown in
When starting the task, the image camera 131 images the marker 210 (step S2). The MR device 100 recognizes the marker 210 based on the image and sets a three-dimensional coordinate system by using the marker 210 as the origin (step S3). The MR device 100 performs spatial mapping to acquire a spatial map (step S4). In the spatial map, surfaces of objects in the surrounding area of the MR device 100 are represented by meshes. The spatial map includes the coordinates of the meshes.
The MR device 100 acquires the number of workers designated in step S1, and the position of the MR device 100 worn by each worker (step S5). The position of the MR device 100 is calculated when the MR device 100 performs the spatial mapping. The MR devices 100 that collaborate with each other appropriately communicate their own positions.
The MR device 100 determines whether the type of the designated task is a simultaneous task, an alternating task, or an arbitrary task (step S6). The simultaneous task (step S10), the alternating task (step S20), or the arbitrary task (step S30) is performed according to the determination result of step S6.
In the simultaneous task (step S10), first, the MR device 100 assigns the fastening locations of the task for each worker (step S11). The MR device 100 determines the task sequence for the assigned fastening locations for each worker (step S12). When the sequence is prespecified, the MR device 100 determines the task sequence to be the prespecified sequence. When the priority is set for each fastening location, the sequence is determined so that the task is performed in order from the fastening locations having higher priorities. The result of the assignment and the sequence of the task are shared between the MR device 100a and the MR device 100b.
Subsequently, the processing of step S10a is performed by the MR device 100a. The processing of step S10b is performed by the MR device 100b. Specifically, in step S10a, the MR device 100a displays the virtual object at the fastening location at which the task is to be performed (step S13a). The MR device 100a determines whether or not the task prepared by the worker W1 and the task prepared by the worker W2 are completed (step S14a). For example, when the hand or tool of the worker contacts the displayed virtual object, the preparation of the worker is determined to be completed. The determination result of step S14a is shared between the MR device 100a and the MR device 100b.
When the preparation is determined to be completed, the MR device 100a notifies the worker W1 that the preparations of the workers are completed (step S15a). Then, the MR device 100a notifies the worker W1 of the start of the task (step S16a). The worker W1 starts the task according to the notification.
The MR device 100a determines whether or not both the task of the worker W1 and the task of the worker W2 are completed (step S17a). For example, the MR device 100a determines the task to be completed when the necessary torque is transmitted from the tool. The determination result of step S17a is shared between the MR device 100a and the MR device 100b.
When the task of the worker W1 and the task of the worker W2 are determined to be completed, the MR device 100a notifies the worker W1 that the task is completed (step S18a). The MR device 100a determines whether or not all of the task steps included in the designated task are completed (step S19). The screw is turned at one fastening location in one task step. When a fastening location at which the screw is to be turned still remains, the virtual object is displayed at the remaining fastening location (step S13a).
The processing by the MR device 100b is similar to the processing by the MR device 100a. In other words, the MR device 100b displays the virtual object (step S13b). The MR device 100b determines whether or not the task preparations by the workers W1 and W2 are completed (step S14b). The determination result of step S14b is shared between the MR device 100a and the MR device 100b.
When the preparations are determined to be completed, the MR device 100b notifies the worker W2 that the preparations of the workers are completed (step S15b). The MR device 100b notifies the worker W2 of the start of the task (step S16b). The MR device 100b determines whether or not the task of the workers W1 and W2 is completed (step S17b). The determination result of step S17b is shared between the MR device 100a and the MR device 100b.
When the task is determined to be completed, the MR device 100b notifies the worker W2 that the task is completed (step S18b). The MR device 100b determines whether or not all of the steps included in the designated task are completed (step S19). When a fastening location at which the screw is to be turned still remains, the virtual object is displayed at the remaining fastening location (step S13b).
The MR device 100a may display messages 321a and 321b as shown in
The MR device 100a may display messages 322a and 322b as shown in
When the preparations by the workers W1 and W2 are completed, the MR device 100a may display messages 323a and 323b as shown in
The MR device 100a may display messages 324a and 324b as shown in
The MR device 100b also may display messages similar to
In the alternating task (step S20), first, the MR device 100 assigns the fastening locations of the task for each worker (step S21). The MR device 100 determines the sequence of the task at the assigned fastening locations for each worker (step S22). When the sequence is prespecified, the MR device 100 determines the sequence of the task to be the prespecified sequence. When the priority is set for each fastening location, the sequence is determined so that the task is performed in order from the fastening locations having higher priorities. The result of the assignment and the sequence of the task are shared between the MR device 100a and the MR device 100b.
Here, an example will be described in which the task is started from the worker W1 wearing the MR device 100a. The MR device 100a displays the virtual object at the fastening location at which the task is to be performed (step S23a). The MR device 100a notifies the worker W1 of the task start (step S23b). The MR device 100a determines whether or not the task of the worker W1 is completed (step S23c). For example, the MR device 100a determines the task to be completed when the necessary torque is transmitted from the tool after the hand or the tool contacts the virtual object. When the task of the worker W1 is determined to be completed, the MR device 100a notifies the worker W1 that the task is completed (step S23d). Also, the MR device 100a notifies the MR device 100b that the task is completed.
During the task of the worker W1, the MR device 100b outputs a wait instruction to the worker W2 (step S23e). The MR device 100b determines whether or not the notification of the task completion is received from the MR device 100a (step S23f).
When the MR device 100b receives the notification of the task completion, the MR device 100b displays the virtual object at the fastening location at which the task is to be performed (step S24a). The MR device 100b notifies the worker W2 of the task start (step S24b). The MR device 100b determines whether or not the task of the worker W2 is completed (step S24c). When the task of the worker W2 is determined to be completed, the MR device 100b notifies the worker W2 that the task is completed (step S24d). Also, the MR device 100b notifies the MR device 100a that the task is completed.
During the task of the worker W2, the MR device 100a outputs a wait instruction to the worker W1 (step S24e). The MR device 100a determines whether or not the notification of the task completion is received from the MR device 100b (step S24f).
In steps S23a to S23f and steps S24a to S24f, the workers W1 and W2 take turns turning the screws at two fastening locations. When the task of the worker W2 is completed, the MR devices 100a and 100b determine whether or not all task steps included in the designated task are completed (step S25). When a fastening location at which the screw is to be turned still remains, the virtual object is displayed at the remaining fastening location (step S23a). By alternately performing steps S23a to S23f and steps S24a to S24f, the screws are alternately turned at the fastening locations.
The MR device 100a may display messages 331a and 331b as shown in
During the task of the worker W1, the MR device 100b may display messages 332a and 332b as shown in
The MR device 100a may display messages 333a and 333b as shown in
The MR device 100b may display messages 334a and 334b as shown in
In the arbitrary task (step S30), the processing by the MR device 100a and the processing by the MR device 100b are common. First, the MR device 100 refers to the priorities of the fastening locations (step S31). The MR device 100 selects the fastening location having the highest priority among the fastening locations at which the screws are not yet turned (step S32). The MR device 100 displays the virtual object at the selected fastening location (step S33). In the arbitrary task, the task is possible at the fastening locations by either the worker W1 or the worker W2. The MR device 100 determines whether or not the task is completed at the selected fastening locations (step S34). The result of the determination is shared between the MR device 100a and the MR device 100b.
The MR device 100 determines whether or not all task steps included in the designated task are completed (step S35). When a fastening location at which the screw is to be turned still remains, step S32 is re-performed. Steps S31 and S32 are omitted when priorities are not set for the fastening locations. In such a case, in step S33, the virtual objects are displayed at all of the fastening locations at which the screws are to be turned.
The MR device 100a may display messages 341a and 341b as shown in
For example, as shown in
A method that uses a marker, a method that uses a hand gesture, etc., are registered as the method for identifying the origin. The ID of the fastening location is a unique character string for identifying each fastening location. The coordinate in the three-dimensional coordinate system based on the origin is registered as the position of the fastening location. The model of the tool indicates the classification of the tool by structure, exterior shape, performance, etc. For example, the length of the extension bar is designated based on the model of the extension bar. The angle is the limit of the angle of the extension bar that can engage the screw when the screw is turned at the fastening location.
In the task, there are cases where a mark is made when the screw is turned at the fastening location. “Mark color” refers to the color of the mark provided at each fastening location. When the screw is marked with different colors according to the number of times that the screw is turned, the color of the mark for the number of times is registered. The virtual object ID is a character string for identifying the data of the preregistered virtual object; and the virtual object ID is associated with each fastening location.
The MR device 100 can display the virtual objects at the fastening locations by referring to the data stored in the fastening location data 172.
The history data 173 stores the task record. When the task is being performed, the MR device 100 associates the torque detected by the tool with the ID of the fastening location at which the screw is determined to be turned, and stores the torque in the history data 173. As shown in
When the task location is determined in the simultaneous task, the alternating task, or the arbitrary task, the MR device 100 stores the data in the history data 173. The task record is automatically generated thereby.
The task data 171, the fastening location data 172, and the history data 173 are stored in the storage device 170 of the MR device 100. Or, the task data 171, the fastening location data 172, and the history data 173 may be stored in a memory region other than the MR device 100. In such a case, the MR device 100 accesses the task data 171, the fastening location data 172, and the history data 173 via wireless communication or a network.
Advantages of the embodiment will now be described.
When turning screws at multiple fastening locations, there are cases where the timing of the task or the sequence of the task is prespecified. In such a case, rework of the task becomes necessary if the timing of the task or the sequence of the task is mistaken. In particular, it is difficult and requires time and effort to mutually confirm the timing of the task or the sequence of the task when multiple workers collaborate to turn the screws. For example, when one worker cannot view the other worker, it is difficult to confirm the timing of the task or the sequence of the task. When the work site is noisy, it is difficult for the workers to confirm the timing or the sequence by voice.
According to the embodiment, in the simultaneous task as shown in
According to this method, the MR devices 100a and 100b display the virtual objects at the fastening locations at which the task is to be performed simultaneously. By performing the task according to the displays of the MR devices 100a and 100b, the workers W1 and W2 can perform the task at the prescribed timing at the multiple fastening locations.
According to the embodiment, in the alternating task as shown in
According to this method, the MR devices 100a and 100b display the virtual objects at the fastening locations at which the task is to be performed alternately. By performing the task according to the displays of the MR devices 100a and 100b, the workers W1 and W2 can perform the task in the prescribed sequence for the multiple fastening locations.
According to the embodiment, multiple workers can be efficiently supported to perform the task at the prescribed timing or in the prescribed sequence for the multiple fastening locations. During the task, it is favorable to notify information from the MR device 100 as shown in
During the task, it is favorable for the MR device 100 to automatically generate the task record. For example, when the screw is determined to be turned at the fastening location, the MR device 100 associates the torque detected by the tool with the data related to the first fastening location. As a result, it is unnecessary for the worker to generate the task record. The burden on the worker can be reduced. Also, mistakes by the worker when generating the record can be avoided.
The examples described above mainly describe a screw being tightened at a fastening location. Embodiments of the invention are applicable not only when a screw is tightened at a fastening location, but also when a screw is loosened at a fastening location. For example, a screw is loosened at a fastening location when performing maintenance, inspection, or repair of a product. According to embodiments of the invention, virtual objects are displayed by the mixed reality devices at the prescribed timing or in the prescribed sequence when loosening the screw. As a result, the multiple workers can be efficiently supported to perform the task at the prescribed timing or in the prescribed sequence for the multiple fastening locations.
The MR device 100 includes, for example, a computer 90 shown in
The ROM 92 stores programs controlling operations of the computer 90. The ROM 92 stores programs necessary for causing the computer 90 to realize the processing described above. The RAM 93 functions as a memory region into which the programs stored in the ROM 92 are loaded.
The CPU 91 includes a processing circuit. The CPU 91 uses the RAM 93 as work memory and executes the programs stored in at least one of the ROM 92 or the storage device 94. When executing the programs, the CPU 91 executes various processing by controlling configurations via a system bus 98.
The storage device 94 stores data necessary for executing the programs and/or data obtained by executing the programs. The storage device 94 includes a solid state drive (SSD), etc. The storage device 94 may be used as the storage device 170.
The input interface (I/F) 95 can connect the computer 90 with an input device. The CPU 91 can read various data from the input device via the input I/F 95. The output interface (I/F) 96 can connect the computer 90 and an output device. The CPU 91 can transmit data to the output device (e.g., the projection devices 121 and 122) via the output I/F 96, and can cause the output device to display information.
The communication interface (I/F) 97 can connect the computer 90 and a device outside the computer 90. For example, the communication I/F 97 connects a digital tool and the computer 90 by Bluetooth (registered trademark) communication.
The data processing performed by the processing device 150 may be performed by only one computer 90. A part of the data processing may be performed by a server or the like via the communication I/F 97.
Processing of various types of data described above may be recorded, as a program that can be executed by a computer, on a magnetic disk (examples of which include a flexible disk and a hard disk), an optical disk (examples of which include a CD-ROM, a CD-R, a CD-RW, a DVD-ROM, a DVD±R, and DVD±RW), a semiconductor memory, or another non-transitory computer-readable storage medium.
For example, information recorded on a recording medium can be read by a computer (or an embedded system). The recording medium can have any record format (storage format). For example, the computer reads a program from the recording medium and causes the CPU to execute instructions described in the program, on the basis of the program. The computer may obtain (or read) the program through a network.
Embodiments of the invention include the following features.
A control method, including:
A control method, including:
The method according to feature 1 or 2, further including:
The method according to any one of features 1 to 3, further including:
The method according to feature 4, further including:
The method according to any one of features 1 to 5, in which
The method according to feature 6, further including:
A mixed reality system, including:
A mixed reality device, configured to:
A program, when executed by a mixed reality device configured to display a virtual space to overlap a real space, causing the mixed reality device to:
A non-transitory computer-readable storage medium configured to store the program according to feature 10.
According to the embodiments above, a control method, a mixed reality system, a mixed reality device, a program, and a storage medium are provided in which multiple workers can be supported to perform a task at a prescribed timing or in a prescribed sequence for multiple fastening locations.
In the specification, “or” shows that “at least one” of items listed in the sentence can be adopted.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the invention. Moreover, above-mentioned embodiments can be combined mutually and can be carried out.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2023-176263 | Oct 2023 | JP | national |