This application is based upon and claims the benefit of priority from Japanese patent application No. 2023-116830, filed on Jul. 18, 2023, the disclosure of which is incorporated herein in its entirety by reference.
The present disclosure relates to a position information acquisition system, a position information acquisition method, and a program.
International Patent Publication No. WO 2018/087844 describes a work recognition method in which a sensor recognizes a worker's body information and an object and associates them with a work. The work recognition system of International Patent Publication No. WO 2018/087844 includes a sensor data acquisition unit for acquiring sensor data and a body part information acquisition unit for detecting a worker's body part and acquiring body part information about the worker's body part based on the sensor data. The work recognition system also includes an object information acquisition unit for detecting an object and acquiring object information about the object based on the sensor data, and an association unit for associating the object with the body part of the worker who has performed a work using the object based on the body part information and the object information. The work recognition system further includes a recognition result analysis unit for recognizing the work performed by the worker based on association information about a result of the association made by the association unit.
When acquiring work performance data using a device such as a camera, GPS, IMU, or the like, the device may not be usable due to obstacles such as blind spots of the camera, or may not function accurately due to, for example, reflection and shielding of GPS radio waves, depending on the process and the positional relationship between such device and the product. Therefore, it may be difficult to ensure the robustness of data acquisition under various work environments and conditions in a work recognition system that recognizes a work based on a machine-learned model or a rule based on inputs from a plurality of devices. Therefore, an object of the present disclosure is to provide a position information acquisition system that can ensure the robustness and accuracy of data acquisition by providing a plurality of devices and automatically selecting the most appropriate device according to a work position.
A position information acquisition system according to the present disclosure includes:
According to the above configuration, it is possible to provide a position information acquisition system that can ensure the robustness and accuracy of data acquisition by providing a plurality of devices and automatically selecting the most appropriate device according to a work position.
In the position information acquisition system according to the present disclosure, the divided area is an object separation area separated from the object, an object contact area in contact with the object, and an object interior area.
According to the above configuration, the peripheral area can be divided into the object separation region, the object contact region, and the object interior region.
In the position information acquisition system,
According to the above configuration, by considering the accuracy and reliability of each means, indoor GPS, image skeleton estimation, and IMU, based on the divided areas, and by employing a priority and backup logic, it is possible to flexibly select the optimal means, ensuring robustness.
The position information acquisition system according to the present disclosure further includes:
With the above configuration, it is possible to receive the production instruction and set the input control of the plurality of sensors or devices in accordance with the production instruction.
A position information acquisition method according to the present disclosure includes:
According to the above configuration, it is possible to provide a position information acquisition method that can ensure the robustness and accuracy of data acquisition by providing a plurality of devices and automatically selecting the most appropriate device according to a work position.
A program according to the present disclosure causes an information processing apparatus to execute:
According to the above configuration, it is possible to provide a program that causes an information processing apparatus to acquire a position information and can ensure the robustness and accuracy of data acquisition by providing a plurality of devices and automatically selecting the most appropriate device according to a work position.
According to the present disclosure, it is possible to provide a position information acquisition system that can ensure the robustness and accuracy of data acquisition by providing a plurality of devices and automatically selecting the most appropriate device according to a work position.
The above and other objects, features and advantages of the present disclosure will become more fully understood from the detailed description given hereinbelow and the accompanying drawings which are given by way of illustration only, and thus are not to be considered as limiting the present disclosure.
Embodiments of the present disclosure will now be described with reference to the drawings. However, the claimed disclosure is not limited to the following embodiments. Moreover, not all of the configurations described in the embodiments are essential as means for solving the problem. For the sake of clarity, the following descriptions and drawings have been omitted and simplified as appropriate. In each drawing, the same elements have the same reference signs, and repeated descriptions have been omitted as appropriate.
As shown in
The external devices and equipment 101 are external elements provided in the processes for purposes other than the system. The external devices and equipment 101 are, for example, cameras, indoor GPS (Global Positioning System), other equipment, jigs and tools, conveyance systems, and so on.
The wearable device 102 is a variety of sensors worn by a worker. The wearable device 102 is, for example, a first-person viewpoint camera, a microphone, a vibration sensor, a pressure sensor, an IMU (Inertial Measurement Unit), a strain sensor, a vital sensor, or the like.
The work recognition platform 134 is composed of an information processing apparatus (not shown) that is the center of the work recognition system according to the present disclosure. The information processing apparatus includes at least a processor (e.g., a CPU (Central Processing Unit)) that executes a program for executing a process and a memory that stores the program. The information processing apparatus may include a cloud server that distributes some or all of the functions. The work recognition platform 134 includes an input I/F (interface) 103, a time series synchronization unit 104, data preparation for work recognition 105, a learning/setting unit 114 based on the work recognition data preparation, a work recognition model library 115, a work recognition model library dynamic filtering unit 131, a work recognition multimodal AI 132, and a unit work-specific work performance data output unit 133.
The input I/F 103 is an interface for connecting with the external devices and equipment 101, and the wearable device 102.
The time series synchronization unit 104 temporally synchronizes data of each of the external devices and equipment 101 and the wearable device 102.
In the data preparation for work recognition 105, data for a model for recognizing work components related to workers, work objects, equipment and tools, and so on are prepared. The data preparation for work recognition 105 includes, for workers, a body and finger skeleton estimation unit 106, a tactile and auditory sensor data unit 107, a body part-specific position estimation unit 108, a worker identification unit 109, etc., for workers. The data preparation for work recognition 105 includes a position estimation unit 110 and an appearance image unit 111 for work objects. The data preparation for work recognition 105 includes an equipment/tool signal/data unit 112 for equipment and tools. The data preparation for work recognition includes a diversity input control unit 113.
The body and finger skeleton estimation unit 106 estimates the skeleton of the body and fingers using data from sensors or devices. The body and finger skeleton estimation unit 106 prioritizes and switches the sensors or devices used for skeleton estimation based on instructions from the diversity input control unit 113.
The tactile and auditory sensor data unit 107 acquires input data such as vibration, friction, reaction force, and sound from sensors or devices. The body part-specific position estimation unit 108 estimates the position of each body part using data from a sensor or a device. The body part-specific position estimation unit 108 prioritizes and switches sensors or devices used for position estimation of each body part by an instruction from the diversity input control unit 113. The worker identification unit 109 identifies a worker by ID (Identification) or the like.
The position estimation unit 110 estimates the position of a work object. The appearance image unit 111 acquires an appearance image by capturing an image of an appearance of the work object using the external devices and equipment 101 and the wearable device 102.
The equipment/jig and tool signal/data unit 112 acquires data of equipment and tools during work. The diversity input control unit 113 prioritizes and switches the sensors or devices to input data according to the work position.
The learning/setting unit 114 by work recognition data preparation is for constructing a work component recognition model 143 described later. The data prepared in the data preparation for work recognition 105 is learned or set to construct each part of the work component recognition model 143.
The work recognition model library 115 is a reference part of the work recognition platform 134. The work recognition model library 115 includes the work component recognition model 143, a learning/setting unit 123 based on the work component recognition model, a general-purpose work recognition module 124, an area-specific reference coordinate control unit 127, a position definition unit 128, the learning/setting unit 129 based on a work position, and an advanced work recognition model 130.
The work component recognition model 143 is a storage unit for storing each recognition model. The work component recognition model 143 includes a posture and action recognition model 116 and a gesture and audio recognition model 117 for a worker. The work component recognition model 143 includes an object recognition model 118, a physical F/B (feedback) recognition model 119, and a state recognition and measurement model 120 for a work object. The work component recognition model 143 includes an equipment and tool data recognition model 121 for equipment and tools. The work component recognition model 143 includes a work position estimation model 122 for a work position.
The posture and action recognition model 116 is a model for recognizing posture and action by learning or setting them from the body and finger skeleton estimation unit 106 and the tactile and auditory sensor data unit 107. The gesture and audio recognition model 117 is a model for recognizing gestures and audio by learning or setting them from the body and finger skeleton estimation unit 106 and the tactile and auditory sensor data unit 107.
The object recognition model 118 is a model for recognizing an object by learning or setting it from the body and finger skeleton estimation unit 106, the tactile and auditory sensor data unit 107, the body part-specific position estimation unit 108, and the appearance image unit 111. The physical F/B recognition model 119 is a model for recognizing physical feedback by learning or setting it from the tactile and auditory sensor data unit 107. The state recognition and measurement model 120 is a model for recognizing and measuring the state of a work object by learning or setting it from the appearance image unit 111.
The equipment and tool data recognition model 121 is a model for recognizing equipment and tool data by learning or setting them from the equipment/jig and tool signal/data unit 112. The work position estimation model 122 is a model for estimating the work position by acquiring information from the body part position estimation unit 108, the work object position estimation unit 110, and the equipment/jig and tool signal/data unit 112. The work position estimation model 122 is set in the process or product reference. The work position estimation model 122 acquires information from the area-specific reference coordinate control unit 127. Furthermore, the work position estimation model 122 provides data of the position estimation result to the learning/setting unit 129 based on the work position.
The learning/setting unit 123 based on the work component recognition model is a learning/setting unit for constructing a unit work recognition model 125 described later. The data prepared by the work component recognition model 143 is learned or set to construct the unit work recognition model 125.
The general-purpose work recognition module 124 is a unit work recognition unit for recognizing the unit work by inputting the information of movement, touch, hearing, position, camera, and equipment, and tools at the time of the work to the object of the worker into the unit work model. That is, the general-purpose work recognition module 124 recognizes what actions the worker has performed and how. The general-purpose work recognition module 124 includes the unit work recognition model 125 and a unit work definition unit 126.
The unit work recognition model 125 is a model for recognizing a work that can be defined generically among various works independent of products. The unit work recognition model 125 is generated in a unit work recognition model generation unit by combining components independent of products among work component recognition models based on work contents. For example, general-purpose work independent of products is a combination of verbs related to work objects such as tightening, fitting, and taking out common parts across products that do not contain product-specific information, and will be described later. The unit work recognition model 125 is a model that recognizes a unit work by learning or setting it from the posture and action recognition model 116, the gesture and audio recognition model 117, the object recognition model 118, the physical F/B recognition model 119, the state recognition and measurement model 120, and the equipment and tool data recognition model 121.
The unit work definition unit 126 defines a work for creating the unit work recognition model 125.
The area-specific reference coordinate control unit 127 controls changing the reference coordinate according to the area of the process. The area-specific reference coordinate control unit 127 provides fixed reference and product reference information to the work position estimation model 122.
The position definition unit 128 defines the position from the product 3D, the equipment 3D, and the like, as will be described later. The position definition unit 128 adds position information to the unit work definition unit 126 in order to define the position where the unit work is performed. The position definition unit 128 provides learning or setting data to the learning/setting unit 129 based on the work position.
The learning/setting unit 129 based on the work position performs learning or setting from the work position estimation model 122, the general-purpose work recognition module 124, the position definition unit 128, and a work item 138 of the product-specific process information 137 described later. The learning/setting unit 129 based on the work position performs learning or setting to construct the advanced work recognition model 130.
The advanced work recognition model 130 is a model for recognizing where, what, and how a worker performs a work. The advanced work recognition model 130 recognizes a work with high accuracy by adding information such as a work position on a product, which is product-specific information, to general-purpose unit work recognition.
The work recognition model library dynamic filtering 131 excludes models other than the corresponding product based on production instructions from the MES 141 in a process library including models of a plurality of vehicle models. Furthermore, completed models are excluded based on the progress of the work from the start of work to the completion of the work. The work recognition model library dynamic filtering 131 acquires information about who is a worker from the worker identification unit.
The work recognition multimodal AI (Artificial Intelligence) 132 is an artificial intelligence for recognizing who, when, where, what, and how a worker has performed work. The work recognition multimodal AI 132 acquires information from the data preparation for work recognition 105 and the work recognition model library dynamic filtering 131.
The unit work-specific work performance data output unit 133 outputs work performance data for each unit work. The unit work-specific work performance data output unit 133 acquires information from the work recognition multimodal AI 132. The data output by the unit work-specific work performance data output unit 133 is utilized in the application (APPL) 142 for safety and ergonomics improvement, quality assurance improvement, productivity improvement, etc.
The BOM 135 is parts information necessary for manufacturing products according to the parts list, and it is essential information for understanding how parts are assembled.
The PDM, BOP, and BOE 136 include a work element master 139, product-specific process information 137, and product and process drawing information 140.
The work element master 139 is master information that standardizes and generalizes work components across products, and includes, for example, work objects, work verbs, tools, work positions, and so on. The product-specific process information 137 includes the work item 138 which is a combination of work elements defined in the work element master 139 and detailed work procedure information. The product and process drawing information 140 includes information of parts 3D, process layout, equipment 3D, and so on.
The work element master 139 provides information to the unit work definition unit 126. The product-specific process information 137 acquires information from the work element master 139, provides information to the product and process drawing information 140, and provides learning or setting data to the learning/setting unit 129 based on the work position. The product-specific process information 137 is provided through the work element master 139 that is defined standardly across products, and the general-purpose work recognition module 124 provides information to the position definition unit 128 and the area-specific reference coordinate control unit 127.
The MES 141 is a manufacturing execution system that monitors and manages factory equipment and worker's works by linking them with each part of a factory production line. The MES 141 acquires information from the BOP and BOE 136. The MES also includes a production instruction database to provide production instruction information to the work recognition model library dynamic filtering 131. Production instructions are instructions to people and equipment such as parts and procedures used in addition to the types and specifications of individual products in accordance with the production plan.
As shown in
The work element master 139 includes a work object 202 and information of a work verb and a tool 203 as a general-purpose work element. The work element master 139 includes information of a work position 204 as a product-specific work element. The information of the work object 202, the work verb/tool 203, and the work position 204 is acquired from a work item 206 of the BOP 205.
The work item 206 defines a series of works for producing a product by a combination of work objects and work verbs. Work elements are elements constituting the work item 206. Main work elements are work verbs, work objects, tools, work positions, and so on. Among them, work verbs, work objects, and tools can be defined as general master items to perform data aggregation and analysis across products and factories. Since the work positions may vary depending on the product, even for the same part, classification codes for work positions can be standardized, but the work positions themselves are treated as product-specific work elements.
The general-purpose work recognition module 124 includes a unit work definition unit (general-purpose) 209, a work component recognition model correlation definition unit (other than work positions) 211, and a general-purpose unit work recognition model 125.
Furthermore, the unit work 208 is a further subdivision of the content of general-purpose work elements such as work verbs, work objects, and tools. Specifically, a unit work is a work verb that is elementally decomposed to a unit suitable for work recognition, and the work is separated by the unit. The unit work 208 includes a unit work definition unit (general-purpose) 209 and a unit work definition unit (product-specific) 210.
The unit work definition unit (general-purpose) 209 includes a step, which is a work sequence, a unit verb defined in a hierarchical structure according to the granularity, recognition category, and information of the recognition item. The unit work definition unit (general-purpose) 209 is associated with the work object 202 and the work verbs and tools 203. The unit verb is a verb of a degree of granularity that enables recognition of work. Although the work verb in the work item 206 can be defined at various granularities as desired, the work recognition according to the present disclosure is intended to recognize various operations performed on a manufacturing site in a specific and detailed manner. Therefore, it is necessary to further subdivide work verbs with coarse granularity into recognizable granularities. If the difference between the granularity of the work verb and that of the unit verb is large, it may be permissible to perform the subdivision in multiple layers.
The recognition categories include two targets, motion and event, for recognizing unit works, which are defined according to the characteristics of the tasks and the recognition purposes. Therefore, the general-purpose work recognition module 124 can recognize various works in detail. The motion is recognized as a mass which continuously performs the works in a time series manner. Therefore, a continuous action such as walking is recognized. The event recognizes a specific cross section with a work such as a tightening start and a tightening completion. Therefore, an accurate work segmentation timing is recognized.
Recognition items are used to define specific recognition objects such as start and end for recognition categories of unit verbs.
The work component recognition model correlation definition unit (other than work position) 211 assigns a work component recognition model (other than work position recognition model which is a product-specific work component) suitable for recognition according to work characteristics to work elements (work verb x work object) x unit work (unit verb, recognition category, item) and defines the correlation. The work component recognition model correlation definition unit (other than work position) 211 includes information of the degree of necessity, pattern, variation, and reliability.
The degree of necessity is defined for each work component recognition model 143 so that the assignment of the unit work 208 and the work component recognition model 143 can be efficiently performed, and the robustness can be maintained and controlled when the output from a part of the model is missing. Patterns and variations are patterns and detailed variations to be recognized by the work component recognition model 143. Reliability is defined for each work component recognition model 143 to enable recognition accuracy evaluation in the system. The reliability is set based on optional settings or actual recognition result data. This evaluation is used, for example, when a plurality of candidates emerge during work recognition with the general-purpose work recognition module 124, to rank these candidates.
The general-purpose work recognition module 124 is shared and reused across products when there are common parts or similar parts among products.
As shown in
Specifically, as in the example shown in
As shown in
The method of unit work recognition in the production of a product is shown in
The result output is passed to the advanced work recognition model 130. Further, the processing of 703 to 710 is carried out continuously until the completion of the work of the process N.
Thus, a general-purpose work recognition module which can recognize a unit work independent of products is provided.
As shown in
As shown in the upper drawing of
As shown in the middle drawing of
In the case where the conveying item reference is applied, the position information of the work object on the conveying system is acquired in real time by connecting with the conveying system, and the work position is estimated by the conveying item reference (more precisely, the conveyed work object reference).
As shown in
The advanced work recognition according to the embodiment will be described with reference to
As shown in
As shown in
The unit work definition unit (product-specific) 210 includes a reference coordinate which is work position detail, a target part, a position type, and a range specification method. The unit work definition unit (product-specific) 210 is included in the unit work 208, and acquires a work position code which is set as a standard for all products from the product-specific work position 204 of the work element master 139. The unit work definition unit (product-specific) 210 acquires 3D coordinate information and process layout information of a product as a detailed work position from a PDM and BOP 207.
The reference coordinate is selected from either the fixed reference defined in
The target part indicates a body part of the worker whose work position is to be obtained. The target part includes, for example, the head, arms, and legs.
A position type is a method for specifying a position. The position type is, for example, a two-dimensional plane and a three-dimensional space.
The range specification method is a method of specifying a range of a work position. The range specification method involves, for example, specifying the position range using two diagonal points of a three-dimensional object or a planar shape, or specifying the position range along a path.
The work component recognition model correlation definition unit (work position) 212 assigns a work component recognition model (work position recognition model which is a work component specific to a product) suitable for recognition to the work position and defines the correlation. The work component recognition model correlation definition unit (work position) 212 includes information of necessity, reference coordinates, target part, position type, and reliability.
Necessity is defined for each work component recognition model 143 so that assignment of the work position and the work component recognition model 143 can be efficiently performed, and robustness can be maintained and controlled when an output from a part of the model is missing.
The reference coordinates, the target part, and the position type use the information acquired in the unit work definition unit (product-specific) 210.
The reliability is defined for each work component recognition model 143 so that recognition accuracy evaluation can be performed in the system. The reliability is set based on arbitrary settings or actual recognition result data. This evaluation is used, for example, when a plurality of candidates appear during work recognition in the advanced work recognition 213, to rank a plurality of candidates.
The advanced work recognition model 130 combines the general-purpose work recognition module 124 and the work position estimation model 122 for work component recognition (other models are added as necessary) to enable detailed and highly accurate work recognition according to products.
As shown in
The advanced work recognition is performed as shown in
When the information processing apparatus outputs the unit work recognition (general-purpose) result 710, the advanced work recognition unit of the work recognition multimodal AI 132 acquires a unit work recognition result (general-purpose) 1301. In parallel, the area-specific reference coordinate control unit 127 acquires the position information of the worker from the work component recognition 707, and transmits the information of the reference coordinate to the work position estimation model 122. The work position estimation model 122 acquires the position information of the worker from the work component recognition 707, estimates the work position coordinate by acquiring the information of the reference coordinate from the area-specific reference coordinate control unit 127, and outputs a result of the estimation to the advanced work recognition unit of the work recognition multimodal AI 132. The advanced work recognition unit acquires the work position coordinate estimation result 1306, performs matching evaluation of the unit work and the actual work position using the advanced work recognition model 130, narrows down the unit work candidates of the corresponding candidate 1307, and outputs a candidate ranking 1308.
Next, the advanced work recognition unit of the work recognition multimodal AI 132 recognizes (estimates) the unit work with the highest accuracy 1302 from the unit work recognition candidate and candidate ranking acquired from the result output 710 of the unit work recognition (general-purpose) and the result output of the unit work recognition (product-specific) 1308. In parallel, the advanced work recognition unit performs work element recognition by monitoring individual unit work recognition data in time series with respect to work elements which are a set of a plurality of unit works 1303 (work object=part X, work verb=tightening, tool=tool Y, work position code=Z, work status: in progress, completed, etc.). Next, the advanced work recognition unit performs work item recognition in conjunction with the output of the work element recognition result with respect to the work item which is a combination of work elements (example: work item=part X tightening, work status: in progress, completed, etc.) 1304. The advanced work recognition unit then outputs the aforementioned recognition results (unit work recognition 1302, work element recognition 1303, work item recognition 1304) at any time 1305. The recognition results are sent to a database, data analysis, application, or the like. Further, the processing of the advanced work recognition unit is continuously performed until the completion of the work in the process N.
When the work element recognition 1303 cannot be recognized, it returns to the unit work recognition 1302. The work element recognition 1303 is done with a plurality of results of the unit work recognition 1302.
Thus, advanced work recognition involves the advanced work recognition unit that takes as input the unit work recognition candidate (general-purpose), which is the output result of the general-purpose work recognition part of the work recognition muiltimodal AI 132, and the unit work recognition (product-specific) candidate related to the product-specific work position, which takes as input the estimation result of the work position estimation model, and highly recognizes the work. In this way, the work recognition system and the work recognition method are provided.
As shown in the upper drawing of
Furthermore, progress filtering means excluding completed models in accordance with the progress of work from the start of work to the completion of work.
As shown in the lower drawing in
As described above, the work recognition system includes a product selection filtering unit for selecting a work recognition model based on a production instruction, a progress filtering unit for selecting a work recognition model in accordance with the progress of a work, and a work recognition unit for recognizing a work from the selected work recognition model. Further, the work recognition system narrows the work recognition model to a work of one product by the product selection filtering unit. Further, the work recognition system excludes the work recognition model for the completed work by the progress filtering unit. Further, the work recognition system determines whether or not the work can be recognized, and when the work cannot be recognized, the product selection filtering is removed, the work is recognized again, and then the work recognition result is output.
In this way, a learning model is selected in accordance with the progress of the work, and the processing time required for the determination by the unnecessary learning model is shortened, thereby providing a work recognition system capable of performing advanced work recognition at high speed.
When acquiring work performance data using a plurality of sensors or devices such as a camera, GPS, IMU, or the like, the sensors or devices may not be usable due to obstacles (such as blind spots of the camera), or may not function properly (due to reflection and shielding of GPS radio waves), depending on the process and the positional relationship between such device and the product. Therefore, in a system provided with a plurality of sensors or devices, the diversity input control unit 113 ensures robustness and accuracy of data acquisition by automatically selecting the most appropriate device according to the work position.
As shown in the lower left drawing of
Using the acquisition of the work position as an example, as shown in the right lower drawing of
Similarly, in the zone B, “position estimation based on image” or “position estimation based on image skeleton estimation or image object recognition or the like” can be obtained from an image in which the object is captured, making position estimation based on image skeleton estimation the most advantageous in terms of accuracy (although in some cases, other devices may be more accurate depending on the angle of inclination and distance from the camera). In the zone B, IMU is used as a second backup and the indoor GPS is used as a third backup, because the accuracy of indoor GPS is poor in areas where there are many radio wave shielding and reflection objects.
In the zone C, position estimation using IMU is given the highest priority. Within the object, when the object shields the radio waves, the indoor GPS cannot be used, or the accuracy is extremely deteriorated. In addition, the “position estimation based on the image” or “position estimation based on image skeleton estimation or image object recognition or the like” for capturing an image of the object can be used in some cases in the area which does not become a blind spot, and thus used as a backup.
When the worker performs the work on a work path as shown in the upper drawing of
As shown in
As described above, the diversity input control unit includes the work position information acquisition unit for acquiring the position of the worker, and the control unit for switching and prioritizing a plurality of sensors or devices to be input in accordance with a previously divided work area. The divided areas are an object separation area separated from the object, an object contact area in contact with the object, and an object interior area. In addition, position information is acquired in the object separation area by preferentially using, for example, the indoor GPS, position information is acquired in the object contact area by preferentially using, for example, image skeleton estimation, and position information is acquired in the object interior area by preferentially using, for example, IMU. Further, the diversity input control unit includes the reception unit for receiving production instructions, and the input control setting acquisition unit for acquiring input control settings of a plurality of sensors or devices in accordance with the production instructions.
In this way, a position information acquisition system and a position information acquisition method can be provided, in which a plurality of sensors or devices are provided, and the robustness and accuracy of data acquisition can be ensured by automatically selecting the optimum sensor or device according to the work position.
Some or all of the processing in the information processing apparatus described above can be implemented as a computer program. Such program can be stored and provided to a computer using any type of non-transitory computer readable media. Non-transitory computer readable media include any type of tangible storage media. Examples of non-transitory computer readable media include magnetic storage media (such as floppy disks, magnetic tapes, hard disk drives, etc.), optical magnetic storage media (e.g. magneto-optical disks), CD-ROM (compact disc read only memory), CD-R (compact disc recordable), CD-R/W (compact disc rewritable), and semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.). The program may be provided to a computer using any type of transitory computer readable media. Examples of transitory computer readable media include electric signals, optical signals, and electromagnetic waves. Transitory computer readable media can provide the program to a computer via a wired communication line (e.g. electric wires, and optical fibers) or a wireless communication line.
It should be noted that the present disclosure is not limited to the above embodiments and can be suitably modified to the extent that it does not deviate from the purpose.
From the disclosure thus described, it will be obvious that the embodiments of the disclosure may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the disclosure, and all such modifications as would be obvious to one skilled in the art are intended for inclusion within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2023-116830 | Jul 2023 | JP | national |