The present invention relates to non-transitory computer-readable recording medium storing an information processing program, an information processing method, and an information processing device.
Typically, a technology is desired for recognizing a motion of a person, in fields of sports, health care, or entertainment. For example, there is a technology for specifying three-dimensional coordinates of each joint of a person, based on multi-viewpoint images captured from different angles, using deep learning.
As related art, for example, there is a technology for outputting any one of a result of first processing, a result of second processing, and a result of third processing, as a skeleton recognition result of a subject, based on a likelihood of the result of the first processing, a likelihood of the result of the second processing, and a likelihood of the result of the third processing. Furthermore, for example, there is a technology for recognizing a heat map image projecting likelihoods of a plurality of joint positions of a subject from a plurality of directions, from a distance image of the subject. Furthermore, for example, there is a technology for performing optimization calculation based on inverse kinematics using a position candidate of a feature point and a multi-joint structure of a target, acquiring each joint angle of the target, performing forward kinematics calculation using the joint angle, and acquiring a position of a feature point including the joint of the target. Furthermore, for example, there is a behavior detection technology using a recurrent neural network.
International Publication Pamphlet No. WO 2021/064942,International Publication Pamphlet No. WO 2021/002025, Japanese Laid-open Patent Publication No. 2020-42476, and U.S. Patent Application Publication No. 2017/0344829 are disclosed as related art.
According to an aspect of the embodiments, a non-transitory computer-readable recording medium storing an information processing program for a computer to execute a processing includes acquiring time-series data of skeleton information that includes a position of each of a plurality of portions of a subject, specifying a type of an operation of the subject that corresponds to skeleton information at a first time point in the acquired time-series data, based on a feature amount of the skeleton information of the acquired time-series data determining a model of a probability distribution that restricts a temporal change in a position of any one portion of the plurality of portions in the skeleton information at the first time point in the acquired time-series data, according to tendency of a motion of the any one portion that corresponds to the specified type of the operation, generating a graph that includes a node that indicates a position of each portion at each time point, a first edge that couples between nodes that indicate positions of different portions that are biologically connected at each time point, and a second edge that couples between nodes that indicate the positions of the any one portion at different time points, and in which the determined model is associated with the second edge, and correcting the skeleton information at the first time point in the time-series data, based on the generated graph.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
There is a case where it is difficult for the related art to accurately specify the three-dimensional coordinates of each joint of the person. For example, the three-dimensional coordinates of the joint of the right hand of the person may be erroneously identified as the three-dimensional coordinates of the joint of the left hand of the person. For example, there is a case where three-dimensional coordinates of a part of an object other than a person imaged in a multi-viewpoint image is erroneously recognized as the three-dimensional coordinates of the joint of the person.
In one aspect, an object of the present invention is to enable to accurately specify a position of a portion of a subject.
Hereinafter, an embodiment of an information processing program, an information processing method, and an information processing device according to the present invention will be described in detail, with reference to the drawings.
Typically, there is a technology for specifying a temporal change in three-dimensional coordinates of each joint of a person, by specifying the three-dimensional coordinates of each joint of the person, based on a multi-viewpoint image imaged from different angles, at each time point, using deep learning.
Specifically, it is considered to detect a region where a person is imaged, in the multi-viewpoint image, specify two-dimensional coordinates of each joint of the person based on the detected region, and specify the three-dimensional coordinates of each joint of the person, based on the specified two-dimensional coordinates, in consideration of the angle. Specifically, when the three-dimensional coordinates of each joint of the person are specified, a model trained using the deep learning is used. Regarding an example of this technology, specifically, Reference Documents 1 and 2 below can be referred.
However, there is a case where it is difficult for the related art to accurately specify the three-dimensional coordinates of each joint of the person. For example, although a distance between the joints of the same person is constant at different time points, when a model is trained by the deep learning, it is not considered that a distance between the joints of the person is constant. Therefore, it is not possible to accurately specify the three-dimensional coordinates of each joint of the person, and it is not possible to accurately specify a temporal change in the three-dimensional coordinates of each joint of the person.
When referring to the specified three-dimensional coordinates of each joint of the person, an analyst who analyzes a motion of a person tends to intuitively have an impression that the three-dimensional coordinates of each joint of the person are wrong. Specifically, the analyst has an impression that an arm length of the person extends or shortens. Furthermore, specifically, the analyst has an impression that the arm of the person is moving at a speed that a human cannot achieve.
Therefore, in the present embodiment, an information processing method that enables to accurately specify the position of the joint of the subject will be described.
In
(1-2) The information processing device 100 specifies a type of an operation of the subject corresponding to skeleton information 101 at a first time point in the acquired time-series data, based on a feature amount of the skeleton information 101 in the acquired time-series data. The type of the operation is, for example, walking, running, jumping, sitting, lying, lateral rotation such as turning or spinning, or longitudinal rotation such as tumbling or a high bar movement, or the like. The feature amount may be, for example, the position of each portion of the subject indicated by the skeleton information 101. The feature amount may be, for example, a deviation of the positions of each portion of the subject indicated by the skeleton information 101 at different time points. The feature amount may be, for example, a distance between positions of different portions of the subject indicated by the skeleton information 101.
The information processing device 100 includes, for example, a first model used to specify the type of the operation of the subject. The first model has, for example, a function for enabling to determine the type of the operation of the subject, according to an input of the feature amount of the skeleton information 101. The information processing device 100 specifies the type of the operation of the subject corresponding to the skeleton information 101 at the first time point in the acquired time-series data, using the first model. In the example in
(1-3) The information processing device 100 determines a second model of a probability distribution that restricts a temporal change in the position of any one portion of the plurality of portions, in the skeleton information 101 at the first time point in the acquired time-series data, according to tendency of a motion of any one portion corresponding to the specified type of the operation. The tendency of the motion is, for example, tendency of an iso-position motion, a uniform motion, or a uniform acceleration motion. In the example in
(1-4) The information processing device 100 generates a graph 110 including a node 111 indicating a position of each portion at each time point, a first edge 112 that couples between the nodes 111, and a second edge 113 that couples between the nodes 111. The first edge 112 couples between the nodes 111 indicating positions of different portions that are biologically connected, at each time point. The second edge 113 couples between the nodes 111 indicating positions of any one portion at different time points.
When generating the graph 110, the information processing device 100 associates the determined second model with the second edge 113. In the example in
(1-5) The information processing device 100 corrects the skeleton information 101 at the first time point in the time-series data, based on the generated graph 110. For example, the information processing device 100 corrects the position of the joint 1 of the subject included in the skeleton information 101 at the time point T in the time-series data. As a result, the information processing device 100 can accurately specify the position of each joint of the subject. The information processing device 100 can accurately specify the temporal change in the position of each joint of the subject.
Here, a case has been described where the information processing device 100 specifies the type of the operation of the subject, using the first model. However, the present embodiment is not limited to this. For example, there may be a case where the information processing device 100 specifies the type of the operation of the subject, without using the first model.
Although a case where the information processing device 100 operates alone has been described herein, the embodiment is not limited to this. For example, there may be a case where a plurality of computers cooperates to implement a function as the information processing device 100. Specifically, there may be a case where a computer that specifies the type of the operation of the subject cooperates with a computer that generates the graph 110 and a computer that corrects the skeleton information 101 at the first time point in the time-series data based on the graph 110.
Next, an example of an information processing system 200, to which the information processing device 100 illustrated in
In the information processing system 200, the information processing device 100 and the image capturing device 201 are coupled via a wired or wireless network 210. The network 210 is, for example, a local area network (LAN), a wide area network (WAN), the Internet, or the like. Furthermore, in the information processing system 200, the information processing device 100 and the client device 202 are coupled via the wired or wireless network 210.
The information processing device 100 acquires a plurality of images obtained by imaging the subject from different angles at each time point, from the one or more image capturing devices 201. The information processing device 100 specifies a distribution of an existence probability of each portion of the subject in a three-dimensional space, based on the plurality of acquired images, at each time point and specifies three-dimensional coordinates of each portion of the subject.
The information processing device 100 specifies the type of the operation of the subject, at each time point, based on the specified three-dimensional coordinates of each portion of the subject. The information processing device 100 specifies any one portion corresponding to the type, from among the plurality of portions of the subject, based on the specified type of the operation of the subject, at each time point. The information processing device 100 determines a model of a probability distribution that restricts a temporal change in a position of the specified any one portion based on the specified type of the operation of the subject, at each time point, according to the type.
The information processing device 100 generates a graph including a node indicating the three-dimensional coordinates of each specified portion of the subject at each time point. When generating the graph, the information processing device 100 generates the graph so that the first edge that couples between the nodes indicating the three-dimensional coordinates of the different portions of the subject that are biologically coupled is included in the graph, at each time point.
When generating the graph, the information processing device 100 generates the graph so that the second edge that couples between the nodes indicating the three-dimensional coordinates at the time point and at another time point other than the time point, regarding the specified any one portion is included in the graph, at each time point. The another time point other than the certain time point is, for example, a time point immediately before the certain time point. The information processing device 100 associates the determined model with the second edge included in the graph.
The information processing device 100 corrects the specified three-dimensional coordinates of each portion of the subject, with reference to the graph. The information processing device 100 outputs the corrected three-dimensional coordinates of each portion of the subject. An output format is, for example, display on a display, print output to a printer, transmission to another computer, storage in a storage region, or the like. For example, the information processing device 100 transmits the corrected three-dimensional coordinates of each portion of the subject, to the client device 202. For example, the information processing device 100 is a server, a personal computer (PC), or the like.
The image capturing device 201 is a computer that images the subject. The image capturing device 201 includes a camera including a plurality of imaging elements and images the subject with the camera. The image capturing device 201 generates an image obtained by imaging the subject and transmits the image to the information processing device 100. The image capturing device 201 is, for example, a smartphone or the like. The image capturing device 201 may be, for example, a fixed point camera or the like. The image capturing device 201 may be, for example, a drone or the like.
The client device 202 receives the three-dimensional coordinates of each portion of the subject, from the information processing device 100. The client device 202 outputs the received three-dimensional coordinates of each portion of the subject to be referred by a user. The client device 202 displays, for example, the received three-dimensional coordinates of each portion of the subject, on a display. The client device 202 is, for example, a PC, a tablet terminal, a smartphone, or the like.
Although a case where the information processing device 100 is a different device from the image capturing device 201 has been described herein, the present embodiment is not limited to this. For example, there may be a case where the information processing device 100 has the functions of the image capturing device 201, and also operates as the image capturing device 201. Although a case where the information processing device 100 and the client device 202 are different devices has been described herein, the present embodiment is not limited to this. For example, there may be a case where the information processing device 100 has the functions as the client device 202, and also operates as the client device 202.
Next, a hardware configuration example of the information processing device 100 will be described with reference to
Here, the CPU 301 controls the entire information processing device 100. The memory 302 includes, for example, a read only memory (ROM), a random access memory (RAM), a flash ROM, or the like. Specifically, for example, the flash ROM or the ROM stores various programs, and the RAM is used as a work area for the CPU 301. The programs stored in the memory 302 are loaded into the CPU 301 to cause the CPU 301 to execute coded processing.
The network I/F 303 is coupled to the network 210 through a communication line and is coupled to another computer via the network 210. Then, the network I/F 303 takes control of an interface between the network 210 and the inside, and controls input and output of data to and from the another computer. For example, the network I/F 303 is a modem, a LAN adapter, or the like.
The recording medium I/F 304 controls reading and writing of data from and to the recording medium 305 under the control of the CPU 301. Examples of the recording medium I/F 304 include a disk drive, a solid state drive (SSD), a universal serial bus (USB) port, or the like. The recording medium 305 is a nonvolatile memory that stores data written under the control of the recording medium I/F 304. Examples of the recording medium 305 include a disk, a semiconductor memory, a USB memory, or the like. The recording medium 305 may be attachable to and detachable from the information processing device 100.
The display 306 displays data of a cursor, an icon, a toolbox, a document, an image, function information, or the like. The display 306 is a cathode ray tube (CRT), a liquid crystal display, an organic electroluminescence (EL) display, or the like, for example. The input device 307 has keys for inputting characters, numbers, various instructions, or the like, and inputs data. The input device 307 is a keyboard, a mouse, or the like, for example. The input device 307 may be a touch-panel input pad, a numeric keypad, or the like, for example.
The information processing device 100 may include a camera or the like, for example, in addition to the above components. Furthermore, the information processing device 100 may also include a printer, a scanner, a microphone, a speaker, or the like, for example, in addition to the above components. In addition, the information processing device 100 may include the plurality of recording medium I/Fs 304 and the plurality of recording media 305. Furthermore, the information processing device 100 does not need to include the display 306, the input device 307, or the like. Furthermore, the information processing device 100 does not need to include the recording medium I/F 304 and the recording medium 305.
Next, a hardware configuration example of the image capturing device 201 will be described with reference to
Here, the CPU 401 controls the entire image capturing device 201. The memory 402 includes, for example, a ROM, a RAM, a flash ROM, or the like. Specifically, for example, the flash ROM or the ROM stores various programs, and the RAM is used as a work area for the CPU 401. The programs stored in the memory 402 are loaded into the CPU 401 to cause the CPU 401 to execute coded processing.
The network I/F 403 is coupled to the network 210 through a communication line, and is coupled to another computer via the network 210. Then, the network I/F 403 takes control of an interface between the network 210 and the inside, and controls input and output of data to and from the another computer. For example, the network I/F 403 is a modem, a LAN adapter, or the like.
The recording medium I/F 404 controls reading and writing of data from and to the recording medium 405 under the control of the CPU 401. The recording medium I/F 404 is, for example, a disk drive, an SSD, a USB port, or the like. The recording medium 405 is a nonvolatile memory that stores data written under control of the recording medium I/F 404. The recording medium 405 is, for example, a disk, a semiconductor memory, a USB memory, or the like. The recording medium 405 may be attachable to and detachable from the image capturing device 201. The camera 406 includes a plurality of imaging elements and generates an image obtained by imaging an object with the plurality of imaging elements. The camera 406 is, for example, a camera for competitions. The camera 406 is, for example, a monitoring camera.
The image capturing device 201 may include, in addition to the above components, a keyboard, a mouse, a display, a printer, a scanner, a microphone, a speaker, or the like, for example. Furthermore, the image capturing device 201 may include the plurality of recording medium I/Fs 404 and the plurality of recording media 405. Furthermore, the image capturing device 201 does not need to include the recording medium I/F 404 and the recording medium 405.
Since a hardware configuration example of the client device 202 is specifically similar to the hardware configuration example of the information processing device 100 illustrated in
Next, a functional configuration example of the information processing device 100 will be described with reference to
For example, the storage unit 500 is implemented by a storage region such as the memory 302 or the recording medium 305 illustrated in
The acquisition unit 501 to the output unit 508 function as an example of a control unit. Specifically, for example, the acquisition unit 501 to the output unit 508 implement functions thereof by causing the CPU 301 to execute a program stored in the storage region such as the memory 302 or the recording medium 305 illustrated in
The storage unit 500 stores various types of information referred to or updated in the processing of each functional unit. For example, the storage unit 500 stores a plurality of images obtained by imaging a specific person from different angles at each of a plurality of consecutive time points. The angle indicates an imaging position. The image is acquired, for example, by the acquisition unit 501.
The storage unit 500 stores, for example, time-series data of skeleton information. The time-series data includes skeleton information at each of the plurality of consecutive time points. The skeleton information includes a position of each of a plurality of portions of the specific person. The portion is, for example, a joint. The portion is, for example, a neck, a head, a right shoulder and a left shoulder, a right elbow and a left elbow, a right hand and a left hand, a right knee and a left knee, a right foot and a left foot, or the like. The position is, for example, three-dimensional coordinates. The time-series data is acquired, for example, by the acquisition unit 501. The time-series data may be generated, for example, by the analysis unit 502.
The acquisition unit 501 acquires various types of information to be used for the processing of each functional unit. The acquisition unit 501 stores the acquired various types of information in the storage unit 500, or outputs the acquired various types of information to each functional unit. Furthermore, the acquisition unit 501 may output the various types of information stored in the storage unit 500 to each functional unit. The acquisition unit 501 acquires the various types of information based on an operation input by the user, for example. The acquisition unit 501 may receive various types of information from a device different from the information processing device 100, for example.
The acquisition unit 501 acquires, for example, the time-series data of the skeleton information of the subject. The skeleton information of the subject includes, for example, the position of each of the plurality of portions of the subject. Specifically, the acquisition unit 501 acquires the time-series data of the skeleton information of the subject, by receiving an input of the time-series data of the skeleton information of the subject, based on the operation input of the user. Specifically, the acquisition unit 501 may acquire the time-series data of the skeleton information of the subject by receiving the time-series data from another computer.
The acquisition unit 501 may acquire, for example, time-series data of skeleton information of a test subject in the past. The test subject may be, for example, the same as the subject. The skeleton information of the test subject includes, for example, a position of each of a plurality of portions of the test subject. Specifically, the acquisition unit 501 acquires the time-series data of the skeleton information of the test subject, by receiving an input of the time-series data of the skeleton information of the test subject, based on the operation input of the user. Specifically, the acquisition unit 501 may acquire the time-series data of the skeleton information of the test subject by receiving the time-series data from another computer.
The acquisition unit 501 may acquire, for example, a type of an operation of the test subject corresponding to each piece of the skeleton information in the time-series data of the skeleton information of the test subject in the past. The type of the operation is, for example, walking, running, jumping, sitting, lying, lateral rotation such as turning or spinning, or longitudinal rotation such as tumbling or a high bar movement, or the like. Specifically, the acquisition unit 501 acquires the type of the operation of the test subject, by receiving an input of the type of the operation of the test subject corresponding to each piece of the skeleton information in the time-series data of the skeleton information of the test subject in the past, based on the operation input of the user. Specifically, the acquisition unit 501 may acquire the type of the operation of the test subject corresponding to each piece of the skeleton information in the time-series data of the skeleton information of the test subject in the past, by receiving the type of the operation from another computer.
For example, the acquisition unit 501 acquires a plurality of images obtained by imaging the subject from different angles at each of the plurality of consecutive time points. In a case where the acquisition unit 501 does not acquire the time-series data of the skeleton information of the subject and the time-series data is generated by the analysis unit 502, the acquisition unit 501 acquires the plurality of images. As a result, the acquisition unit 501 can allow the analysis unit 502 to generate the time-series data of the skeleton information of the subject.
For example, the acquisition unit 501 may acquire a plurality of images obtained by imaging the test subject from different angles at each of the plurality of consecutive time points. In a case where the acquisition unit 501 does not acquire the time-series data of the skeleton information of the test subject and the time-series data is generated by the analysis unit 502, the acquisition unit 501 acquires the plurality of images. As a result, the acquisition unit 501 can allow the analysis unit 502 to generate the time-series data of the skeleton information of the test subject.
The acquisition unit 501 may accept a start trigger to start the processing of any functional unit. The start trigger is a predetermined operation input by the user, for example. The start trigger may be, for example, reception of predetermined information from another computer. The start trigger may be, for example, output of predetermined information by any one of the functional units.
For example, the acquisition unit 501 may receive acquisition of a plurality of images as a start trigger to start processing of the analysis unit 502. For example, the acquisition unit 501 may receive acquisition of the time-series data of the skeleton information of the test subject, as a start trigger to start processing of the training unit 503. For example, the acquisition unit 501 may receive acquisition of the time-series data of the skeleton information of the subject, as a start trigger to start processing of the specification unit 504, the determination unit 505, the generation unit 506, and the correction unit 507.
The analysis unit 502 generates time-series data of skeleton information of a predetermined person. The analysis unit 502 generates, for example, the time-series data of the skeleton information of the subject. Specifically, the analysis unit 502 estimates a position of each portion of the subject at each time point, based on the plurality of images obtained by imaging the subject from the different angles at each of the plurality of time points and generates skeleton information of the subject including the estimated position. Specifically, the analysis unit 502 generates the time-series data of the skeleton information of the subject, based on the generated skeleton information of the subject. As a result, the analysis unit 502 can temporarily specify the position of each portion of the subject at each time point and can obtain a correction target.
The analysis unit 502 may generate, for example, the time-series data of the skeleton information of the test subject. Specifically, the analysis unit 502 generates the skeleton information of the test subject at each time point, based on the plurality of images obtained by imaging the test subject from the different angles at each of the plurality of time points and generates the time-series data of the skeleton information of the test subject. The analysis unit 502 may add noise to the generated time-series data of the skeleton information of the test subject. The analysis unit 502 sets the skeleton information of the test subject to teacher information used to generate a training model. As a result, the analysis unit 502 can obtain the teacher information used to generate the training model.
The training unit 503 trains a first training model, based on teacher information including the position of each of the plurality of portions of the test subject. The first training model has a function for enabling to specify any one portion in an abnormal state regarding a position, from among a plurality of portions of the predetermined person, according to a feature amount regarding the skeleton information in the time-series data of the skeleton information of the predetermined person. The first training model has, for example, a function for enabling to determine whether or not each portion of the predetermined person is in the abnormal state regarding the position.
Specifically, the first training model has a function for calculating an index value indicating a magnitude of a probability that each portion of the predetermined person is in the abnormal state regarding the position. More specifically, the first training model outputs the index value indicating the magnitude of the probability that each portion of the predetermined person is in the abnormal state regarding the position, according to an input of the feature amount regarding the skeleton information. Specifically, the first training model is a neural network. As a result, the training unit 503 enables to specify any one portion in the abnormal state regarding the position, from among the plurality of portions of the subject.
The training unit 503 trains a second training model, based on the teacher information including the position of each of the plurality of portions of the test subject. The second training model has a function for enabling to specify a type of an operation of the predetermined person corresponding to each piece of the skeleton information in the time-series data of the skeleton information of the predetermined person, according to the feature amount regarding the skeleton information in the time-series data of the skeleton information of the predetermined person.
Specifically, the second training model outputs an index value indicating certainty for each candidate that may be the type of the operation of the predetermined person corresponding to any piece of skeleton information in the time-series data, according to an input of the feature amount regarding the skeleton information in the time-series data. Specifically, the second training model is a neural network. As a result, the training unit 503 enables to specify the type of the operation of the subject.
The specification unit 504 specifies a type of an operation of the subject corresponding to the skeleton information at the first time point in the acquired time-series data, based on the feature amount of the skeleton information in the acquired time-series data. For example, the specification unit 504 specifies the type of the operation of the subject corresponding to the skeleton information at the first time point, based on the feature amount regarding the skeleton information in the acquired time-series data of the skeleton information of the subject, using the trained second training model.
Specifically, the specification unit 504 calculates the index value indicating the certainty for each candidate that may be the type of the operation of the subject corresponding to the skeleton information at the first time point, by inputting the feature amount regarding the skeleton information in the time-series data of the skeleton information of the subject, into the second training model. Specifically, the specification unit 504 specifies the type of the operation of the subject corresponding to the skeleton information at the first time point, based on the calculated index value. More specifically, the specification unit 504 specifies a candidate having the largest calculated index value, as the type of the operation of the subject. As a result, the specification unit 504 can obtain a guideline for correcting the position of each of the plurality of portions of the subject. The specification unit 504 enables to determine whether or not a position of which portion of the subject is preferably corrected.
The specification unit 504 specifies an abnormal portion in the abnormal state regarding the position, from among the plurality of portions of the subject. The specification unit 504 specifies the abnormal portion in the abnormal state regarding the position, for the skeleton information at the first time point in the acquired time-series data of the skeleton information of the subject, based on the feature amount regarding the skeleton information in the acquired time-series data of the skeleton information of the subject. For example, the specification unit 504 specifies the abnormal portion in the abnormal state regarding the position, for the skeleton information at the first time point, based on the feature amount regarding the skeleton information in the acquired time-series data of the skeleton information of the subject, using the trained first training model.
Specifically, the specification unit 504 calculates an index value indicating a magnitude of a probability that each portion of the subject in the abnormal state, for the skeleton information at the first time point, by inputting the feature amount regarding the skeleton information in the time-series data of the skeleton information of the subject, into the first training model. Specifically, the specification unit 504 specifies the abnormal portion in the abnormal state regarding the position, for the skeleton information at the first time point, based on the calculated index value. More specifically, the specification unit 504 specifies a portion of which the calculated index value is equal to or more than a threshold as the abnormal portion in the abnormal state regarding the position, from among the plurality of portions of the subject. As a result, the specification unit 504 can obtain a guideline for correcting the position of each of the plurality of portions of the subject. The specification unit 504 enables to determine whether or not a position of which portion of the subject is preferably corrected.
The determination unit 505 determines a distribution model of the probability distribution that restricts the temporal change in the position of any one portion corresponding to the specified type of the operation, from among the plurality of portions, in the skeleton information at the first time point in the acquired time-series data. The distribution model is, for example, a model that restricts the temporal change in the position of any one portion corresponding to the specified type of the operation, according to the tendency of the motion of any one portion corresponding to the specified type of the operation. The tendency of the motion is, for example, tendency of an iso-position motion, a uniform motion, a uniform acceleration motion, or the like. As a result, the determination unit 505 can obtain the guideline for correcting the position of the portion specified by the specification unit 504.
The generation unit 506 generates a graph including a node indicating a position of each portion at each time point, the first edge, and the second edge. The first edge couples between the nodes indicating the positions of the different portions that are biologically connected, at each time point. The second edge couples between the nodes indicating the positions of any one portion corresponding to the specified type of the operation, at different time points. When generating the graph, the generation unit 506 associates the determined distribution model with the second edge. As a result, the generation unit 506 enables to correct the skeleton information at the first time point in the time-series data of the skeleton information of the subject.
The generation unit 506 may generate the graph so as to further include, in the graph, a third edge that couples between nodes indicating positions of other portions other than any one portion corresponding to the specified type of the operation, from among the plurality of portions. For example, if the number of first edges coupled to each of the nodes indicating the positions of the other portions at different time points is one each, the generation unit 506 generates the graph that includes the third edge that couples between the nodes. As a result, the generation unit 506 enables to accurately correct the skeleton information at the first time point in the time-series data of the skeleton information of the subject. For example, the generation unit 506 enables to accurately correct the position of the other portion.
The generation unit 506 may generate the graph so as to further include, in the graph, the third edge that couples between the nodes indicating positions of other portions specified as abnormal portions, other than any one portion corresponding to the specified type of the operation, from among the plurality of portions. For example, if the number of first edges coupled to each of the nodes indicating the positions of the other portions at different time points is one each, the generation unit 506 generates the graph that includes the third edge that couples between the nodes. As a result, the generation unit 506 enables to accurately correct the skeleton information at the first time point in the time-series data of the skeleton information of the subject. For example, the generation unit 506 enables to accurately correct the position of the other portion determined as the abnormal portion.
The correction unit 507 corrects the skeleton information at the first time point in the time-series data of the skeleton information of the subject, based on the generated graph. The correction unit 507 corrects the skeleton information at the first time point in the time-series data of the skeleton information of the subject, for example, by optimizing the generated graph. As a result, the correction unit 507 enables to accurately specify the position of each portion of the subject, in consideration of the type of the operation of the subject. The correction unit 507 enables to accurately specify the position of each portion of the subject, in consideration of the magnitude of the probability that each portion of the subject is in the abnormal state.
The output unit 508 outputs a processing result of at least any one of the functional units. Examples of an output format include display on a display, print output to a printer, transmission to an external device by the network I/F 303, and storage in a storage region such as the memory 302 or the recording medium 305. As a result, the output unit 508 may make it possible to notify a user of the processing result of at least any one of the functional units and may promote improvement in convenience of the information processing device 100.
For example, the output unit 508 outputs the skeleton information at the first time point corrected by the correction unit 507. Specifically, the output unit 508 transmits the skeleton information at the first time point corrected by the correction unit 507, to the client device 202. Specifically, the output unit 508 displays the skeleton information at the first time point corrected by the correction unit 507, on the display. As a result, the output unit 508 enables to use the position of each portion of the subject.
Next, an operation example of the information processing device 100 will be described with reference to
The information processing device 100 executes 2 dimension (D) pose estimation processing, on each multi-viewpoint image 600, at each time point. The information processing device 100 generates a 2D heat map 601 indicating a distribution of an existence probability of each joint of the subject in each multi-viewpoint image 600, by executing the 2D pose estimation processing on each multi-viewpoint image 600, at each time point. The 2D heat map 601 includes, for example, a joint likelihood indicating the existence probability of any one joint of the subject, at each point in a 2D space corresponding to the multi-viewpoint image 600.
The information processing device 100 specifies 2D coordinates of the joint of the subject, in the multi-viewpoint image 600, based on the 2D heat map 601 indicating the distribution of the existence probability of each joint of the subject in each multi-viewpoint image 600, at each time point. A variance of the joint likelihood indicating the existence probability of the joint of the subject in the 2D heat map 601 can be treated as an index value representing accuracy of the specified 2D coordinates.
The information processing device 100 acquires arrangement information indicating the angle of each multi-viewpoint image 600, at each time point. The information processing device 100 specifies 3D coordinates of each joint of the subject, in a 3D space, by executing 3D pose estimation processing, based on the arrangement information and the 2D coordinates of each joint of the subject in each multi-viewpoint image 600, at each time point. The information processing device 100 generates a 3D skeleton inference result 602 including the specified 3D coordinates of each joint of the subject at each time point and generates time-series data of the 3D skeleton inference result 602.
The information processing device 100 corrects the 3D skeleton inference result 602, by executing correction processing, on the time-series data of the 3D skeleton inference result 602. The information processing device 100 outputs time-series data of a corrected 3D skeleton inference result 603 to be available. The information processing device 100 outputs, for example, the time-series data of the corrected 3D skeleton inference result 603 to be referred by the user.
The user executes predetermined analysis processing, based on the time-series data of the corrected 3D skeleton inference result 603. Specifically, a case is considered where the subject is a participant of an athletic meet. In this case, the analysis processing is, for example, scoring of a participant in a competition of the athletic meet. The user executes the analysis processing for scoring the participant, based on the time-series data of the corrected 3D skeleton inference result 603.
Specifically, a case is considered where the subject is an examinee of a medical institution that provides rehabilitations, a medical institution examinee who receives diagnosis regarding an exercise capacity such as a walking capacity, or the like. In this case, the analysis processing is, for example, rehabilitation effect determination, diagnosis of an exercise capacity or a health state, or the like. The user performs the rehabilitation effect determination of the examinee of the medical institution or diagnoses the exercise capacity or the health state of the medical institution examinee, based on the time-series data of the corrected 3D skeleton inference result 603.
The information processing device 100 may execute the above analysis processing, based on the time-series data of the corrected 3D skeleton inference result 603. The information processing device 100 outputs a result of executing the analysis processing, so that the user can refer to the result. The information processing device 100 may output the time-series data of the corrected 3D skeleton inference result 603 to the analysis unit 502 that executes the above analysis processing. For example, another computer other than the information processing device 100 includes the analysis unit 502. As a result, the information processing device 100 enables to accurately execute the analysis processing.
Next, a specific example of the correction processing will be described with reference to
The information processing device 100 generates processed data 701, by adding noise to the original data 700. For example, the information processing device 100 generates the processed data 701, by changing 3D coordinates of at least any one of the plurality of joints of the test subject indicated by the original data 700 into 3D coordinates determined to be in the abnormal state. The abnormal state corresponds to, for example, a state where the 3D coordinates of the joint are erroneously estimated. Specifically, the abnormal state is jitter, inversion, swap, miss, or the like. As a result, the information processing device 100 can acquire time-series data of the processed data 701.
The information processing device 100 trains an abnormality determination deep neural network (DNN) 710 using the time-series data of the processed data 701. For example, the abnormality determination DNN 710 has a function for outputting an abnormality probability of each joint of the subject, at least any one 3D skeleton inference result 602, according to an input of a feature amount of the 3D skeleton inference result 602 in the time-series data of the 3D skeleton inference result 602. The abnormality probability indicates a magnitude of a probability that the 3D coordinates of the joint of the subject are positionally in an abnormal state.
For example, the abnormality determination DNN 710 may have a function for outputting the abnormality probability of each joint of the subject, in the entire time-series data, according to the input of the feature amount of the 3D skeleton inference result 602 in the time-series data of the 3D skeleton inference result 602. Next, description of
In
Here, a case has been described where the information processing device 100 specifies the abnormal joint using the abnormality determination DNN 710. However, the present embodiment is not limited to this. For example, there may be a case where the information processing device 100 specifies the abnormal joint on a rule basis. Specifically, the information processing device 100 may store a rule for calculating the abnormality probability of the joint, according to a magnitude of a difference between a feature amount regarding each joint and a threshold, in the 3D skeleton inference result 602. Specifically, it is considered that the information processing device 100 calculates the abnormality probability of each joint, with reference to the stored rule and specifies any one of the joints of which the calculated abnormality probability is equal to or more than the threshold, as the abnormal joint. Next, a specific example in which the information processing device 100 generates Factor Graph will be described with reference to
For example, the state estimation DNN 900 may have a function for outputting the type of the motion of the subject, in the entire time-series data, according to the input of the feature amount of the 3D skeleton inference result 602 in the time-series data of the 3D skeleton inference result 602.
The information processing device 100 includes a Factor Graph definition database (DB) 910. The Factor Graph definition DB 910 stores a template 911 of the Factor Graph, for each type of the motion of the subject. The template 911 is formed by, for example, the node indicating each joint of the subject, the first edge that couples between the nodes indicating the positions of the different joints that are biologically connected, and the second edge that couples between the nodes indicating the positions of the same joint at the different time points. The first edge may be associated with a constraint of a distance between joints. The distance between the joints is, for example, a length of a bone.
Specifically, the Factor Graph definition DB 910 stores a template 911 corresponding to a type of a motion “jump”, a template 911 corresponding to a type of a motion “lying”, or the like. For example, the second edge couples between the nodes indicating the positions of any joints corresponding to the type of the motion of the subject, for each type of the motion of the subject. In other words, the second edge couples, for example, between the nodes indicating the positions of the different joints for each type of the motion of the subject.
The second edge is associated with the distribution model. The second edge that couples between the nodes indicating the positions of any joints is associated with the distribution model indicating the probability distribution that restricts the temporal change in the position of any one joint, according to tendency of the motion of any one joint corresponding to the type of the motion. For example, if the type of the motion is “jump”, the tendency corresponds to a uniform linear motion. For example, if the type of the motion is “lying”, the tendency corresponds to the iso-position motion.
The information processing device 100 specifies the type of the motion of the subject, in each 3D skeleton inference result 602, using the state estimation DNN 900. The information processing device 100 selects the template 911 corresponding to the type of the motion of the subject, in each 3D skeleton inference result 602, as Factor Graph to be used, with reference to the Factor Graph definition DB 910. Next, a specific example of the template 911 of the Factor Graph will be described with reference to
In the example in
Furthermore, the nodes indicating the position of the left hip joint of the subject at different time points are coupled to each other by the second edge 1001. Furthermore, the nodes indicating the position of the right hip joint of the subject at different time points are coupled to each other by the second edge 1001. Each second edge is associated with a distribution model of Pairwise Term indicating a time-series constraint corresponding to the uniform linear motion.
The Pairwise Term is, for example, gt(xj,t-1, xj,t) to N(||xj,t-1, xj,t||| vj{circumflex over ( )}Δt, Σvj{circumflex over ( )}). The reference xj,t-1 is an estimated position of a joint at a time t-1. The reference xj,t is an estimated position of the joint at a time t. The reference vj{circumflex over ( )} is an average speed of the joint. The reference Δt is a unit time width. The reference Σvj{circumflex over ( )} is a velocity variance of the joint. Here, in a case where the type of the motion is “jump”, it is considered that a temporal change in a position of a joint in a trunk portion tends to be regular. On the other hand, regarding the joint in the trunk portion of which the temporal change in the position is considered to be easily predicted, in a case where the type of the motion is “jump”, the template 911 can restrict the temporal change in the position, as assuming the uniform linear motion. Next, description of
In the example in
Furthermore, the nodes indicating the position of the left hip joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right hip joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the left knee joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right knee joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the left leg joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right leg joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the left foot of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right foot of the subject at different time points are coupled to each other by the second edge 1101.
Furthermore, the nodes indicating the position of the left shoulder joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right shoulder joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the left elbow joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right elbow joint of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the left wrist of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right wrist of the subject at different time points are coupled to each other by the second edge 1101.
Furthermore, the nodes indicating the position of the left hand of the subject at different time points are coupled to each other by the second edge 1101. Furthermore, the nodes indicating the position of the right hand of the subject at different time points are coupled to each other by the second edge 1101. In the example of
The Pairwise Term is, for example, gt(xj,t-1, xj,t) to N(||xj,t-1, xj,t|||0, Σxj{circumflex over ( )}). The reference Σxj{circumflex over ( )} is a position variance of the joint. Here, in a case where the type of the motion is “lying”, it is considered that temporal changes in positions of joints in an entire body tend to be regular. On the other hand, regarding the joints in the entire body of which the temporal change in the position is easily predicted, in a case where the type of the motion is “lying”, the template 911 can restrict the temporal change in the position, as assuming the iso-position motion. Next, a specific example will be described in which the information processing device 100 adds a time-series constraint to the selected Factor Graph will be described with reference to
For example, the nodes 1311 and 1312 are coupled by a first edge 1331. For example, the nodes 1312 and 1313 are coupled by a first edge 1332. For example, the nodes 1321 and 1322 are coupled by a first edge 1341. For example, the nodes 1322 and 1323 are coupled by a first edge 1342. For example, the first edge 1342 that couples the nodes 1322 and 1323 may be associated with Pairwise Term indicating a constraint of the bone length.
For example, the nodes 1312 and 1322 are coupled by a second edge 1351. The second edge 1351 is associated with, for example, the Pairwise Term indicating the time-series constraint, corresponding to the type of the motion of the subject. For example, the nodes 1311 and 1321 are coupled by a third edge 1361. The third edge 1361 may be associated with the Pairwise Term indicating the time-series constraint, for example.
The information processing device 100 may associate the node indicating the position of at least any one of the joints of the Factor Graph 1300 with Unary Term. The Unary Term is, for example, f(xj) to N(xj|xj{circumflex over ( )}, Σ3Dj{circumflex over ( )}). The reference xj{circumflex over ( )} is a weighted sum of a joint likelihood of a 3D heat map obtained by integrating the joint likelihoods of the plurality of 2D heat maps. The reference Σ3Dj{circumflex over ( )} is a variance of the joint likelihood of the 3D heat map obtained by integrating the joint likelihoods of the plurality of 2D heat maps.
The information processing device 100 may associate the node indicating the position of at least any one of the joints of the Factor Graph 1300 with Unary Term indicating a constraint of the abnormal joint, that acts to restrict the position of the joint according to the abnormality probability of the joint. In the example in
The information processing device 100 corrects the position of each joint at each time point, based on the Unary Term in the Factor Graph 1300 and the Pairwise Term. The information processing device 100 corrects the position of each joint at each time point, for example, by optimizing the Factor Graph 1300.
As a result, the information processing device 100 can accurately correct the 3D skeleton inference result 602. The information processing device 100 can accurately specify the position of each joint at each time point. For example, even in a case where the subject performs a relatively high speed or relatively complicated motion such as gymnastics, the information processing device 100 can specify the position of each joint of the subject at each time point, with a relatively high degree of certainty.
Here, with reference to Reference Document 3, a comparative example 1 is considered in which the 3D coordinates of the joint of the subject are corrected using Factor Graph that does not include the Pairwise Term indicating the time-series constraint. In the comparative example 1, since it is not possible to restrict the temporal change in the position of the joint, there is a case where it is difficult to accurately correct the 3D coordinates of each joint of the subject and it is difficult to accurately specify the temporal change in the 3D coordinates of each joint of the subject.
On the other hand, the information processing device 100 can use the Factor Graph 1300 including the Pairwise Term indicating the time-series constraint. Therefore, the information processing device 100 can appropriately correct the 3D coordinates of each joint of the subject. For example, the information processing device 100 can appropriately correct the 3D coordinates of the joint of the subject at each time point, so that a temporal change from 3D coordinates of the joint of the subject at a certain time point to 3D coordinates of the joint of the subject at a next time point is a temporal change that is difficult to be intuitively felt as an error by a person.
Moreover, it is assumed that a comparative example 2 be considered in which the 3D coordinates of the joint of the subject are corrected using Factor Graph including Pairwise Term indicating a predetermined time-series constraint. However, in the comparative example 2, there is a case where it is difficult to accurately correct the three-dimensional coordinates of each joint of the subject and it is difficult to accurately specify a temporal change in the three-dimensional coordinates of each joint of the subject. In the comparative example 2, since it is not possible to dynamically change the Pairwise Term indicating the time-series constraint, according to a state of the subject such as the type of the operation, it is difficult to accurately correct the three-dimensional coordinates of each joint of the subject.
On the other hand, the information processing device 100 can set the Factor Graph 1300, by selectively using templates 911 of a plurality of Factor Graphs including Pairwise Terms indicating different time-series constraints, according to the type of the operation of the subject. As a result, for example, the information processing device 100 can selectively use the Pairwise Terms indicating the time-series constraints corresponding to the iso-position motion, the uniform linear motion, the uniform acceleration motion, or the like according to the type of the operation of the subject. Furthermore, the information processing device 100 can couple the second edge corresponding to the Pairwise Term indicating the time-series constraint to the node indicating the 3D coordinates of the joint different according to the type of the operation of the subject.
Therefore, the information processing device 100 can appropriately correct the 3D coordinates of each joint of the subject. For example, the information processing device 100 can appropriately correct the 3D coordinates of the joint of the subject at each time point, so that a temporal change from 3D coordinates of the joint of the subject at a certain time point to 3D coordinates of the joint of the subject at a next time point is a temporal change that is difficult to be intuitively felt as an error by a person. Next, a specific example of a flow of data processing in the operation example will be described with reference to
The information processing device 100 generates a 2D skeleton inference result 1402, by executing 2D skeleton inference processing, on each of the plurality of camera images 1401, with reference to the 2D skeleton inference model 1410, at each time point. The 2D skeleton inference result 1402 includes, for example, 2D coordinates (x [pixel], y [pixel]) indicating a position of a joint and a likelihood indicating certainty of the position of the joint.
The information processing device 100 stores a 3D skeleton inference model 1420. The information processing device 100 stores, for example, a weight parameter that defines a neural network to be the 3D skeleton inference model 1420.
The information processing device 100 generates a 3D skeleton inference result 1403, by executing 3D skeleton inference processing, on the plurality of 2D skeleton inference results 1402, with reference to the 3D skeleton inference model 1420, at each time point. The 3D skeleton inference result 1403 includes, for example, 3D coordinates (x [mm], y [mm], z [mm]) indicating a position of a joint. The information processing device 100 generates time-series data 1404 obtained by integrating the 3D skeleton inference result 1403 at each time point. Next, description of
As illustrated in
The information processing device 100 estimates a type of a motion of the subject, by executing motion state estimation processing, on the time-series data 1404 with reference to the motion state estimation model 1510 and generates a motion state estimation result 1501 including the estimated type of the motion of the subject.
The information processing device 100 stores a Factor Graph definition DB 1520. The Factor Graph definition DB 1520 stores a template of Factor Graph corresponding to a type of a motion, including Pairwise Term indicating a time-series constraint, for each type of the motion. The Pairwise Term indicates that, for example, a temporal change in the position of the joint corresponding to the type of the motion is restricted according to tendency of a motion of the subject corresponding to the type of the motion.
For example, the Factor Graph definition DB 1520 indicates the type of the motion, the type of the joint of the subject, and the tendency of the motion of the joint of the subject corresponding to the type of the motion, in association with each other. The tendency of the motion is, for example, an iso-position motion, a uniform linear motion, a uniform acceleration motion, or the like. The information processing device 100 selects the template of the Factor Graph corresponding to the estimated type of the motion of the subject included in the motion state estimation result 1501, with reference to the Factor Graph definition DB 1520, as Factor Graph to be used.
The information processing device 100 stores a bone length model 1530. The bone length model 1530 includes a parameter that defines the Pairwise Term indicating the constraint of the bone length. The parameter includes, for example, an average and a variance of the bone length. The information processing device 100 refers to the bone length model 1530 and adds the Pairwise Term indicating the constraint of the bone length to the selected Factor Graph.
The information processing device 100 corrects the position of each joint, by executing optimization processing, on the Factor Graph after the addition. The information processing device 100 generates a corrected 3D skeleton inference model 1502 including the corrected position of each joint. As a result, the information processing device 100 can accurately specify the position of each joint of the subject at each time point.
Next, an example of an overall processing procedure executed by the information processing device 100 will be described with reference to
Next, the information processing device 100 estimates a motion state of the subject, at each time point, based on the acquired time-series data of the three-dimensional skeleton inference result of the subject (step S1603). Then, the information processing device 100 selects Factor Graph corresponding to the estimated motion state of the subject, at each time point (step S1604).
Next, the information processing device 100 sets Pairwise Term along a time axis for a leaf node corresponding to a portion of which a likelihood is relatively small, in the selected Factor Graph (step S1605). Then, the information processing device 100 sets Unary Term=likelihood to a specific portion of the subject in the selected Factor Graph and sets Pairwise Term along the time axis for the specific portion of the subject (step S1606).
Next, the information processing device 100 corrects the time-series data of the three-dimensional skeleton inference result of the subject, by optimizing the Factor Graph (step S1607). Then, the information processing device 100 outputs the corrected time-series data of the three-dimensional skeleton inference result of the subject (step S1608). Thereafter, the information processing device 100 ends the overall processing.
As a result, the information processing device 100 can accurately correct the three-dimensional skeleton inference result of the subject. Therefore, the information processing device 100 can improve usefulness of the three-dimensional skeleton inference result of the subject. For example, the information processing device 100 can improve accuracy of the analysis processing based on the three-dimensional skeleton inference result of the subject.
Here, the information processing device 100 may switch some steps in the processing order in
As described above, according to the information processing device 100, it is possible to acquire the time-series data of the skeleton information including the position of each of the plurality of portions of the subject. According to the information processing device 100, it is possible to specify the type of the operation of the subject corresponding to the skeleton information at the first time point in the acquired time-series data, based on the feature amount of the skeleton information in the acquired time-series data. According to the information processing device 100, it is possible to determine the model of the probability distribution that restricts the temporal change in the position of any one portion of the plurality of portions, according to the tendency of the motion of the any one portion corresponding to the specified type of the operation, in the skeleton information at the first time point in the acquired time-series data. According to the information processing device 100, it is possible to generate the graph including the node indicating the position of each portion at each time point. According to the information processing device 100, in the graph, the first edge that couples between the nodes indicating the positions of the different portions that are biologically connected at each time point can be added. According to the information processing device 100, in the graph, the second edge that couples between the nodes indicating the positions of any one portion at the different time points can be added. According to the information processing device 100, in the graph, the determined model can be associated with the second edge. According to the information processing device 100, it is possible to correct the skeleton information at the first time point in the time-series data, based on the generated graph. As a result, the information processing device 100 can accurately correct the skeleton information at the first time point.
According to the information processing device 100, it is possible to determine the model of the probability distribution that restricts the temporal change in the position of any one portion in the skeleton information at the first time point, according to the tendency of the iso-position motion, the uniform motion, or the uniform acceleration motion, of any one portion corresponding to the specified type of the operation. As a result, the information processing device 100 can determine the model that enables to appropriately correct the skeleton information at the first time point, according to the type of the operation.
According to the information processing device 100, it is possible to determine whether the not the number of first edges coupled to each of the nodes indicating the positions of the other portions at the different time points is one, regarding the other portions other than any one portion of the plurality of portions. According to the information processing device 100, if the number of first edges coupled to each of the nodes indicating the positions of the other portion at the different time points is one, regarding the other portion, it is possible to generate the graph so that the third edge that couples the nodes to each other is included in the graph. As a result, the information processing device 100 can increase the number of edges coupled to the node and enables to accurately correct the position of the other portion indicated by the node.
According to the information processing device 100, it is possible to specify the other portion, other than any one portion of the plurality of portions, in the abnormal state regarding the position. According to the information processing device 100, if the number of first edges coupled to each of the nodes indicating the positions of the other portion at the different time points is one, regarding the specified other portion, it is possible to generate the graph so that the third edge that couples the nodes to each other is included in the graph. As a result, the information processing device 100 can specify the other portion that is preferable to be corrected and enables to accurately correct the position of the specified other portion.
Note that the information processing method described in the present embodiment may be implemented by a computer such as a PC or a workstation executing a program prepared in advance. The information processing program described in the present embodiment is executed by being recorded in a computer-readable recording medium and being read from the recording medium by the computer. The recording medium is a hard disk, a flexible disk, a compact disc (CD)-ROM, a magneto optical disc (MO), a digital versatile disc (DVD), or the like. In addition, the information processing program described in the present embodiment may be distributed via a network such as the Internet.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a continuation application of International Application PCT/JP2022/016364 filed on Mar. 30, 2022 and designated the U.S., the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/016364 | Mar 2022 | WO |
Child | 18885788 | US |