The present disclosure relates to an equipment operation device and an equipment operation method.
There is an equipment operation device that receives an operation of operation target equipment by executing an operation application of the operation target equipment (See, for example, Patent Literature 1).
The equipment operation device includes a first peripheral image acquiring unit, a second peripheral image acquiring unit, a setting unit, and a signal output unit. The first peripheral image acquiring unit acquires first peripheral images that are images of respective peripheral regions in one or more pieces of operation target equipment, and stores the respective first peripheral images in association with the pieces of operation target equipment. The second peripheral image acquiring unit acquires a second peripheral image that is an image of a peripheral region of operation target equipment, captured when a user operates the operation target equipment. When there is a first peripheral image coincident with the second peripheral image among the plurality of first peripheral images, the setting unit sets a piece of operation target equipment corresponding to the first peripheral image as a piece of equipment to be actually operated. When there is no first peripheral image coincident with the second peripheral image among the plurality of first peripheral images, the setting unit receives an operation of selecting a piece of equipment to be actually operated by a user. The signal output unit outputs an operation signal to a piece of equipment to be actually operated by executing an operation application of the piece of equipment.
Patent Literature 1: JP 2012-119935 A
For example, in an individual's house, a plurality of rooms having the same shape and having the same pattern of a ceiling, a floor, or a wall may be present. In each of a plurality of rooms having the same pattern or the like of a wall, for example, an air conditioner of the same model may be disposed as operation target equipment.
In such a case, since peripheral regions of the air conditioners disposed in the plurality of rooms are similar to each other, it may be difficult to distinguish images of the peripheral regions of the air conditioners disposed in the rooms from each other.
In the equipment operation device disclosed in Patent Literature 1, since the first peripheral images of the pieces of operation target equipment disposed in the plurality of rooms are similar to each other, the first peripheral image acquiring unit may store the plurality of respective first peripheral images similar to each other in association with the pieces of operation target equipment. In such a case, the setting unit may fail to detect a first peripheral image coincident with the second peripheral image among the plurality of first peripheral images, or may erroneously detect a first peripheral image not coincident with the second peripheral image. In addition, also for a user, it may be difficult to determine which piece of operation target equipment among pieces of operation target equipment associated with the respective first peripheral images is a piece of equipment to be actually operated. Under these circumstances, there is a problem that the signal output unit cannot execute an operation application of a piece of equipment to be actually operated in some cases.
The present disclosure has been made in order to solve the above problem, and an object of the present disclosure is to provide an equipment operation device and an equipment operation method capable of executing an operation application of operation target equipment appearing in a captured image even in a case where the captured image and a registered image that has been recorded are similar to each other, and a difference between these images is not detected.
An equipment operation device according to the present disclosure includes processing circuitry configured to; acquire, from a camera, image data indicating a captured image in which operation target equipment and a peripheral region of the operation target equipment appear; perform a process of detecting a difference between the captured image indicated by the acquired image data and a first registered image that has been recorded; record the captured image as a second registered image that is different from the first registered image when the difference is detected, and acquire a second captured image differentiated from the first registered image that has been recorded and to record the second captured image as the second registered image that is different from the first registered image when the difference is not detected; and download an operation application of the operation target equipment appearing in the second registered image having been recorded.
According to the present disclosure, it is possible to execute an operation application of operation target equipment appearing in a captured image even in a case where the captured image and a registered image that has been recorded are similar to each other, and a difference between these images is not detected.
Hereinafter, in order to describe the present disclosure in more detail, embodiments for carrying out the present disclosure will be described with reference to the attached drawings.
In
A camera 2 captures an image of a range including the operation target equipment 1 and a peripheral region of the operation target equipment 1, and outputs image data indicating a captured image PG to the equipment operation device 3. The captured image PG may be a still image or a moving image.
In the equipment operation device 3 illustrated in
The equipment operation device 3 includes a captured image acquiring unit 11, a difference detecting unit 12, a guidance presenting unit 13, an image recording unit 14, a download unit 15, and an operation receiving unit 16.
The captured image acquiring unit 11 is implemented by, for example, a captured image acquiring circuit 31 illustrated in
The captured image acquiring unit 11 acquires image data indicating the captured image PG from the camera 2.
In addition, when a difference between the captured image PG and each of registered images RG1 to RGN that have been recorded is not detected by the difference detecting unit 12, the captured image acquiring unit 11 acquires, as a second captured image PG′, image data indicating a captured image in which the operation target equipment 1 to which an identification marker is pasted and a peripheral region of the operation target equipment 1 appear from the camera 2 after guidance is presented by the guidance presenting unit 13. N is an integer equal to or more than 1.
The captured image acquiring unit 11 outputs the image data indicating the captured image PG to each of the difference detecting unit 12 and the image recording unit 14.
The captured image acquiring unit 11 outputs image data indicating the second captured image PG′ to the image recording unit 14.
The difference detecting unit 12 is implemented by, for example, a difference detecting circuit 32 illustrated in
The difference detecting unit 12 acquires image data indicating the captured image PG from the captured image acquiring unit 11, and acquires image data indicating a registered image RGn (n=1, . . . , N) that has been recorded from the image recording unit 14.
The difference detecting unit 12 performs a process of detecting a difference between the captured image PG and the registered image RGn that has been recorded. The process of detecting the difference is a process of comparing the captured image PG with the registered image RGn that has been recorded and determining whether or not there is a difference between these images.
The guidance presenting unit 13 is implemented by, for example, a guidance presenting circuit 33 illustrated in
When the difference is not detected by the difference detecting unit 12, the guidance presenting unit 13 presents guidance that prompts a user to paste an identification marker.
As a method for presenting the guidance, the guidance presenting unit 13 may display the guidance on a display (not illustrated) or may output the guidance by voice from a speaker (not illustrated).
The image recording unit 14 is implemented by, for example, an image recording circuit 34 illustrated in
When the difference is detected by the difference detecting unit 12, the image recording unit 14 acquires image data indicating the captured image PG from the captured image acquiring unit 11.
The image recording unit 14 records the captured image PG as a registered image RGn+1 (n=1, . . . , N).
When the difference is not detected by the difference detecting unit 12, the image recording unit 14 acquires image data indicating the second captured image PG′ differentiated from the registered image RG1 that has been recorded from the captured image acquiring unit 11.
The image recording unit 14 records the second captured image PG′ as the registered image RGn+1.
That is, when the difference is not detected by the difference detecting unit 12, the image recording unit 14 acquires, as the second captured image PG′, a captured image in which the operation target equipment 1 to which an identification marker is pasted and a peripheral region of the operation target equipment 1 appear, and records the captured image as the registered image RGn+1.
The download unit 15 is implemented by, for example, a download circuit 35 illustrated in
For example, the download unit 15 downloads an operation application of the operation target equipment 1 appearing in the registered image RGn+1 that has been recorded by the image recording unit 14 from a server device 4 of a manufacturer of the operation target equipment 1 via a network.
The operation application of the operation target equipment 1 is stored in, for example, an internal memory of the download unit 15.
The operation receiving unit 16 is implemented by, for example, an operation receiving circuit 36 illustrated in
The operation receiving unit 16 receives selection of any one operation application from among one or more operation applications downloaded by the download unit 15.
The operation receiving unit 16 receives an operation of the operation target equipment 1 by executing an operation application selected by a user.
In
To each of the captured image acquiring circuit 31, the difference detecting circuit 32, the guidance presenting circuit 33, the image recording circuit 34, the download circuit 35, and the operation receiving circuit 36, for example, a single circuit, a composite circuit, a programmed processor, a parallel-programmed processor, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination thereof corresponds.
The constituent elements of the equipment operation device 3 are not limited to those implemented by dedicated hardware, and the equipment operation device 3 may be implemented by software, firmware, or a combination of software and firmware.
Software or firmware is stored as a program in a memory of a computer. The computer means hardware that executes a program. To the computer, for example, a central processing unit (CPU), a central processing device, a processing device, an arithmetic device, a microprocessor, a microcomputer, a processor, or a digital signal processor (DSP) corresponds.
In a case where the equipment operation device 3 is implemented by software, firmware, or the like, a program for causing a computer to execute processing procedures in each of the captured image acquiring unit 11, the difference detecting unit 12, the guidance presenting unit 13, the image recording unit 14, the download unit 15, and the operation receiving unit 16 is stored in a memory 51. A processor 52 of the computer executes the program stored in the memory 51.
Next, an operation of the equipment operation device 3 illustrated in
By operating the camera 2, a user captures an image of a range including the operation target equipment 1 and a peripheral region of the operation target equipment 1 as illustrated in
In the example of
In the example of
When the user taps the operation target equipment 1 recognized by the camera 2, the operation target equipment 1 is determined.
The camera 2 captures an image of a range including the operation target equipment 1 and a peripheral region of the operation target equipment 1, and outputs image data indicating a captured image PG to the equipment operation device 3.
The captured image acquiring unit 11 acquires image data indicating the captured image PG from the camera 2 (step ST1 in
The captured image acquiring unit 11 outputs the image data indicating the captured image PG to each of the difference detecting unit 12 and the image recording unit 14.
The difference detecting unit 12 acquires image data indicating the captured image PG from the captured image acquiring unit 11, and acquires image data indicating registered images RG1 to RGN that have been recorded from the image recording unit 14.
By comparing the captured image PG with the registered image RG1 (n=1, . . . N) that has been recorded, the difference detecting unit 12 performs a process of detecting a difference between the captured image PG and the registered image RG1 that has been recorded (step ST2 in
As the process of detecting the difference, a known image comparing application that determines whether or not a plurality of images are the same image can be used. Examples of the known image comparing application include an application using a pattern matching method.
Specifically, the difference detecting unit 12 determines a similarity between the captured image PG and the registered image RG1 that has been recorded. When the similarity is equal to or more than a threshold, the difference detecting unit 12 determines that there is “no difference” between the captured image PG and the registered image RGn. When the similarity is less than the threshold, the difference detecting unit 12 determines that there is a “difference” between the captured image PG and the registered image RGn. The threshold for determining the similarity may be stored in an internal memory of the difference detecting unit 12 or may be given from the outside of the equipment operation device 3.
The difference detecting unit 12 outputs a result of the process of detecting the difference to each of the captured image acquiring unit 11, the guidance presenting unit 13, and the image recording unit 14.
The image recording unit 14 acquires the result of the process of detecting the difference from the difference detecting unit 12.
If the difference is detected by the difference detecting unit 12 (step ST3 in
The guidance presenting unit 13 acquires the result of the process of detecting the difference from the difference detecting unit 12.
If the difference is not detected by the difference detecting unit 12 (step ST3 in
As illustrated in
The identification marker may be a mark of any picture. For example, the identification marker may be packed in a cardboard in which the purchased operation target equipment 1 is packed.
When the captured image PG has already been recorded as the registered image RGn, the user to whom the guidance has been presented does not perform the operation of pasting the identification marker to the operation target equipment 1 in order to prevent double registration of the captured image PG.
In a case where the user performs the operation of pasting the identification marker to the operation target equipment 1, by operating the camera 2, the user captures an image of a range including the operation target equipment 1 to which the identification marker is pasted and a peripheral region of the operation target equipment 1.
The camera 2 outputs image data indicating the second captured image PG′ to the equipment operation device 3 with an image in which the operation target equipment 1 to which the identification marker is pasted and a peripheral region of the operation target equipment 1 appear as the second captured image PG′.
The captured image acquiring unit 11 acquires the result of the process of detecting the difference from the difference detecting unit 12.
When the difference is not detected by the difference detecting unit 12, the captured image acquiring unit 11 acquires the image data indicating the second captured image PG′ output from the camera 2 after the guidance is presented by the guidance presenting unit 13 (step ST6 in
The captured image acquiring unit 11 outputs the image data indicating the second captured image PG′ to the image recording unit 14.
When the difference is not detected by the difference detecting unit 12, the image recording unit 14 acquires the image data indicating the second captured image PG′ from the captured image acquiring unit 11.
The image recording unit 14 records the second captured image PG′ as the registered image RGn+1 (step ST7 in
When the captured image PG or the second captured image PG′ is recorded as the registered image RGn+1 by the image recording unit 14, the download unit 15 downloads an operation application of the operation target equipment 1 appearing in the registered image RG1+, for example, from the server device 4 of the manufacturer of the operation target equipment 1 (step ST8 in
The operation application of the operation target equipment 1 is stored in, for example, an internal memory of the download unit 15.
When a user operates the operation target equipment 1, as illustrated in
In the example of
In addition, in the example of
The operation receiving unit 16 receives selection of any one operation application from among one or more selectable operation applications (step ST9 in
In the example of
The operation receiving unit 16 receives an operation of the operation target equipment 1 by executing the operation application selected by the user (step ST10 in
In the above first embodiment, the equipment operation device 3 is configured to include: the captured image acquiring unit 11 that acquires, from the camera 2, image data indicating a captured image in which the operation target equipment 1 and a peripheral region of the operation target equipment 1 appear; and the difference detecting unit 12 that performs a process of detecting a difference between the captured image indicated by the image data acquired by the captured image acquiring unit 11 and a registered image that has been recorded. The equipment operation device 3 also includes: the image recording unit 14 that records the captured image as a registered image when the difference is detected by the difference detecting unit 12, acquires a second captured image differentiated from the registered image that has been recorded and records the second captured image as a registered image when the difference is not detected by the difference detecting unit 12; and the download unit 15 that downloads an operation application of the operation target equipment 1 appearing in the registered image recorded by the image recording unit 14. Therefore, the equipment operation device 3 can execute an operation application of operation target equipment appearing in a captured image even in a case where the captured image and a registered image that has been recorded are similar to each other, and a difference between these images is not detected.
In the equipment operation device 3 illustrated in
The plurality of captured images PG is, for example, images obtained by capturing images of the same operation target equipment 1 from different directions.
The difference detecting unit 12 performs the process of detecting a difference between each of the plurality of captured images PG and the registered image RGn that has been recorded, whereby difference detecting accuracy is improved as compared with a case of performing a process of detecting a difference between one captured image PG and the registered image RGn that has been recorded.
In the equipment operation device 3 illustrated in
In a second embodiment, an equipment operation device 3 will be described in which when a difference is not detected by a difference detecting unit 12, an image recording unit 17 combines an illustration component indicating an identification marker with operation target equipment 1 appearing in a captured image PG, and thereby acquires an image in which the operation target equipment 1 with the identification marker and a peripheral region thereof appear as a second captured image.
The equipment operation device 3 illustrated in
The image recording unit 17 is implemented by, for example, an image recording circuit 37 illustrated in
When a difference is detected by the difference detecting unit 12, the image recording unit 17 records the captured image PG as a registered image RGn+1.
When a difference is not detected by the difference detecting unit 12, the image recording unit 17 combines an illustration component indicating an identification marker with the operation target equipment 1 appearing in the captured image PG, and thereby acquires an image in which the operation target equipment 1 with the identification marker and a peripheral region thereof appear as a second captured image PG′.
The image recording unit 17 records the second captured image PG′ as the registered image RGn+1.
In
To each of the captured image acquiring circuit 31, the difference detecting circuit 32, the image recording circuit 37, the download circuit 35, and the operation receiving circuit 36, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or a combination thereof corresponds.
The constituent elements of the equipment operation device 3 are not limited to those implemented by dedicated hardware, and the equipment operation device 3 may be implemented by software, firmware, or a combination of software and firmware.
In a case where the equipment operation device 3 is implemented by software, firmware, or the like, a program for causing a computer to execute processing procedures in each of the captured image acquiring unit 11, the difference detecting unit 12, the image recording unit 17, the download unit 15, and the operation receiving unit 16 is stored in the memory 51 illustrated in
Next, an operation of the equipment operation device 3 illustrated in
When a difference is detected by the difference detecting unit 12, the image recording unit 17 records the captured image PG as a registered image RGn+1 similarly to the image recording unit 14 illustrated in
When a difference is not detected by the difference detecting unit 12, the image recording unit 17 combines an illustration component indicating an identification marker with the operation target equipment 1 appearing in the captured image PG. The illustration component indicating an identification marker may be stored in an internal memory of the image recording unit 17 or may be given from the outside of the equipment operation device 3. Image processing of combining the illustration component indicating an identification marker with the operation target equipment 1 is a known technique, and therefore a detailed description thereof is omitted.
The image recording unit 17 acquires, as the second captured image PG′, an image in which the operation target equipment 1 with an identification marker and a peripheral region thereof appear, and records the image as the registered image RGn+1.
In the above second embodiment, the equipment operation device 3 is configured in such a manner that when a difference is not detected by the difference detecting unit 12, the image recording unit 17 combines an illustration component indicating an identification marker with the operation target equipment 1 appearing in a captured image, and thereby acquires an image in which the operation target equipment 1 with the identification marker and a peripheral region thereof appear as a second captured image. Therefore, the equipment operation device 3 can execute an operation application of operation target equipment appearing in a captured image even in a case where a captured image and a registered image that has been recorded are similar to each other, and a difference between these images is not detected, and can omit a user's work of pasting an identification marker to the operation target equipment 1.
In a third embodiment, an equipment operation device 3 including a grid adding unit 18 that adds a grid to a captured image PG or a second captured image PG′ will be described.
The equipment operation device 3 illustrated in
The grid adding unit 18 is implemented by, for example, a grid adding circuit 38 illustrated in
When acquiring image data indicating the captured image PG from the captured image acquiring unit 11, the grid adding unit 18 adds a grid to the captured image PG.
The grid adding unit 18 outputs a captured image PGG with a grid obtained by adding a grid to the captured image PG as the captured image PG to each of the difference detecting unit 12 and the image recording unit 14.
When acquiring image data indicating the second captured image PG′ from the captured image acquiring unit 11, the grid adding unit 18 adds a grid to the second captured image PG′.
The grid adding unit 18 outputs a captured image PGG′ with a grid obtained by adding a grid to the second captured image PG′ as the second captured image PG′ to the image recording unit 14.
In the equipment operation device 3 illustrated in
In
To each of the captured image acquiring circuit 31, the grid adding circuit 38, the difference detecting circuit 32, the guidance presenting circuit 33, the image recording circuit 34, the download circuit 35, and the operation receiving circuit 36, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or a combination thereof corresponds.
The constituent elements of the equipment operation device 3 are not limited to those implemented by dedicated hardware, and the equipment operation device 3 may be implemented by software, firmware, or a combination of software and firmware.
In a case where the equipment operation device 3 is implemented by software, firmware, or the like, a program for causing a computer to execute processing procedures in each of the captured image acquiring unit 11, the grid adding unit 18, the difference detecting unit 12, the guidance presenting unit 13, the image recording unit 14, the download unit 15, and the operation receiving unit 16 is stored in the memory 51 illustrated in
Next, an operation of the equipment operation device 3 illustrated in
When acquiring image data indicating the captured image PG from the captured image acquiring unit 11, the grid adding unit 18 adds a grid to the captured image PG. Since a process itself of adding a grid to the captured image PG is a known technique, detailed description thereof is omitted.
The grid adding unit 18 outputs a captured image PGG with a grid obtained by adding a grid to the captured image PG as the captured image PG to each of the difference detecting unit 12 and the image recording unit 14.
When acquiring image data indicating the second captured image PG′ from the captured image acquiring unit 11, the grid adding unit 18 adds a grid to the second captured image PG′.
The grid adding unit 18 outputs a captured image PGG′ with a grid obtained by adding a grid to the second captured image PG′ as the second captured image PG′ to the image recording unit 14.
In the example of
By comparing the captured image PGG with a grid with the registered image RGn (n=1, . . . , N) that has been recorded, the difference detecting unit 12 detects a difference between the captured image PGG with a grid and the registered image RGn that has been recorded. A grid is also added to the registered image RGn that has been recorded.
By adding a grid, for example, a size ratio between the operation target equipment 1 and an object present in a peripheral region thereof is clear. Therefore, detection accuracy of a difference between the captured image PGG with a grid and the registered image RG1 with a grid is higher than detection accuracy of a difference between the captured image PG without a grid and the registered image RGn without a grid.
When a difference is detected by the difference detecting unit 12, the image recording unit 14 records the captured image PGG with a grid as a registered image RGn+1.
When a difference is not detected by the difference detecting unit 12, the image recording unit 14 records the captured image PGG′ with a grid to which an identification marker is pasted as the registered image RGn+1.
In the above third embodiment, the equipment operation device 3 illustrated in
In a fourth embodiment, an equipment operation device 3 including a feature region designation receiving unit 19 that receives designation of a feature region that is a region of interest in a captured image PG acquired by a captured image acquiring unit 11 when a difference detecting unit 12 performs a process of detecting a difference between the captured image PG and a registered image RG1 (n=1, . . . , N) that has been recorded will be described.
The equipment operation device 3 illustrated in
The feature region designation receiving unit 19 is implemented by, for example, a feature region designation receiving circuit 39 illustrated in
The feature region designation receiving unit 19 receives designation of a feature region FRm that is a region of interest in a captured image PG acquired by the captured image acquiring unit 11 when the difference detecting unit 12 performs a process of detecting a difference between the captured image PG and a registered image RG1 (n=1, . . . , N) that has been recorded. m=1, . . . , M, and M is an integer equal to or more than 1.
The feature region designation receiving unit 19 outputs a captured image PGF in which the feature region FRm is designated to each of the difference detecting unit 12 and the image recording unit 14 as the captured image PG.
The feature region designation receiving unit 19 outputs a captured image PGF′ in which the feature region FRm is designated to the image recording unit 14 as a second captured image PG′.
In addition, when receiving designation of the plurality of feature regions FRm, the feature region designation receiving unit 19 receives designation of a priority order of each of the feature regions FRm.
The feature region designation receiving unit 19 outputs the captured image PGF in which the feature region FRm with a priority order is designated to each of the difference detecting unit 12 and the image recording unit 14 as the captured image PG.
The feature region designation receiving unit 19 outputs the captured image PGF′ in which the feature region FRm with a priority order is designated to the image recording unit 14 as the second captured image PG′.
In the equipment operation device 3 illustrated in
In
To each of the captured image acquiring circuit 31, the feature region designation receiving circuit 39, the difference detecting circuit 32, the guidance presenting circuit 33, the image recording circuit 34, the download circuit 35, and the operation receiving circuit 36, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or a combination thereof corresponds.
The constituent elements of the equipment operation device 3 are not limited to those implemented by dedicated hardware, and the equipment operation device 3 may be implemented by software, firmware, or a combination of software and firmware.
In a case where the equipment operation device 3 is implemented by software, firmware, or the like, a program for causing a computer to execute processing procedures in each of the captured image acquiring unit 11, the feature region designation receiving unit 19, the difference detecting unit 12, the guidance presenting unit 13, the image recording unit 14, the download unit 15, and the operation receiving unit 16 is stored in the memory 51 illustrated in
Next, an operation of the equipment operation device 3 illustrated in
The feature region designation receiving unit 19 receives designation of a feature region FRm (m=1, . . . , M) that is a region of interest when the difference detecting unit 12 performs a process of detecting a difference between the captured image PG and a registered image RGn (n=1, . . . , N) that has been recorded. Since a process itself of receiving the feature region FRm designated by a user is a known technique, detailed description thereof is omitted.
In the example of
In addition, when receiving designation of the plurality of feature regions FRm, the feature region designation receiving unit 19 receives designation of a priority order of each of the feature regions FRm. Since a process itself of receiving a priority order designated by a user is a known technique, detailed description thereof is omitted.
In the example of
Here, for convenience of description, it is assumed that the feature region FR1 has a first priority order and the feature region FR2 has a second priority order.
The feature region designation receiving unit 19 outputs the captured image PGF in which the feature region FRm with a priority order is designated to each of the difference detecting unit 12 and the image recording unit 14 as the captured image PG.
The feature region designation receiving unit 19 outputs the captured image PGF′ in which the feature region FRm with a priority order is designated to the image recording unit 14 as the second captured image PG′.
The difference detecting unit 12 acquires the captured image PG in which the feature region FRm with a priority order is designated from the feature region designation receiving unit 19.
By comparing the feature region FR1 having the first priority order in the captured image PGF with a region corresponding to the feature region FR1 in the registered image RGn (n=1, . . . , N) that has been recorded, the difference detecting unit 12 detects a difference between the feature region FR1 and the region corresponding to the feature region FR1.
When there is a difference between the feature region FR1 and the region corresponding to the feature region FR1, the difference detecting unit 12 ends the comparing process.
When there is no difference between the feature region FR1 and the region corresponding to the feature region FR1, by comparing the feature region FR2 having the second priority order with a region corresponding to the feature region FR2 in the registered image RGn that has been recorded, the difference detecting unit 12 detects a difference between the feature region FR2 and the region corresponding to the feature region FR2.
When a difference is detected by the difference detecting unit 12, the image recording unit 14 acquires the captured image PGF in which the feature region FRm is designated as the captured image PG and records the captured image PGF as the registered image RGn+1.
When a difference is not detected by the difference detecting unit 12, the image recording unit 14 acquires the captured image PGF′ in which the feature region FRm with a priority order is designated as the second captured image PG′, and records the captured image PGF′ as the registered image RGn+1.
In the above fourth embodiment, the equipment operation device 3 illustrated in
Note that the present disclosure can freely combine the embodiments to each other, modify any constituent element in each of the embodiments, or omit any constituent element in each of the embodiments.
The present disclosure is suitable for an equipment operation device and an equipment operation method.
1: operation target equipment, 2: camera, 3: equipment operation device, 4: server device, 11: captured image acquiring unit, 12: difference detecting unit, 13: guidance presenting unit, 14: image recording unit, 15: download unit, 16: operation receiving unit, 17: image recording unit, 18: grid adding unit, 19: feature region designation receiving unit, 31: captured image acquiring circuit, 32: difference detecting circuit, 33: guidance presenting circuit, 34: image recording circuit, 35: download circuit, 36: operation receiving circuit, 37: image recording circuit, 38: grid adding circuit, 39: feature region designation receiving circuit, 51: memory, 52: processor
This application is a Continuation of International Patent Application No. PCT/JP2022/023228, filed on Jun. 9, 2022, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/023228 | Jun 2022 | WO |
Child | 18961596 | US |