This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2016-011668, filed on Jan. 25, 2016, the entire contents of which are incorporated herein by reference.
The embodiment discussed herein is related to an information processing device, an information processing method, and a non-transitory computer-readable recording medium.
Currently, a working field is confronted with problems such as labor shortage and fosterage of a practicing engineer, and thus to deploy an expert in every working field is difficult in some cases. There exists a system to solve the problem, by which an expert gives an instruction to a worker that is working in a remote location while grasping remote information to coordinately perform a work. Moreover, the work can be efficiently performed by the combined use of an Augmented Reality technology (AR technology) in which virtual world information is superimposed on an image of an actual environment captured by a camera and information is provided to the worker.
The remote support device 60 is operated by the indicator 1. The remote support device 60 generates a three-dimensional panorama image (3D panorama image) 4 from the image frame 2c transmitted from the worker terminal 50 in the remote location, and displays it. The indicator 1 grasps a situation of the working field in the remote location from the three-dimensional panorama image 4 that is displayed on the remote support device 60. The three-dimensional panorama image 4 is updated every time when the image frame 2c is received.
The indicator 1 clicks, for example, a spot to which an instruction is to be given in the three-dimensional panorama image 4. Position information on a position in the image frame 2c that is clicked by the indicator 1 and instruction information 2f that includes instruction's contents 2g and the like are transmitted from the remote support device 60 to the worker terminal 50. When receiving the instruction information 2f, the worker terminal 50 causes the display device 21d to display the instruction's contents 2g. The worker 2 refers to the instruction's contents 2g displayed on the display device 21d to perform a work.
One example of a conventional process will be explained, by which the three-dimensional panorama image 4 is generated. The conventional technology calculates position/posture information on the camera 21c having captured the image frame 2c on the basis of a Simultaneous Localization And Mapping technology (SLAM technology), etc. The conventional technology generates a frustum-shaped three-dimensional image drawing object by using the position/posture information of the camera and a preliminarily acquired internal parameter of the camera 21c, and performs texture mapping of the image frame 2c on a base of the three-dimensional image drawing object. The conventional technology arranges the texture mapped three-dimensional image drawing object on the three-dimensional space on the basis of the position/posture information on the camera 21c. The conventional technology repeatedly executes the aforementioned process to generate the three-dimensional panorama image 4.
Patent Literature 1: Japanese Laid-open Patent Publication No. 2011-159162
However, with regard to the aforementioned conventional technology, there exists a problem that a panorama image is not appropriately generated.
For example, in the process for generating and arranging a three-dimensional image drawing object with the movement of the camera 21c, because a three-dimensional image drawing object is not appropriately generated in the conventional technology, misalignment occurs in a three-dimensional panorama image.
According to an aspect of an embodiment, an information processing device includes a processor that executes a process including: acquiring information on a characteristic point of a subject, which is included in image information captured by a photographing device, and position/posture information of the photographing device; computing a distance from the photographing device to the subject based on the information on the characteristic point and the position/posture information; and changing a scale of image information that is arranged to generate a panorama image in accordance with the distance.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Preferred embodiments of the present invention will be explained with reference to accompanying drawings. It is not intended that this invention be limited to the embodiment described below.
A referential technology that generates a three-dimensional panorama image will be explained before explanation of the present embodiment. The referential technology to be explained hereinafter is not a conventional technology.
The scales of the three-dimensional image drawing objects 10a to 10d generated by the referential technology are decided on the basis of an internal parameter of the camera, and once decided scale is fixed. Therefore, as in the referential technology, when the three-dimensional image drawing objects 10a to 10d are arranged with the scales being fixed, every image is arranged in a certain size at the position that is away from the position of the camera along an eye direction by a unit length of the three-dimensional space. From this, mismatch of image occurs between the three-dimensional image drawing objects 10a to 10d, and the three-dimensional panorama image is not appropriately generated, and thus the exact grasping of a working field by a remote supporter may be difficult.
In the referential technology, when a worker continues capturing only by rotational motion while suppressing translational motion of the camera, or performs capturing so that the distance from the camera to a capturing target accords to a unit length of the three-dimensional space, the error of the three-dimensional panorama image can be reduced. However, in the aforementioned method, the burden to be given to the worker is large.
Next, a configuration of a system according to the present embodiment will be explained.
The worker terminal 100 is a terminal device that is worn by a worker that works in a working field.
The communication unit 110 is a processing unit that executes data communication with the remote support device 200 via the network 70. The communication unit 110 corresponds to, for example, a communication device. The controller 150 to be mentioned later exchanges information with the remote support device 200 via the communication unit 110.
The camera 120 is a camera to be worn by a worker. The camera 120 is connected to the worker terminal 100 by wireless communication or the like. The camera 120 is a compact camera such as a Head Mounted Camera (HMC) and a wearable Charge Coupled Device (CCD). The camera 120 captures an image of a captured region, and outputs information on the captured image to the worker terminal 100.
The display device 130 is a display device that displays information output from the controller 150. The display device 130 is a wearable display device such as a Head Mounted Display (HMD), from/to which the audio can be output/input. For example, the display device 130 displays instruction information and the like by an indicator, which is transmitted from the remote support device 200.
The storage 140 stores characteristic point mapping information 141, position/posture information 142, image information 143, and geometric distance information 144. The storage 140 corresponds to a semiconductor memory element such as a Random Access Memory (RAM), a Read Only Memory (ROM), or a flash memory, and a storage device such as a Hard Disk Drive (HDD).
The characteristic point mapping information 141 is information in which a plurality of characteristic points included in the image information that is captured by the camera 120 and three-dimensional coordinates of the characteristic points are associated, respectively.
The position/posture information 142 is information that indicates the position and the posture of the camera 120 at the timing when the camera 120 captures the image information 143.
The image information 143 is information on images captured by the camera 120. The geometric distance information 144 indicates the distance from the camera 120 to a subject.
The controller 150 includes an acquiring unit 151, a computing unit 152, a transmitting unit 153, and a display device controller 154. The controller 150 corresponds to an integrated device such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA). The controller 150 corresponds to an electronic circuit of a Central Processing Unit (CPU), a Micro Processing Unit (MPU), etc.
The acquiring unit 151 is a processing unit that acquires information on characteristic points of a subject included in the image information that is captured by the camera 120 and position/posture information of the camera 120. The acquiring unit 151 registers the information on the characteristic point of the subject in the characteristic point mapping information 141. The acquiring unit 151 stores position/posture information of the camera 120 in the storage 140 as the position/posture information 142. The acquiring unit 151 stores image information that is captured by the camera 120 in the storage 140 as the image information 143.
One example of a process, by which the acquiring unit 151 calculates the information on a characteristic point, will be explained. The acquiring unit 151 acquires image information from the camera 120, and extracts a characteristic point from the image information. For example, the acquiring unit 151 executes an edge detecting process for the image information to extract a characteristic point from the image information.
The acquiring unit 151 compares characteristic points of the first image information captured by the camera 120 at time T1 with characteristic points of the second image information captured by the camera 120 at time T2 to associate the same characteristic point with each other. The acquiring unit 151 compares characteristic amount of the characteristic points to determine that a combination of characteristic points in which difference in the characteristic amount is minimum as the same characteristic point. The characteristic amount of a characteristic point corresponds to the brightness distribution, the edge intensity, etc. around the characteristic point.
The acquiring unit 151 calculates a three-dimensional coordinate of a characteristic point on the basis of a coordinate of the same characteristic point that is included in the first image information and the second image information, and the principle of stereo matching. The acquiring unit 151 repeatedly executes the aforementioned process about each characteristic point, and calculates a three-dimensional coordinate of each characteristic point to acquire information on the characteristic points.
One example of a process by which the acquiring unit 151 calculates position/posture information on the camera 120 will be explained. The acquiring unit 151 may estimate the position/posture information on the camera by the monocular SLAM function. For example, the acquiring unit 151 converts, on the basis of a conversion table, the three-dimensional coordinate of each characteristic point of the characteristic point mapping information 141 to a two-dimensional coordinate to project each characteristic point to the present image information captured by the camera 120. In the conversion table, a two-dimensional coordinate, which is acquired from a three-dimensional coordinate of a characteristic point, differs in accordance with the position/posture information of the camera 120.
The acquiring unit 151 searches the position/posture information of the camera 120, by which the error between a characteristic point on the image information and a projected characteristic point is minimum, with regard to the same characteristic point. The acquiring unit 151 acquires the position/posture information by which the error is minimum as the present position/posture information on the camera 120.
The computing unit 152 is a processing unit that computes a geometric distance from the camera 120 to the subject on the basis of the characteristic point mapping information 141 and the position/posture information 142. The computing unit 152 writes information on the geometric distance as the geometric distance information 144. The computing unit 152 stores the geometric distance information 144 in the storage 140.
One example of a process by which the computing unit 152 computes a geometric distance will be explained. A set of the characteristic points included in the characteristic point mapping information 141 may be referred to as “set P”. The computing unit 152 extracts, within the set “P”, a set “P′⊂P” of the characteristic points that can be observed by the camera 120.
The computing unit 152 may extract the characteristic points of the characteristic point mapping information 141, which respectively have association with the characteristic points of the image information captured by the camera 120, as a set “P′”. In the following explanation of the computing unit 152, the image information captured by the camera 120 may be referred to as “image information C”. The computing unit 152 may project the set “P” on the image information C, and further may extract a set of characteristic points that, for example, exist inside the image information C as the set “P”.
The computing unit 152 computes a geometric distance “d” on the basis of a formula (1). In the formula (1), “Mc” is a three-dimensional coordinate of the camera 120. The three-dimensional coordinate of the camera 120 is included in the position/posture information 142. “MP′” is a representative three-dimensional coordinate that is calculated form the three-dimensional coordinate of each characteristic point of the set “P′”. “MC” and “MP” are to be expressed by the same coordinate system.
d=|M
C
−M
P′| (1)
One example of “MP′” in the set “P′” will be explained.
The computing unit 152 may compute the geometric distance “d” every time when it acquires the image information from the camera 120, or at previously set frequency.
The transmitting unit 153 is a processing unit that transmits an image frame that includes the characteristic point mapping information 141, the position/posture information 142, the image information 143, and the geometric distance information 144, which are stored in the storage 140, to the remote support device 200. For example, the transmitting unit 153 transmits the image frame to the remote support device 200 every time when the position/posture information 142 or the geometric distance information 144 is updated on the basis of the updated image information 143 after the image information 143 is updated. The transmitting unit 153 may store the information on an internal parameter of the camera 120 in the image frame.
The display device controller 154 is a processing unit that causes, in such a case that it receives instruction information from the remote support device 200, the display device 130 to display the received instruction information.
The remote support device 200 is a device that receives image frames from the worker terminal 100 to generate a three-dimensional panorama image. An indicator that uses the remote support device 200 refers to the three-dimensional panorama image and the like to grasp a situation of field.
The communication unit 210 is a processing unit that executes data communication with the worker terminal 100 via the network 70. The communication unit 210 corresponds to, for example, a communication device. The controller 250 to be mentioned later exchanges information with the worker terminal 100 via the communication unit 210.
The input unit 220 is an input device that is for inputting various kinds of information to the remote support device 200. The input unit 220 corresponds to, for example, a keyboard, a mouse, a touch panel, etc. The indicator operates the input unit 220 to input various kinds of instruction information.
The display device 230 is a display device that displays information output from the controller 250. For example, the display device 230 displays information on a three-dimensional panorama image, which is output from the controller 250. The display device 230 corresponds to, for example, a liquid crystal display, a touch panel, etc.
The storage 240 includes a management table 241 and a panorama image table 242. The storage 240 corresponds to a semiconductor memory element such as a RAM, a ROM, or a flash memory, and a storage device such as a HDD.
The management table 241 is a table that stores an image frame transmitted from the worker terminal 100. As described above, the image frame includes the characteristic point mapping information 141, the position/posture information 142, the image information 143, and the geometric distance information 144.
The panorama image table 242 is a table that holds information on a plurality of the three-dimensional image drawing objects that constitutes a three-dimensional panorama image.
For example, a three-dimensional image drawing object A16 is generated on the basis of the record of the record number “R1001” in the management table 241. A three-dimensional image drawing object A26 is generated on the basis of the record of the record number “R1002” in the management table 241. A three-dimensional image drawing object A36 is generated on the basis of the record of the record number “R1003” in the management table 241. Not illustrated here, however, other three-dimensional image drawing objects are similarly associated with the records in the management table 241.
The controller 250 includes a receiving unit 251, the panorama image generating unit 252, and a transmitting unit 253. The controller 250 corresponds to an integrated device such as an ASIC or a FPGA. The controller 250 corresponds to an electronic circuit of a CPU, a MPU, etc. The panorama image generating unit 252 is one example of the controller 250.
The receiving unit 251 is a processing unit that receives an image frame from the worker terminal 100. The receiving unit 251 associates the characteristic point mapping information, the position/posture information, the image information, and the geometric distance information, which are included in the received image frame, with the record number every time when it receives an image frame, and stores them in the management table 241.
The panorama image generating unit 252 calculates the scale information on the basis of the geometric distance information at each record number stored in the management table 241. The panorama image generating unit 252 generates a plurality of three-dimensional image drawing objects on the basis of each piece of generated scale information, and arranges the plurality of three-dimensional image drawing objects on the basis of the position/posture information to generate a three-dimensional panorama image. The panorama image generating unit 252 outputs the information on the three-dimensional panorama image to the display device 230 to display it.
One example of a process by which the panorama image generating unit 252 calculates the scale information on the basis of the geometric distance information will be explained. The panorama image generating unit 252 generates a frustum-shaped reference object on the basis of an internal parameter of the camera 120. The internal parameter of the camera 120 is expressed by a formula (2). The panorama image generating unit 252 is assumed to preliminarily acquire information on an internal parameter “K” from the worker terminal 100.
In the formula (2), “fx” and “fy” express a focal distance of the camera 120. For example, “fx” expresses the focal distance in the x-direction based on the position of the camera 120, and “fy” expresses the focal distance in the y-direction based on the position of the camera 120. Moreover, “s” expresses a skew of the camera 120, “α” expresses an aspect ratio of the camera 120, and “cx” and “cy” are coordinates that express the center of an image that is captured by the camera 120.
The panorama image generating unit 252 calculates an aspect ratio “r” of a reference object on the basis of a formula (3). The panorama image generating unit 252 derives an angle of view θ of a reference object on the basis of a formula (4).
The panorama image generating unit 252 generates the reference object 30, and then calculates scale information of each piece of geometric distance information. The panorama image generating unit 252 converts the unit length “da” to a geometric distance while maintaining the aspect ratio “r” and the angle of view “θ” of the reference object 30 constant to deform the scale of the reference object 30. The reference object 30 whose scale is changed is defined as the scale information.
As illustrated in
As described above, the panorama image generating unit 252 enlarges or reduces the reference object on the basis of the geometric distance information to generate the object. The panorama image generating unit 252 stores information on the object, which includes the angle of view “θ”, the aspect ratio “r”, and the geometric distance information on the object, in the management table 241 as the scale information. The panorama image generating unit 252 executes texture mapping of the image information on the bottom face of the object to generate a three-dimensional image drawing object, and stores it in the panorama image table 242.
The panorama image generating unit 252 repeatedly executes the aforementioned process with regard to information on each record number in the management table 241, and thus generates a three-dimensional image drawing object whose scale is different for each piece of geometric distance information. The panorama image generating unit 252 stores each three-dimensional image drawing object in the panorama image table 242.
By the way, in the aforementioned example, the panorama image generating unit 252 converts the scale of the reference object by directly using the geometric distance notified from the worker terminal 100, however, is not limit thereto. For example, the panorama image generating unit 252 may calculate the optimum value of each geometric distance as illustrated hereinafter, and may generate a three-dimensional image drawing object by using the optimum value of the geometric distance.
Step S11 will be explained. For convenience of explanation, a characteristic point “P6” is focused, and referred to as “pi”. A position “p′A” is the position in which a straight line that passes through the characteristic point “pi” and the position 32 A intersects with the image 31A. A position “p′B” is the position in which a straight line that passes through the characteristic point “pi” and the position 32 B intersects with the image 31B. A position “p′c” is the position in which a straight line that passes through the characteristic point “pi” and the position 32 C intersects with the image 31C. For example, the error between “p′A”, “p′B”, and “p′C” is expressed “εi”. The panorama image generating unit 252 similarly specifies the error “ε” with regard to other characteristic points “p1” to “p5”, and “p7” to “p8”.
Step S12 will be explained. The panorama image generating unit 252 searches a value of a geometric distance so that a total value E(s) of each of the errors “ε” is minimum to specify new scale information on the three-dimensional image drawing objects 30 A to 30 C.
With regard to a geometric distance set “D={di}” of a camera set “Γ={Ci} (i=1, 2, etc.)”, a usage by which the optimum geometric distance set, which, for example, minimizes the error between the three-dimensional panorama images, is derived will be explained hereinafter. For example, “Ci” corresponds to respective positions of the camera 120. The geometric distance set is defined by a formula (5).
D={di} (5)
The panorama image generating unit 252 extracts, with regard to each characteristic point “pj∈P (j=1, 2, etc.)” in a three-dimensional characteristic point map, a camera set “Γj⊂Γcf” by which the characteristic point “pj” can be observed. The panorama image generating unit 252 may extract a set of cameras that have two-dimensional characteristic points respectively corresponding to the characteristic points “pj”, or may extract a set of cameras in which respectively projected points of the characteristic points “pj” exist in an image.
With regard to each three-dimensional point “pj∈P” of the three-dimensional characteristic point map, an error “εj(Dj)” relating to the geometric distance set “Dj⊂D”, which corresponds to the camera set “Γj”, is defined.
The definition of the error “εj(Dj)” based on a dispersion of the three-dimensional projected point will be explained as an example. First, the three-dimensional image drawing object of “Ci” that is deformed by the geometric distance “di∈Dj” corresponding to the camera “Ci∈Γj” will be exemplified. The three-dimensional image drawing object corresponds to the three-dimensional image drawing objects 30A to 30C, etc. illustrated in
Respective “Ri” and “ti” included in the formulas (6) and (8) are a rotation matrix “Ri” and a translation vector “ti” that are for converting the world coordinate system to the coordinate system of the camera “Ci”. Herein, “xi,j” is a two-dimensional projected point of “pj” for a camera in which the camera “Ci” is normalized. The camera in which the camera “Ci” is normalized is a camera in which the rotation matrix Ri, the translation vector ti, and an internal parameter are third order unit matrices, respectively. “Xi,j” in the formulas (7) and (8) is a three-dimensional coordinate of “p′i,j” in the coordinate system of the camera “Cj”. “RiT” is a transpose of “Ri”, and “W” is an arbitrary real number.
Next, the panorama image generating unit 252 calculates the dispersion “εj(Dj)” of the three-dimensional projected point “p′i,j” in the camera set “Γj” by formulas (9) and (10). The formula (9) is for deriving the average of the three-dimensional projected points “p′i,j” in the camera set “Γj”.
Next, the sum of the error “εj(Dj)” derived in all of the “pj∈P” is defined, as in a formula (11), as an energy function “E(D)” relating to the geometric distance set “D”. The panorama image generating unit 252 derives a geometric distance set in which the energy function “E(D)”, which is defined by a formula (12), is minimized.
The panorama image generating unit 252 derives a geometric distance set, and then updates each geometric distance information in the management table 241. Moreover, the panorama image generating unit 252 updates the scale information on the management table 241 and the three-dimensional image drawing object of the panorama image table 242 on the basis of the updated geometric distance information.
The panorama image generating unit 252 arranges each of the three-dimensional image drawing objects in the panorama image table 242 on the basis of the position/posture information to generate a three-dimensional panorama image. For example, the panorama image generating unit 252 arranges the three-dimensional image drawing object A16 on the basis of the position/posture information A12. The panorama image generating unit 252 arranges the three-dimensional image drawing object A26 on the basis of the position/posture information A22. The panorama image generating unit 252 arranges the three-dimensional image drawing object A36 on the basis of the position/posture information A32. Similarly, the panorama image generating unit 252 arranges another three-dimensional image drawing object on the basis of the corresponding position/posture information. The panorama image generating unit 252 outputs information on the generated three-dimensional panorama image, and causes the display device 230 to display it.
The transmitting unit 253 is a processing unit that transmits the instruction information that is input by an indicator via the input unit 220, etc. to the worker terminal 100.
Next, one example of a processing procedure for the system according to the present embodiment will be explained.
The acquiring unit 151 associates a characteristic point in the previous image information with a characteristic point in the present image information (Step S103). The acquiring unit 151 estimates the position/posture of the camera 120 on the basis of the result of the association (Step S104). The acquiring unit 151 updates the characteristic point mapping information 141 and the position/posture information (Step S105).
The computing unit 152 of the worker terminal 100 computes a geometric distance on the basis of the characteristic point mapping information 141 and the position/posture information 142 (Step S106). The transmitting unit 153 of the worker terminal 100 transmits an image frame to the remote support device 200 (Step S107), and shifts to Step S101. For example, the image frame includes the characteristic point mapping information 141, the position/posture information 142, the image information 143, the geometric distance information 144, an internal parameter of the camera 120, etc.
The panorama image generating unit 252 of the remote support device 200 generates a reference object on the basis of an internal parameter of the camera 120 (Step S202). The panorama image generating unit 252 deforms the reference object on the basis of the geometric distance, and generates scale information (Step S203).
The panorama image generating unit 252 generates a three-dimensional image drawing object on the basis of each piece of the scale information (Step S204). The panorama image generating unit 252 arranges the three-dimensional image drawing object on the basis of each piece of the position/posture information to generate a three-dimensional panorama image (Step S205), and shifts to Step S201.
Next, effects of the system according to the present embodiment will be explained. The worker terminal 100 acquires information on characteristic points that is included in image information captured by the camera 120 and position/posture information on the camera 120, computes a geometric distance from the camera 120 to a subject by using the acquired information, and notifies the remote support device 200 of it. The remote support device 200 changes the scale of the image information that is arranged in order to generate a three-dimensional panorama image in accordance with the geometric distance. For example, the remote support device 200 adjusts the scale of the image information in accordance with the geometric distance, and arranges the image information whose scale is adjusted to generate a panorama image. Therefore, by employing the present embodiment, the three-dimensional panorama image can be appropriately generated compared with the conventional technology, the referential technology, etc.
One example of the three-dimensional panorama image generated by the present embodiment and that generated by the referential technology will be explained.
The system according to the present embodiment changes a unit length of a reference object in accordance with the distance from the camera 120 to a subject while maintaining the aspect ratio and the angle of view of the reference object constant, and adjusts the scale of the image information in accordance with the aspect ratio of the object that indicates the changed reference object. Therefore, the three-dimensional panorama image can be efficiently generated.
The system according to the present embodiment arranges respective three-dimensional image drawing objects, and adjusts the scale of each of the three-dimensional image drawing objects so that the position of the same characteristic point included in each piece of the image information is minimum. Therefore, occurrence of the misalignment between the three-dimensional panorama images can be more reduced.
By the way, in the aforementioned embodiment, the process is shared by the worker terminal 100 and the remote support device 200 in such a case that the three-dimensional panorama image is generated, however is not limited thereto. For example, a processing unit that generates a three-dimensional panorama image may be merged with the worker terminal 100 or the remote support device 200.
For example, the panorama image generating unit 252 may be further arranged in the worker terminal 100, and the worker terminal 100 may generate a three-dimensional panorama image. Or, the acquiring unit 151 and the computing unit 152 may be further arranged in the remote support device 200, and the remote support device 200 may generate a three-dimensional panorama image. The worker terminal 100 that further includes the panorama image generating unit 252 or the remote support device 200 that further includes the acquiring unit 151 and the computing unit 152 is one example of an information processing device.
The case, in which the acquiring unit 151 of the worker terminal 100 according to the present embodiment calculates a three-dimensional coordinate of a characteristic point on the basis of the principle of stereo matching by using images information captured at different times, is explained, however, not limited thereto. For example, the acquiring unit 151 may specify a three-dimensional coordinate of the characteristic point by using a distance sensor. The acquiring unit 151 may calculate a three-dimensional position of a characteristic point on the basis of a time during which the light that is irradiated from the distance sensor to a characteristic point is reflected from the characteristic point and reaches again the distance sensor, and the velocity of light.
Next, one example of a computer will be explained, which executes an information processing program that realizes the same functions as those of the worker terminal 100 and the remote support device 200 described in the aforementioned embodiment.
As illustrated in
The hard disk drive 307 includes an acquiring program 307a, a calculating program 307b, and a generating program 307c. The CPU 301 reads the acquiring program 307a, the calculating program 307b, and the generating program 307c to expand them into the RAM 306.
The acquiring program 307a functions as the acquiring process 306a. The calculating program 307b functions as a calculating process 306b. The generating program 307c functions as a generating process 306c.
The process for the acquiring process 306a corresponds to the process by the acquiring unit 151. The process for the calculating process 306b corresponds to the process by the acquiring unit 152. The process for the generating process 306c corresponds to the process by the acquiring unit 252.
The acquiring program 307a, the calculating program 307b, and the generating program 307c need not be previously stored in the hard disk drive 307. For example, each program may be stored in “portable physical medium” such as a flexible disk (FD), a Compact Disc-ROM (CD-ROM), a Digital Versatile Disc (DVD), a magnet-optical disk, or an Integrated Circuit card (IC card), which is inserted into the computer 300, and the computer 300 may read and execute each of the programs 307a to 307c.
According to an aspect of the embodiment, a panorama image can be appropriately generated.
All examples and conditional language recited herein are intended for pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-011668 | Jan 2016 | JP | national |