This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2017-205739, filed on Oct. 25, 2017, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
The present invention relates to an image processing apparatus, an imaging system, a communication system, an image processing method, and a recording medium.
For example, when an accident related to a vehicle (such as a car, an aircraft, etc.) occurs, an image taken by a drive recorder is collected and used to investigate a possible cause of the accident or to take countermeasures. In the case of a large-size vehicle, sensors can be installed in various parts of the vehicle to obtain visual information around a body of the vehicle. However, in the case of a small-size vehicle, a place to attach the sensor is also limited. It is thus desirable to have a compact imaging system capable of obtaining an image of surroundings of the vehicle.
Example embodiments of the present invention include an image processing apparatus for processing a plurality of images captured by an image capturing device, the image capturing device including a plurality of imaging elements each of which captures an imaging area with a preset angle of view, imaging areas of at least two of the plurality of imaging elements overlapping with each other. The image processing apparatus includes: circuitry to: obtain the plurality of images captured by the image capturing device; convert at least one image of the plurality of images, to an image having an angle of view that is smaller than the preset angle of view; and combine the plurality of images including the at least one image that is converted, into a combined image.
Example embodiments of the present invention include an imaging system including the image processing apparatus and the image capturing device.
Example embodiments of the present invention include a communication system including the image processing apparatus and a communication terminal.
Example embodiments of the present invention include an image processing method performed by the image processing apparatus, and a recording medium storing a control program for performing the image processing method.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Referring to the drawings, embodiments of the present invention are described.
<<<Overall Configuration>>>
The terminal 10 is an information processing apparatus having a communication function and is, for example, a smart device such as a tablet, a smartphone, or a single board computer, or an information processing apparatus such as a personal computer (PC). Hereinafter, the case where the terminal 10A is a smart device and the terminal 10B is a PC will be described.
The imaging system 20, which is mounted on a mobile body 20M, is an information processing system having an image capturing function, an image processing function, and a communication function. The mobile body 20M is exemplified by, but not limited to, an automobile such as a construction machine, a forklift, a truck, a passenger car, or a two-wheeled vehicle, or a flying object such as a drone, a helicopter, or a small-size airplane.
The management system 50 is an information processing apparatus having a communication function and an image processing function. The management system 50 functions as a Web server, which transmits images to the terminal 10 in response to a request from the terminal 10 for display at the terminal 10.
The terminal 10A and the imaging system 20 are connected by wireless communication in compliance with Wireless Fidelity (Wi-Fi) or Bluetooth (Registered Trademark), or by wired communication via a Universal Serial Bus (USB) cable or the like. The terminal 10A is connected to the Internet 2I via Wi-Fi, wireless local area network (LAN), or the like or via a base station. With this configuration, the terminal 10A establishes communication with the management system 50 on the Internet 2I. Further, the terminal 10B is connected to the Internet 2I via a LAN. With this configuration, the terminal 10B establishes communication with the management system 50. Hereinafter, the Internet 2I, the LAN, and various wired and wireless communication paths are collectively referred to as a communication network 2.
<<Imaging System>>
The cameras 21A and 21B include various components such as imaging units 211A and 211B, and batteries, which are respectively accommodated in housings 219A and 219B. The camera 21A further includes a controller 215. Hereinafter, any arbitrary one of the imaging units 211A and 211B is referred to as the imaging unit 211. In
The imaging units 211A and 211B include, respectively, imaging optical systems 212A and 212B, and imaging elements 213A and 213B such as Charge Coupled Device (CCD) sensor and Complementary Metal Oxide Semiconductor (CMOS) sensor. Hereinafter, any arbitrary one of the imaging elements 213A and 213B is referred to as the imaging element 213. Each of the imaging optical systems 212A and 212B includes, for example, seven fisheye lenses, which are grouped into 6 lens sets. The fisheye lens has a full angle of view that is larger than 180° (=360°/n; optical system number n=2). Preferably, the fisheye lens has an angle of view of 185° or larger, and more preferably, 190° or larger. A set of the wide-angle imaging optical system 212A and the imaging element 213A, and a set of the wide-angle imaging optical system 212B and the imaging element 213B are each referred to as a wide-angle imaging optical system.
The cameras 21A and 21B are respectively secured to a holding plate 221 of the holder 22 by two of a plurality of screws 222. It is desirable that the holding plate 221 is sufficiently strong such that it is hardly deformed due to external forces. The holding plate 221 is attached to a hook 223 by one of the screws 222, other than the screws for securing the cameras 21A and 21B. The hook 223, which is an example of a mounting part, could be any shape as long as it can be attached to a desired location on the mobile body 20M.
The optical elements (the lenses, the prisms, the filters, and the aperture stops) of the two imaging optical systems 212A and 212B are disposed for the respective imaging elements 213A and 213B. The positions of the optical elements of the imaging optical systems 212A and 212B are determined by the holder 22, such that the optical center axis OP that passes through the imaging optical systems 212A and 212B is made orthogonal to centers of light receiving areas of the imaging elements 213A and 213B, respectively. Further, the light receiving areas of the imaging elements 213A and 213B form imaging planes of the corresponding fisheye lenses.
In this example, the imaging optical systems 212A and 212B are substantially the same in specification, and disposed so as to face in opposite directions so that their respective optical central axes OP coincide with each other. The imaging elements 213A and 213B each convert a distribution of the received light into image signals, and sequentially output image frames (frame data) to an image processing circuit on the controller 215. The controller 215 then transfers the images (the image frames) captured by the imaging elements 213A and 213B to the image processing board 23 at which the images are combined into an image having a solid angle of 47c steradian (hereinafter referred to as the “spherical image”). The spherical image is obtained by capturing images of all directions that can be seen from an image capturing point. The spherical video image is generated from a set of consecutive frames of spherical image. Hereinafter, a process of generating a spherical image and a spherical video image will be described. However, this process can be replaced with a process of generating a so-called panorama image and a panorama video image, obtained by capturing images of only the horizontal plane by 360 degrees.
<<Hardware Configuration>>
The image processing board 23 includes a Central Processing Unit (CPU) 101, a Read Only Memory 102 (ROM), a Random Access Memory (RAM) 103, a Solid State Drive (SSD) 104, a medium interface (I/F) 105, a network I/F 107, a user I/F 108, and a bus line 110.
The CPU 101 controls entire operation of the image processing board 23. The ROM 102 stores various programs that operate on the image processing board 23. The RAM 103 is used as a work area for the CPU 101. The SSD 104 stores data used by the CPU 101 in executing various programs. The SSD 104 can be replaced with any nonvolatile memory such as a Hard Disk Drive (HDD). The medium I/F 105 is an interface circuit for reading out information stored in a recording medium 106 such as an external memory, or writing information to the recording medium 106. The network I/F 107 is an interface circuit that enables the image processing board 23 to communicate with other devices via the communication network 2. The user I/F 108 is an interface circuit, which provides image information to a user or receives operation inputs from the user. The user IX 108 allows the image processing board 23 to connect with, for example, a liquid crystal display or an organic EL (ElectroLuminescence) display equipped with a touch panel, or a keyboard or a mouse. The bus line 110 is an address bus or a data bus for electrically connecting the respective elements illustrated in
Since the hardware configuration of the terminal 10 and the management system 50 is the same as the hardware configuration of the image processing board 23 described above, its description is omitted.
The CPU 252 controls operation of respective elements in the camera 21. The ROM 254 stores control programs and various parameters described in codes that are interpretable by the CPU 252. The image processing block 256 is connected to the imaging elements 213A and 213B, and is input with image signals of the images being captured by the imaging elements 213A and 213B. The image processing block 256 includes an Image Signal Processor (ISP) and the like, and performs shading correction, Bayer interpolation, white balance correction, gamma correction, and the like on the image signals input from the imaging elements 213A and 213B.
The video compression block 258 is a codec block that compresses or decompresses a video according to such as MPEG-4 AVC/H.264. The DRAM 272 provides a storage area for temporarily storing data in applying various signal processing and image processing. The sensor 276 measures a physical quantity, such as a velocity, an acceleration, an angular velocity, an angular acceleration, or a magnetic direction, which results from a movement of the imaging element 213. For example, the sensor 276 may be an acceleration sensor, which detects acceleration components of three axes, which are used to detect the vertical direction to perform zenith correction on the spherical image.
The controller 215 of the camera 21 further includes an external memory I/F 262, a Universal Serial Bus (USB) I/F 266, a serial block 268, and a video output I/F 269. To the external memory I/F 262, an external memory 274 is connected. The external memory I/F 262 controls reading and writing to the external memory 274 such as a memory card inserted in a memory card slot. To the USB I/F 266, a USB connector 278 is connected. The USB I/F 266 controls USB communication with an external device such as a personal computer connected via the USB connector 278. The serial block 268 is connected with a wireless Network Interface Card (NIC) 280, and controls serial communication with an external device such as a personal computer. The video output I/F 269 is an interface for connecting the controller 215 with the image processing board 23.
While in this embodiment referring to
<<Functional Configuration>>
Next, a functional configuration of the imaging system 20 is described according to an embodiment.
When the power of the camera 21 is turned on, the control program for the camera 21 is loaded, for example, from the ROM 254 to a main memory such as the DRAM 272. The CPU 252 controls operation of each part in the camera 21 according to the program loaded into the main memory, while temporarily saving data necessary for control on the memory. Accordingly, the camera 21 performs functions and operations as described below.
The camera 21 includes an image capturing unit 2101, a video encoder 2102, an image manager 2103, and a transmitter 2109. The camera 21 further includes a storage unit 2100 implemented by the ROM 254, DRAM 272, or external memory 274.
The image capturing unit 2101, which is implemented by the imaging element 213, captures a still image or a video. The video encoder 2102, which is implemented by the video compression block 258, encodes (compresses) or decodes (decompresses) the video. The image manager 2103, which is implemented by instructions of the CPU 252, stores in the memory the image data in association with projection transformation information for management. The transmitter 2109, which is implemented by instructions of the CPU 252 and the video output VF 269, controls communication with the image processing board 23.
The image processing board 23 includes a projection transformation information manager 2301, a conversion unit 2302, a displaying unit 2303, and a transmitter and receiver 2309. Further, the image processing board 23 includes a storage unit 2300, implemented by the ROM 102, the RAM 103, or the SSD 104.
The projection transformation information manager 2301, which is implemented by instructions of the CPU 101, manages projection transformation information of an image that is captured by the camera 21. The conversion unit 2302, which is implemented by instructions of the CPU 101, converts an angle of view of each image in a set of images, to generate a set of images each applied with projection transformation. The conversion unit 2302 then performs texture mapping with the set of images applied with projection transformation, onto a unit sphere, to generate a sphere image. The displaying unit 2303, which is implemented by instructions of the CPU 101 and a displaying function of the user I/F 108, displays the spherical image that is generated by combining the set of images. The transmitter and receiver 2309, which is implemented by instructions of the CPU 101 and the network T/F 107, controls communication with other devices.
<<Concept>>
Next, a concept used for generating a spherical image from images captured by the camera 21 will be described. First, the direction in which the imaging system 20 captures images will be described.
The direction of the camera 21 can be represented by an angle of (Yaw, Pitch, Roll) with reference to a direction that the lens (imaging optical system 212A, 212B) of the camera 21A faces, which is defined as a reference direction. For example, in the camera 21 of
The camera 21 acquires data of (Yaw, Pitch, Roll) for each imaging optical system as imaging direction data, to determine a positional relationship of each imaging optical system and transmits the imaging direction data to the image processing board 23 with the image data being captured. Accordingly, the image processing board 23 can determine a positional relationship between the captured images (fisheye images) captured by the respective cameras 21 and convert the captured images into the spherical image. In the example of
Next, conversion from a fisheye image to a spherical image will be described.
h=f×φ (1)
Other projection models include the central projection method (h=f·tan φ), the stereographic projection method (h=2 f·tan (φ/2)), the isostatic projection method (h=2 f·sin(φ/2)), and the orthogonal projection method (h=f·sin φ). In either method, the image height h of a formed image is determined based on the incident angle φ with respect to the optical central axis and the focal length f. Further, in the present embodiment, it is assumed that a so-called circumferential fisheye lens having an image circle diameter that is smaller than an image diagonal is adopted. Here, the image diagonal (the diagonal line in
The description of converting from the fisheye image to the spherical image continues.
In
x=sin ϕ×cos a (2-1)
y=sin ϕ×sin a (2-2)
z=cos ϕ (2-3)
The coordinates of the point P′ calculated by the equations (2-1) to (2-3) are further rotated using the imaging direction data, to correspond to a direction that the camera 21 was facing during image capturing. Accordingly, the directions defined in
From the equations (3-1) to (3-3), the following equation (4) is obtained. By using the equation (4), the fisheye image is applied with perspective projection transformation according to the imaging direction.
The hemispherical images of
h=f×φ×180°/θ (5)
As described above, using the above-described equations, a table for converting from the fisheye images to a three-dimensional spherical image can be created. In the present embodiment, in order to perform such conversion on the image processing board 23, projection transformation information such as imaging direction data and projection transformation data is added to the image data as supplementary data, and transmitted from the camera 21 to the image processing board 23.
<<Processing>>
Next, processing performed by the imaging system 20 will be described. First, a process of transmitting image data from the camera 21 to the image processing board 23 will be described.
As the cameras 21A and 21B of the imaging system 20 start capturing images, the image capturing unit 2101 stores the image data of each fisheye image that has been captured, the imaging direction data indicating the imaging direction, and the projection transformation data, in the storage unit 2100. Hereinafter, a case where the captured image is a video will be described. The image data includes frame data of a video encoded by the video encoder 2102. The image manager 2103 associates the stored image data with the projection transformation information (imaging direction data and projection transformation data) (S11). This association is performed, for example, based on a time when each data was stored, and a flag that may be added when capturing the image.
The transmitter 2109 of the camera 21 reads the projection transformation information from the storage unit 2100 (S12), and transmits the projective transformation information of each image data to the image processing board 23 (S13).
Further, the transmitter 2109 reads the frame data of video, processed by the video encoder 216 (S14), and transmits the frame data to the image processing board 23 (S15). The transmitter 2109 determines whether the transmitted frame data is the last frame (S16). When it is the last frame (“YES” at S16), the operation ends. When it is not the last frame (“NO” at S16), the operation returns to S14, and the transmitter 2109 repeats the process of reading frame data and the process of transmitting until the last frame is processed.
Next, operation to be performed by the image processing board 23 that has received the image data and the projection transformation information will be described.
The transmitter and receiver 2309 of the image processing board 23 receives projection transformation information (imaging direction data and projection transformation data), which has been transmitted by the camera 21 (see S13) for each image data (S21). The projection transformation information manager 2301 of the image processing board 23 stores the projection transformation information for each image data that has been received in the storage unit 2300 (S22).
The transmitter and receiver 2309 of the image processing board 23 starts receiving the image data of the fisheye image, transmitted by the camera 21 (see S13) (S23). The storage unit 2300, as an image buffer, stores the received image data.
The conversion unit 2302 of the image processing board 23 reads out a set of image data stored in the storage unit 2300. The set of image data is frame data of a video captured by the cameras 21A and 21B, each of which is associated with the same time information to indicate that the images are taken at the same time. As described above, in alternative to the time information, the flag may be used to indicate the images to be included in the same set. The conversion unit 2302 of the image processing board 23 reads the projection transformation information, associated with the set of these image data from the storage unit 2300.
Each image (image frame) in the set of image data has an angle of view of 180° or more. The conversion unit 2302 converts each image of the set of image data to have an angle of view of 180°, and performs texture mapping with the set of image data that is converted using the projection transformation information to a unit sphere, to generate a spherical image (S24). For the process of converting an image having an angle of view wider than 180° to an image having an angle of view of 180°, any one of the above-described methods may be performed. In this case, the conversion unit 2302 obtains the angle “a” and the image height “h” for each pixel in the fisheye image, obtains φ for each pixel using the projection transformation data in the projection transformation information for each image data using the equation (5), and performs the equations (2-1) to (2-3) to calculate the coordinate (x, y, z) of each pixel.
The conversion unit 2302 stores the converted frame data of a spherical image in the storage unit 2300 (S25). The conversion unit 2302 determines whether the converted image data is the last frame in the video (S26), and the operation ends when it is the last frame (“YES”). When it is determined at S26 that the frame is not the last frame (“NO”), the conversion unit 2302 repeats S24 to S25 of applying texture mapping and storing frame data of a spherical image for the remaining frames.
As the operation of
Next, a modified example of the imaging system 20 is described, while only focusing on some differences from the above-described embodiment.
In
In this modified example A, the imaging direction data in the imaging system 20′ is determined, based on assumption that the optical central axis direction in the cameras 21A and 21B is set to the Roll axis, the optical central axis direction in the cameras 21C and 21D is set to the Pitch axis, and the direction perpendicular to the Roll axis and the Pitch axis is set to the Yaw axis.
h=f×φ×90°/θ (6)
The above-described modified example A is similar to the above-described embodiment, except that four images are captured and that the angle of view of 180° is converted to 90° using the equation (6).
Next, another modified example of the imaging system 20 is described, while only focusing on some differences from the above-described embodiment.
The functional configuration of the camera 21 is similar to that of the camera 21 in the above-described embodiment. The functional configuration of the transmitter and receiver 2309 of the image processing board 23 is the same as that of the image processing board 23 in the above-described embodiment.
The terminal 10A includes a transmitter and receiver 1009A. The transmitter and receiver 1009A, which is implemented by instructions of the CPU 101 and the network I/F 107, controls communication with other devices.
The management system 50 includes a projection transformation information manager 5001, a conversion unit 5002, and a transmitter and receiver 5009. The management system 50 further includes a storage unit 5000, implemented by the ROM 102, the RAM 103, or the SSD 104. The projection transformation information manager 5001, which is implemented by instructions of the CPU 101, manages projection transformation information of an image that is captured by the camera 21. The conversion unit 5002, which is implemented by instructions of the CPU 101, converts an angle of view of each image in a set of images, to generate a set of images each applied with projection transformation. The conversion unit 5002 then performs texture mapping with the set of images applied with projection transformation, onto a unit sphere, using projection transformation information to generate a sphere image. The transmitter and receiver 5009, which is implemented by instructions of the CPU 101 and the network I/F 107, controls communication with other devices.
The terminal 10B includes a transmitter and receiver 1009B, an acceptance unit 1001, and a displaying unit 1002. The transmitter and receiver 1009B, which is implemented by instructions of the CPU 101 and the network I/F 107, controls communication with other devices. The acceptance unit 1001 accepts instructions of the CPU 101, and an operation input by the user through a touch panel via the user I/F 108. The displaying unit 1002, which is implemented by instructions of the CPU 101 and a displaying function of the user I/F 108, displays images on a display.
In response to reception of the projection transformation information from the camera 21, the transmitter and receiver 2309 of the image processing board 23 transmits the received projection transformation information to the terminal 10A (S32).
In response to reception of the projection transformation information from the image processing board 23, the transmitter and receiver 1009A of the terminal 10A transmits the received projection transformation information to the management system 50 (S33). The transmitter and receiver 5009 of the management system 50 receives the projection transformation information transmitted from the terminal 10A.
The cameras 21A and 21B each transmit frame data of the captured video to the image processing board 23, in a substantially similar manner as described above referring to S14 to S16 of
In response to reception of frame data of the video transmitted from the camera 21, the transmitter and receiver 2309 of the image processing board 23 transmits the received frame data of the video to the terminal 10A (S42).
In response to reception of the frame data of the video transmitted from the image processing board 23, the transmitter and receiver 1009A of the terminal 10A transmits the frame data of the video to the management system 50 (S43). The transmitter and receiver 5009 of the management system 50 receives the frame data of the video transmitted by the terminal 10A.
Using the received projection transformation information and the frame data of the video, the management system 50 generates video data of a spherical image in a substantially similar manner as described above referring to S21 to S26 of
As the user of the terminal 10B inputs an operation for requesting displaying of the spherical image, the acceptance unit 1001 receives a request for the spherical image. The transmitter and receiver 1009B of the terminal 10B transmits a request for the spherical image to the management system 50 (S61).
The transmitter and receiver 5009 of the management system 50 receives the request for the spherical image transmitted from the terminal 10B. In response to this request, the transmitter and receiver 5009 of the management system 50 reads the video data of the spherical image from the storage unit 5000, and transmits the read video data to the terminal 10B.
The transmitter and receiver 1009B of the terminal 10B receives the video data of the spherical image, transmitted from the management system 50. The displaying unit 1002 displays (reproduces), on the display, the spherical image based on the received video data (S71).
According to one or more embodiments, the imaging system 20 includes a plurality of cameras 21A and 21B each of which captures an image with a preset angle of view, such as an angle of view wider than 180 degrees. Here, the sum of the angles of view of the cameras 21A and 21B is greater than 360 degrees. Accordingly, the imaging area of one camera 21 overlaps with that of the other camera 21. The image processing board 23 processes the images of all surroundings, taken by the cameras 21. Specifically, the conversion unit 2302 of the image processing board 23 converts at least one image, from among the plurality of images captured by the plurality of cameras 21, into an image having a predetermined angle of view smaller than the original angle of view. The conversion unit 2302 then combines a plurality of images including the at least one converted image, to generate a spherical image. With this conversion processing, when combining a plurality of images to generate an image of all surroundings, loss of information on overlapping areas of the plurality of images can be prevented.
Especially when the images are being captured for the purpose of monitoring, it is important that no information would be lost.
The conversion unit 2302 of the image processing board 23 converts the angle of view so that the sum of the angles of view of the plurality of images acquired by the plurality of cameras 21A and 21B becomes 360°. As a result, when the image processing board 23 combines a plurality of images to generate a spherical image, no overlapping areas of the plurality of images are generated such that a loss in image can be prevented.
The imaging system 20 includes two cameras 21A and 21B. The conversion unit 2302 of the image processing board 23 converts the image having an angle of view wider than 180°, which is acquired by each of the two cameras 21A and 21B, into an image having an angle of view of 180°. Accordingly, even when the installation space for the camera is small like in the case of a small-size construction machine, as long as the imaging system 20 with the two cameras 21A and 21B can be installed, surroundings of a target, such as the construction machine, can be captured.
The cameras 21A and the cameras 21B are arranged so as to face in opposite directions while keeping a predetermined distance therebetween, such that different directions can be captured at substantially the same time. If the cameras 21A and 21B are disposed at the same location, there may be areas not captured due to a blind spot of the vehicle or the like. By placing the cameras at a predetermined distance, an image of surroundings can be sufficiently captured.
In the modified example A of the embodiment, the imaging system 20 includes four cameras 21A, 21B, 21C, and 21D. The conversion unit 2302 of the image processing board 23 converts an image having an angle of view wider than 90°, acquired by each of the four cameras 21A, 21B, 21C, and 21D, into an image having an angle of view of 90°. As a result, even if there is an area that cannot be captured due to the blind spot with the two cameras 21A and 21B, by installing the four cameras 21A, 21B, 21C, and 21D, an image of all surroundings can be captured.
As described above for the case of two cameras, two of the camera 21A, the camera 21B, the camera 21C, and the camera 21D are arranged so as to face in opposite directions while keeping a predetermined distance from each other, so that different directions can be captured at substantially the same time.
Any one of the programs for controlling the terminal 10, the imaging system 20, and the management system 50 may be stored in a computer-readable recording medium in a file format installable or executable by the general-purpose computer for distribution. Examples of such recording medium include, but not limited to, compact disc-recordable (CD-R), digital versatile disc (DVD), and blue-ray disc. In addition, a memory storing any one of the above-described control programs, such as a recording medium including a CD-ROM or a HDD, may be provided in the form of a program product to a user within a certain country or outside that country.
The terminals 10, the imaging system 20, and the management system 50 in any one of the above-described embodiments may be configured by a single computer or a plurality of computers to which divided portions (functions) are arbitrarily allocated.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
For example, processing to combine the images may be performed in various ways, such as by integrating one image with another image, mapping one image on another image entirely or partly, laying one image over another image entirely or partly. That is, as long as the user can perceive a plurality of images being displayed on a display as one image, processing to combine the images is not limited to this disclosure.
For example, the method of combining images may be performed in a substantially similar manner as described in U.S. Patent Application Publication No. 2014/0071227A1, the entire disclosure of which is hereby incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2017-205739 | Oct 2017 | JP | national |