The present invention relates to a displaying technique which is based on circular images such as an omnidirectional image, a fisheye image and the like.
In cameras to be used in a capturing system for a monitoring or surveillance application, there is a camera which is equipped with an optical system capable of performing omnidirectional image capturing by a mirror or the like or an optical system such as a fisheye lens or the like, in order to monitor a wide area by a single body or reduce blind spots. Omnidirectional images and fisheye images obtained through suchlike optical systems are circular (including annular) images with large distortion.
Incidentally, Japanese Patent Application Laid-Open No. 2003-303335 discloses a technique of segmenting and cutting open a circular image along a line passing through the center of the circular image, and converting the obtained image into a rectangular wide-angle projected image by an image process such as distortion correction or the like. Besides, Japanese Patent Application Laid-Open No. 2010-68071 discloses a technique of defining a cutting-open position to be segmented, and then, when a face area is detected, not setting the defined cutting-open position but setting a new cutting-open position such that the face area is located at the center of a converted wide-angle projected image.
The conventional cutting-open position is selected from determined options (for example, 90° unit from the horizontal position passing through the center). However, in the capturing system for the monitoring application in which a camera and a displaying apparatus are apart from each other, there is a problem that it is difficult for a user of the camera to set an optimum cutting-open position. The method described in Japanese Patent Application Laid-Open No. 2010-68071 is not to select the cutting-open position from the determined options but is to determine the cutting-open position based on a detection result of the face area. Thus, in this method, there is a problem that the determined cutting-open position does not become the optimum cutting-open position for the user. According to the present invention, there is provided a technique capable of setting the cutting-open position that the user desires.
According to one aspect of the present invention, there is provided an image processing apparatus for generating at least one rectangular image converted from an omnidirectional image captured by an omnidirectional camera, the image processing apparatus comprising: a display controlling unit configured to display, together with the rectangular image, a mark indicating a reference direction of the omnidirectional camera at a corresponding position of the rectangular image; a changing unit configured to change a cutting-open position of the omnidirectional camera in accordance with a user operation for changing the position at which the mark is displayed; and a converting unit configured to convert an image generated by dividing the omnidirectional image based on the cutting-open position changed by the changing unit, into the at least one rectangular image.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. Incidentally, it should be noted that each of the embodiments to be described below shows an example in a case where the present invention is concretely carried out, and is one of the specific embodiments of the constitution and configuration described in claims.
Initially, a functional configuration example of a capturing system according to the present embodiment will be described with reference to a block diagram of
First, the camera unit 1 will be described. An optical lens 11 has an optical system capable of performing omnidirectional capturing by a mirror or the like. Light which is incident from the outside through the optical lens 11 is received by an image capturing sensor unit 12.
The image capturing sensor unit 12 is an image capturing element such as CCD (charge-coupled device) or CMOS (complementary metal-oxide semiconductor) which converts the light received through the optical lens 11 into a digital image signal by photoelectric conversion, and the converted digital image signal is input to a development processing unit 13.
The development processing unit 13 performs various image processes such as a pixel interpolating process, a color converting process and the like to the digital image signal output from the image capturing sensor unit 12, thereby generating color images such as an RGB image, a YUV image and the like as captured images. Here, when the optical lens 11 has an optical system capable of performing omnidirectional capture, the captured image obtained by forming an image of the entire surface of the light received through the optical lens 11 like this on the image capturing sensor unit 12 is the omnidirectional image. Besides, when the optical lens 11 has an optical system capable of performing wide-angle capture, the captured image obtained by forming an image of the entire surface of the light received through the optical lens 11 like this on the image capturing sensor unit 12 is a fisheye image. Both the omnidirectional image and the fisheye image are circular images. An example of the circular image is illustrated in
A camera controlling unit 14 entirely controls the operation of the camera unit 1 including the image capturing sensor unit 12 and the development processing unit 13. For example, the camera controlling unit 14 comprises a processor such as a CPU (central processing unit), and a memory for holding computer programs and data used by the processor for performing processes. In this case, the processor performs the process using a computer program or data stored in the memory, so that the camera controlling unit 14 entirely controls the operation of the camera unit 1 including the image capturing sensor unit 12 and the development processing unit 13. Besides, the camera controlling unit 14 controls the operations of the image capturing sensor unit 12 and the development processing unit 13 in accordance with various instructions sent from the image processing apparatus 2.
Next, the image processing apparatus 2 will be described. An image converting unit 21 sets, as a cutting-open position, the diameter in a direction designated by a user or the diameter in an initial direction in the circular image sent from the camera unit 1. The image converting unit 21 cuts open (segments or divides) the circular image at the cutting-open position to generate two semicircular images, and performs a projective transformation to each of the two semicircular images, thereby generating two rectangular images. Both of the generated two rectangular images are wide-angle projected images which are captured by using a wide-angle lens, and each of the rectangular images has an angle of view of 180 degrees. In the circular image of
The storing unit 23 is a memory which can store the wide-angle projected image and the circular image output from the image converting unit 21, information (for example, setting information of the image processing apparatus 2) to be treated as known information by the image processing apparatus 2, and the like.
The UI unit 22 performs display control of various user interfaces, such as a user interface including the wide-angle projected image sent from the image converting unit 21, a user interface for creating later-described map data, and the like. Besides, the UI unit 22 receives an operation input from the user with respect to the displayed user interface.
A controlling unit 24 entirely controls the operations of the image processing apparatus 2 which comprises the image converting unit 21, the UI unit 22 and the storing unit 23. Besides, the controlling unit 24 sends various instructions (for example, instructions for changing capturing direction, angle of view and focal position of the camera unit 1, and a setting instruction of exposure time) to the camera controlling unit 14. The camera controlling unit 14 controls the camera unit 1 according to the instruction from the controlling unit 24.
Next, a process of generating a wide-angle projected image according to a desired cutting-open position from the circular image obtained from the camera unit 1 by the image processing apparatus 2 and presenting the generated wide-angle projected image to the user will be described with reference to
In S101, the image converting unit 21 obtains the circular image output from the camera unit 1. Then, in S102, the image converting unit 21 cuts open the circular image obtained in S101 at the cutting-open position to generate the two semicircular images, and performs the projective transformation to each of the two semicircular images to generate the two wide-angle projected images. Here, a transforming process from the circular image to the wide-angle projected image will be described with reference to
When the circular image on the left side of
The center of the circular image corresponds to the pixel corresponding to the vicinity of the optical axis of the optical lens 11. Here, “a pixel position P of the pixel corresponding to the vicinity of the optical axis of the optical lens 11” in the circular image sent from the camera unit 1 is previously registered in the storing unit 23. This registration is performed, for example, at the time of manufacturing the image processing apparatus 2. The controlling unit 24 reads “the pixel position P” registered in the storing unit 23, and sets the read position in the image converting unit 21. The image converting unit 21 sets the line segment passing through the pixel position P as the cutting-open position, generates the two semicircular images by segmenting the circular image received from the camera unit 1 at the cutting-open position, and performs the projective transformation to each of the two semicircular images, thereby generating the two rectangular images.
In the present embodiment, it is assumed that “a line segment in the horizontal direction passing through the pixel position P” is registered in the storing unit 23 as an initial position of the cutting-open position. Incidentally, the initial position of the cutting-open position is registered, for example, at the time of manufacturing the image processing apparatus 2. Then, by operating the UI unit 22, the user can instruct the image processing apparatus 2 to change the cutting-open position (i.e., rotation of “the line segment passing through the pixel position P” (diameter)). The cutting-open position is defined by a counterclockwise rotation angle θ (degrees) from the initial position of the cutting-open position. The definition of the cutting-open position is illustrated in
Returning to
Besides, the UI unit 22 displays indicators for notifying the user which direction the first wide-angle projected image and the second wide-angle projected image are respectively directed to. A display example by the UI unit 22 in S103 is illustrated in
On the lower side of the first wide-angle projected image 601, in order to notify the user to which position in an orientation direction (horizontal direction) of the first wide-angle projected image 601 a direction preset as the reference direction of the camera unit 1 corresponds, a mark 610 is displayed at a position corresponding to the reference direction of the camera unit 1 in the orientation direction of the first wide-angle projected image 601.
Besides, on the lower side of the first wide-angle projected image 601, in order to notify the user to which position in the orientation direction of the first wide-angle projected image 601 a direction in which a target “AREA A” is located corresponds, a mark 611 is displayed at a position corresponding to the direction in which the target “AREA A” is located in the orientation direction of the first wide-angle projected image 601.
Besides, on the lower side of the first wide-angle projected image 601, in order to notify the user to which position in the orientation direction of the first wide-angle projected image 601 a direction in which a target “ENTRANCE 1” is located corresponds, a mark 612 is displayed at a position corresponding to the direction in which the target “ENTRANCE 1” is located in the orientation direction of the first wide-angle projected image 601.
Besides, on the lower side of the first wide-angle projected image 601, in order to notify the user to which position in the orientation direction of the first wide-angle projected image 601 a direction in which a target “AREA B” is located corresponds, a mark 613 is displayed at a position corresponding to the direction in which the target “AREA B” is located in the orientation direction of the first wide-angle projected image 601.
That is, it is possible by the marks 611 to 613 to notify that the targets “AREA A”, “ENTRANCE 1” and “AREA B” are respectively located in the directions indicated by the respective marks 611 to 613 within the range of the orientation direction of the first wide-angle projected image 601.
On the lower side of the second wide-angle projected image 602, in order to notify the user to which position in the orientation direction of the second wide-angle projected image 602 a direction in which a target “AREA C” is located corresponds, a mark 621 is displayed at a position corresponding to the direction in which the target “AREA C” is located in the orientation direction of the second wide-angle projected image 602.
Besides, on the lower side of the second wide-angle projected image 602, in order to notify the user to which position in the orientation direction of the second wide-angle projected image 602 a direction in which a target “AREA D” is located corresponds, a mark 622 is displayed at a position corresponding to the direction in which the target “AREA D” is located in the orientation direction of the second wide-angle projected image 602.
Besides, on the lower side of the second wide-angle projected image 602, in order to notify the user to which position in the orientation direction of the second wide-angle projected image 602 a direction in which a target “ENTRANCE 2” is located corresponds, a mark 623 is displayed at a position corresponding to the direction in which the target “ENTRANCE 2” is located in the orientation direction of the second wide-angle projected image 602.
That is, it is possible by the marks 621 to 623 to notify that the targets “AREA C”, “AREA D” and “ENTRANCE 2” are respectively located in the directions indicated by the respective marks 621 to 623 within the range of the orientation direction of the second wide-angle projected image 602.
As just described, each of the marks 610 to 613 and 621 to 623 is for defining the capturing direction as a substitute for the indicator representing the direction (north, south, east and west), and the display position of each mark is determined based on the reference direction previously set by the user and the direction to each target. The determination of the marks 610 to 613 and 621 to 623 to be displayed on the lower side of the first and second wide-angle projected images and the determination of the display positions thereof will be described.
In the storing unit 23, information defining positional relationships between the camera unit 1 and the targets is previously registered. For example, as one example of the information like this, map data exemplified in
Incidentally, the objective angles of the objects 751 to 756 are handled as the objective angles of the targets “AREA A”, “ENTRANCE 1”, “AREA B”, “AREA C”, “AREA D” and “ENTRANCE 2”.
As just described, in the map data, it is defined in which angle direction from the reference direction of the camera unit 1 each object is located as viewed from the position of the camera unit 1.
When θ=0, the UI unit 22 disposes the mark 610 at the center position in the orientation direction of the first wide-angle projected image 601. Further, the UI unit 22 disposes the mark 611 at the position separated from the center position in the orientation direction of the first wide-angle projected image 601 by a distance corresponding to an objective angle A of the target “AREA A”. As illustrated in
Thus, when it is assumed that the center position in the orientation direction of the first wide-angle projected image is Pc and the objective angle of the object is Δ, a horizontal arrangement position Pt of the mark corresponding to the object is given by calculating “Pt=Pc+(θ−Δ)×k” (expression 1). Here, “k” is a coefficient representing the number of pixels corresponding to the rotation angle “1”. Such a display position may be calculated by the UI unit 22 or may be calculated by the controlling unit 24.
Returning to
In S105, the controlling unit 24 decides whether or not the user operates the UI unit 22 to perform a cutting-open position changing operation. For example, as illustrated in
As a result of the decision by the controlling unit 24, when the cutting-open position changing operation is performed, the process proceeds to S106. On the other hand, when the cutting-open position changing operation is not performed, the process proceeds to S102.
In S106, the image converting unit 21 rotates the current cutting-open position with the pixel position P as the center by the rotation angle Δθ decided by the controlling unit 24 in the rotation direction D decided by the controlling unit 24, thereby changing the cutting-open position. Then, the process returns to S102.
When the process proceeds from S106 to S102, the image converting unit 21 cuts open the circular image obtained in S101 at the cutting-open position changed in S106 to generate the two semicircular images, and generates the two wide-angle projected images from the two semicircular image. After then, in S103, the UI unit 22 displays the first wide-angle projected image and the second wide-angle projected image in the same manner as described above, and also displays the above indicators respectively below the first wide-angle projected image and the second wide-angle projected image. The display of the indicators is carried out by the following process.
The UI unit 22 calculates the horizontal arrangement position Pt for each object defined in the map data, and sets, as target objects, the objects for which the horizontal arrangement positions Pt within a range from one end to the other end in the orientation direction of the first wide-angle projected image are obtained as a target object. Then, the UI unit 22 disposes a mark of the target object at the position Pt obtained for the relevant target object. With reference to the mark corresponding to the reference direction, in a case where the position Pt obtained by calculating the above expression 1 with Δ=0 is within the range from one end to the other end in the orientation direction of the first wide-angle projected image, the mark corresponding to the reference direction is displayed at the position indicated by the horizontal arrangement position Pt. Likewise, the above operations are applied to the second wide-angle projected image.
The above method of obtaining the display position of each of the marks is merely an example. Namely, as the method like this, it only has to be able to notify the user to which position in the in the orientation direction (horizontal direction) of the first wide-angle projected image 601 the reference direction of the camera unit 1 and the target correspond.
Besides, in the present embodiment, the name of the target is used as the mark corresponding to the target, but the present invention is not limited to this. Namely, as the mark, it only has to represent information on the target. Besides, in the present embodiment, as the mark indicating the reference direction of the camera unit 1, the arrow mark pointing to the reference direction is used, but the mark is not limited to this. Namely, as the mark, it only has to point the reference direction of the camera unit 1.
<Modification 1>
In the first embodiment, the changing operation of the cutting-open position is performed on the first wide-angle projected image, but the present invention is not limited to this. Namely, the changing operation may be performed on the second wide-angle projected image, or may be performed on the area where the mark is displayed.
<Modification 2>
In the first embodiment, the image processing apparatus 2 generates and displays the two rectangular images from the circular image received from the camera unit 1. However, without generating the rectangular image from the circular image received from the camera unit 1, as illustrated in
<Modification 3>
The above map data described in the first embodiment can be created by the user using the UI unit 22. An initial state of a GUI (graphical user interface) for generating the map data displayed by the UI unit 22 is illustrated in
<Modification 4>
In the first embodiment, the center of the circular image is the pixel corresponding to the vicinity of the optical axis of the optical lens 11, but the pixel corresponding to the vicinity of the optical axis of the optical lens 11 may be deviated from the center of the circular image.
In the first embodiment, the user changes the cutting-open position by operating the UI unit 22, but the present invention is not limited to this. Namely, the controlling unit 24 may perform the change according to various conditions. For example, when the controlling unit 24 analyzes the circular image and thus detects an area with few objects such as a person and the like, a cutting-open position passing through the relevant area may be set. That is, the cutting-open position may be automatically set by the image processing apparatus 2 in accordance with images and events. The event may be received from the outside or may be generated by the controlling unit 24 according to a result of the image analysis.
In the first embodiment, the image converting unit 21 receives the circular image from the camera unit 1. However, the image converting unit may receive a rectangular image obtained by cutting out a part of an image captured through an optical system capable of omnidirectional image capturing by a mirror or the like, a fisheye lens, or the like. In this case, the image converting unit 21 may cut open and convert the rectangular image in the same manner as that in the first embodiment.
In the following embodiments and modifications including the present embodiment, differences from the first embodiment will be described. Namely, it is assumed that other constitutions, configurations, operations and the like are the same as those in the first embodiment unless otherwise mentioned. In the present embodiment, a process according to a flowchart of
In S203, the UI unit 22 displays the map data in addition to the contents displayed in S103. For example, the display screen as illustrated in
In S204, the controlling unit 24 decides whether or not the user operates the UI unit 22 to input an end instruction. As a result of the decision, when the end instruction is input, the processing according to the flowchart of
In S205, the controlling unit 24 decides whether or not there is an operation input by the user using the UI unit 22 to rotate the direction reference mark 701 in the map data. As a result of such decision, when there is the operation input to rotate the direction reference mark 701, the process proceeds to S206. On the other hand, when there is no operation input to rotate the direction reference mark 701, the process returns to S102. An initial value of the rotation angle of the direction reference mark 701 is “0” (the state facing right upward as illustrated in
A display example of the map data according to the present embodiment is illustrated in
In S206, the controlling unit 24 obtains the position Pt of the mark 610 corresponding to the direction reference mark 701 by substituting a rotation angle θb of the direction reference mark 701 for Δ to calculate the above expression 1. After the process proceeded from S206 to S102, the mark 610 is displayed at the position Pt obtained in S206. A change of the position of the mark 610 before and after the rotation of the direction reference mark 701 is illustrated in
<Modification 1>
In the second embodiment, the image processing apparatus 2 generates and displays the two rectangular images from the circular image received from the camera unit 1. However, without generating the rectangular image from the circular image received from the camera unit 1, as illustrated in
<Modification 2>
In the second embodiment, the timing of switching the display position of the mark 610 corresponding to the direction reference mark 701 is not limited to a specific timing. For example, the switching timing may be a timing of image display of a next frame, or may be a timing before the timing of image display of the next frame.
<Modification 3>
In the second embodiment, the display position of the mark 610 corresponding to the direction reference mark 701 is changed according to the rotation operation of the direction reference mark 701 on the map data. However, by changing the display position of the mark 610, the rotation angle of the direction reference mark 701 on the map data may be changed according to the above expression 1.
In the present embodiment, a process according to a flowchart of
In S306, the image converting unit 21 changes the cutting-open position by setting the rotation angle θb of the direction reference mark 701 to θ. Then, the process returns to S102. At this time, since θb=Δ=θ is given, the display position of the mark 610 corresponding to the direction reference mark 701 does not change.
Changes in display of the wide-angle projected image and the indicator before and after the rotation of the direction reference mark 701 are illustrated in
<Modification 1>
In the third embodiment, the image processing apparatus 2 generates and displays the two rectangular images from the circular image received from the camera unit 1. However, without generating the rectangular image from the circular image received from the camera unit 1, as illustrated in
In the present embodiment, a process according to a flowchart of
In S403, the UI unit 22 displays the circular image obtained from the camera unit 1 in addition to the contents displayed as above in S103. For example, the display screen as illustrated in
In S404, the controlling unit 24 decides whether or not the user has operated the UI unit 22 to input an end instruction. As a result of the decision, when the end instruction is input, the processing according to the flowchart of
In S405, the controlling unit 24 decides whether or not the user operates the UI unit 22 to change the cutting-open position on the circular image. For example, as illustrated in
As a result of the decision, when the changing operation of the cutting-open position has been performed, the process proceeds to S406. On the other hand, when the changing operation of the cutting-open position is not performed, the process returns to S102.
In S406, the image converting unit 21 changes the cutting-open position by rotating the current cutting-open position with the pixel position P as the center by a rotation angle corresponding to the movement amount of the mark 1100 in a rotation direction corresponding to the movement direction of the mark 1100.
Changes in display of the wide-angle projected image and the indicator before and after the rotation of the cutting-open position due to the operation of the mark 1100 are illustrated in
<Modification 1>
In the fourth embodiment, the cutting-open position is changed by moving the mark 1100. However, the cutting-open position may also be changed by moving the marks 1101 to 1106 or one point in the circular image in the same manner.
Some or all of the above embodiments and modifications may be combined as appropriate. Besides, some or all of the above embodiments and modifications may be selectively used.
In each of the above embodiments and modifications, the image converting unit 21 is included in the image processing apparatus 2. However, the image converting unit 21 may be included in the camera unit 1. In this case, the camera unit 1 generates two wide-angle projected images from the circular image based on the cutting-open position notified from the image processing apparatus 2, and sends the generated wide-angle projected image to the image processing apparatus 2. In addition to the wide-angle projected image, the camera unit 1 may send the circular image to the image processing apparatus 2. The image processing apparatus 2 displays the wide-angle projected image and the circular image received from the camera unit 1.
In the above embodiments and modification, the operations have been described as the operations to be performed by the user for, for example, changing the cutting-open position and the direction reference mark 701. It should be noted that these operations are examples. Namely, the processes of, for example, changing the cutting-open position and direction reference mark 701 may be instructed to the image processing apparatus 2 by another operation method.
Functional units constituting the image processing apparatus 2 illustrated in
A CPU 2601 performs various processes using computer programs and data stored in a RAM (random access memory) 2602. Thus, the CPU 2601 entirely controls the operations of the computer apparatus, and performs or controls each of the above processes as being performed by the image processing apparatus 2.
The RAM 2602 has an area for storing computer programs and data loaded from a ROM (read only memory) 2603 and an external storing device 2606, and an area for storing data received from the outside through an I/F (interface) 2607. Further, the RAM 2602 has a working area to be used when the CPU 2601 performs various processes. Thus, the RAM 2602 can appropriately provide various areas.
The ROM 2603 stores unrewritable computer programs and data, such as computer programs and data relating to activation of the computer apparatus, setting data of the computer apparatus, and the like.
An operating unit 2604 is constituted by user interfaces such as a mouse, a keyboard and the like. By operating the operating unit 2604, the user can input various instructions to the CPU 2601. For example, the operating unit 2604 realizes the user operation accepting function of the above UI unit 22.
A displaying unit 2605 is constituted by a CRT (cathode ray tube), a liquid crystal screen or the like. The displaying unit can display the processing result by the CPU 2601 with images, characters and the like. For example, the displaying unit 2605 realizes the displaying function of the above UI unit 22. Incidentally, the touch panel screen may be constituted by integrating the operating unit 2604 and the displaying unit 2605. In this case, the touch panel screen realizes the function of the above UI unit 22.
The external storing device 2606 is a large-capacity information storing device typified by a hard disk drive device. In the external storing device 2606, computer programs and data for causing the CPU 2601 to execute or control the above processes that an OS (operating system) and the image processing apparatus 2 perform are stored. The computer programs stored in the external storing device 2606 include, for example, a computer program for causing the CPU 2601 to realize the functions of the controlling unit 24, the UI unit 22 and the image converting unit 21. The data stored in the external storing device 2606 include what has been described as known information in the above description (such as map data). The computer programs and data stored in the external storing device 2606 are loaded into the RAM 2602 as appropriate under the control of the CPU 2601, and are processed by the CPU 2601. Incidentally, the RAM 2602 and the external storing device 2606 described above realize the functions of the above storing unit 23.
The I/F 2607 functions as a communication interface for performing data communication with the external device such as the camera unit 1. For example, the computer apparatus performs the data communicates with the camera unit 1 through the I/F 2607.
The CPU 2601, the RAM 2602, the ROM 2603, the operating unit 2604, the displaying unit 2605, the external storing device 2606 and the I/F 2607 are all connected to a bus 2608. Incidentally, the constitution illustrated in
Likewise, each functional unit constituting the camera unit 1 may be provided by hardware, or a part of each functional unit may be implemented by software. In the latter case, for example, computer programs and data for realizing the functions of the development processing unit 13 and the camera controlling unit 14 by the processor are stored in a memory of the camera unit 1. Then, by performing the processes using the computer programs and data with the processor, the functions of these functional units can be realized.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2018-043488, filed Mar. 9, 2018, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2018-043488 | Mar 2018 | JP | national |