This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2014-052208 filed in Japan on Mar. 14, 2014; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a frame interpolation device, a frame interpolation method and a recording medium.
There is a known technique for inserting interpolated frames between frames thereby to smooth a video. An interpolated frame generation device typically generates an interpolated frame based on motion vectors acquired by searching a video.
An occlusion region occurs in a video shooting a moving object therein in many cases. The “occlusion region” is a region where, since an object overlaps the background or the like behind the object, no background or the like behind the object can temporarily be seen. Image information on the background or the like behind the object is lost in the occlusion region, and thus no correct motion vector can be acquired in many cases. When no correct motion vector can be acquired, an interpolated frame generated by the interpolated frame generation device becomes an unnatural image.
A frame interpolation device according to an embodiment includes a motion vector interpolation unit that, based on motion vectors indicating motions of an image between two frames and a temporal position of an interpolated frame inserted between the two frames, calculates interpolated motion vectors indicating motions of images between the interpolated frame and the two frames, and assigns the calculated interpolated motion vectors to the interpolated frame per unit region, a motion-compensated image generation unit that generates a forward motion-compensated image generated based on image information on the forward frame out of the two frames and the interpolated motion vectors, and a backward motion-compensated image generated based on image information on the backward frame out of the two frames and the interpolated motion vectors, and an interpolated frame generation unit that generates the interpolated frame by averaging corresponding regions of the forward motion-compensated image and the backward motion-compensated image by different weights between a normal region in which one or one pair of interpolated motion vectors is assigned per unit region and a non-normal region which is configured of at least one of a collided region assigned with a plurality or a plurality of pairs of interpolated motion vectors per unit region and a vacant region assigned with no interpolated motion vector.
The present embodiment will be described below with reference to the drawings. In the drawings, the same reference numerals are denoted to the same or like reference numerals.
The control unit 110 is configured of a processing device such as processor. The control unit 110 operates according to a program stored in a ROM (Read Only Memory) or RAM (Random Access Memory) (not illustrated) thereby to control the respective units in the frame interpolation device 100.
The storage unit 120 is configured of a data readable/writable storage device such as DRAM (Dynamic Random Access Memory), SRAM (Static Random Access Memory), semiconductor memory or hard disk. The storage unit 120 includes various storage areas such as a frame memory area 121, a motion vector storage area 122 and an interpolated frame storage area 123. The frame memory area 121 stores a video signal acquired by the input unit 130 therein. The motion vector storage area 122 stores a search result generated by the motion estimation unit 140 therein. The interpolated frame storage area 123 stores an interpolated frame generated by the motion compensation unit 150 therein.
The input unit 130 is configured of an input interface such as serial interface or parallel interface. The input unit 130 stores an input video signal into the frame memory area 121. A video signal is assumed to be configured of input frames at intervals of 0.1 second, for example. In the following description, for easy understanding, the frame numbers are assumed to be assigned to the input frames as F[0], F[1], F[2], . . . in order of time.
The motion estimation unit 140 makes input frame motion estimation. The search method may use the block matching method or the gradient method, for example. The motion estimation unit 140 stores a search result of motion vectors, evaluation value or the like in the motion vector storage area 122.
The motion compensation unit 150 operates according to a program stored in the ROM or RAM (not illustrated) thereby to realize various operations including “motion compensation processing.” The motion compensation processing generates an interpolated frame based on a search result of the motion estimation unit 140. The motion compensation unit 150 stores the generated interpolated frame in the interpolated frame storage area 123. The motion compensation unit 150 may realize its function by one processor or in cooperation with a plurality of processors.
The output unit 160 is configured of an output interface such as serial interface or parallel interface. The output unit 160 outputs an interpolated frame stored in the interpolated frame storage area 123.
The operations of the frame interpolation device 100 will be described below.
The operations of the frame interpolation device 100 are divided into the “motion estimation processing” performed by the motion estimation unit 140 and the “motion compensation processing” performed by the motion compensation unit 150. The motion estimation processing will be described first.
When being ordered to start the motion estimation processing from the control unit 110, the motion estimation unit 140 starts the processing. The motion estimation unit 140 searches a motion of the image between two input frames. In the following description, it is assumed that the frame numbers of the two input frames to be searched are designated by the control unit 110 at the same time with the order of starting the motion estimation processing. At this time, the frame numbers designated by the control unit 110 are assumed as F[n−1] and F[n].
The motion estimation unit 140 divides the base frame F[n] into a plurality of blocks (S102). The block size is arbitrary.
The motion estimation unit 140 selects one unsearched block as a block to be searched from the base frame F[n] (S103).
Subsequently, the motion estimation unit 140 searches a part similar to the block to be searched from the reference frame F[n−1] (S104). The search method may be a well-known search method such as the block matching method or the gradient method, or may be a search method uniquely improved by a device manufacturer. After the searching, the motion estimation unit 140 acquires an evaluation value of the part similar to the block to be searched. The evaluation value indicates a degree of coincidence between the block to be searched and the similar part. The search range of the motion estimation unit 140 may not be necessarily the total reference frame F[n−1]. The search range may be a certain range in the reference frame F[n−1] or may be a preset range about the coordinate corresponding to the block to be searched, for example.
The operation in step S104 will be described herein by use of a specific example.
At first, an attention is paid to the block (a) in the base frame F[n] illustrated in
An attention is then paid to the block (b) in the base frame F[n]. Part of the character “T” moving from right to left is drawn in the block (b). The image of the block (b) completely matches with the image (block (f)) of the reference frame F[n−1] to which the block (b) moves by 8 pixels in the positive direction on the X axis. Thus, the motion estimation unit 140 determines that the similar part to the block (b) is the image of the block (f).
Subsequently, an attention is paid to the block (c) in the base frame F[n]. Part of the background hidden by the character “T” is drawn in the block (c). The background part in the block (c) is occluded behind the character “T” in the reference frame F[n−1]. Thus, image information on the background part in the block (c) is not present in the reference frame F[n−1]. However, the block (c) partially matches with the image of the block (g) in the reference frame F[n−1]. Thus, the motion estimation unit 140 determines that the similar part to the block (c) is the image of the block (g).
Finally, an attention is paid to the block (d) in the base frame F[n]. Part of the wavy line on the background is drawn in the block (d). Part of the background in the image (block (g)) in the reference frame F[n−1] at the same coordinate as the block (d) is hidden by the character “T” and the image does not completely match with the image of the block (d). However, most of the images of the block (d) and the block (g) (the background parts other than the character “T”) almost match with each other. Thus, the motion estimation unit 140 determines that the similar part to the block (d) is the image of the block (g). In the example of
In the example of
The motion estimation unit 140 generates a motion vector of the block to be searched based on the search result in S104 (S105).
The expression form of a motion vector is not limited to a specific expression form, and various expression forms may be used. For example, a motion vector may be expressed in a coordinate form. In the example of
Subsequently, the motion estimation unit 140 determines whether search of all the blocks in the base frame F[n] are completed (step S106). When the search of all the blocks are not completed (S106: No), the motion estimation unit 140 repeats the processes in S103 to S106 until the search of all the blocks are completed. When the search of all the blocks are completed (S106: Yes), the motion estimation unit 140 proceeds to S107.
The motion estimation unit 140 associates the frame number of the base frame F[n] and the evaluation values with the motion vectors of all the blocks, and store them in the motion vector storage area 122 (S107), and then terminates the processing.
The motion compensation processing performed by the motion compensation unit 150 will be described below.
When being ordered to start the motion compensation processing from the control unit 110, the motion compensation unit 150 starts the processing. The motion compensation unit 150 generates interpolated frames to be inserted between two input frames based on the motion vectors generated in the motion estimation processing. In the following description, it is assumed that two input frames (input frames F[n] and F[n−1]) between which an interpolated frame is to be inserted, and an interpolated frame I[n][m] to be inserted are designated by the control unit 110 at the same time with the order of starting the motion estimation processing. m is an integer of 1 or more. In
Specifically, the motion vector interpolation unit 151 calculates the backward motion vector MVa and the forward motion vector MVb based on the following Equation (1) and Equation (2). In the following Equations, MV indicates a motion vector between the input frames F[n] and F[n−1], and T indicates a temporal position of the interpolated frame I[n][1] relative to the two input frames.
MVa=MV×T (1)
MVb=−MV×(1−T) (2)
The backward motion vector MVa and the forward motion vector MVb will be described herein by way of a specific example.
The motion vector interpolation unit 151 assigns the interpolated motion vectors calculated in S211 to the interpolated frame I[n][1] (S212). At this time, the motion vector interpolation unit 151 assigns the interpolated motion vectors to the interpolated frame I[n][1] per unit region. The unit region may be configured of a plurality of pixels or may be configured of one pixel. The following description will be made assuming that a unit region is made of one pixel.
The motion vector interpolation unit 151 selects any one pair of interpolated motion vectors from among the assigned pairs of interpolated motion vectors for each collided pixel (S213). At this time, the motion vector interpolation unit 151 selects the interpolated motion vectors based on the evaluation value calculated by the motion estimation unit 140. Specifically, the motion vector interpolation unit 151 selects the interpolated motion vectors generated based on the motion vector MV with the largest evaluation value as interpolated motion vectors for the collided pixel.
In
The motion vector interpolation unit 151 assigns a pair of interpolated motion vectors to each of the vacant pixels (S214). At this time, the motion vector interpolation unit 151 assumes, as the interpolated motion vector for a vacant pixel, the interpolated motion vector which is most frequently assigned to the pixels in a preset range around the vacant pixel. For example, the motion vector interpolation unit 151 acquires the interpolated motion vectors assigned to the pixels in the 63×63-pixel range around the vacant pixel. At this time, if no interpolated motion vector is assigned to the pixels in the range, the pixels are ignored and the interpolated motion vectors are acquired from only the pixels assigned with the interpolated motion vectors. The motion vector interpolation unit 151 then most frequent interpolated motion vectors having the same value from the acquired pairs of interpolated motion vectors, and assumes the extracted interpolated motion vectors as interpolated motion vector for the vacant pixel.
In
When completing the processing of interpolating the motion vectors in S210, the motion compensation unit 150 performs a motion-compensated image generation processing (S220). The motion vector interpolation processing is performed by the motion-compensated image generation unit 152. The motion-compensated image generation unit 152 generates a motion-compensated image based on the interpolated motion vectors generated in the motion vector interpolation processing. The “motion-compensated image” is an image used in the interpolated frame generation processing, and is configured of a forward motion-compensated image and a backward motion-compensated image.
The motion-compensated image generation unit 152 generates a forward motion-compensated image based on the image information on the input frame F[n] forward from the interpolated frame I[n][1], and the forward interpolated motion vectors MVb (S222). The motion-compensated image generation unit 152 generates a forward motion-compensated image by attaching the pixels on the head of the arrows of the forward interpolated motion vectors MVb to the originating coordinates of the arrows.
When completing the generation of the motion-compensated image in S220, the motion compensation unit 150 performs an interpolated frame generation processing (S230). The interpolated frame generation processing is performed by the interpolated frame generation unit 153 illustrated in
At first, the interpolated frame generation unit 153 selects one pixel which has not been assigned with a pixel value from among a plurality of pixels configuring the interpolated frame I[n][1] (S232).
The interpolated frame generation unit 153 determines whether the pixel selected in S232 (which will be denoted as “selected pixel” below) is a normal pixel (S233). When the selected pixel is a normal pixel (S233: Yes), the interpolated frame generation unit 153 proceeds to S236. When the selected pixel is not a normal pixel (S233: No) or when the selected pixel is a collided pixel or vacant pixel, the interpolated frame generation unit 153 proceeds to S234.
When the selected pixel is not a normal pixel (S233: No), the interpolated frame generation unit 153 calculates a degree of confidence indicating a possibility that the selected pixel is an occlusion region (S234). The occlusion region is where an object overlaps the background behind the object or another object and the background behind the object or another object cannot be temporarily seen. In the example of
Specifically, the interpolated frame generation unit 153 calculates a degree of confidence Ap or a degree of confidence Aq based on the following Equation (3) and Equation (4). The degree of confidence Ap is a degree of confidence when the selected pixel is a vacant pixel, and the degree of confidence Aq is a degree of confidence when the selected pixel is a collided pixel.
Ap=Rp (3)
Aq=Rq (4)
Here, Rp is a rate of the vacant pixels occupying the neighboring pixels around the selected pixel, and Rq is a rate of the collided pixels occupying the neighboring pixels around the selected pixel. The neighboring pixels are positioned in a preset range determined with reference to the selected pixel.
Subsequently, the interpolated frame generation unit 153 calculates a correction weight coefficient based on the calculated degree of confidence Ap or degree of confidence Aq (S235). The correction weight coefficient is used for averaging the non-normal regions in the two motion-compensated images. The interpolated frame generation unit 153 calculates a correction weight coefficient based on a reference weight coefficient Wr. Specifically, the interpolated frame generation unit 153 shifts the weight toward either the forward motion-compensated image or the backward motion-compensated image with reference to the value of the reference weight coefficient Wr thereby to calculate a correction weight coefficient.
Generally, the vacant region P is assumed as a background region which is gradually hidden by a moving object. Thus, it is assumed that when the image of the vacant region P is to be generated, it should be based on the backward image F[n−1] so that a more natural image can be generated. On the other hand, the collided region Q is assumed as a region where the background hidden by a moving object is appearing. Thus, it is assumed that when the image of the collided region Q is to be generated, it should be based on the forward image F[n] so that a more natural image can be generated. Thus, when the selected pixel is a vacant pixel, the interpolated frame generation unit 153 shifts the weight toward the backward motion-compensated image with reference to the reference weight coefficient Wr, and when the selected pixel is a collided pixel, it shifts the weight toward the forward motion-compensated image with reference to the reference weight coefficient Wr.
In this case, the interpolated frame generation unit 153 linearly changes the value toward 0 or 1 with reference to the reference weight coefficient Wr in order to smooth a change in the appearing afterimage.
Wp=Wr×(1−Ap) (5)
Wq=Wr×(1−Aq)+Aq (6)
The interpolated frame generation unit 153 weight-averages the pixels at the same coordinate between the backward motion-compensated image and the forward motion-compensated image thereby to calculate a pixel value V of the selected pixel in the interpolated frame I[n][1] (S236). Specifically, the interpolated frame generation unit 153 calculates the pixel value V of the selected pixel in the interpolated frame I[n][1] based on the following Equation (7) to Equation (9). Equation (7) is used when the selected pixel is a vacant pixel, and Equation (8) is used when the selected pixel is a collided pixel. Equation (9) is used when the selected pixel is a normal pixel.
V=Va×(1−Wp)+Vb×Wp (7)
V=Va×(1−Wq)+Vb×Wq (8)
V=Va×(1−Wr)+Vb×Wr (9)
Va is a pixel value of the selected pixel in the backward motion-compensated image and Vb is a pixel value of the selected pixel in the forward motion-compensated image. When completing the calculation of the pixel value V, the interpolated frame generation unit 153 assigns the pixel value V to the selected pixel in the interpolated frame.
Subsequently, the interpolated frame generation unit 153 determines whether all the pixels are averaged (S237). When all the pixels have not been averaged (S237: No), the interpolated frame generation unit 153 returns to S232 and repeats S232 to S237 until all the pixels are averaged. When all the pixels are averaged (S237: Yes), the interpolated frame generation unit 153 proceeds to S238.
When all the pixels are averaged (S237: Yes), the interpolated frame generation unit 153 stores the interpolated frame I[n][1] assigned with the pixel value in the interpolated frame storage area 123 (S238). In the examples illustrated in
When completing the storage of the interpolated frame, the motion compensation unit 150 terminates the motion compensation processing. The control unit 110 transmits the interpolated frame I[n][1] stored in the interpolated frame storage area 123 to an external device as needed.
According to the present embodiment, the image is averaged by use of different weights between the normal regions and the non-normal regions, and thus an unnatural afterimage occurring in an occlusion region, particularly an unnatural afterimage occurring near a boundary between a moving object and the background can be lighter. Additionally, the frame interpolation device 100 shifts the weight used for averaging the vacant regions toward the backward motion-compensated image, and thus can generate a natural image in the region where the background is gradually hidden in the occlusion region. Further, the frame interpolation device 100 shifts the weight used for averaging the collided regions toward the forward motion-compensated image, and thus can generate a natural image in the region where the background appears in the occlusion region.
The frame interpolation device 100 calculates a correction weight coefficient based on a rate of vacant pixels or collided pixels occupying the neighboring pixels around the selected pixel. More specifically, a correction weight coefficient is calculated based on the degree of confidence Ap calculated based on the rate of vacant pixels occupying the neighboring pixels or the degree of confidence Aq calculated based on the rate of collided pixels occupying the neighboring pixels. Small vacant regions or collided regions may be dispersed in an image of a video in which the objects or the background does not move perfectly in parallel. When the selected pixel is a vacant pixel or collided pixel dispersed in the image and thus not a pixel in an occlusion region, the degree of confidence or the rate has a low value, and consequently the correction weight coefficient is considered to have a value close to the reference weight coefficient. Therefore, when the selected pixel is a vacant pixel or collided pixel dispersed in the image, the frame interpolation device 100 can average the selected pixels of the two motion-compensated images with the weight coefficient close to the reference weight coefficient so that a pixel with a remarkably different value from the neighboring pixel values is less likely to occur in the interpolated frame. Consequently, the frame interpolation device 100 can generate a more natural interpolated frame.
The above embodiment is merely exemplary, and various modifications and applications can be made thereto. For example, the rate Rp of the vacant pixels occupying the neighboring pixels is acquired as the degree of confidence Ap in the above embodiment, but the value of the degree of confidence Ap does not necessarily match with the rate Rp. For example, when the rate Rp is larger than a preset value s, the interpolated frame generation unit 153 may assume the selected pixel as a pixel in the occlusion region and set the degree of confidence Ap at 1.
Further, when the rate Rp is smaller than a preset value d, the frame interpolation device 100 may assume the selected pixel as not a pixel in the occlusion region but a vacant pixel dispersed in the image and set the degree of confidence Ap at 0. The interpolated frame generation unit 153 then calculates a correction weight coefficient Wp based on Equation (5). Thereby, when the selected pixel is likely to be a vacant pixel dispersed in the image not a pixel in the occlusion region, the pixel value calculated by use of the reference weight coefficient Wr can be assumed as the value of the selected pixel in the interpolated frame, and thus the pixel value of the selected pixel can be close to the neighboring pixel values. Consequently, the interpolated frame generation unit 153 can generate a more natural interpolated frame.
In the above embodiment, the rate Rq of the collided pixels occupying the neighboring pixels is acquired as the degree of confidence Aq, but the value of the degree of confidence Aq does not necessarily match with the rate Rq. For example, when the rate Rq is larger than the preset value s, the frame interpolation device 100 may assume the selected pixel as a pixel in the occlusion region and set the degree of confidence Ap at 1. The interpolated frame generation unit 153 then calculates a correction weight coefficient Wq based on Equation (6). Thereby, when the selected pixel is likely to be a pixel in the occlusion region, the pixels in the forward compensated image can be assumed as the pixels of the interpolated frame, and thus the interpolated frame generation unit 153 can make an afterimage occurring in the interpolated frame lighter. Consequently, the interpolated frame generation unit 153 can generate a more natural interpolated frame.
When the rate Rq is smaller than the preset value d, the frame interpolation device 100 may assume the selected pixel as a collided pixel dispersed in the image not a pixel in the occlusion region and set the degree of confidence Aq at 0. The interpolated frame generation unit 153 then calculates a correction weight coefficient Wq based on Equation (6). Thereby, when the selected pixel is likely to be a collided pixel dispersed in the image not a pixel in the occlusion region, the pixel value calculated by use of the reference weight coefficient Wr can be assumed as the value of the selected pixel of the interpolated frame, and thus the pixel value of the selected pixel can be closer to the neighboring pixel values. Consequently, the interpolated frame generation unit 153 can generate a more natural interpolated frame.
The neighboring pixels are assumed to be in the 5×5 square range around the selected pixel in the above embodiment, but the range of the neighboring pixels is not limited to the 5×5 square range. The range of the neighboring pixels is not limited to a square range, and the range of the neighboring pixels may be a quadrangular range such as rectangular shape or rhombic shape around the selected pixel, for example. Further, the range of the neighboring pixels is not limited to a quadrangular shape, and may be a circular or oval range, for example. Furthermore, the position of the selected pixel may not be on the center of the range. For example, the selected pixel may be positioned at an end of the range.
In the above embodiment, the interpolated frame generation unit1153 shifts the weight toward the backward motion-compensated image when the selected pixel is a vacant pixel, but the shift direction is not limited to the backward direction. The interpolated frame generation unit 153 may shift the weight toward the forward motion-compensated image as needed depending on the nature of the video.
In the above embodiment, the interpolated frame generation unit 153 shifts the weight toward the forward motion-compensated image when the selected pixel is a collided pixel, but the shift direction is not limited to the forward direction. The interpolated frame generation unit 153 may shift the weight toward the backward motion-compensated image as needed depending on the nature of the video.
In the above embodiment, the input frames to be used for generating an interpolated frame are assumed as input frames F[n−1] and F[n] immediately before and immediately after the interpolated frame, but the input frames to be used for generating an interpolated frame are not limited to those immediately before and immediately after input frames. The input frames may be separated away from the interpolated frame by 2 or more frames.
In the above embodiment, the interpolated frame generation unit 153 calculates a reference weight coefficient Wr based on the temporal position T of the interpolated frame, but the interpolated frame generation unit 153 may calculate a reference weight coefficient Wr without using the temporal position T. For example, the interpolated frame generation unit 153 may assume the reference weight coefficient Wr at 0.5 uniformly, and simply average the normal regions uniformly.
In the above embodiment, the description has been made assuming that a unit region is made of one pixel, but the unit region may be configured of a plurality of pixels. In this case, the vacant pixel, the collided pixel and the non-normal pixel can be denoted as vacant unit region, collided unit region and non-normal unit region, respectively. The vacant unit region is a concept including a vacant pixel, and the collided unit region is a concept including a collided pixel. The non-normal unit region is a concept including a non-normal pixel.
In the above embodiment, the motion compensation unit 150 completes to generate motion-compensated images (a backward motion-compensated image and a forward motion-compensated image) in the motion-compensated image generation processing and then performs the interpolated frame generation processing, but the motion compensation unit 150 may perform the interpolated frame generation processing before completing the generation of motion-compensated images. The motion compensation unit 150 may generate an interpolated frame by repeating the generation of a motion-compensated image in a certain range (a motion-compensated image for one block, for example) and the generation of an interpolated frame in a certain range (an interpolated frame for one block, for example). The motion compensation unit 150 may generate an interpolated frame by repeating the generation of a motion-compensated image for one pixel and the generation of an interpolated frame for one pixel.
The above embodiment has been described assuming that the interpolated motion vectors are configured of a pair of interpolated motion vectors including a backward interpolated motion vector MVa and a forward interpolated motion vector MVb, but the interpolated motion vectors may not be necessarily configured of a pair of interpolated motion vectors. The interpolated motion vectors may be configured of either a backward interpolated motion vector MVa or a forward interpolated motion vector MVb. In this case, the interpolated motion vector may be such that one interpolated motion vector is specified as needed based on the information on the other interpolated motion vector of either a backward interpolated motion vector MVa or a forward interpolated motion vector MVb and the information on the motion vector MV.
The above embodiment has been described assuming that a video into which the frame interpolation device 100 inserts interpolated frames is a slow-motion playing video, but a video into which the frame interpolation device 100 inserts interpolated frames is not limited to a slow-motion playing video. For example, a video into which the frame interpolation device 100 inserts interpolated frames may be a video played at a normal speed.
In the above embodiment, the frame interpolation device 100 is configured to output generated interpolated frames to an external device, but the frame interpolation device 100 may be configured to include a playing function and to output a video generated based on generated interpolated frames and input frames to a display device. In this case, the frame interpolation device 100 may be configured to be able to output a video signal from the output unit 160, or may be configured to include a display unit for displaying a video and to output a video to the display unit. Of course, the frame interpolation device 100 may not include a video playing function and may only generate interpolated frames.
The frame interpolation device 100 can be assumed as a product such as TV, recorder, personal computer, fixed-line telephone, cell phone, Smartphone, tablet terminal, PDA (Personal Digital Assistant) or game machine. Alternatively, the frame interpolation device 100 can be assumed as a component mounted on a product, such as semiconductor or semiconductor circuit board.
The frame interpolation device 100 according to the present embodiment may be realized by a dedicated system or may be realized by a typical computer system. For example, a program for performing the above operations may be stored in a computer readable storage medium such as optical disk, semiconductor memory, magnetic tape or flexible disk to be distributed and the program may be installed in a computer to perform the above processing thereby to configure the frame interpolation device 100. The program may be stored in a disk device provided in a server device on a network such as Internet and downloaded in a computer. The above functions may be realized in cooperation with the OS (Operating System) and application software. In this case, the components other than the OS may be stored in a medium to be distributed, or the components other than the OS may be stored in a server device to be downloaded in a computer.
While certain embodiments have been described these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel apparatus and methods described herein may be embodied in a variety of other forms: furthermore various omissions, substitutions and changes in the form o the apparatus and methods described herein may be made without departing from the spirit of the inventions. The accompanying claims and there equivalents are intended to cover such forms of modifications as would fall within the scope and spirit of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2014-052208 | Mar 2014 | JP | national |