METHOD AND APPARATUS FOR TRAINING FAKE IMAGE DISCRIMINATIVE MODEL

Information

  • Patent Application
  • 20230298332
  • Publication Number
    20230298332
  • Date Filed
    July 27, 2022
    a year ago
  • Date Published
    September 21, 2023
    8 months ago
  • CPC
    • G06V10/82
    • G06V10/764
    • G06V10/7747
    • G06V10/778
  • International Classifications
    • G06V10/82
    • G06V10/778
    • G06V10/774
    • G06V10/764
Abstract
A method and apparatus for training a fake image discriminative model according to an embodiment of the present disclosure includes generating one or more fake images for a real image by selecting one or more encoding layers and one or more decoding layers from a generator network of an autoencoder structure, generating a training image set based on the one or more fake images, and training a classifier for discriminating a fake image by using the training image set.
Description
Claims
  • 1. A method for training a fake image discriminative model, the method comprising: generating one or more fake images for a real image by selecting one or more encoding layers and one or more decoding layers from a generator network of an autoencoder structure;generating a training image set based on the one or more fake images; andtraining a classifier for discriminating a fake image by using the training image set.
  • 2. The method of claim 1, wherein the training image set includes the one or more fake images.
  • 3. The method of claim 1, wherein the generating of the one or more fake images includes generating the one or more fake images by arbitrarily selecting the one or more encoding layers and the one or more decoding layers whenever each of the one or more fake images is generated.
  • 4. The method of claim 1, wherein the generating of the one or more fake images includes generating the one or more fake images by sequentially selecting one or more encoding layers from an input layer from among a plurality of encoding layers included in the autoencoder, and selecting one or more decoding layers symmetric to the one or more selected encoding layers from among a plurality of decoding layers included in the autoencoder.
  • 5. The method of claim 1, wherein the generating of the one or more fake images includes generating the one or more fake images by performing anti-aliasing on an output of at least one of the one or more decoding layers.
  • 6. The method of claim 1, wherein the generating of the training image set includes generating one or more synthetic images by using the one or more fake images, and the training image set includes the one or more synthetic images.
  • 7. The method of claim 6, wherein the generating of the one or more fake images includes generating a plurality of fake images for the real image, and the one or more synthetic images include an image generated by combining two or more fake images among the plurality of fake images.
  • 8. The method of claim 6, wherein the one or more synthetic images include an image generated by combining at least one of the one or more fake images with another real image.
  • 9. The method of claim 6, wherein the training image set further includes a real image generated by combining the real image and another real image.
  • 10. The method of claim 9, wherein the training includes training the classifier to classify the one or more synthetic images as fake and classify the real image generated by the combining as real.
  • 11. An apparatus for training a fake image discriminative model, the apparatus comprising: a fake image generator configured to generate one or more fake images for a real image by selecting one or more encoding layers and one or more decoding layers from a generator network of an autoencoder structure;a training image set generator configured to generate a training image set based on the one or more fake images; anda trainer configured to train a classifier for discriminating a fake image by using the training image set.
  • 12. The apparatus of claim 11, wherein the training image set includes the one or more fake images.
  • 13. The apparatus of claim 11, wherein the fake image generator is configured to generate the one or more fake images by arbitrarily selecting the one or more encoding layers and the one or more decoding layers whenever each of the one or more fake images is generated.
  • 14. The apparatus of claim 11, wherein the fake image generator is configured to generate the one or more fake images by sequentially selecting one or more encoding layers from an input layer from among a plurality of encoding layers included in the autoencoder, and selecting one or more decoding layers symmetric to the one or more selected encoding layers from among a plurality of decoding layers included in the autoencoder.
  • 15. The apparatus of claim 11, wherein the fake image generator is configured to generate the one or more fake images by performing anti-aliasing on an output of at least one of the one or more decoding layers.
  • 16. The apparatus of claim 11, wherein the fake image generator is configured to generate one or more synthetic images by using the one or more fake images, and the training image set includes the one or more synthetic images.
  • 17. The apparatus of claim 16, wherein the fake image generator is configured to generate a plurality of fake images for the real image, and the one or more synthetic images include an image generated by combining two or more fake images among the plurality of fake images.
  • 18. The apparatus of claim 16, wherein the one or more synthetic images include an image generated by combining at least one of the one or more fake images with another real image.
  • 19. The apparatus of claim 16, wherein the training image set further includes a real image generated by combining the real image and the other real image.
  • 20. The apparatus of claim 19, wherein the trainer is configured to train the classifier to classify the one or more synthetic images as fake and classify the real image generated by the combining as real.
Priority Claims (1)
Number Date Country Kind
10-2022-0032768 Mar 2022 KR national