• vane@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    2 days ago

    Also ban photo editors because it’s lower margin of entry. Those are not models in first place but VAE that modify model output. To use it you need some sort of understanding how to connect correct VAE to correct models so you need to setup tools to do it or write code. If you can set up tools you can as well train your own VAE on any small subset of pictures of any person on any hardware in under an hour with those tools.

    Not that I support generating naked pictures of people but this is how it works and has always worked. It’s not models ! https://en.wikipedia.org/wiki/Variational_autoencoder

    • FooBarrington@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Why would they be VAEs instead of LoRas?

      It’s also incredibly simple to use these things, you don’t need any technical knowledge.

      • vane@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        2 days ago

        Can be also loras that doesn’t change the fact that those are not models. VAE is just easier to train. Also saying something is simple is very subjective.

  • ZDL@lazysoci.al
    link
    fedilink
    arrow-up
    16
    ·
    3 days ago

    Wait, you mean techie nerds created new technology without thinking consequences through and it was used to degrade, harrass, and otherwise treat women like shit?

    This has never, not even once, happened before!

    • SW42@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      3 days ago

      Why do you assume it’s women? It could also be men. It isn’t but it could be ;)

      • ZDL@lazysoci.al
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        3 days ago

        Because I live in reality, not in some bizarre cloud-cuckoo land.