[go: up one dir, main page]

People’s websites get pummeled by LLM companies scraping the web. I bet they don’t use nor scrape I2P though. Would using eepsites be a remedy?

  • ianhclark510@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    I mean, feel free to correct me, but there is nothing inherent to I2P that would stop an AI agent from scraping content any differently than the visible internet

    • onlinepersona@programming.devOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      It’s the speed. I2P isn’t fast (at the moment). Unless they participate with nodes, they won’t be able to increase the speed of the network. By forcing them onto I2P, they’d have to become good citizens in order to be able to consume anything.

  • Multiplexer@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 days ago

    You could also just set a password protection for access.
    Will have a similar effect on overall visitor count and is even more effective against potential scrapers.