Like the issue with modern AI is the data centers and central control no? How feasible would an AI be, whose code is FOSS and that is trained and running decentralized?

  • JASN_DE@feddit.org
    link
    fedilink
    arrow-up
    6
    ·
    17 hours ago

    Technically? Not a problem. The reason most of them run in data centers is the massive amount of computational and therefore also electrical power you need to run a somewhat useful model.

    Even worse when you need to initially train them. That’ll really hit the wallet.

    There are (by now) vast selection of models you can easily run at home without any outside connection, as long as you have reasonable hardware to run them.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 hours ago

      Based on OPs comments in the rest of the thread, they’re talking about a fold@home type system, not a locally run LLM.