>>/10638/
It took approximately 5 minutes for >>/10638/ to show up (after pressing "New Reply").
>>/10575/
>>/10599/
> GraphQL for AR
When researching, before I saw that page, I saw the video linked below.
How do you locally do AI workloads? How do you run generative AI or large language models locally? There's like 4 levels:
1. Casual computing: no stellar results
2. Personally have had paid for a powerful array of computers
3. Collaborative/community-based computing, distributed over a large geographical area
4. Paid solutions: centralized or decentralized
The lower two levels can only go so far. The third level is OK, but when it's done for free, there's much less motivation to do it. Some proof for this is Filecoin vs. whatever free storage methods (community-based or otherwise). Filecoin stores exabytes of data because there is a motivation to store it: money. Paid centralized solutions are OK. Amazon and other big tech companies have huge data centers. However, if you read about edge computing at https://wikipedia.vern.cc/wiki/Edge_computing then you will see this text:
> The world's data is expected to grow 61 percent to 175 zettabytes by 2025.[16] According to research firm Gartner, around 10 percent of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2025, the firm predicts that this figure will reach 75 percent.[17] The increase of IoT devices at the edge of the network is producing a massive amount of data — storing and using all that data in cloud data centers pushes network bandwidth requirements to the limit.
Therefore, decentralized solutions may be the future: for storage and compute. I heard about a decentralized physical infrastructure network (DePIN) which is used to take on GPU or compute workloads: a cryptocurrency called Render. Don't know if this specifically is any good for AI/ML or rendering 3D or whatever, but it's an interesting general idea - "DePin Crypto Ecosystem Overview (Clore, Render, Arweave)":
https://invidious.private.coffee/watch?v=sEqTji_qMUs
I looked at the "Pony Preservation Project" (AI general/project) in 4chan /mlp/ some time ago and apparently they do use some "level 3" community computing thing. Don't know the specifics. With that level, it's limited by the amount of enthusiasts who want to get involved. With level 4 (at least with storage), more users/systems get involved because money is in the mix. So compare the amount of enthusiasts for whatever the data/compute is about regardless of money vs. the amount of entities involved when money is also added to the equation. I know with storage, the paid system probably totally beats the free system. I'm guessing the same is true with compute.
Comparing "public clouds":
. archive.org: <1 exabyte
. BitTorrent: in the exabyte scale (or larger?) I guess, interested in more specific guesstimates/research
. Filecoin: about 22 exabytes - https://filfox.info/en
. Non-Filecoin IPFS: petabyte scale, maybe/probably exabyte scale, also interested in more specific numbers