hbmmaster:

uselessservo:

hbmmaster:

hbmmaster:

it’s weird how often people are recycling arguments against NFTs to argue against AI art. like, the reasons NFTs were bad were really easy to understand, but that doesn’t mean you can just transplant that understanding onto the new trendy tech thing! it’s a completely different technology.

in particular, I keep seeing people make points about the power consumption of AI art. that argument made a lot of sense for NFTs! cryptocurrency wastes a lot of energy on purpose; this is the thing that’s supposed to give it its value. this is extremely bad for the environment. there, nice solid argument, perfectly reasonable.

but like, you can’t just take that same exact argument unaltered and apply it to AI art. it is technically true that training a large model on millions of labeled images consumes a significant amount of energy, but that energy cost is a one-time thing. once the model already exists, using it consumes a completely ordinary amount of electricity, as far as the power consumption of software is concerned. it would be a bad idea for hundreds of thousands of people to all rent out server space to train dozens of cutting edge generative models each, but that’s simply not the reality of the situation.

like, sure, it’s not an insignificant amount of power consumption, and it certainly doesn’t have a positive environmental impact, but it’s not actually any worse for the environment than any other popular internet-based service. it only makes sense to call this particularly wasteful if you already believe that generative AI models themselves are a bad thing to put a moderate amount of resources into, which means this is a really ineffective way to try to convince someone who likes to use these AI models to stop.

My PhD work is related to physical acceleration of machine learning processes, and a counterintuitive thing I’ve stumbled across is that actually, inference (so running the model once trained) can be comparable or much greater in computing cost than training purely because you infer many many more times than you train

However, there is a growing discrepancy in this research effort. The absolute majority of the research done when it comes to ML carbon emissions is focused on the model training. At the same time, multiple sources estimate that when the ML deployment pipeline is considered as a whole, inference consumes the majority of compute resources, accounting for anything from 70 to 90% [1, 2, 3, 4].ALT

[overview from https://www.seldon.io/the-environmental-impact-of-ml-inference]

You only have to train Chat GPT once, but then you have to service all the people chatting to it, and that’s a lot of matrices to multiply. So while NFTs are of course worse in terms of compute use per useful output, because the whole point of crypto is doing useless computation to prove that you can, I don’t think environmental concerns about generative AI are unfounded, especially as the models explode in size.

Agreed still on the general message - the power consumption is likely not the most important argument against generative AI! But it is still a factor, and a growing one.

that is a fair point, the per-use energy cost being low still adds up quite a lot when the service is being used this frequently. as you said though it’s still not really comparable to the environmental costs involved with crypto. the numbers I’d consider most reasonable to compare it to here would be the environmental impact of hosting the servers for a popular online video game.