

1·
2 days agoSorry, you’re right. I meant the training of the LLM is what uses lots of energy, I guess that’s not end user’s fault.
Sorry, you’re right. I meant the training of the LLM is what uses lots of energy, I guess that’s not end user’s fault.
won’t ruin your career
Granted, but it still will suck a fuck ton of coal produced electricity.
I think I do. Might be an illusion, though.
deleted by creator
classifier model which labels the ads
Not exactly my idea as I read about it somewhere when news about YT supposedly serving server-side ads started to spread.
You could record the video several times and check for differences between streams and then cut them off, might be more resource intensive in network and storage, but I think it’s still cheaper than a neural network hogging the GPU.
To your first question, nop, I have no idea how much energy takes to index the web in a traditional way (e.g MapReduce). But I think, in recent years, it’s been pretty clear that training AI consumes more energy (so much that big corpo are investing in nuclear energy, I think there was an article about companies giving up meeting 2030 [or 2050?] carbon emission goals, couldn’t find it)
About the second… I agree with you, but I also think that the problem is much bigger and complex than that.