Day 109

Been observing vibe coding; and also doing some of my own.

The backbone of the new AI boom will be the processing power. I don’t need to know much about this, but it’s worth taking a Quick Look at things.

So, for instance do OpenAI or any of the AI companies release information on their power consumption.

Is there any official power consumption data?

Does the hardware vary from enterprise companies, or are they pretty much set?

I suppose at this point you have two ways of looking at it:

1 – How good are the micro level language frameworks going to get? i.e. low powered devices
2 – People are going to get lazier and overuse AI for everything which will create a radical increase in energy demand.

Turns out that in the main, energy usage of actually performing a ChatGPT response is fairly negligible.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *