Had a call with the guys yesterday at HeyGen.com
So these guys generate avatars for use in influencer marketing. Some key notes are:
- They have an API you can use
- Multilingual
- Pre-generated ones can basically say whatever you want
- You can speak to one of these avatars and they will talk back to you. Not pre-generated.

During the meeting they were also able to take a 4 second clip of me in the video, and basically render it and make me say anything. The mouth movements and voice are not super great, but for a four second sample it was good.
It’s clear to see we will just be interacting with these agents on websites, and as you talk to them, they will present the data necessary to complete your goal next to them.
I imagine OperatingSystems will just have these inbuilt. Back to ‘Clippy’!

High quality human avatars driven in real time by LLMs that you can talk to, will be the future of e-commerce.
It was always going to happen, and it’s been tried before but now it looks like the tech is going to converge.
Websites will just have a shopping assistant now that talks to you, suggests products, and you wont have to do as much searching, or even typing (which might be a good thing for those of us who’ve coded for years) …
Psychologically humans will connect more with another human looking avatar helping them to complete their goals.
You will talk to it, the LLM runs off using RAG to the objects available for sale, and then will come back the solution … and present it to the user. If it’s clothing, the avatar may just change clothing to demonstrate it.
Potentially further into the future, the avatar will interact with the product, like holding a laptop, etc … unsure exactly how, but I imagine eventually all products will be showcased in 3D so the avatar will be able to work with that somehow.
Interesting times.
Every day I see something that makes me both sad and excited, depending on what perception I want to take!
Vercel AI Chatbot
Carrying on from yesterday, I managed to switch the local Chatbot to OpenAI and so now I have got the local UI up and running and hooked up to OpenAIs API. And I can change the colour scheme now.

Ok, that’s it for today’s R&D update.
Will also have some news on the DXP project soon.
We’ll also go through N8N workflows from WordPress to the socials.
Leave a Reply