3 Things to Start LLMO
We have recently read the article SEO is Dead, Long Live LLMO. As the use of LLM is an inevitable trend, doing LLMO(Large language model optimization) is increasingly important for marketers. Despite the prompt technique in context learning talked about in the article, we think that getting content to be included in the LLM training data set is actually pretty important for LLMO. Based on our understating of LLM, we think these 3 things to start LLMO.
Aiming for Long Term
The model training and updating of existing LLM platforms require quite an amount of computing power and are processed in batches. Get your content in the LLM training set won’t be faster than getting your sites get indexed.
But do those models actually update? Surely, they are. A staled model can not answer everybody’s new questions, or the size of the prompts will become too large to handle. One more thing, those 4-elephants weight GPU with 144TB memory will surely accelerate the training and updating of LLM models.
Setting a long term aim would be your timing, your attitude, and your strategy to get along with LLM. You don’t refresh your prompt every day with ChatGPT or Bard. The LLM models will update eventually, you should be focusing on the content that you manage and produce, not something beyond your control.
Create New Links
There already are a flood of content created by LLM and other AI models, despite the content generated by manually copy and paste. So why would your content prevail? The key to unlocking the LLM door is to create new links, i.e., to create content describing new relationships between existing things, or, to create answers for new questions.
For example, clearly, we can easily relate Apple with iPhone, but if you talk about apple and skiing equipment reasonably, you create new links. The reason behind this is the theory of attention model, which is the foundation of almost every state-of-the-art LLMs. The new relationship you presented results in different embedding and attention measurements, therefore better fit in the training sets.
Don’t Forget SEO
Today, many data centers in America are actually located along the first transcontinental railroad built in the 1860s. That was 150 years ago, yet it affects the nowadays technologies like big data and cloud computing.
If we agree about that SEO is bound to be replaced by LLMO. Think about data to train LLM are actually the ones that search engines favored. LLMs are still at their emerging stage, all the training data LLMs need are still out there on the website pages.
So, don’t forget about SEO.