Data architecture for investigating LLMops on Google Cloud micro-services
Here is some data architecture showing, explore investigating LLMops on Google Cloud micro-services with LangchainAI as the catalyst.
This is of course with LLM coding assistance so its much quicker to prototype. This all runs on Google Cloud Run and PubSub, scales between zero and a billion, and can swap out between LLMs and Vectorstores very easily.