What if we trained small #LLMs, that are experts in one small tight, domain and don't require a lot of resources to run?
But they have the ability to remotely pass other questions back and forth to a more general LLM?
For example, what if there was an LLM trained specifically for your car as a "concierge", such as controlling the media player, air conditioning, seats, mirrors, etc. through human language? It could run on a few chips through your car, and be powered by the battery.
Then, if you suddenly asked a question on Nietzschean philosophy, the little LLM will remotely pass it along to a "big brother", like Chat GPT, in the cloud, and then relate the returned answer.
© 2025 Praveen Puri