Since my earlier testing of Chatbots on Android years ago, a lot of changes have happened. I’m revisiting the options for an Android phone/tablet user to interact with LLMs now. There are many more options now, ranging from cloud services down to on-device models.
All of the options in this section are thin apps that simply pass your query up into the cloud. As a positive, this is often the fastest and most featureful appproach. On the negative side, you lose all privacy when conversing with a corporate cloud.







After my deployment of private Gemma, Mistral, and Qwen LLMs on my home LAN, running Ollama on each PC, with a single instance of OpenWebUI fronting them all, I went looking for a mobile phone app to access my home LLMs. I found Conduit, which I connected to OpenWebUI via Tailscale on my Unraid server.

The real future of LLMs will likely be on edge devices themselves as phones/tablets get more powerful hardware. At this point, with my mid-range Google Pixel 9a, local LLMs can be run quite effectively.
The Edge app from Google is more of a playground demonstration than a real Chatbot app. They are mainly trying to attract developers looking to include AI in their apps, without requiring a network connection or cloud service.

Similarly, the LEAP models in Apollo seem to be a developer demonstration, aiming for adoption and integration.

Pocketpal seems to the leader of true on-device Chatbots, offering a wide-selection of free models.

On the otherhand, ChatboxAI mentioned a “Free” version, but I couldn’t seem to get to it, instead only being shown license sales pages…
