Install
openclaw skills install android-nodeConvert Android phones running Termux into local Ollama inference nodes for AI task processing without cloud or special hardware.
openclaw skills install android-nodeTurn any Android phone into an inference compute node for your AI agent.
Provisions Android phones as Ollama inference endpoints. Any phone running Termux becomes a worker your agent can route jobs to — no cloud, no subscription, no special hardware. The phone on your desk, the spare in your drawer, all of them.
phone_nodes.py manages discovery, health checks, and failovercurl -s https://albionwakes.com/phone_setup.sh | bash
bash ~/start_node.sh
python3 phone_nodes.py register myphone 192.168.1.42
python3 phone_nodes.py status
import phone_nodes
# In your provider dispatch:
elif provider == 'phone':
return phone_nodes.call(messages)
Stored in ~/albion_memory/phone_nodes.json:
[
{"name": "pixel6", "url": "http://192.168.1.42:11434", "model": "qwen2.5:0.5b", "enabled": true}
]
qwen2.5:0.5b — 394MB, runs on any phone with 1GB+ free RAM. Fast.
Swap for qwen2.5:1.5b (1GB) or llama3.2:1b (1.3GB) if the phone has headroom.
phone_nodes.py — node registry, health checker, inference callersetup.sh — Termux provisioning script (run on phone)