feat(opencode): add oh-my-openagent plugin with omo config for ollama-cloud/glm-5.1
Configure oh-my-openagent (omo) plugin for multi-agent orchestration using ollama-cloud and local llama-swap providers. Primary model is ollama-cloud/glm-5.1 with fallback chains. Add runtime fallback, background task concurrency limits, and disable incompatible agents (hephaestus, multimodal-looker).
This commit is contained in:
@@ -1,5 +1,6 @@
|
||||
{
|
||||
"$schema": "https://opencode.ai/config.json",
|
||||
"plugin": ["oh-my-openagent"],
|
||||
"provider": {
|
||||
"llama-local": {
|
||||
"name": "Llama.cpp (zix790prors RTX 4070 Ti)",
|
||||
|
||||
Reference in New Issue
Block a user