{"skill":{"slug":"litellm-vertex-codex","displayName":"Src","summary":"Configure OpenAI Codex CLI to use Vertex AI Gemini models via LiteLLM. A guide for translating strict Codex requests for Gemini compatibility.","tags":{"latest":"1.0.0"},"stats":{"comments":0,"downloads":42,"installsAllTime":0,"installsCurrent":0,"stars":0,"versions":1},"createdAt":1777302928911,"updatedAt":1777303309414},"latestVersion":{"version":"1.0.0","createdAt":1777302928911,"changelog":"Initial release: Guide for integrating OpenAI Codex CLI with Vertex AI Gemini models via LiteLLM.\n\n- Provides step-by-step instructions for configuring Codex CLI to use Gemini models through LiteLLM as a protocol translation proxy.\n- Includes details on prerequisite tools, LiteLLM `config.yaml` setup, and aliasing models for Codex compatibility.\n- Outlines Codex CLI configuration, emphasizing correct use of the `responses` wire API.\n- Notes requirements for setting an `OPENAI_API_KEY` environment variable.\n- Supplies verification steps and troubleshooting tips for common integration issues.","license":"MIT-0"},"metadata":{"os":null,"systems":null},"owner":{"handle":"bhrum","userId":"s175dc010k455d80ce2x9fwajd854ak9","displayName":"bhrum","image":"https://avatars.githubusercontent.com/u/222737880?v=4"},"moderation":{"isSuspicious":true,"isMalwareBlocked":false,"verdict":"suspicious","reasonCodes":["suspicious.llm_suspicious"],"summary":"Detected: suspicious.llm_suspicious","engineVersion":"v2.4.0","updatedAt":1777303309414}}