LLM proxy is running. Use POST /v1/responses or /v1/chat/completions