[go: up one dir, main page]

AI Gateway 18.3: Bedrock GPT-OSS (120B) fails with stopSequences unsupported error

Summary

When using GitLab 18.3 (self-managed) with AI Gateway 18.3, Duo Chat fails to call the Bedrock GPT-OSS 120B model. The request returns 400 Bad Request from Bedrock, indicating that the model does not support stopSequences.

Environment

  • GitLab: 18.3 (self-managed, Ultimate)
  • AI Gateway: 18.3
  • Model: bedrock/openai.gpt-oss-120b-1:0 (AWS us-west-2)
  • Feature: Duo Chat (ReAct agent)

Steps to Reproduce

  1. Enable Duo Chat via AI Gateway 18.3.
  2. Configure AWS Bedrock GPT-OSS 120B (bedrock/openai.gpt-oss-120b-1:0).
  3. Ask Duo Chat any question (example: "How do I estimate story points?").
  4. Observe AI Gateway logs.

Actual Behavior

AI Gateway returns a 200 to the client, but internally Bedrock responds with 400 Bad Request. Error log excerpt:

{
  "exception_message": "litellm.BadRequestError: BedrockException - {\"message\":\"This model doesn't support the stopSequences field. Remove stopSequences and try again.\"}"
}

Expected Behavior

The request should complete successfully, or AI Gateway should handle unsupported parameters for this model (e.g. not sending stopSequences).

Notes

  • This error occurs reproducibly with GPT-OSS 120B (Bedrock).
  • Other Bedrock models may not have this limitation.
  • Possibly related to how LiteLLM maps OpenAI’s stop → Bedrock stopSequences.
Edited by 🤖 GitLab Bot 🤖