Guillaume Salou commited on
Commit
ac55773
·
unverified ·
1 Parent(s): 2fac9ff

fix: route bedrock/ model ids via LiteLLM Bedrock adapter (#89)

Browse files

_resolve_llm_params treated any non-anthropic/openai prefix as an HF
router model and wrapped it as 'openai/<id>' pointed at
router.huggingface.co. 'bedrock/...' therefore fell through to HF
router which rejected it with 'model does not exist'. Add a dedicated
bedrock/ branch that passes the model id as-is so LiteLLM's Converse
adapter picks it up with the standard AWS env creds.

Files changed (1) hide show
  1. agent/core/llm_params.py +8 -0
agent/core/llm_params.py CHANGED
@@ -154,6 +154,14 @@ def _resolve_llm_params(
154
  params["output_config"] = {"effort": level}
155
  return params
156
 
 
 
 
 
 
 
 
 
157
  if model_name.startswith("openai/"):
158
  params = {"model": model_name}
159
  if reasoning_effort:
 
154
  params["output_config"] = {"effort": level}
155
  return params
156
 
157
+ if model_name.startswith("bedrock/"):
158
+ # LiteLLM routes ``bedrock/...`` through the Converse adapter, which
159
+ # picks up AWS credentials from the standard env vars
160
+ # (``AWS_ACCESS_KEY_ID`` / ``AWS_SECRET_ACCESS_KEY`` / ``AWS_REGION``).
161
+ # The Anthropic thinking/effort shape is not forwarded through Converse
162
+ # the same way, so we leave it off for now.
163
+ return {"model": model_name}
164
+
165
  if model_name.startswith("openai/"):
166
  params = {"model": model_name}
167
  if reasoning_effort: