Skip to content

feat(proxy): add NO_OPENAPI env var to disable /openapi.json endpoint#25696

Merged
krrish-berri-2 merged 2 commits intoBerriAI:litellm_oss_staging_04_14_2026_p1from
lovek629:feature/no-openapi-env-var-25538
Apr 15, 2026
Merged

feat(proxy): add NO_OPENAPI env var to disable /openapi.json endpoint#25696
krrish-berri-2 merged 2 commits intoBerriAI:litellm_oss_staging_04_14_2026_p1from
lovek629:feature/no-openapi-env-var-25538

Conversation

@lovek629
Copy link
Copy Markdown
Contributor

@lovek629 lovek629 commented Apr 14, 2026

Relevant issues

Fixes #25538

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/proxy_unit_tests/test_proxy_utils.py file
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem
  • I have requested a Greptile review by commenting @greptileai and received a Confidence Score of at least 4/5 before requesting a maintainer review

Delays in PR merge?

If you're seeing a delay in your PR being merged, ping the LiteLLM Team on Slack (#pr-review).

CI (LiteLLM team)

  • Branch creation CI run
  • CI run for the last commit
  • Merge / cherry-pick CI run

Screenshots / Proof of Fix

Feature adds a new env var NO_OPENAPI=true which disables the /openapi.json endpoint by passing openapi_url=None to the FastAPI constructor. No screenshot needed as this is a configuration-only change following the existing NO_DOCS and NO_REDOC pattern.

Type

🆕 New Feature

Changes

  • Added _get_openapi_url() helper function in litellm/proxy/utils.py following the exact same pattern as the existing _get_docs_url() and _get_redoc_url() functions
  • Added openapi_url=_get_openapi_url() to the FastAPI() constructor in litellm/proxy/proxy_server.py
  • Added _get_openapi_url to the import block in proxy_server.py
  • Added 4 parametrized tests in tests/proxy_unit_tests/test_proxy_utils.py covering default, custom URL, disabled, and false cases

@vercel
Copy link
Copy Markdown

vercel bot commented Apr 14, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Ready Ready Preview, Comment Apr 14, 2026 1:47pm

Request Review

@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Apr 14, 2026

Greptile Summary

Adds a NO_OPENAPI env var (plus an OPENAPI_URL override) to disable or relocate the /openapi.json endpoint, implemented via a new _get_openapi_url() helper that follows the identical pattern as the existing _get_docs_url() and _get_redoc_url() helpers.

All remaining findings are P2 style issues (two missing blank lines and one out-of-order import) that will surface as Black/lint CI failures. Run uv run black . and re-import in alphabetical order before the next CI run.

Confidence Score: 5/5

  • Safe to merge after running uv run black . to fix formatting — no logic or security issues.
  • All findings are P2 style/formatting issues (missing blank lines, import order) that Black and CI will catch. The implementation is a clean copy of the established _get_docs_url/_get_redoc_url pattern with correct str_to_bool null-handling and adequate test coverage.
  • No files require special attention — formatting fixes only.

Important Files Changed

Filename Overview
litellm/proxy/utils.py Adds _get_openapi_url() following the exact same pattern as _get_docs_url() and _get_redoc_url(). Logic is correct. Minor: only one blank line precedes the function definition; Black requires two between top-level definitions.
litellm/proxy/proxy_server.py Imports _get_openapi_url and passes it to the FastAPI constructor as openapi_url=. The import is inserted out of alphabetical order (after _get_redoc_url instead of before _get_projected_spend_over_limit). Functionally correct.
tests/proxy_unit_tests/test_proxy_utils.py Adds 4 parametrized unit tests matching the pattern of existing _get_docs_url / _get_redoc_url tests. All mock-only, no network calls. Minor: one blank line before the new parametrize decorator; Black requires two.

Flowchart

%%{init: {'theme': 'neutral'}}%%
flowchart TD
    A[FastAPI startup] --> B[_get_openapi_url]
    B --> C{OPENAPI_URL set?}
    C -- Yes --> D[Return custom URL]
    C -- No --> E{NO_OPENAPI=true?}
    E -- Yes --> F[Return None\ndisables /openapi.json]
    E -- No --> G[Return '/openapi.json'\ndefault]
    D --> H[FastAPI openapi_url param]
    F --> H
    G --> H
Loading

Reviews (1): Last reviewed commit: "test(proxy): add tests for _get_openapi_..." | Re-trigger Greptile

@codspeed-hq
Copy link
Copy Markdown
Contributor

codspeed-hq bot commented Apr 14, 2026

Merging this PR will not alter performance

✅ 16 untouched benchmarks


Comparing lovek629:feature/no-openapi-env-var-25538 (4848767) with main (e64d98f)

Open in CodSpeed

Comment thread litellm/proxy/utils.py

return "/"

def _get_openapi_url() -> Optional[str]:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Missing blank line — Black formatting will fail CI

There is only one blank line between _get_docs_url and _get_openapi_url. Black (and PEP 8) requires two blank lines between top-level function definitions, matching the existing _get_redoc_url_get_docs_url spacing. Since Black is enforced in CI (make lint), this will block the pipeline.

Suggested change
def _get_openapi_url() -> Optional[str]:
def _get_openapi_url() -> Optional[str]:

Add a second blank line before this def so the gap matches the rest of the file.

_get_docs_url,
_get_projected_spend_over_limit,
_get_redoc_url,
_get_openapi_url,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Import out of alphabetical order

_get_openapi_url is placed after _get_redoc_url, but alphabetically (o before p before r) it should come before _get_projected_spend_over_limit.

Suggested change
_get_openapi_url,
_get_openapi_url,
_get_projected_spend_over_limit,
_get_redoc_url,

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

result = _get_docs_url()
assert result == expected_url

@pytest.mark.parametrize(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Missing blank line before test function

Only one blank line separates test_get_docs_url from this new test. Black requires two blank lines between top-level definitions. Add a second blank line above this decorator to match the rest of the file and pass CI.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 14, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

_get_docs_url,
_get_projected_spend_over_limit,
_get_redoc_url,
_get_openapi_url,
@krrish-berri-2 krrish-berri-2 changed the base branch from main to litellm_oss_staging_04_14_2026_p1 April 15, 2026 01:53
@krrish-berri-2 krrish-berri-2 merged commit 9a8daab into BerriAI:litellm_oss_staging_04_14_2026_p1 Apr 15, 2026
45 of 51 checks passed
krrish-berri-2 pushed a commit that referenced this pull request Apr 16, 2026
…ne (#25777)

* feat(proxy): add NO_OPENAPI env var to disable /openapi.json endpoint (#25696)

* feat(proxy): add NO_OPENAPI env var to disable /openapi.json endpoint - Fixes #25538

* test(proxy): add tests for _get_openapi_url

---------

Co-authored-by: Progressive-engg <lov.kumari55@gmail.com>

* feat(prometheus): add api_provider label to spend metric (#25693)

* feat(prometheus): add api_provider label to spend metric

Add `api_provider` to `litellm_spend_metric` labels so users can
build Grafana dashboards that break down spend by cloud provider
(e.g. bedrock, anthropic, openai, azure, vertex_ai).

The `api_provider` label already exists in UserAPIKeyLabelValues and
is populated from `standard_logging_payload["custom_llm_provider"]`,
but was not included in the spend metric's label list.

* add api_provider to requests metric + add test

Address review feedback:
- Add api_provider to litellm_requests_metric too (same call-site as
  spend metric, keeps label sets in sync)
- Add test_api_provider_in_spend_and_requests_metrics following the
  existing pattern in test_prometheus_labels.py

* fix: ensure `litellm_metadata` is attached to `pre_call` guardrail to align with `post_call` guardrail (#25641)

* fix: ensure `litellm_metadata` is attached to pre_call to align with post_call

* refactor: remove unused BaseTranslation._ensure_litellm_metadata

* refactor: module level imports for ensure_litellm_metadata and CodeQL

* fix: update based off of Codex comment

* revert: undo usage of `_guardrail_litellm_metadata`

* feat: add pricing entry for openrouter/google/gemini-3.1-flash-lite-preview (#25610)

* fix(bedrock): skip synthetic tool injection for json_object with no schema (#25740)

When response_format={"type": "json_object"} is sent without a JSON
schema, _create_json_tool_call_for_response_format builds a tool with an
empty schema (properties: {}). The model follows the empty schema and
returns {} instead of the actual JSON the caller asked for.

This patch:
- Skips synthetic json_tool_call injection when no schema is provided.
  The model already returns JSON when the prompt asks for it.
- Fixes finish_reason: after _filter_json_mode_tools strips all
  synthetic tool calls, finish_reason stays "tool_calls" instead of
  "stop". Callers (like the OpenAI SDK) misinterpret this as a pending
  tool invocation.

json_schema requests with an explicit schema are unchanged.

Co-authored-by: Claude <noreply@anthropic.com>

* fix(utils): allowed_openai_params must not forward unset params as None

`_apply_openai_param_overrides` iterated `allowed_openai_params` and
unconditionally wrote `optional_params[param] = non_default_params.pop(param, None)`
for each entry. If the caller listed a param name but did not actually
send that param in the request, the pop returned `None` and `None` was
still written to `optional_params`. The openai SDK then rejected it as
a top-level kwarg:

    AsyncCompletions.create() got an unexpected keyword argument 'enable_thinking'

Reproducer (from #25697):

    allowed_openai_params = ["chat_template_kwargs", "enable_thinking"]
    body = {"chat_template_kwargs": {"enable_thinking": False}}

Here `enable_thinking` is only present nested inside
`chat_template_kwargs`, so the helper should forward
`chat_template_kwargs` and leave `enable_thinking` alone. Instead it
wrote `optional_params["enable_thinking"] = None`.

Fix: only forward a param if it was actually present in
`non_default_params`. Behavior is unchanged for the happy path (param
sent → still forwarded), and the explicit `None` leakage is gone.

Adds a regression test exercising the helper in isolation so the test
does not depend on any provider-specific `map_openai_params` plumbing.

Fixes #25697

---------

Co-authored-by: lovek629 <59618812+lovek629@users.noreply.github.com>
Co-authored-by: Progressive-engg <lov.kumari55@gmail.com>
Co-authored-by: Ori Kotek <ori.k@codium.ai>
Co-authored-by: Alexander Grattan <51346343+agrattan0820@users.noreply.github.com>
Co-authored-by: Mohana Siddhartha Chivukula <103447836+iamsiddhu3007@users.noreply.github.com>
Co-authored-by: Amiram Mizne <amiramm@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
Sameerlite pushed a commit that referenced this pull request Apr 16, 2026
…ne (#25777)

* feat(proxy): add NO_OPENAPI env var to disable /openapi.json endpoint (#25696)

* feat(proxy): add NO_OPENAPI env var to disable /openapi.json endpoint - Fixes #25538

* test(proxy): add tests for _get_openapi_url

---------

Co-authored-by: Progressive-engg <lov.kumari55@gmail.com>

* feat(prometheus): add api_provider label to spend metric (#25693)

* feat(prometheus): add api_provider label to spend metric

Add `api_provider` to `litellm_spend_metric` labels so users can
build Grafana dashboards that break down spend by cloud provider
(e.g. bedrock, anthropic, openai, azure, vertex_ai).

The `api_provider` label already exists in UserAPIKeyLabelValues and
is populated from `standard_logging_payload["custom_llm_provider"]`,
but was not included in the spend metric's label list.

* add api_provider to requests metric + add test

Address review feedback:
- Add api_provider to litellm_requests_metric too (same call-site as
  spend metric, keeps label sets in sync)
- Add test_api_provider_in_spend_and_requests_metrics following the
  existing pattern in test_prometheus_labels.py

* fix: ensure `litellm_metadata` is attached to `pre_call` guardrail to align with `post_call` guardrail (#25641)

* fix: ensure `litellm_metadata` is attached to pre_call to align with post_call

* refactor: remove unused BaseTranslation._ensure_litellm_metadata

* refactor: module level imports for ensure_litellm_metadata and CodeQL

* fix: update based off of Codex comment

* revert: undo usage of `_guardrail_litellm_metadata`

* feat: add pricing entry for openrouter/google/gemini-3.1-flash-lite-preview (#25610)

* fix(bedrock): skip synthetic tool injection for json_object with no schema (#25740)

When response_format={"type": "json_object"} is sent without a JSON
schema, _create_json_tool_call_for_response_format builds a tool with an
empty schema (properties: {}). The model follows the empty schema and
returns {} instead of the actual JSON the caller asked for.

This patch:
- Skips synthetic json_tool_call injection when no schema is provided.
  The model already returns JSON when the prompt asks for it.
- Fixes finish_reason: after _filter_json_mode_tools strips all
  synthetic tool calls, finish_reason stays "tool_calls" instead of
  "stop". Callers (like the OpenAI SDK) misinterpret this as a pending
  tool invocation.

json_schema requests with an explicit schema are unchanged.

Co-authored-by: Claude <noreply@anthropic.com>

* fix(utils): allowed_openai_params must not forward unset params as None

`_apply_openai_param_overrides` iterated `allowed_openai_params` and
unconditionally wrote `optional_params[param] = non_default_params.pop(param, None)`
for each entry. If the caller listed a param name but did not actually
send that param in the request, the pop returned `None` and `None` was
still written to `optional_params`. The openai SDK then rejected it as
a top-level kwarg:

    AsyncCompletions.create() got an unexpected keyword argument 'enable_thinking'

Reproducer (from #25697):

    allowed_openai_params = ["chat_template_kwargs", "enable_thinking"]
    body = {"chat_template_kwargs": {"enable_thinking": False}}

Here `enable_thinking` is only present nested inside
`chat_template_kwargs`, so the helper should forward
`chat_template_kwargs` and leave `enable_thinking` alone. Instead it
wrote `optional_params["enable_thinking"] = None`.

Fix: only forward a param if it was actually present in
`non_default_params`. Behavior is unchanged for the happy path (param
sent → still forwarded), and the explicit `None` leakage is gone.

Adds a regression test exercising the helper in isolation so the test
does not depend on any provider-specific `map_openai_params` plumbing.

Fixes #25697

---------

Co-authored-by: lovek629 <59618812+lovek629@users.noreply.github.com>
Co-authored-by: Progressive-engg <lov.kumari55@gmail.com>
Co-authored-by: Ori Kotek <ori.k@codium.ai>
Co-authored-by: Alexander Grattan <51346343+agrattan0820@users.noreply.github.com>
Co-authored-by: Mohana Siddhartha Chivukula <103447836+iamsiddhu3007@users.noreply.github.com>
Co-authored-by: Amiram Mizne <amiramm@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Add NO_OPENAPI env var to disable /openapi.json endpoint

5 participants