Skip to content

Make custom_ops import optional in native runners and improve error message#18857

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/fix-qwen3-5-runner-error
Draft

Make custom_ops import optional in native runners and improve error message#18857
Copilot wants to merge 2 commits intomainfrom
copilot/fix-qwen3-5-runner-error

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 13, 2026

Summary

Running native Python runners (e.g., Qwen 3.5 examples) fails with a cryptic AssertionError: Expected 1 library but got 0 when custom_ops_aot_lib is not built. The custom ops Meta kernels registered by this import are only needed for export tracing, not for pybindings inference.

  • extension/llm/custom_ops/custom_ops.py: Improve assertion message to include the search path and actionable guidance (-DEXECUTORCH_BUILD_KERNELS_LLM_AOT=ON or pybind preset).
  • examples/models/llama/runner/native.py, examples/models/llama3_2_vision/runner/native.py: Wrap custom_ops import in try/except — follows the same pattern as kernels/portable/__init__.py and kernels/quantized/__init__.py.

Test plan

Verified the native runner no longer crashes on import when custom_ops_aot_lib is not built. When custom_ops is needed (e.g., during export) and the library is missing, the error now reads:

AssertionError: Expected 1 custom_ops_aot_lib library but got 0 (searched in .../extension/llm/custom_ops).
If building from source, re-build with -DEXECUTORCH_BUILD_KERNELS_LLM_AOT=ON or use the pybind cmake preset.

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Apr 13, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18857

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

⚠️ 12 Awaiting Approval

As of commit 7d04704 with merge base fe71bd4 (image):

AWAITING APPROVAL - The following workflows need approval before CI can run:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 13, 2026
@github-actions
Copy link
Copy Markdown

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

…e runners

Make the custom_ops import optional in native runners since it is not
needed for pybindings inference. Also improve the assertion error message
in custom_ops.py to give actionable guidance when the library is missing.

Fixes the error when running Qwen 3.5 examples without building with
EXECUTORCH_BUILD_KERNELS_LLM_AOT=ON.

Agent-Logs-Url: https://github.com/pytorch/executorch/sessions/6d0bbec5-f0cd-4307-9d9e-3703e499b4ab

Co-authored-by: kirklandsign <107070759+kirklandsign@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix error when running Qwen 3.5 examples Make custom_ops import optional in native runners and improve error message Apr 13, 2026
Copilot AI requested a review from kirklandsign April 13, 2026 23:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Error occurred when running Qwen 3.5 examples

2 participants