Skip to content

[Compiler] Add reserved_launch_param_names to Backend ABC#1970

Open
hinriksnaer wants to merge 6 commits intopytorch:mainfrom
hinriksnaer:backend-abstraction-pr2
Open

[Compiler] Add reserved_launch_param_names to Backend ABC#1970
hinriksnaer wants to merge 6 commits intopytorch:mainfrom
hinriksnaer:backend-abstraction-pr2

Conversation

@hinriksnaer
Copy link
Copy Markdown
Collaborator

@hinriksnaer hinriksnaer commented Apr 7, 2026

Builds on the backend registry to eliminate a hardcoded backend string check in device_function.py.

Each backend's kernel launch mechanism reserves certain parameter names (e.g., Triton's run() uses grid, num_warps, num_stages). These names cannot be used as kernel variables. Previously, this was handled by a hardcoded list with an if backend_name == "tileir" branch to add TileIR-specific names.

This PR adds reserved_launch_param_names() as a static method on the Backend ABC so each backend declares its own reserved names. The registry aggregates them via all_reserved_launch_param_names() to ensure kernel portability across backends.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Apr 7, 2026
@hinriksnaer hinriksnaer requested review from fulvius31 and jansel April 7, 2026 17:36
@hinriksnaer hinriksnaer force-pushed the backend-abstraction-pr2 branch from 40c295c to 16b8a93 Compare April 8, 2026 16:20
...

@property
def experimental(self) -> bool:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should go in prior PR?

(e.g., Triton's ``run()`` method uses ``grid``, ``num_warps``,
``num_stages``, etc.).
"""
return []
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

frozenset might be better here.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hrm this looks like a dupe with the prior PR something is off.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an extension of the previous PR so the diffs would disappear on merge. Let me know if there is a different convention for introducing changes that depend on unmerged ones. Maybe both should not target upstream?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants