Description
Torch-MLIR has implemented lowering for a large number of torch.aten operations, with the implementations spread across various conversion passes (e.g., Torch → Linalg, Torch → Stablehlo, decompositions, etc.).
However, there doesn't seem to be a straightforward or centralized way to quickly determine:
Which aten ops are currently fully supported (end-to-end lowered without known issues)
Which ones are partially supported or have limitations
Which ones are still missing / unimplemented
This makes it challenging to evaluate torch-mlir's compatibility with specific PyTorch models when using it as a frontend for inference compilation pipelines.
Motivation
We are exploring torch-mlir as the core frontend for downstream inference deployment workflows. Having a clear understanding of the current Aten op coverage would greatly help with:
Model compatibility assessment
Deciding whether custom decompositions or op extensions are needed
Planning long-term integration and contributions around torch-mlir
Questions / Suggestions
Are there any recommended ways (existing or planned) to obtain an overview of supported Aten ops in torch-mlir? For example:
Is there a maintained coverage table, list, status matrix, or documentation page that tracks implemented Aten ops?
Is it possible to query the supported ops programmatically (e.g., via some introspection in torch_ods_gen.py, a Python API, torch-mlir-opt flags, or the op registry)?
If no such mechanism exists yet, would the project consider adding a centralized "Aten op coverage overview" document or tool? This would be extremely valuable for downstream users and adopters.
Thank you for the excellent work on torch-mlir — looking forward to any pointers, workarounds, or future plans!
Description
Torch-MLIR has implemented lowering for a large number of torch.aten operations, with the implementations spread across various conversion passes (e.g., Torch → Linalg, Torch → Stablehlo, decompositions, etc.).
However, there doesn't seem to be a straightforward or centralized way to quickly determine:
Which aten ops are currently fully supported (end-to-end lowered without known issues)
Which ones are partially supported or have limitations
Which ones are still missing / unimplemented
This makes it challenging to evaluate torch-mlir's compatibility with specific PyTorch models when using it as a frontend for inference compilation pipelines.
Motivation
We are exploring torch-mlir as the core frontend for downstream inference deployment workflows. Having a clear understanding of the current Aten op coverage would greatly help with:
Model compatibility assessment
Deciding whether custom decompositions or op extensions are needed
Planning long-term integration and contributions around torch-mlir
Questions / Suggestions
Are there any recommended ways (existing or planned) to obtain an overview of supported Aten ops in torch-mlir? For example:
Is there a maintained coverage table, list, status matrix, or documentation page that tracks implemented Aten ops?
Is it possible to query the supported ops programmatically (e.g., via some introspection in torch_ods_gen.py, a Python API, torch-mlir-opt flags, or the op registry)?
If no such mechanism exists yet, would the project consider adding a centralized "Aten op coverage overview" document or tool? This would be extremely valuable for downstream users and adopters.
Thank you for the excellent work on torch-mlir — looking forward to any pointers, workarounds, or future plans!