Arm backend: Add experimental support for new TOSAQuantizer#18100
Arm backend: Add experimental support for new TOSAQuantizer#18100meta-codesync[bot] merged 13 commits intopytorch:mainfrom
Conversation
Allows initializing TOSA/EthosU/Vgf quantizers with use_composable_quantizer=True to use a new implementation of the quantizer following the Cortex-M. See pytorch#17701 for more details. - Creates a new temporary TOSAQuantizer API layer for switching between the two versions - Adds a TOSAQuantizationConfig encapturing TOSA-specific qspec requirements for certain ops. - Adds quantizer_support.py for defining what operators are supported by the quantizer. - Align mark_node_as_annotated in cortex-m backend to TOSAQuantizer behaviour. - Update quantizer reporter to handle TOSA qspecs as they are dynamically created. Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: Icbca66ff86e6f78ffa1c8dcec55e17c25f97d8ca
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18100
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 0cd9609 with merge base c81126e ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: Id81e0c39d13a94a749206441fce60664c80a0af8
|
Hi @SS-JIA / @digantdesai this adds a file, do you want/need to check this? |
|
Fails unrelated |
backends/arm/quantizer/__init__.py
Outdated
|
|
||
| # Lazily import heavy quantizer classes to avoid circular imports with | ||
| # Cortex-M quantization configs. | ||
| _LAZY_EXPORTS = { |
There was a problem hiding this comment.
This is a workaround since the import situation is a bit messy with imports across the cortex-m and arm backend. The idea is to clean this up when things have stabilized, I didn't want to move things in this commit to make the diff cleaner, I hope this is OK.
|
I like the overall direction, and the Cortex-M quantizer reuse. Also the priority for the composition. One thing I would say is, add an e2e test with both Ethos and Cortex-M quantizers and run the test on FVP. |
- Spelling errors - Buck fixes - E2E model test on fvp with new quantizer Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: I43dadbcec22b3b28c65a1c9708790881dd3868a2
|
I hope this is what you had in mind for the test @digantdesai, we have also run all tests internally with the new quantizer set to true sucessfully. The idea is that we can let you and others try this out and give feedback and then there should be minimal disruption as we flip the flag to True as default. |
Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: I63601333a968eee27011bec4f906964f01ea71b9
- Moves all common baseclaseses/ helper functions to arm_quantizer_utils - Register cortex-m qspecs outside of the quantizer_reporter - Buck file updates - Docstring formatting Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: I2ea13ca9b9f02ce4b4e56c2a4fee1fd86b2a3ac5
Ignore MYPY errors for now since the errors are only typing related and the code is known to be functional. Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: I638927a57a7101433534542152ae40ea2290e8f4
rascani
left a comment
There was a problem hiding this comment.
Ok, finally gotten it all the rest of the way through. Thank you for sorting out the lazy imports, that resolved a lot of it. There were still a few more buck rules that needed fixing up, but there were also usages of some Quantizer APIs that no longer existed.
I think this should do it.
digantdesai
left a comment
There was a problem hiding this comment.
Review automatically exported from Phabricator review in Meta.
Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: I9620b6022bf6c528948b90e0e757967dc889dccb
Signed-off-by: Adrian Lundell <adrian.lundell@arm.com> Change-Id: I9ebccb94e71bdb124649f62bdcd4ff31cfb9057e
Allows initializing TOSA/EthosU/Vgf quantizers with use_composable_quantizer=True to use a new implementation of the quantizer following the Cortex-M. See
#17701 for more details.
cc @digantdesai @freddan80 @per @zingo @oscarandersson8218 @mansnils @Sebastian-Larsson @robell