DRP-AI TVM: Difference between revisions

From Renesas.info
(Adding information about TVM)
(Added General support information about the TVM)
Line 1: Line 1:
The Renesas TVM is the Extension package of Apache TVM Deep Learning Complier for Renesas DRP-AI accelerators powered by EdgeCortix MERA™. The TVM is a software framework that translates Neural Networks to run on the Renesas MPUs.  
The Renesas TVM is the Extension package of Apache TVM Deep Learning Complier for Renesas DRP-AI accelerators powered by EdgeCortix MERA™. The TVM is a software framework that translates Neural Networks to run on the Renesas MPUs. While the AI Translator (see section above) can translate ONNX models to the DRP-AI hardware, it is restricted by the supported AI operations. This can restrict the number of supported AI Models. The TVM Translator expands the number of supported AI models for the RZV processors (currently RZV2MA). The TVM translates ONNX models by delegating the generated output between the DRP-AI and CPU.  


Current Supported Renesas MPUs
This is the TVM Software framework based on the Apache TVM. The TVM includes python support libraries, and sample scripts. The python scripts follow the Apache TVM framework API found here.
 
* The TVM Provides the following
** Wider supported range of AI Networks that can run on the DRP-AI and CPU.
** Translate AI models from ONNX files
** Translate AI models from PyTorch PT saved models. ( For other supported AI Software Frameworks see Apache TVM)
** Translate models to run on CPU only. This allows models to run on RZG.
 
'''Official RZ\V2MA TVM Translator [https://github.com/renesas-rz/rzv_drp-ai_tvm Github repo]'''
 
 
'''Current Supported Renesas MPUs'''


* RZV2MA
* RZV2MA
* RZV2M
* RZV2M


Generated Output
'''Generated Output'''


* DRP-AI + CPU
* DRP-AI + CPU

Revision as of 21:01, 25 October 2022

The Renesas TVM is the Extension package of Apache TVM Deep Learning Complier for Renesas DRP-AI accelerators powered by EdgeCortix MERA™. The TVM is a software framework that translates Neural Networks to run on the Renesas MPUs. While the AI Translator (see section above) can translate ONNX models to the DRP-AI hardware, it is restricted by the supported AI operations. This can restrict the number of supported AI Models. The TVM Translator expands the number of supported AI models for the RZV processors (currently RZV2MA). The TVM translates ONNX models by delegating the generated output between the DRP-AI and CPU.

This is the TVM Software framework based on the Apache TVM. The TVM includes python support libraries, and sample scripts. The python scripts follow the Apache TVM framework API found here.

  • The TVM Provides the following
    • Wider supported range of AI Networks that can run on the DRP-AI and CPU.
    • Translate AI models from ONNX files
    • Translate AI models from PyTorch PT saved models. ( For other supported AI Software Frameworks see Apache TVM)
    • Translate models to run on CPU only. This allows models to run on RZG.

Official RZ\V2MA TVM Translator Github repo


Current Supported Renesas MPUs

  • RZV2MA
  • RZV2M

Generated Output

  • DRP-AI + CPU
  • CPU only

Getting Started

Implementation examples

  • YoloV5