You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Download and link the Buck2 build, Android NDK, and MediaTek ExecuTorch Libraries from the MediaTek Backend Readme ([link](https://github.com/pytorch/executorch/tree/main/backends/mediatek/scripts#prerequisites)).
13
13
* MediaTek Dimensity 9300 (D9300) chip device
14
14
* Desired Llama 3 model weights. You can download them on HuggingFace [Example](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct)).
15
-
*`libneuronusdk_adapter.mtk.so`, `libneuron_buffer_allocator.so`, and `.whl` files (will be available soon by MediaTek)
15
+
* Download NeuroPilot Express SDK from the [MediaTek NeuroPilot Portal](https://neuropilot.mediatek.com/resources/public/npexpress/en/docs/npexpress) (coming soon):
16
+
-`libneuronusdk_adapter.mtk.so`: This universal SDK contains the implementation required for executing target-dependent code on the MediaTek chip.
17
+
-`libneuron_buffer_allocator.so`: This utility library is designed for allocating DMA buffers necessary for model inference.
18
+
-`mtk_converter-8.8.0.dev20240723+public.d1467db9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl`: This library preprocess the model into a MediaTek representation.
19
+
-`mtk_neuron-8.2.2-py3-none-linux_x86_64.whl`: This library converts the model to binaries.
16
20
17
21
## Setup ExecuTorch
18
22
In this section, we will need to set up the ExecuTorch repo first with Conda environment management. Make sure you have Conda available in your system (or follow the instructions to install it [here](https://anaconda.org/anaconda/conda)). The commands below are running on Linux (CentOS).
0 commit comments