Skip to content

Commit 1dfacd6

Browse files
Songhao Jiafacebook-github-bot
authored andcommitted
restructure
Summary: This diff is a preparation for bundled program documentation, including: 1. move bundled program documentation from tutorial/ to sdk/ 2. remove the bento notebook, make its content directly in the .md file 3. elementary update to in line with new api Differential Revision: D49550056
1 parent 57c328c commit 1dfacd6

File tree

1 file changed

+186
-1
lines changed

1 file changed

+186
-1
lines changed

docs/website/docs/tutorials/bundled_program.md renamed to docs/website/docs/sdk/07_bundled_program.md

Lines changed: 186 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,192 @@ Overall procedure can be broken into two stages, and in each stage we are suppor
99

1010
## Emit stage
1111

12-
This stage mainly focuses on the creation of a BundledProgram, and dump it out to the disk as a flatbuffer file. Please refer to Bento notebook [N2744997](https://www.internalfb.com/intern/anp/view/?id=2744997) for details on how to create a bundled program.
12+
This stage mainly focuses on the creation of a BundledProgram, and dump it out to the disk as a flatbuffer file. The main procedure is as follow:
13+
1. Create a model and emit its executorch program.
14+
2. Construct a BundledConfig to record all info need to be bundled.
15+
3. Generate BundledProgram using the emited model and BundledProgram
16+
4. Serialize the BundledProgram and dump it out to the disk.
17+
18+
### Step 1: Create a model and emit its executorch program.
19+
20+
This is not the part BunledProgram focusing on, so we just give left an example here without detailed APIs usage. Most of the example is borrowed from bundled_program/tests/common.py:
21+
22+
```python
23+
24+
import torch
25+
from executorch import exir
26+
from executorch.exir import ExecutorchBackendConfig
27+
from executorch.exir.passes import MemoryPlanningPass, ToOutVarPass
28+
29+
30+
class SampleModel(torch.nn.Module):
31+
"""An example model with multi-methods. Each method has multiple input and single output"""
32+
33+
def __init__(self) -> None:
34+
super().__init__()
35+
self.a: torch.Tensor = 3 * torch.ones(2, 2, dtype=torch.int32)
36+
self.b: torch.Tensor = 2 * torch.ones(2, 2, dtype=torch.int32)
37+
38+
def encode(
39+
self, x: torch.Tensor, q: torch.Tensor
40+
) -> torch.Tensor:
41+
z = x.clone()
42+
torch.mul(self.a, x, out=z)
43+
y = x.clone()
44+
torch.add(z, self.b, out=y)
45+
torch.add(y, q, out=y)
46+
return y
47+
48+
def decode(
49+
self, x: torch.Tensor, q: torch.Tensor
50+
) -> torch.Tensor:
51+
y = x * q
52+
torch.add(y, self.b, out=y)
53+
return y
54+
55+
56+
method_names = ["encode", "decode"]
57+
model = SampleModel()
58+
59+
capture_inputs = {
60+
m_name: (
61+
(torch.rand(2, 2) - 0.5).to(dtype=torch.int32),
62+
(torch.rand(2, 2) - 0.5).to(dtype=torch.int32),
63+
)
64+
for m_name in method_names
65+
}
66+
67+
# Trace to FX Graph and emit the program
68+
program = (
69+
exir.capture_multiple(model, capture_inputs)
70+
.to_edge()
71+
.to_executorch()
72+
.program
73+
)
74+
75+
```
76+
77+
### Step 2: Construct BundledConfig
78+
79+
BundledConfig is a class under `executorch/bundled_program/config.py` that contains all information needs to be bundled for model verification. Here's the constructor api to create BundledConfig:
80+
81+
```python
82+
class BundledConfig:
83+
def __init__(
84+
self,
85+
method_names: List[str],
86+
inputs: List[List[Any]],
87+
expected_outputs: List[List[Any]],
88+
) -> None:
89+
"""Contruct the config given inputs and expected outputs
90+
91+
Args:
92+
method_names: All method names need to be verified in program.
93+
inputs: All sets of input need to be test on for all methods. Each list
94+
of `inputs` is all sets which will be run on the method in the
95+
program with corresponding method name. Each set of any `inputs` element should
96+
contain all inputs required by eager_model with the same inference function
97+
as corresponding execution plan for one-time execution.
98+
99+
expected_outputs: Expected outputs for inputs sharing same index. The size of
100+
expected_outputs should be the same as the size of inputs and provided method_names.
101+
"""
102+
103+
```
104+
105+
Here's an example of creating a bundled program for SampleModel above:
106+
107+
```python
108+
109+
from executorch.bundled_program.config import BundledConfig
110+
111+
# number of input sets needed to be verified
112+
n_input = 10
113+
114+
# All Input sets need to be verified for all execution plans.
115+
inputs = [
116+
# The below list is all inputs for a single execution plan (inference method).
117+
[
118+
# Each list below is a individual input set.
119+
# The number of inputs, dtype and size of each input follow Program's spec.
120+
[
121+
(torch.rand(2, 2) - 0.5).to(dtype=torch.int32),
122+
(torch.rand(2, 2) - 0.5).to(dtype=torch.int32),
123+
]
124+
for _ in range(n_input)
125+
]
126+
for _ in range(len(program.execution_plan))
127+
]
128+
129+
# Expected outputs align with inputs.
130+
expected_outputs = [
131+
[[getattr(model, m_name)(*x)] for x in inputs[i]]
132+
for i, m_name in enumerate(method_names)
133+
]
134+
135+
136+
bundled_config = BundledConfig(
137+
method_names, inputs, expected_outputs
138+
)
139+
140+
```
141+
142+
### Step 3: Generate BundledProgram
143+
144+
To create BundledProgram, we provice `create_bundled_program` under `executorch/bundled_program/core.py` to generate BundledProgram by bundling the emitted executorch program with the bundled_config:
145+
146+
```python
147+
148+
def create_bundled_program(
149+
program: Program,
150+
bundled_config: BundledConfig,
151+
) -> BundledProgram:
152+
"""
153+
Args:
154+
program: The program to be bundled.
155+
bundled_config: The config to be bundled.
156+
"""
157+
```
158+
159+
Example:
160+
161+
```python
162+
from executorch.bundled_program.core import create_bundled_program
163+
164+
bundled_program = create_bundled_program(program, bundled_config)
165+
```
166+
167+
### Step 4: Serialize BundledProgram to Flatbuffer.
168+
169+
To serialize BundledProgram to make runtime APIs use it, we provide two APIs, both under `executorch/bundled_program/serialize/__init__.py`.
170+
171+
Serialize BundledProgram to flatbuffer:
172+
173+
```python
174+
def serialize_from_bundled_program_to_flatbuffer(
175+
bundled_program: BundledProgram,
176+
) -> bytes
177+
```
178+
179+
Deserialize flatbuffer to BundledProgram:
180+
181+
```python
182+
def deserialize_from_flatbuffer_to_bundled_program(
183+
flatbuffer: bytes
184+
) -> BundledProgram
185+
```
186+
187+
Example:
188+
```python
189+
from executorch.bundled_program.serialize import (
190+
serialize_from_bundled_program_to_flatbuffer,
191+
deserialize_from_flatbuffer_to_bundled_program,
192+
)
193+
194+
serialized_bundled_program = serialize_from_bundled_program_to_flatbuffer(bundled_program)
195+
regenerate_bundled_program = deserialize_from_flatbuffer_to_bundled_program(serialized_bundled_program)
196+
197+
```
13198

14199
## Runtime Stage
15200
This stage mainly focuses on executing the model with the bundled inputs and and comparing the model's output with the bundled expected output. We provide multiple APIs to handle the key parts of it.

0 commit comments

Comments
 (0)