Skip to content

Commit 5bff24e

Browse files
Olivia-liufacebook-github-bot
authored andcommitted
print graph break in llama -- a different if-else branch (#2493)
Summary: Pull Request resolved: #2493 D54969119 added the print to only one of the if branches (line 315-319), but the other branch is also used. Reviewed By: cccclai Differential Revision: D55033430 fbshipit-source-id: 4b42fa750245555e353838d2bcca86b689b16c26
1 parent 98f679f commit 5bff24e

File tree

2 files changed

+5
-1
lines changed

2 files changed

+5
-1
lines changed

examples/models/llama2/builder.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -302,6 +302,11 @@ def to_backend(
302302
assert self.edge_manager is not None
303303
self.edge_manager = self.edge_manager.to_backend(p)
304304
if self.verbose:
305+
logging.info(
306+
print_delegated_graph(
307+
self.edge_manager.exported_program().graph_module
308+
)
309+
)
305310
logging.info(f"Applied partitioners: {key}")
306311
elif isinstance(partitioner, Partitioner):
307312
assert self.edge_manager is not None

exir/backend/utils.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -258,7 +258,6 @@ def print_delegated_graph(graph_module: torch.fx.GraphModule) -> str:
258258
graph_format_str += (
259259
f"{indent * 3}{node_in_lowered_module.format_node()}\n"
260260
)
261-
print(graph_format_str)
262261
return graph_format_str
263262

264263

0 commit comments

Comments
 (0)