Skip to content

Commit 42764ed

Browse files
committed
[Executorch][custom ops] Change lib loading logic to account for package dir
Just looking at the location of the source file. In this case custom_ops.py, can, and does, yield to wrong location depending on where you import custom_ops from. If you are importing custom_ops from another source file inside extension folder, e.g. builder.py that is in extensions/llm/export, then, I think, custom_ops gets resolved to the one installed in site-packages or pip package. But if this is imported from say examples/models/llama/source_transformations/quantized_kv_cache.py (Like in the in next PR), then it seems to resolve to the source location. In one of the CI this is /pytorch/executorch. Now depending on which directory your filepath resolves to, you will search for lib in that. This of course does not work when filepath resolves to source location. This PR changes that to resolve to package location. Differential Revision: [D66385480](https://our.internmc.facebook.com/intern/diff/D66385480/) [ghstack-poisoned]
1 parent 27dde5e commit 42764ed

File tree

1 file changed

+11
-1
lines changed

1 file changed

+11
-1
lines changed

extension/llm/custom_ops/custom_ops.py

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,17 @@
2323
op2 = torch.ops.llama.fast_hadamard_transform.default
2424
assert op2 is not None
2525
except:
26-
libs = list(Path(__file__).parent.resolve().glob("libcustom_ops_aot_lib.*"))
26+
import glob
27+
28+
import executorch
29+
30+
executorch_package_path = executorch.__path__[0]
31+
logging.info(f"Looking for libcustom_ops_aot_lib.so in {executorch_package_path }")
32+
libs = list(
33+
glob.glob(
34+
f"{executorch_package_path}/**/libquantized_ops_aot_lib.*", recursive=True
35+
)
36+
)
2737
assert len(libs) == 1, f"Expected 1 library but got {len(libs)}"
2838
logging.info(f"Loading custom ops library: {libs[0]}")
2939
torch.ops.load_library(libs[0])

0 commit comments

Comments
 (0)