Skip to content

Commit 3dfaaa6

Browse files
mikekgfbmalfet
authored andcommitted
update meta info (#355)
* update meta info * meta info updates
1 parent a1e77d2 commit 3dfaaa6

File tree

3 files changed

+11
-29
lines changed

3 files changed

+11
-29
lines changed

docs/ACKNOWLEDGEMENTS.md

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -10,19 +10,23 @@
1010
(both ideas and code) from his repo. You can never go wrong by
1111
following Andrej's work.
1212

13-
* Bert Maher and his [llama2.so](https://github.com/bertmaher/llama2.so),
14-
which built on Andrej's llama2.c and closed the loop on Llama models with
15-
AOTInductor.
13+
* Michael Gschwind, Bert Maher, Scott Wolchok, Bin Bao, Chen Yang,
14+
Huamin Li and Mu-Chu Li who built the first version of nanogpt (`DSOGPT`)
15+
with AOT Inductor proving that AOTI can be used to build efficient
16+
LLMs, and DSOs are a viable distribution format for models.
17+
[nanoGPT](https://github.com/karpathy/nanoGPT).
18+
19+
* Bert Maher and his
20+
[llama2.so](https://github.com/bertmaher/llama2.so), which built on
21+
Andrej's llama2.c and on DSOGPT to close the loop on Llama models
22+
with AOTInductor.
1623

1724
* Christian Puhrsch, Horace He, Joe Isaacson and many more for their
1825
many contributions in Accelerating GenAI models in the *"Anything,
1926
Fast!"* pytorch.org blogs, and, in particular, Horace He for [GPT,
2027
Fast!](https://github.com/pytorch-labs/gpt-fast), which we have
2128
directly adopted (both ideas and code) from his repo.
2229

23-
* Bert Maher, Scott Wolchok, Bin Bao, Chen Yang, Huamin Li and Mu-Chu
24-
Li for great collaborations in building AOTInductor for CPU including
25-
for [nanoGPT](https://github.com/karpathy/nanoGPT).
26-
2730
* Mobius Labs as the authors of the HQQ quantization algorithms
2831
included in this distribution.
32+

runner-et/LICENSE

Lines changed: 0 additions & 22 deletions
This file was deleted.
File renamed without changes.

0 commit comments

Comments
 (0)