Skip to content

fix a set union bug #2827

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

fix a set union bug #2827

wants to merge 1 commit into from

Conversation

cccclai
Copy link
Contributor

@cccclai cccclai commented Apr 3, 2024

Summary:
It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache

Differential Revision: D55680296

Copy link

pytorch-bot bot commented Apr 3, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/2827

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 1f15d1e with merge base be618c2 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 3, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55680296

cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Differential Revision: D55680296
@cccclai cccclai force-pushed the export-D55680296 branch from 495cfd0 to d4e6ccd Compare April 3, 2024 05:51
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55680296

cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Differential Revision: D55680296
cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Differential Revision: D55680296
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Reviewed By: angelayi

Differential Revision: D55680296
@cccclai cccclai force-pushed the export-D55680296 branch from d4e6ccd to 1f15d1e Compare April 3, 2024 18:01
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55680296

cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Reviewed By: angelayi

Differential Revision: D55680296
cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Reviewed By: angelayi

Differential Revision: D55680296
cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Reviewed By: angelayi

Differential Revision: D55680296
cccclai added a commit to cccclai/executorch-1 that referenced this pull request Apr 3, 2024
Summary:

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Reviewed By: angelayi

Differential Revision: D55680296
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in a25dea6.

kirklandsign pushed a commit to kirklandsign/executorch that referenced this pull request Apr 4, 2024
Summary:
Pull Request resolved: pytorch#2827

It's a bug, we need to keep union the original set with the new set from the for loop, however the original code just overwrite the original set. Test with coreml + llama code

```
python3 -m examples.models.llama2.export_llama --coreml --use_kv_cache
```

Reviewed By: angelayi

Differential Revision: D55680296

fbshipit-source-id: 906d508a276a91bd4b1caa295ce547e9686612ad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants