Skip to content

Handle multiple memory IDs using pid #2974

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from

Conversation

skrtskrtfb
Copy link
Contributor

Summary:
Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, hsharma35

Differential Revision: D55494986

Copy link

pytorch-bot bot commented Apr 10, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/2974

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 55ec176 with merge base 0f5794e (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 10, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Apr 16, 2024
Summary:

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through 

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 23, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 25, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, hsharma35

Differential Revision: D55494986
skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 25, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 30, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, hsharma35

Differential Revision: D55494986
@Olivia-liu Olivia-liu self-requested a review July 30, 2024 20:45
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 30, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 30, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 31, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 31, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Jul 31, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

skrtskrtfb pushed a commit to skrtskrtfb/executorch that referenced this pull request Aug 2, 2024
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
skrtskrtfb and others added 3 commits August 5, 2024 15:09
Differential Revision: D55455168
Summary:
Pull Request resolved: pytorch#2974

Handle multiple memory IDs by dumping them into different processes in trace view. This solution seemed the simplest, and since the time stamps match between processes it should be fairly straightforward to look through

An alternate solution I attempted was to place different memory spaces across each other separated by some horizontal "space". However, it quickly became ugly/difficult to differentiate which allocation was on which memory space - I think the above solution is probably the easier one to read.

Reviewed By: kimishpatel, Olivia-liu, hsharma35

Differential Revision: D55494986
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D55494986

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in f52d8ab.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants