Skip to content

[SYCL][ROCm] Fix context destruction on AMD #4104

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 15, 2021
Merged

Conversation

npmiller
Copy link
Contributor

What this piece of code is doing is making the context being destroyed
"current" by pushing it, so that it can be synchronized, and then
popping it.

However on AMD the synchronization is not supported, nor necessary, so
these steps can be skipped. In addition this was causing issues because
on AMD it uses a single context under the hood so hipCtxt == current
is always true, which means that this piece of code was popping the
context without pushing it first which would return an error code.

Simply skipping this step on AMD should be fine.

What this piece of code is doing is making the context being destroyed
"current" by pushing it, so that it can be synchronized, and then
popping it.

However on AMD the synchronization is not supported, nor necessary, so
these steps can be skipped. In addition this was causing issues because
on AMD it uses a single context under the hood so `hipCtxt == current`
is always true, which means that this piece of code was popping the
context without pushing it first which would return an error code.

Simply skipping this step on AMD should be fine.
@npmiller npmiller requested a review from smaslov-intel as a code owner July 14, 2021 12:16
@bader bader merged commit 6042d3a into intel:sycl Jul 15, 2021
@bader bader added the hip Issues related to execution on HIP backend. label Aug 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hip Issues related to execution on HIP backend.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants