Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU type 3 in Python interface #588

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

janden
Copy link
Collaborator

@janden janden commented Oct 28, 2024

Adds a type 3 interface in Python for the GPU code.

@ahbarnett
Copy link
Collaborator

From Joakim email:

I've been able to get a type 3 GPU interface working in Python, including tests (docs still have to be updated). This is all in PR #588. However, some of the tests fail in an interesting way: I get very large values (e.g. on the order of 1e35) for the output when running in single precision and with more than 4000 points (for input and output) in 2D and 3D. However, this only happens if I've previously run a similar transform in double precision (a few times). I've tried to reproduce this with the C++ interface but have no luck so far, so it might be something in the Python layer. At any rate I will continue to investigate and report back.

Best,

Joakim

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants