You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently working on creating a simple architecture using imitation learning. However, I found an issue using the following code to create my dataset. Specifically, when I use "data = next(self.scenario_generator)", I notice that GPU memory usage increases rapidly (1G --> 20G+). I’m not sure how to resolve this problem.
I would appreciate any help, Thanks!
Here is my dataprocess code, I tried to use some methods like this "torch.from_numpy(jax.device_get(log_trajectory.yaw))" to deal with the scenario data.
Hi, Very Nice Work!!
I'm currently working on creating a simple architecture using imitation learning. However, I found an issue using the following code to create my dataset. Specifically, when I use "data = next(self.scenario_generator)", I notice that GPU memory usage increases rapidly (1G --> 20G+). I’m not sure how to resolve this problem.
I would appreciate any help, Thanks!
Here is my dataprocess code, I tried to use some methods like this "torch.from_numpy(jax.device_get(log_trajectory.yaw))" to deal with the scenario data.
The text was updated successfully, but these errors were encountered: