You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PyTorch has recently released tensordict, which provides a series of utilities for efficiently process batches of heterogeneous tensor types.
It makes it easy to index multiple tensors at the same time, manipulate their shape, split them etc. It also allows to work efficiently with nested data structures. Have a look at the README, the doc and tutorial.
For context, ysing TensorDict with replay buffers in TorchRL provided us with at least an order of magnitude speed-up during data collection.
Would it be possible to consider a version of the dataloader that would output a TensorDict, instead of a dictionary as in the tutorial?
The text was updated successfully, but these errors were encountered:
vmoens
changed the title
[Feature Request] Make torch dataloader support TensorDict
[Feature Request] Make the torch dataloader support TensorDict
Nov 25, 2022
PyTorch has recently released tensordict, which provides a series of utilities for efficiently process batches of heterogeneous tensor types.
It makes it easy to index multiple tensors at the same time, manipulate their shape, split them etc. It also allows to work efficiently with nested data structures. Have a look at the README, the doc and tutorial.
For context, ysing TensorDict with replay buffers in TorchRL provided us with at least an order of magnitude speed-up during data collection.
Would it be possible to consider a version of the dataloader that would output a TensorDict, instead of a dictionary as in the tutorial?
The text was updated successfully, but these errors were encountered: