Loading and Inferencing model on Multiple GPUs

#6
by V1shwanath - opened

When loading and inferencing with the model getting the following error.

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cuda:1!

Sign up or log in to comment