The transition_parser
in Spacy
is not compatible with the use of cuda for inference
#13462
Labels
transition_parser
in Spacy
is not compatible with the use of cuda for inference
#13462
I am facing an issue where am trying to run a spacy based pipeline, using the
en_core_web_trf:3.7.3
model, whereby thetransition_parser
seems to be placing tensors on cpu instead of the gpu as can be seen in the logs below:I tried multiple fixes, such as using
torch.set_default_device("cuda:0")
, andtorch.set_default_dtype
, but this doesn't seem to be working.How to reproduce the behaviour
This error is encountered using the model in an MLServer deployment. It is a bit difficult to provide reproduction code here.
Your Environment
The text was updated successfully, but these errors were encountered: