small fix

This commit is contained in:
svlandeg 2020-08-27 19:56:52 +02:00
parent 8cde6ccb7d
commit aa9e0c9c39

View File

@ -323,11 +323,11 @@ for details and system requirements.
Load and wrap a transformer model from the Load and wrap a transformer model from the
[HuggingFace `transformers`](https://huggingface.co/transformers) library. You [HuggingFace `transformers`](https://huggingface.co/transformers) library. You
can any transformer that has pretrained weights and a PyTorch implementation. can use any transformer that has pretrained weights and a PyTorch
The `name` variable is passed through to the underlying library, so it can be implementation. The `name` variable is passed through to the underlying library,
either a string or a path. If it's a string, the pretrained weights will be so it can be either a string or a path. If it's a string, the pretrained weights
downloaded via the transformers library if they are not already available will be downloaded via the transformers library if they are not already
locally. available locally.
In order to support longer documents, the In order to support longer documents, the
[TransformerModel](/api/architectures#TransformerModel) layer allows you to pass [TransformerModel](/api/architectures#TransformerModel) layer allows you to pass