mirror of
https://github.com/explosion/spaCy.git
synced 2025-10-26 13:41:21 +03:00
The model registry refactor of the Tok2Vec function broke loading models
trained with the previous function, because the model tree was slightly
different. Specifically, the new function wrote:
concatenate(norm, prefix, suffix, shape)
To build the embedding layer. In the previous implementation, I had used
the operator overloading shortcut:
( norm | prefix | suffix | shape )
This actually gets mapped to a binary association, giving something
like:
concatenate(norm, concatenate(prefix, concatenate(suffix, shape)))
This is a different tree, so the layers iterate differently and we
loaded the weights wrongly.
|
||
|---|---|---|
| .. | ||
| __init__.py | ||
| _wire.py | ||
| common.py | ||
| tok2vec.py | ||