spaCy/spacy/ml
Matthew Honnibal 9e210fa7fd
Fix tok2vec structure after model registry refactor (#4549)
The model registry refactor of the Tok2Vec function broke loading models
trained with the previous function, because the model tree was slightly
different. Specifically, the new function wrote:

    concatenate(norm, prefix, suffix, shape)

To build the embedding layer. In the previous implementation, I had used
the operator overloading shortcut:

    ( norm | prefix | suffix | shape )

This actually gets mapped to a binary association, giving something
like:

    concatenate(norm, concatenate(prefix, concatenate(suffix, shape)))

This is a different tree, so the layers iterate differently and we
loaded the weights wrongly.
2019-10-28 23:59:03 +01:00
..
__init__.py Tidy up and auto-format 2019-10-28 12:43:55 +01:00
_wire.py Refactor Tok2Vec to use architecture registry (#4518) 2019-10-25 22:28:20 +02:00
common.py Tidy up and auto-format 2019-10-28 12:43:55 +01:00
tok2vec.py Fix tok2vec structure after model registry refactor (#4549) 2019-10-28 23:59:03 +01:00