mirror of
https://github.com/explosion/spaCy.git
synced 2025-12-04 00:34:27 +03:00
While developing v2.1, I ran a bunch of hyper-parameter search experiments to find settings that performed well for spaCy's NER and parser. I ended up changing the default Adam settings from beta1=0.9, beta2=0.999, eps=1e-8 to beta1=0.8, beta2=0.8, eps=1e-5. This was giving a small improvement in accuracy (like, 0.4%). Months later, I run the models with Prodigy, which uses beam-search decoding even when the model has been trained with a greedy objective. The new models performed terribly...So, wtf? After a couple of days debugging, I figured out that the new optimizer settings was causing the model to converge to solutions where the top-scoring class often had a score of like, -80. The variance on the weights had gone up enormously. I guess I needed to update the L2 regularisation as well? Anyway. Let's just revert the change --- if the optimizer is finding such extreme solutions, that seems bad, and not nearly worth the small improvement in accuracy. Currently training a slate of models, to verify the accuracy change is minimal. Once the training is complete, we can merge this. <!--- Provide a general summary of your changes in the title. --> ## Description <!--- Use this section to describe your changes. If your changes required testing, include information about the testing environment and the tests you ran. If your test fixes a bug reported in an issue, don't forget to include the issue number. If your PR is still a work in progress, that's totally fine – just include a note to let us know. --> ### Types of change <!-- What type of change does your PR cover? Is it a bug fix, an enhancement or new feature, or a change to the documentation? --> ## Checklist <!--- Before you submit the PR, go over this checklist and make sure you can tick off all the boxes. [] -> [x] --> - [x] I have submitted the spaCy Contributor Agreement. - [x] I ran the tests, and all new and existing tests passed. - [x] My changes don't require a change to the documentation, or if they do, I've added all required information. |
||
|---|---|---|
| .. | ||
| cli | ||
| data | ||
| displacy | ||
| lang | ||
| matcher | ||
| pipeline | ||
| syntax | ||
| tests | ||
| tokens | ||
| __init__.pxd | ||
| __init__.py | ||
| __main__.py | ||
| _align.pyx | ||
| _ml.py | ||
| about.py | ||
| attrs.pxd | ||
| attrs.pyx | ||
| compat.py | ||
| errors.py | ||
| glossary.py | ||
| gold.pxd | ||
| gold.pyx | ||
| language.py | ||
| lemmatizer.py | ||
| lexeme.pxd | ||
| lexeme.pyx | ||
| morphology.pxd | ||
| morphology.pyx | ||
| parts_of_speech.pxd | ||
| parts_of_speech.pyx | ||
| scorer.py | ||
| strings.pxd | ||
| strings.pyx | ||
| structs.pxd | ||
| symbols.pxd | ||
| symbols.pyx | ||
| tokenizer.pxd | ||
| tokenizer.pyx | ||
| typedefs.pxd | ||
| typedefs.pyx | ||
| util.py | ||
| vectors.pyx | ||
| vocab.pxd | ||
| vocab.pyx | ||