diff --git a/.github/contributors/bintay.md b/.github/contributors/bintay.md new file mode 100644 index 000000000..c64f2dba5 --- /dev/null +++ b/.github/contributors/bintay.md @@ -0,0 +1,106 @@ +# spaCy contributor agreement + +This spaCy Contributor Agreement (**"SCA"**) is based on the +[Oracle Contributor Agreement](http://www.oracle.com/technetwork/oca-405177.pdf). +The SCA applies to any contribution that you make to any product or project +managed by us (the **"project"**), and sets out the intellectual property rights +you grant to us in the contributed materials. The term **"us"** shall mean +[ExplosionAI GmbH](https://explosion.ai/legal). The term +**"you"** shall mean the person or entity identified below. + +If you agree to be bound by these terms, fill in the information requested +below and include the filled-in version with your first pull request, under the +folder [`.github/contributors/`](/.github/contributors/). The name of the file +should be your GitHub username, with the extension `.md`. For example, the user +example_user would create the file `.github/contributors/example_user.md`. + +Read this agreement carefully before signing. These terms and conditions +constitute a binding legal agreement. + +## Contributor Agreement + +1. The term "contribution" or "contributed materials" means any source code, +object code, patch, tool, sample, graphic, specification, manual, +documentation, or any other material posted or submitted by you to the project. + +2. With respect to any worldwide copyrights, or copyright applications and +registrations, in your contribution: + + * you hereby assign to us joint ownership, and to the extent that such + assignment is or becomes invalid, ineffective or unenforceable, you hereby + grant to us a perpetual, irrevocable, non-exclusive, worldwide, no-charge, + royalty-free, unrestricted license to exercise all rights under those + copyrights. This includes, at our option, the right to sublicense these same + rights to third parties through multiple levels of sublicensees or other + licensing arrangements; + + * you agree that each of us can do all things in relation to your + contribution as if each of us were the sole owners, and if one of us makes + a derivative work of your contribution, the one who makes the derivative + work (or has it made will be the sole owner of that derivative work; + + * you agree that you will not assert any moral rights in your contribution + against us, our licensees or transferees; + + * you agree that we may register a copyright in your contribution and + exercise all ownership rights associated with it; and + + * you agree that neither of us has any duty to consult with, obtain the + consent of, pay or render an accounting to the other for any use or + distribution of your contribution. + +3. With respect to any patents you own, or that you can license without payment +to any third party, you hereby grant to us a perpetual, irrevocable, +non-exclusive, worldwide, no-charge, royalty-free license to: + + * make, have made, use, sell, offer to sell, import, and otherwise transfer + your contribution in whole or in part, alone or in combination with or + included in any product, work or materials arising out of the project to + which your contribution was submitted, and + + * at our option, to sublicense these same rights to third parties through + multiple levels of sublicensees or other licensing arrangements. + +4. Except as set out above, you keep all right, title, and interest in your +contribution. The rights that you grant to us under these terms are effective +on the date you first submitted a contribution to us, even if your submission +took place before the date you sign these terms. + +5. You covenant, represent, warrant and agree that: + + * Each contribution that you submit is and shall be an original work of + authorship and you can legally grant the rights set out in this SCA; + + * to the best of your knowledge, each contribution will not violate any + third party's copyrights, trademarks, patents, or other intellectual + property rights; and + + * each contribution shall be in compliance with U.S. export control laws and + other applicable export and import laws. You agree to notify us if you + become aware of any circumstance which would make any of the foregoing + representations inaccurate in any respect. We may publicly disclose your + participation in the project, including the fact that you have signed the SCA. + +6. This SCA is governed by the laws of the State of California and applicable +U.S. Federal law. Any choice of law rules will not apply. + +7. Please place an “x” on one of the applicable statement below. Please do NOT +mark both statements: + + * [x] I am signing on behalf of myself as an individual and no other person + or entity, including my employer, has or will have rights with respect to my + contributions. + + * [ ] I am signing on behalf of my employer or a legal entity and I have the + actual authority to contractually bind that entity. + +## Contributor Details + +| Field | Entry | +|------------------------------- | -------------------- | +| Name | Ben Taylor | +| Company name (if applicable) | | +| Title or role (if applicable) | | +| Date | October 2, 2019 | +| GitHub username | bintay | +| Website (optional) | bentaylor.xyz | diff --git a/README.md b/README.md index a51f6ca86..18ec75b62 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,7 @@ It's commercial open-source software, released under the MIT license. 💫 **Version 2.2 out now!** [Check out the release notes here.](https://github.com/explosion/spaCy/releases) -[![Azure Pipelines]()](https://dev.azure.com/explosion-ai/public/_build?definitionId=8) +[![Azure Pipelines]()](https://dev.azure.com/explosion-ai/public/_build?definitionId=8) [![Travis Build Status]()](https://travis-ci.org/explosion/spaCy) [![Current Release Version](https://img.shields.io/github/release/explosion/spacy.svg?style=flat-square&logo=github)](https://github.com/explosion/spaCy/releases) [![pypi Version](https://img.shields.io/pypi/v/spacy.svg?style=flat-square&logo=pypi&logoColor=white)](https://pypi.org/project/spacy/) @@ -22,7 +22,7 @@ It's commercial open-source software, released under the MIT license. [![Python wheels](https://img.shields.io/badge/wheels-%E2%9C%93-4c1.svg?longCache=true&style=flat-square&logo=python&logoColor=white)](https://github.com/explosion/wheelwright/releases) [![PyPi downloads](https://img.shields.io/pypi/dm/spacy?style=flat-square&logo=pypi&logoColor=white)](https://pypi.org/project/spacy/) [![Conda downloads](https://img.shields.io/conda/dn/conda-forge/spacy?style=flat-square&logo=conda-forge&logoColor=white)](https://anaconda.org/conda-forge/spacy) -[![Model downloads](https://img.shields.io/github/downloads/explosion/spacy-models/total?style=flat-square&label=model+downloads)](https://github.com/explosion/spacy-models) +[![Model downloads](https://img.shields.io/github/downloads/explosion/spacy-models/total?style=flat-square&label=model+downloads)](https://github.com/explosion/spacy-models/releases) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/ambv/black) [![spaCy on Twitter](https://img.shields.io/twitter/follow/spacy_io.svg?style=social&label=Follow)](https://twitter.com/spacy_io) diff --git a/spacy/tests/vocab_vectors/test_vectors.py b/spacy/tests/vocab_vectors/test_vectors.py index 4226bca3b..4b2e171a6 100644 --- a/spacy/tests/vocab_vectors/test_vectors.py +++ b/spacy/tests/vocab_vectors/test_vectors.py @@ -277,9 +277,9 @@ def test_vocab_prune_vectors(): _ = vocab["dog"] # noqa: F841 _ = vocab["kitten"] # noqa: F841 data = numpy.ndarray((5, 3), dtype="f") - data[0] = 1.0 - data[1] = 2.0 - data[2] = 1.1 + data[0] = [1.0, 1.2, 1.1] + data[1] = [0.3, 1.3, 1.0] + data[2] = [0.9, 1.22, 1.05] vocab.set_vector("cat", data[0]) vocab.set_vector("dog", data[1]) vocab.set_vector("kitten", data[2]) diff --git a/spacy/vectors.pyx b/spacy/vectors.pyx index 3c238fe2d..75716617c 100644 --- a/spacy/vectors.pyx +++ b/spacy/vectors.pyx @@ -303,8 +303,8 @@ cdef class Vectors: self._unset.erase(self._unset.find(row)) return row - def most_similar(self, queries, *, batch_size=1024): - """For each of the given vectors, find the single entry most similar + def most_similar(self, queries, *, batch_size=1024, n=1, sort=True): + """For each of the given vectors, find the n most similar entries to it, by cosine. Queries are by vector. Results are returned as a `(keys, best_rows, @@ -314,15 +314,17 @@ cdef class Vectors: queries (ndarray): An array with one or more vectors. batch_size (int): The batch size to use. - RETURNS (tuple): The most similar entry as a `(keys, best_rows, scores)` + n (int): The number of entries to return for each query. + sort (bool): Whether to sort the n entries returned by score. + RETURNS (tuple): The most similar entries as a `(keys, best_rows, scores)` tuple. """ xp = get_array_module(self.data) vectors = self.data / xp.linalg.norm(self.data, axis=1, keepdims=True) - best_rows = xp.zeros((queries.shape[0],), dtype='i') - scores = xp.zeros((queries.shape[0],), dtype='f') + best_rows = xp.zeros((queries.shape[0], n), dtype='i') + scores = xp.zeros((queries.shape[0], n), dtype='f') # Work in batches, to avoid memory problems. for i in range(0, queries.shape[0], batch_size): batch = queries[i : i+batch_size] @@ -331,12 +333,19 @@ cdef class Vectors: # vectors e.g. (10000, 300) # sims e.g. (1024, 10000) sims = xp.dot(batch, vectors.T) - best_rows[i:i+batch_size] = sims.argmax(axis=1) - scores[i:i+batch_size] = sims.max(axis=1) + best_rows[i:i+batch_size] = xp.argpartition(sims, -n, axis=1)[:,-n:] + scores[i:i+batch_size] = xp.partition(sims, -n, axis=1)[:,-n:] + + if sort: + sorted_index = xp.arange(scores.shape[0])[:,None],xp.argsort(scores, axis=1)[:,::-1] + scores[i:i+batch_size] = scores[sorted_index] + best_rows[i:i+batch_size] = best_rows[sorted_index] + xp = get_array_module(self.data) row2key = {row: key for key, row in self.key2row.items()} keys = xp.asarray( - [row2key[row] for row in best_rows if row in row2key], dtype="uint64") + [[row2key[row] for row in best_rows[i] if row in row2key] + for i in range(len(queries)) ], dtype="uint64") return (keys, best_rows, scores) def from_glove(self, path): diff --git a/spacy/vocab.pyx b/spacy/vocab.pyx index c0d835553..67317a9ac 100644 --- a/spacy/vocab.pyx +++ b/spacy/vocab.pyx @@ -324,10 +324,10 @@ cdef class Vocab: syn_keys, syn_rows, scores = self.vectors.most_similar(toss, batch_size=batch_size) remap = {} for i, key in enumerate(keys[nr_row:]): - self.vectors.add(key, row=syn_rows[i]) + self.vectors.add(key, row=syn_rows[i][0]) word = self.strings[key] - synonym = self.strings[syn_keys[i]] - score = scores[i] + synonym = self.strings[syn_keys[i][0]] + score = scores[i][0] remap[word] = (synonym, score) link_vectors_to_models(self) return remap diff --git a/website/docs/api/vectors.md b/website/docs/api/vectors.md index ae62d8cfc..3588672db 100644 --- a/website/docs/api/vectors.md +++ b/website/docs/api/vectors.md @@ -303,6 +303,29 @@ vectors, they will be counted individually. | ----------- | ---- | ------------------------------------ | | **RETURNS** | int | The number of all keys in the table. | +## Vectors.most_similar {#most_similar tag="method"} + +For each of the given vectors, find the `n` most similar entries to it, by +cosine. Queries are by vector. Results are returned as a +`(keys, best_rows, scores)` tuple. If `queries` is large, the calculations are +performed in chunks, to avoid consuming too much memory. You can set the +`batch_size` to control the size/space trade-off during the calculations. + +> #### Example +> +> ```python +> queries = numpy.asarray([numpy.random.uniform(-1, 1, (300,))]) +> most_similar = nlp.vectors.most_similar(queries, n=10) +> ``` + +| Name | Type | Description | +| ------------ | --------- | ------------------------------------------------------------------ | +| `queries` | `ndarray` | An array with one or more vectors. | +| `batch_size` | int | The batch size to use. Default to `1024`. | +| `n` | int | The number of entries to return for each query. Defaults to `1`. | +| `sort` | bool | Whether to sort the entries returned by score. Defaults to `True`. | +| **RETURNS** | tuple | The most similar entries as a `(keys, best_rows, scores)` tuple. | + ## Vectors.from_glove {#from_glove tag="method"} Load [GloVe](https://nlp.stanford.edu/projects/glove/) vectors from a directory.