You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when you generate new lemmatisations of text, do you use the uniqueTokens?
The corrections I'm compiling are not context- / text- specific but are just corrections to the unique tokens / morpheus output. For example: when verbs are lemmatised with the wrong preverb.
If I make pull requests against the uniqueTokens, will those improvements be picked up next time you run lemmatisation on actual texts? Or will they get overwritten unless I also make a pull request against the Morpheus data?
I am compiling a list of lemmatisation errors, mostly through applying my own rule-based morphology parsing.
What should the process be for submitting corrections?
The text was updated successfully, but these errors were encountered: