Assuming you have a working python environment, you can simply install it using
pip install spacy_fastlang
The library exports a pipeline component called language_detector
that will set two spacy extensions
- doc._.language = ISO code of the detected language or
xx
as a fallback - doc._.language_score = confidence
import spacy_fastlang # noqa: F401 # pylint: disable=unused-import
nlp = spacy.load("...")
nlp.add_pipe("language_detector")
doc = nlp(en_text)
doc._.language == "..."
doc._.language_score >= ...
Check the tests to see more examples and available options
Everythin is under MIT
except the default model which is distributed under Creative Commons Attribution-Share-Alike License 3.0 by facebook here