Skip to content
#

multiclass-classification

Here are 566 public repositories matching this topic...

In this particular project, we have used a pre-trained model to predict our text known as BERT. BERT is an open-source ML framework for Natural Language Processing. BERT stands for Bidirectional Encoder Representations and is a pre-trained model from Google known for producing state-of-the-art results in a wide variety of NLP tasks.

  • Updated Dec 4, 2023

Improve this page

Add a description, image, and links to the multiclass-classification topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the multiclass-classification topic, visit your repo's landing page and select "manage topics."

Learn more