Freakdivi β BERT Query Router
Model Description
A BERT-based sequence classification model that routes natural-language queries into predefined categories.
The model encodes each query with bert-base-uncased and feeds the [CLS] embedding to a scikit-learn MLP classifier.
This repository contains:
mlp_query_classifier.joblibβ trained MLP classifierscaler_query_classifier.joblibβ feature scaler used on BERT embeddingslabel_encoder_query_classifier.joblibβ maps class indices β string labelsinference.pyβ handler used by Hugging Face Inference Endpoints
β οΈ TODO: Replace the task + label descriptions below with your actual ones.
Task
Multi-class text classification / query routing
Given an input query, the model predicts one of N categories, such as:
| ID | Label | Description |
|---|---|---|
| 0 | LABEL_0 π |
TODO: short description of label 0 |
| 1 | LABEL_1 π |
TODO: short description of label 1 |
| 2 | LABEL_2 π |
TODO: short description of label 2 |
| 3 | LABEL_3 π |
TODO: add/remove rows as needed |
You can get the exact list of labels by checking the label_encoder_query_classifier.joblib in code: