Alirector-bart
Introduction
The bart-large model checkpoint released from paper: Alirector: Alignment-Enhanced Chinese Grammatical Error Corrector
github: https://github.com/yanghh2000/Alirector
Usage
Please follow our github to install the required python packages first.
And then use like this:
from transformers import BartForConditionalGeneration, BertTokenizer
import torch
model_path = "yanghh7/Alirector-bart"
tokenizer = BertTokenizer.from_pretrained(model_path)
model = BartForConditionalGeneration.from_pretrained(model_path).cuda()
while True:
source = input("输入句子:")
model_inputs = tokenizer(
source,
return_tensors='pt',
return_token_type_ids=False,
).to(model.device)
with torch.no_grad():
output = model.generate(**model_inputs,)
response = tokenizer.batch_decode(output.detach().cpu(), skip_special_tokens=True)[0]
response = response.replace(' ', '')
print(response)
Citation
@inproceedings{yang-quan-2024-alirector,
title = "Alirector: Alignment-Enhanced {C}hinese Grammatical Error Corrector",
author = "Yang, Haihui and Quan, Xiaojun",
booktitle = "Findings of the Association for Computational Linguistics: ACL 2024",
year = "2024",
}
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for yanghh7/Alirector-bart
Base model
OpenMOSS-Team/bart-large-chinese