Add ONNX export w/ `optimum-onnx`
#3
by
alvarobartt
HF Staff
- opened
Description
This PR adds the ONNX weights exported with optimum-onnx as:
from optimum.onnxruntime import ORTModelForSequenceClassification
model = ORTModelForSequenceClassification.from_pretrained("Roblox/roblox-pii-classifier", export=True)
model.save_pretrained("onnx/")
To be later on used via Python as follows (amongst many other options leveraging ONNX):
from transformers import AutoTokenizer, pipeline
from optimum.onnxruntime import ORTModelForSequenceClassification
model = ORTModelForSequenceClassification.from_pretrained("Roblox/roblox-pii-classifier", revision="refs/pr/3", subfolder="onnx/")
tokenizer = AutoTokenizer.from_pretrained("Roblox/roblox-pii-classifier", revision="refs/pr/3")
pipe = pipeline("text-classification", model=model, tokenizer=tokenizer)
print(pipe("He never went out without a book under his arm"))
Thanks @jasonxie-rblx π€
xbian-rblx
changed pull request status to
merged