An Empirical Comparison of Pre-Trained Models of Source Code
Paper
•
2302.04026
•
Published
•
1
This model is the unofficial HuggingFace version of "C-BERT" with just the masked language modeling head for pretraining. The weights come from "An Empirical Comparison of Pre-Trained Models of Source Code". Please cite the authors if you use this in an academic setting.