MolCrawl/protein_sequence
Collection
10 items • Updated • 1
GPT-2 small (124M parameters) foundation model pre-trained on protein amino acid sequences from the MolCrawl dataset.
from transformers import AutoModelForMaskedLM, AutoTokenizer
import torch
model = AutoModelForMaskedLM.from_pretrained("kojima-lab/molcrawl-protein-sequence-bert-small")
tokenizer = AutoTokenizer.from_pretrained("kojima-lab/molcrawl-protein-sequence-bert-small")
# Predict masked amino acid
# Use tokenizer.mask_token instead of hardcoded "[MASK]":
# BERT-style tokenizers vary ("[MASK]", "<mask>", etc.)
if tokenizer.mask_token is None:
raise ValueError("This tokenizer has no mask_token; masked LM inference is not supported.")
prompt = "MKTAYIAK{MASK}RQISFVKSHFSRQ".replace("{MASK}", tokenizer.mask_token)
inputs = tokenizer(prompt, return_tensors="pt")
mask_index = (inputs["input_ids"] == tokenizer.mask_token_id).nonzero(as_tuple=True)[1]
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
predicted_token_id = logits[0, mask_index].argmax(dim=-1)
predicted_token = tokenizer.decode(predicted_token_id)
result = prompt.replace(tokenizer.mask_token, predicted_token)
print(f"Predicted: {result}")
Training pipeline, configuration files, and data preparation scripts are available in the MolCrawl GitHub repository: https://github.com/mmai-framework-lab/MolCrawl
This model is released under the APACHE-2.0 license.
If you use this model, please cite:
@misc{molcrawl_protein_sequence_bert_small,
title={molcrawl-protein-sequence-bert-small},
author={{RIKEN}},
year={2026},
publisher={{Hugging Face}},
url={{https://huggingface.co/kojima-lab/molcrawl-protein-sequence-bert-small}}
}