molcrawl-protein-sequence-bert-small

Model Description

GPT-2 small (124M parameters) foundation model pre-trained on protein amino acid sequences from the MolCrawl dataset.

  • Model Type: bert
  • Data Type: Protein
  • Training Date: 2026-04-24

Usage

from transformers import AutoModelForMaskedLM, AutoTokenizer
import torch

model = AutoModelForMaskedLM.from_pretrained("kojima-lab/molcrawl-protein-sequence-bert-small")
tokenizer = AutoTokenizer.from_pretrained("kojima-lab/molcrawl-protein-sequence-bert-small")

# Predict masked amino acid
# Use tokenizer.mask_token instead of hardcoded "[MASK]":
# BERT-style tokenizers vary ("[MASK]", "<mask>", etc.)
if tokenizer.mask_token is None:
    raise ValueError("This tokenizer has no mask_token; masked LM inference is not supported.")
prompt = "MKTAYIAK{MASK}RQISFVKSHFSRQ".replace("{MASK}", tokenizer.mask_token)
inputs = tokenizer(prompt, return_tensors="pt")
mask_index = (inputs["input_ids"] == tokenizer.mask_token_id).nonzero(as_tuple=True)[1]

with torch.no_grad():
    outputs = model(**inputs)
logits = outputs.logits

predicted_token_id = logits[0, mask_index].argmax(dim=-1)
predicted_token = tokenizer.decode(predicted_token_id)
result = prompt.replace(tokenizer.mask_token, predicted_token)
print(f"Predicted: {result}")

Source Code

Training pipeline, configuration files, and data preparation scripts are available in the MolCrawl GitHub repository: https://github.com/mmai-framework-lab/MolCrawl

License

This model is released under the APACHE-2.0 license.

Citation

If you use this model, please cite:

@misc{molcrawl_protein_sequence_bert_small,
  title={molcrawl-protein-sequence-bert-small},
  author={{RIKEN}},
  year={2026},
  publisher={{Hugging Face}},
  url={{https://huggingface.co/kojima-lab/molcrawl-protein-sequence-bert-small}}
}
Downloads last month
68
Safetensors
Model size
86.5M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kojima-lab/molcrawl-protein-sequence-bert-small