Yin Fang
commited on
Commit
·
63cb0f6
1
Parent(s):
51cb62c
Update README.md
Browse files
README.md
CHANGED
|
@@ -6,12 +6,15 @@ tags:
|
|
| 6 |
---
|
| 7 |
# MolGen
|
| 8 |
MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
|
|
|
|
| 9 |
## Model description
|
| 10 |
MolGen is the first pre-trained model that only produces chemically valid molecules.
|
| 11 |
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
|
| 12 |
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
|
| 13 |
Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
|
| 14 |
|
|
|
|
|
|
|
| 15 |
|
| 16 |
### BibTeX entry and citation info
|
| 17 |
```bibtex
|
|
|
|
| 6 |
---
|
| 7 |
# MolGen
|
| 8 |
MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
|
| 9 |
+
|
| 10 |
## Model description
|
| 11 |
MolGen is the first pre-trained model that only produces chemically valid molecules.
|
| 12 |
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
|
| 13 |
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
|
| 14 |
Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
|
| 15 |
|
| 16 |
+
## Intended uses & limitations
|
| 17 |
+
You can use the raw model for molecular generation or fine-tune it to a downstream task. See the [repository](https://github.com/zjunlp/MolGen) to look for fine-tune details on a task that interests you.
|
| 18 |
|
| 19 |
### BibTeX entry and citation info
|
| 20 |
```bibtex
|