File size: 1,032 Bytes
a6b9854 24315fc a6b9854 f7bd6ce 0b69648 6306a19 0b69648 6306a19 0b69648 6306a19 0b69648 06f36dd 9361f12 06f36dd 9361f12 06f36dd 9361f12 06f36dd 0b69648 06f36dd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
language: python
tags: vae
license: apache-2.0
datasets: Fraser/python-lines
---
# T5-VAE-Python (flax)
A Transformer-VAE made using flax.
It has been trained to interpolate on lines of Python code form the [python-lines dataset](https://huggingface.co/datasets/Fraser/python-lines).
Done as part of Huggingface community training ([see forum post](https://discuss.huggingface.co/t/train-a-vae-to-interpolate-on-english-sentences/7548)).
Builds on T5, using an autoencoder to convert it into an MMD-VAE.
## How to use from the 🤗/transformers library
Add model repo as a submodule:
```bash
git submodule add https://github.com/Fraser-Greenlee/t5-vae-flax.git t5_vae_flax
```
```python
from transformers import AutoTokenizer
from t5_vae_flax.src.t5_vae import FlaxT5VaeForAutoencoding
tokenizer = AutoTokenizer.from_pretrained("t5-base")
model = FlaxT5VaeForAutoencoding.from_pretrained("flax-community/t5-vae-python")
```
## Setup
Run `setup_tpu_vm_venv.sh` to setup a virtual enviroment on a TPU VM for training.
|