t5-vae-python / README.md
Fraser's picture
fix
9361f12
|
raw
history blame
1.03 kB
metadata
language: python
tags: vae
license: apache-2.0
datasets: Fraser/python-lines

T5-VAE-Python (flax)

A Transformer-VAE made using flax.

It has been trained to interpolate on lines of Python code form the python-lines dataset.

Done as part of Huggingface community training (see forum post).

Builds on T5, using an autoencoder to convert it into an MMD-VAE.

How to use from the 🤗/transformers library

Add model repo as a submodule:

git submodule add https://github.com/Fraser-Greenlee/t5-vae-flax.git t5_vae_flax
from transformers import AutoTokenizer
from t5_vae_flax.src.t5_vae import FlaxT5VaeForAutoencoding

tokenizer = AutoTokenizer.from_pretrained("t5-base")

model = FlaxT5VaeForAutoencoding.from_pretrained("flax-community/t5-vae-python")

Setup

Run setup_tpu_vm_venv.sh to setup a virtual enviroment on a TPU VM for training.