VAE latent after normalization not in N(0,1)

#2
by Eyalgut - opened

Usually when training the latent given from vae.encode it is first normalized to ~N(0,1)
latents = (latents - vae.config.shift_factor) * vae.config.scaling_factor
At least that is what happens on sd3 vae(and probably sdxl also).

Here the latent after normalization is ~N(0.014,0.16^2), not even close to N(0,1).
Isn't the normalization meant to be ~N(0,1) as in SD3 vae ?

*** Verified on imagenet resized to 128,128 and normalized to [-1,1] as usual before vae.encode

Eyalgut changed discussion status to closed
Eyalgut changed discussion title from VAE Normalization to VAE after normalization not in N(0,1)
Eyalgut changed discussion title from VAE after normalization not in N(0,1) to VAE latent after normalization not in N(0,1)
Eyalgut changed discussion status to open

Sign up or log in to comment