Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
Run inference on multiple GPUs
#47
by
itsrocchi
- opened
i duplicated the space to run it on an rtx a6000, works just fine if i select 7b or 13b models but 48 gigs of vRAM are not enough for the 70B model. How can i edit the model.py script to run it on 2-3 or more GPUs? (i can add on my machine more a6000)
itsrocchi
changed discussion status to
closed