Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

Error in Triton implementation

#9
by narenzen - opened

Using the below configuration i loaded the instruct model

config = transformers.AutoConfig.from_pretrained(
  'mosaicml/mpt-7b-instruct',
  trust_remote_code=True
)
config.attn_config['attn_impl'] = 'triton'

model = transformers.AutoModelForCausalLM.from_pretrained(
  'mosaicml/mpt-7b-instruct',
  config=config,
  torch_dtype=torch.bfloat16,
  trust_remote_code=True
)
model.to(device='cuda:0')

But I got error:
TypeError: dot() got an unexpected keyword argument 'trans_b'

You likely have an incompatible version of something. Please try the versions here: https://github.com/mosaicml/llm-foundry/blob/5fe01bcceb146d2a64d3b595c243d55fa7af9c70/setup.py#L74-L77

Closing as stale.

We've added a requirements.txt file as of this PR: https://huggingface.co/mosaicml/mpt-7b-instruct/discussions/41

abhi-mosaic changed discussion status to closed

Sign up or log in to comment