KeyError in Inference Example

#1
by nvriese - opened

KeyError: 'grounding-dino' encountered when running the inference example. It appears 'grounding-dino' is missing from CONFIG_MAPPING for the AutoModelForZeroShotObjectDetection class which propagates a ValueError. For reference I'm running the latest version of transformers (4.39.3) with python 3.10.

Full Traceback below:
```

KeyError Traceback (most recent call last)
File ./python3.10/site-packages/transformers/models/auto/configuration_auto.py:1155, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1154 try:
-> 1155 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1156 except KeyError:

File ./python3.10/site-packages/transformers/models/auto/configuration_auto.py:852, in _LazyConfigMapping.getitem(self, key)
851 if key not in self._mapping:
--> 852 raise KeyError(key)
853 value = self._mapping[key]

KeyError: 'grounding-dino'

During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
Cell In[3], line 11
8 device = 'mps'
10 processor = AutoProcessor.from_pretrained(model_id)
---> 11 model = AutoModelForZeroShotObjectDetection.from_pretrained(model_id).to(device)
13 image_url = "http://images.cocodataset.org/val2017/000000039769.jpg"
14 image = Image.open(requests.get(image_url, stream=True).raw)

File ./python3.10/site-packages/transformers/models/auto/auto_factory.py:523, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
520 if kwargs.get("quantization_config", None) is not None:
521 _ = kwargs.pop("quantization_config")
--> 523 config, kwargs = AutoConfig.from_pretrained(
524 pretrained_model_name_or_path,
525 return_unused_kwargs=True,
526 trust_remote_code=trust_remote_code,
527 code_revision=code_revision,
528 _commit_hash=commit_hash,
529 **hub_kwargs,
530 **kwargs,
531 )
533 # if torch_dtype=auto was passed here, ensure to pass it on
534 if kwargs_orig.get("torch_dtype", None) == "auto":

File ./python3.10/site-packages/transformers/models/auto/configuration_auto.py:1157, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1155 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1156 except KeyError:
-> 1157 raise ValueError(
1158 f"The checkpoint you are trying to load has model type {config_dict['model_type']} "
1159 "but Transformers does not recognize this architecture. This could be because of an "
1160 "issue with the checkpoint, or because your version of Transformers is out of date."
1161 )
1162 return config_class.from_dict(config_dict, **unused_kwargs)
1163 else:
1164 # Fallback: use pattern matching on the string.
1165 # We go from longer names to shorter names to catch roberta before bert (for instance)

ValueError: The checkpoint you are trying to load has model type grounding-dino but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.


@nvriese For now, you need to install the main branch of the transformers library to have access to GroundingDino. Probably on the next release of the library it will be available through pip install

Excellent, thank you both for the update and work on this, works seamlessly!

nvriese changed discussion status to closed

Sign up or log in to comment