Skip to content

Conversation

@bvantuan
Copy link
Contributor

What does this PR do?

Fixes #39004
Fixes key mapping for VLMs when they are extended by a custom subclass.

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@zucchini-nlp

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, indeed it doesn't handle when users inherit from VLM classes. Can you also update a similar line in self.save_pretrained?

@bvantuan
Copy link
Contributor Author

Hi @zucchini-nlp ! Thank you for reviewing my PR. Yes, I’ve also updated the corresponding line in save_pretrained 👍.

@zucchini-nlp zucchini-nlp added the for patch Tag issues / labels that should be included in the next patch label Jul 1, 2025
@zucchini-nlp zucchini-nlp enabled auto-merge (squash) July 1, 2025 06:56
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@zucchini-nlp zucchini-nlp disabled auto-merge July 1, 2025 07:47
@zucchini-nlp zucchini-nlp merged commit d53518c into huggingface:main Jul 1, 2025
21 checks passed
Cyrilvallez pushed a commit that referenced this pull request Jul 4, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
@ManuelFay
Copy link
Contributor

Insane PR, thanks a lot @bvantuan, wasn't aware of mro[:-1] when I proposed the key mapping patch !

zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* fix key mapping for VLMs

* use __mro__ instead

* update key mapping in save_pretrained
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

for patch Tag issues / labels that should be included in the next patch

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Warning when load pretrained model for qwen2-VL-1.5B-Instruct.

4 participants