Skip to content

Conversation

@JustinTong0323
Copy link
Contributor

@JustinTong0323 JustinTong0323 commented Mar 21, 2025

What does this PR do?

This PR adds support for the original_max_position_embeddings parameter in YARN rope scaling configurations, addressing compatibility issues with Qwen-32B series models.

The Qwen team requires this parameter for their YARN implementation in Qwen-32B models, ref: link.

Previously, transformers would raise warnings about unrecognized keys despite this being a valid configuration parameter:

"rope_scaling": {
    "factor": 4.0,
    "original_max_position_embeddings": 32768,  # Previously unrecognized
    "type": "yarn"
}

Impact of this PR:

  • Eliminates spurious warnings for Qwen-32B users
  • Enables proper configuration validation for YARN-based models

This PR could also solves downstream issues:
sgl-project/sglang#4145
vllm-project/vllm#10293

Clarify #33783

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@github-actions github-actions bot marked this pull request as draft March 21, 2025 09:18
@github-actions
Copy link
Contributor

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. When it is ready for review, please click the Ready for review button (at the bottom of the PR page).

@JustinTong0323 JustinTong0323 marked this pull request as ready for review March 21, 2025 09:18
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure ! Thanks for finding this 🤗

@ArthurZucker ArthurZucker merged commit e28be7a into huggingface:main Mar 24, 2025
19 of 21 checks passed
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request May 14, 2025
…ional keys (huggingface#36877)

[fix] Update optional keys in _validate_yarn_parameters to include original_max_position_embeddings
soghomon-b pushed a commit to soghomon-b/transformers that referenced this pull request Aug 24, 2025
…ional keys (huggingface#36877)

[fix] Update optional keys in _validate_yarn_parameters to include original_max_position_embeddings
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants