Skip to content

Conversation

@qubvel
Copy link
Contributor

@qubvel qubvel commented Apr 9, 2025

What does this PR do?

Continue removing the return_dict attribute from the modeling code.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Comment on lines -1456 to -1458
output_attentions: Optional[bool] = None,
output_hidden_states: Optional[bool] = None,
return_dict: Optional[bool] = None,
Copy link
Contributor Author

@qubvel qubvel Apr 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not super related to the PR, but I removed redundant kwargs for get_text_features/get_image_features, which actually has no effect because we return a tensor (not Output) anyway, might be worth adding 🚨🚨🚨 for the PR

Comment on lines +1456 to +1457
output_attentions=False,
output_hidden_states=False,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

passing False explicitly here instead (see above comment)

if is_torch_available()
else {}
)
fx_compatible = True
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FX compatibility is broken for BERT, ELECTRA, MobileBERT, and RoBERTa. I'm not sure how critical this is. As far as I understand, we dropped FX compatibility for text models previously by adding **kwargs.

It's a bit strange that FX compatibility is not broken for CLIP, even though it also has the @can_return_tuple decorator.

@qubvel qubvel requested a review from molbap April 18, 2025 16:42
@qubvel qubvel marked this pull request as ready for review April 18, 2025 16:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants