Skip to content

feat: add compatibility with transformers V5#1563

Merged
hanhainebula merged 2 commits intoFlagOpen:masterfrom
therealmichaelberna:transformers-v5-compat
Mar 10, 2026
Merged

feat: add compatibility with transformers V5#1563
hanhainebula merged 2 commits intoFlagOpen:masterfrom
therealmichaelberna:transformers-v5-compat

Conversation

@therealmichaelberna
Copy link

@therealmichaelberna therealmichaelberna commented Mar 5, 2026

Created alternative shim method to determine if torch is available and accompanying pytest unit tests.
Will allow FlagEmbedding to work with transformers version 5 now.

Resolves: #1561

@therealmichaelberna therealmichaelberna changed the title feat: add compatibility with transformers V5 Draft: feat: add compatibility with transformers V5 Mar 5, 2026
@therealmichaelberna therealmichaelberna changed the title Draft: feat: add compatibility with transformers V5 feat: add compatibility with transformers V5 Mar 9, 2026
@therealmichaelberna
Copy link
Author

Would like to request review and approval
@hanhainebula Has been tested by myself and tomasguillen.

@hanhainebula
Copy link
Collaborator

Hello, @therealmichaelberna. Thanks for your PR. I will merge it.

@hanhainebula hanhainebula merged commit dbc6005 into FlagOpen:master Mar 10, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

huggingface transformers has removed the is_torch_fx_available function

2 participants