-
Notifications
You must be signed in to change notification settings - Fork 28.5k
Issues: huggingface/transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
https://huggingface.co/hf-internal-testing need to be converted to safetensors
#37296
opened Apr 4, 2025 by
sfc-gh-sbekman
[i18n-<Transformers>] Translating docs to <Tibetain>
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#37293
opened Apr 4, 2025 by
OSUer600
10 tasks
Add support for higher jax and flax version
Feature request
Request for a new feature
Flax
#37262
opened Apr 3, 2025 by
rxng8
Incorrect word timestamps and word repetitions with Whisper-Large-v3-turbo model
bug
#37248
opened Apr 3, 2025 by
Asma-droid
1 of 4 tasks
Inconsistent results between torch and jax versions of DINOv2
bug
Flax
#37246
opened Apr 3, 2025 by
MasterXiong
2 of 4 tasks
TypeError: 'NoneType' object cannot be interpreted as an integer
bug
#37242
opened Apr 3, 2025 by
20040206dfhDFH
4 tasks
ed_video = input_tokens.index(video_token_id, st) ValueError: 151656 is not in list
bug
#37240
opened Apr 3, 2025 by
yaomingzhang
4 tasks
opencv imshow stuck forever when importing transformer
bug
#37239
opened Apr 3, 2025 by
leemengwei
4 tasks
transformers has no attribute TFFlorence2ForConditionalGeneration
#37235
opened Apr 3, 2025 by
wuchaotao
Why does
transformers
load FA2 when it's not asked to do so?
bug
#37227
opened Apr 3, 2025 by
sfc-gh-sbekman
Qwen fails ungracefully when images are truncated
bug
#37222
opened Apr 2, 2025 by
gbarello-uipath
2 of 4 tasks
RecurrentGemma crashes during inference for inputs longer than sliding window width
bug
#37219
opened Apr 2, 2025 by
assafbk
1 of 4 tasks
Proposal for Optimizing transformers.BertTokenizerFast
Feature request
Request for a new feature
#37200
opened Apr 2, 2025 by
springkim
torch.compile graph break when tuning llama with FA2
bug
#37199
opened Apr 2, 2025 by
SilverSoldier
2 of 4 tasks
Different limits for saving only model weights and saving full checkpoints
Feature request
Request for a new feature
#37195
opened Apr 2, 2025 by
Tim-Siu
Bug when using StaticCache in Qwen2.5 Inference with custom inputs_embeds and attention_masks
bug
#37189
opened Apr 2, 2025 by
matthewdm0816
2 of 4 tasks
Previous Next
ProTip!
Find all open issues with in progress development work with linked:pr.