url stringlengths 62 66 | repository_url stringclasses 1
value | labels_url stringlengths 76 80 | comments_url stringlengths 71 75 | events_url stringlengths 69 73 | html_url stringlengths 50 56 | id int64 377M 2.15B | node_id stringlengths 18 32 | number int64 1 29.2k | title stringlengths 1 487 | user dict | labels list | state stringclasses 2
values | locked bool 2
classes | assignee dict | assignees list | comments list | created_at int64 1.54k 1.71k | updated_at int64 1.54k 1.71k | closed_at int64 1.54k 1.71k ⌀ | author_association stringclasses 4
values | active_lock_reason stringclasses 2
values | body stringlengths 0 234k ⌀ | reactions dict | timeline_url stringlengths 71 75 | state_reason stringclasses 3
values | draft bool 2
classes | pull_request dict |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/29061 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29061/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29061/comments | https://api.github.com/repos/huggingface/transformers/issues/29061/events | https://github.com/huggingface/transformers/pull/29061 | 2,138,735,693 | PR_kwDOCUB6oc5nGBVu | 29,061 | Fix trainer test wrt DeepSpeed + auto_find_bs | {
"login": "muellerzr",
"id": 7831895,
"node_id": "MDQ6VXNlcjc4MzE4OTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/7831895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/muellerzr",
"html_url": "https://github.com/muellerzr",
"followers_url": "https://api.github.com/users/mu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29061). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"cc @amyeroberts good for review now that I verified tests pass :)"
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
Follow-up to https://github.com/huggingface/transformers/pull/29057, changes the test to ensure it raises a not-implemented-error
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contribu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29061/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29061/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29061",
"html_url": "https://github.com/huggingface/transformers/pull/29061",
"diff_url": "https://github.com/huggingface/transformers/pull/29061.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29061.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29060 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29060/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29060/comments | https://api.github.com/repos/huggingface/transformers/issues/29060/events | https://github.com/huggingface/transformers/issues/29060 | 2,138,554,275 | I_kwDOCUB6oc5_d7-j | 29,060 | Request for Flash Attention 2.0 Support in GPNRoFormerForMaskedLM | {
"login": "YBoulaimen",
"id": 157366664,
"node_id": "U_kgDOCWE5iA",
"avatar_url": "https://avatars.githubusercontent.com/u/157366664?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YBoulaimen",
"html_url": "https://github.com/YBoulaimen",
"followers_url": "https://api.github.com/users/YBo... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
},
{
"id": 6202871275,
... | open | false | null | [] | [
"Hi @YBoulaimen, thanks for opening this request! \r\n\r\nThe model is defined and maintained under this repo: https://github.com/songlab-cal/gpn/blob/main/gpn/model.py\r\n\r\nI suggest opening a request there. "
] | 1,708 | 1,708 | null | NONE | null | Hello,
I trust this message finds you well. I am currently attempting to run the GPN-MSA model, which utilizes AutoModelForMaskedLM, and I am keen on parallelizing the computation across multiple GPUs. To optimize the model's performance, I would like to request the integration of Flash Attention 2.0 support into GP... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29060/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29060/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29059 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29059/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29059/comments | https://api.github.com/repos/huggingface/transformers/issues/29059/events | https://github.com/huggingface/transformers/issues/29059 | 2,138,513,299 | I_kwDOCUB6oc5_dx-T | 29,059 | Transformers trainer: All checkpoint restarts now FAILING | {
"login": "whr778",
"id": 5939523,
"node_id": "MDQ6VXNlcjU5Mzk1MjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/5939523?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/whr778",
"html_url": "https://github.com/whr778",
"followers_url": "https://api.github.com/users/whr778/foll... | [] | closed | false | null | [] | [
"Forgot the version info... sorry about that guys\r\ntransformers 4.37.2",
"Hello,\r\n\r\n`trainer_state.json` is saved when calling `_save_checkpoint` at the given line below which is after the model, optimizer and schedulers are saved.\r\n\r\nhttps://github.com/huggingface/transformers/blob/b2628086... | 1,708 | 1,708 | 1,708 | NONE | null | ### System Info
@muellerzr and @pacman100
In the trainer.py
Trainer code now requires trainer_state.json for checkpoint restarts
trainer.py does NOT save trainer_state.json in def _save_optimizer_and_scheduler(self, output_dir):
Recommend either removing the trainer_state.json dependency for checkpoint restar... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29059/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29059/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29058 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29058/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29058/comments | https://api.github.com/repos/huggingface/transformers/issues/29058/events | https://github.com/huggingface/transformers/pull/29058 | 2,138,280,124 | PR_kwDOCUB6oc5nEc4B | 29,058 | `auto_find_batch_size` isn't yet supported with DeepSpeed/FSDP. Raise error accrodingly. | {
"login": "pacman100",
"id": 13534540,
"node_id": "MDQ6VXNlcjEzNTM0NTQw",
"avatar_url": "https://avatars.githubusercontent.com/u/13534540?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pacman100",
"html_url": "https://github.com/pacman100",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29058). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
1. When examining if `auto_find_batch_size` issue with DeepSpeed is solved via Zach's previous PR as someone commented on the PR that issue is still there: https://github.com/huggingface/transformers/pull/28088#issuecomment-1893093503
When I try https://github.com/pacman100/DHS-LLM-Workshop/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29058/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29058/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29058",
"html_url": "https://github.com/huggingface/transformers/pull/29058",
"diff_url": "https://github.com/huggingface/transformers/pull/29058.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29058.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29057 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29057/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29057/comments | https://api.github.com/repos/huggingface/transformers/issues/29057/events | https://github.com/huggingface/transformers/pull/29057 | 2,138,258,766 | PR_kwDOCUB6oc5nEYNq | 29,057 | fix failing trainer ds tests | {
"login": "pacman100",
"id": 13534540,
"node_id": "MDQ6VXNlcjEzNTM0NTQw",
"avatar_url": "https://avatars.githubusercontent.com/u/13534540?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pacman100",
"html_url": "https://github.com/pacman100",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29057). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
1. After PR https://github.com/huggingface/transformers/pull/27568, when resuming from ckpt, it first loads the `trainer_state.json` file. As such, when bogus ckpt folder is passed it will throw file not found error. Earlier, the code would throw different invalid ckpt error in the function cal... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29057/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29057/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29057",
"html_url": "https://github.com/huggingface/transformers/pull/29057",
"diff_url": "https://github.com/huggingface/transformers/pull/29057.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29057.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29056 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29056/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29056/comments | https://api.github.com/repos/huggingface/transformers/issues/29056/events | https://github.com/huggingface/transformers/pull/29056 | 2,138,242,495 | PR_kwDOCUB6oc5nEUrh | 29,056 | StoppingCriteria tracks elements separately in the batch | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29056). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | MEMBER | null | # What does this PR do?
As was pointed out in #28932 , StoppingCriteria needs to stop generation per batch element and return a boolean tensor of `batch_size`. This PR adds the logic to track each row and when StoppingCriteria is triggered, stop generating for that particular row only.
Note that the when #28932 ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29056/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29056/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29056",
"html_url": "https://github.com/huggingface/transformers/pull/29056",
"diff_url": "https://github.com/huggingface/transformers/pull/29056.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29056.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29055 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29055/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29055/comments | https://api.github.com/repos/huggingface/transformers/issues/29055/events | https://github.com/huggingface/transformers/pull/29055 | 2,138,013,993 | PR_kwDOCUB6oc5nDjBY | 29,055 | FIX [`PEFT` / `Trainer` ] Handle better peft + quantized compiled models | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29055). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"LGTM! Let me know when it's out of draft and you want a final review ",
"It woul... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
Fixes: https://github.com/huggingface/transformers/issues/29033
Even though quantized models + compile + peft is not really stable (might not work OTB for all users), the current way we deal with peft compiled models leads to errors that are hard to interpret to users such as the one descr... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29055/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29055/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29055",
"html_url": "https://github.com/huggingface/transformers/pull/29055",
"diff_url": "https://github.com/huggingface/transformers/pull/29055.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29055.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29054 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29054/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29054/comments | https://api.github.com/repos/huggingface/transformers/issues/29054/events | https://github.com/huggingface/transformers/pull/29054 | 2,137,965,022 | PR_kwDOCUB6oc5nDYUv | 29,054 | Fix missing translation in README_ru | {
"login": "Strikoder",
"id": 71812454,
"node_id": "MDQ6VXNlcjcxODEyNDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/71812454?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Strikoder",
"html_url": "https://github.com/Strikoder",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29054). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"> Thank you for spotting the missing paragraph, and adding translation. I left a c... | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
This PR fixes the Russian translation of the README file by translating one line from English to Russian that was forgotten to be translated.
## Before submitting
- [x] This PR improves the docs.
## Fixes #26208
@stevhliu
@MKhalusova | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29054/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29054/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29054",
"html_url": "https://github.com/huggingface/transformers/pull/29054",
"diff_url": "https://github.com/huggingface/transformers/pull/29054.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29054.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29053 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29053/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29053/comments | https://api.github.com/repos/huggingface/transformers/issues/29053/events | https://github.com/huggingface/transformers/issues/29053 | 2,137,954,620 | I_kwDOCUB6oc5_bpk8 | 29,053 | model_max_length arg has no effect when creating bert tokenizer | {
"login": "galtay",
"id": 663051,
"node_id": "MDQ6VXNlcjY2MzA1MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/663051?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/galtay",
"html_url": "https://github.com/galtay",
"followers_url": "https://api.github.com/users/galtay/follow... | [] | open | false | null | [] | [
"Hi @galtay, thanks for raising this issue! \r\n\r\nIt looks related to #29050 \r\n\r\ncc @LysandreJik "
] | 1,708 | 1,708 | null | NONE | null | ### System Info
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utili... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29053/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29053/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29052 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29052/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29052/comments | https://api.github.com/repos/huggingface/transformers/issues/29052/events | https://github.com/huggingface/transformers/pull/29052 | 2,137,941,298 | PR_kwDOCUB6oc5nDTJf | 29,052 | Add Arabic translation for README | {
"login": "Strikoder",
"id": 71812454,
"node_id": "MDQ6VXNlcjcxODEyNDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/71812454?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Strikoder",
"html_url": "https://github.com/Strikoder",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | [
"I've addressed the duplicates and verified the Markdown formatting; all links are functioning correctly. Screenshots documenting each issue you highlighted have been attached for reference. \r\n\r\nRegarding the Arabic langauge, I don't really know who speaks Arabic at Hugging Face, could you please help me with t... | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
This PR introduces the Arabic translation of the README file.
## Before submitting
- [x] This PR improves the docs.
## Fixes #29045
@stevhliu
@MKhalusova | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29052/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29052/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29052",
"html_url": "https://github.com/huggingface/transformers/pull/29052",
"diff_url": "https://github.com/huggingface/transformers/pull/29052.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29052.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29051 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29051/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29051/comments | https://api.github.com/repos/huggingface/transformers/issues/29051/events | https://github.com/huggingface/transformers/pull/29051 | 2,137,680,528 | PR_kwDOCUB6oc5nCcDs | 29,051 | [`Do not Merge`] | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29051). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
UV | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29051/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29051/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29051",
"html_url": "https://github.com/huggingface/transformers/pull/29051",
"diff_url": "https://github.com/huggingface/transformers/pull/29051.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29051.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29050 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29050/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29050/comments | https://api.github.com/repos/huggingface/transformers/issues/29050/events | https://github.com/huggingface/transformers/issues/29050 | 2,137,665,880 | I_kwDOCUB6oc5_ajFY | 29,050 | Migrated pre-hub models' tokenizers don't configure the same as their pre-hub version | {
"login": "mlamera",
"id": 48600479,
"node_id": "MDQ6VXNlcjQ4NjAwNDc5",
"avatar_url": "https://avatars.githubusercontent.com/u/48600479?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mlamera",
"html_url": "https://github.com/mlamera",
"followers_url": "https://api.github.com/users/mlamer... | [] | open | false | null | [] | [
"Hey @mlamera! I believe this is due to the `transformers` library overriding some attributes of the config due to the explicit definition here:\r\n\r\nhttps://github.com/huggingface/transformers/blob/f497f564bb76697edab09184a252fc1b1a326d1e/src/transformers/models/gpt2/tokenization_gpt2.py#L53-L60\r\n\r\nThis shou... | 1,708 | 1,708 | null | NONE | null | ### System Info
transformers version: 4.38.0.dev0
python version: 3.10.12
### Who can help?
@younesbelkada
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or data... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29050/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29050/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29049 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29049/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29049/comments | https://api.github.com/repos/huggingface/transformers/issues/29049/events | https://github.com/huggingface/transformers/issues/29049 | 2,137,354,132 | I_kwDOCUB6oc5_ZW-U | 29,049 | Getting Long text generation after fine tuning Mistral 7b Model | {
"login": "Rishita32",
"id": 56127736,
"node_id": "MDQ6VXNlcjU2MTI3NzM2",
"avatar_url": "https://avatars.githubusercontent.com/u/56127736?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rishita32",
"html_url": "https://github.com/Rishita32",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | [
"Hi, thanks for raising an issue! \r\n\r\nThis is a question best placed in our [forums](https://discuss.huggingface.co/). We try to reserve the github issues for feature requests and bug reports.\r\n\r\nGeneral comments: \r\n* Setting `add_eos_token` instructs the tokenizer to add an EOS token at the end of a sequ... | 1,708 | 1,708 | null | NONE | null | ### System Info
Hi,
I am fine tuning Mistral7b model. I am getting long automated text generation using the fine tuned model. I have kept the eos_token=True. Can someone please tell me how to add a word limit to the responses?
This is the code for initializing tokenizer:
base_model = "mistralai/Mistral-7B-v0.1... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29049/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29049/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29048 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29048/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29048/comments | https://api.github.com/repos/huggingface/transformers/issues/29048/events | https://github.com/huggingface/transformers/pull/29048 | 2,137,352,371 | PR_kwDOCUB6oc5nBVAZ | 29,048 | Fix - don't return pixel mask for yolos | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29048). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Closing as change was added in #28312"
] | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
#28363 introduced a bug where the pixel mask was now being returned for YOLOS. `pixel_mask` isn't a valid YOLOS input, and so this breaks this.
The PR fixes that.
Weirdly, this wasn't caught on the original PR, but was triggered in #28312
cc @ydshieh for reference - we can try and... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29048/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29048/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29048",
"html_url": "https://github.com/huggingface/transformers/pull/29048",
"diff_url": "https://github.com/huggingface/transformers/pull/29048.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29048.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29047 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29047/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29047/comments | https://api.github.com/repos/huggingface/transformers/issues/29047/events | https://github.com/huggingface/transformers/issues/29047 | 2,137,276,126 | I_kwDOCUB6oc5_ZD7e | 29,047 | [BUG] Unexpected GPU memory consumption when using transformers PEFT in DeepSpeed Zero3 | {
"login": "alekseymalakhov11",
"id": 131314005,
"node_id": "U_kgDOB9OxVQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131314005?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alekseymalakhov11",
"html_url": "https://github.com/alekseymalakhov11",
"followers_url": "https://api... | [] | open | false | null | [] | [
"cc @younesbelkada too :) ",
"Hi @alekseymalakhov11 ! Thanks very much for the issue, just for us to understand better the issue, can you share the full command you are using for training? \r\nIt might be unrelated but just to be on the safe zone, could you try out on PEFT==0.8.2 & PEFT main to include some fixes... | 1,708 | 1,708 | null | NONE | null | ### System Info
transformers = "4.35.0"
peft = "0.7.1"
torch = ">=2.0.0"
accelerate = "^0.24.1"
deepspeed = "^0.9.5"
### Who can help?
@muellerzr @pacman100 @pacman100
### Information
- [ ] The official example scripts
- [X] My own modified scripts
### Tasks
- [ ] An officially supported task in the `example... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29047/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29047/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29046 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29046/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29046/comments | https://api.github.com/repos/huggingface/transformers/issues/29046/events | https://github.com/huggingface/transformers/pull/29046 | 2,137,186,717 | PR_kwDOCUB6oc5nAw77 | 29,046 | [CI] Quantization workflow | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMar... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29046). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"I see that `transformers-all-latest-gpu` docker image is not being updated for th... | 1,708 | 1,708 | null | MEMBER | null | # What does this PR do ?
This PR adds a workflow for quantization tests + related dockerfile. Since we merged the [HfQuantizer PR](https://github.com/huggingface/transformers/pull/26610), the community started integrating their own quantizers into transformers (e.g. [AQML](https://github.com/huggingface/transformers/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29046/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29046/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29046",
"html_url": "https://github.com/huggingface/transformers/pull/29046",
"diff_url": "https://github.com/huggingface/transformers/pull/29046.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29046.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29045 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29045/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29045/comments | https://api.github.com/repos/huggingface/transformers/issues/29045/events | https://github.com/huggingface/transformers/issues/29045 | 2,137,164,681 | I_kwDOCUB6oc5_YouJ | 29,045 | [i18n-ar] Translating docs to Arabic | {
"login": "Strikoder",
"id": 71812454,
"node_id": "MDQ6VXNlcjcxODEyNDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/71812454?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Strikoder",
"html_url": "https://github.com/Strikoder",
"followers_url": "https://api.github.com/users/... | [
{
"id": 2796628563,
"node_id": "MDU6TGFiZWwyNzk2NjI4NTYz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/WIP",
"name": "WIP",
"color": "234C99",
"default": false,
"description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in pro... | open | false | null | [] | [
"I will start translating the readme.md, then I will move to the tutorial section."
] | 1,708 | 1,708 | null | NONE | null | Hi!
Let's bring the documentation to all the Arabic-speaking community 🌐
Who would want to translate? Please follow the 🤗 [TRANSLATING guide](https://github.com/huggingface/transformers/blob/main/docs/TRANSLATING.md). Here is a list of the files ready for translation. Let us know in this issue if you'd like to... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29045/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29045/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29044 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29044/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29044/comments | https://api.github.com/repos/huggingface/transformers/issues/29044/events | https://github.com/huggingface/transformers/pull/29044 | 2,137,035,002 | PR_kwDOCUB6oc5nAQle | 29,044 | Fix a tiny typo in `generation/utils.py::GenerateEncoderDecoderOutput`'s docstring | {
"login": "sadra-barikbin",
"id": 22097587,
"node_id": "MDQ6VXNlcjIyMDk3NTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/22097587?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sadra-barikbin",
"html_url": "https://github.com/sadra-barikbin",
"followers_url": "https://api.gi... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29044). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"@amyeroberts any idea why CI isn't running in this PR? 👀 \r\n\r\n(this PR fixes a... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Hi there!
To fix a tiny typo in `generation/utils.py::GenerateEncoderDecoderOutput`'s docstring
@gante
| {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29044/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29044/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29044",
"html_url": "https://github.com/huggingface/transformers/pull/29044",
"diff_url": "https://github.com/huggingface/transformers/pull/29044.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29044.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29043 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29043/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29043/comments | https://api.github.com/repos/huggingface/transformers/issues/29043/events | https://github.com/huggingface/transformers/pull/29043 | 2,137,011,581 | PR_kwDOCUB6oc5nALjk | 29,043 | Patch to skip failing `test_save_load_low_cpu_mem_usage` tests | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29043). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Thanks for catching these, and sorry I missed some of them (I didn't know how to r... | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
A handful of tests started failing after the merging in of #28948. Tests didn't fail on PR or initial main commit, but now failing. Looks like might be relevant tests not fetched for the runners.
This PR skips the tests for now.
cc @ylacombe As you might want to enable this feature for... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29043/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29043/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29043",
"html_url": "https://github.com/huggingface/transformers/pull/29043",
"diff_url": "https://github.com/huggingface/transformers/pull/29043.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29043.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29042 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29042/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29042/comments | https://api.github.com/repos/huggingface/transformers/issues/29042/events | https://github.com/huggingface/transformers/issues/29042 | 2,136,985,386 | I_kwDOCUB6oc5_X88q | 29,042 | Neuron Trainium --Gradient_Accumulation_Steps > 1 | {
"login": "mathephysicist",
"id": 25594384,
"node_id": "MDQ6VXNlcjI1NTk0Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/25594384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mathephysicist",
"html_url": "https://github.com/mathephysicist",
"followers_url": "https://api.gi... | [] | open | false | null | [] | [
"cc @muellerzr as it seems to cover trainer + TPU",
"Thanks for the flag @mathephysicist! Can you confirm this works (with no change in Trainer) if installing accelerate from main via `pip install git+https://github.com/huggingface/accelerate@grad-accum-tpu`?",
"Will try that out! ",
"That seems to uninstall/... | 1,708 | 1,708 | null | NONE | null | ### System Info
If I use Optimum Neuron on Trainium with --gradient_accumulation_steps > 1 and training failed,
Then I modified line https://github.com/huggingface/transformers/blob/6d1f545665ac66420af9f6702d891a30c5d070ea/src/transformers/trainer.py#L1966C21-L1966C23
to include
```
if is_torch_tpu_availab... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29042/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29042/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29041 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29041/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29041/comments | https://api.github.com/repos/huggingface/transformers/issues/29041/events | https://github.com/huggingface/transformers/pull/29041 | 2,136,759,631 | PR_kwDOCUB6oc5m_Ugm | 29,041 | Fix bug with passing capture_* args to neptune callback | {
"login": "AleksanderWWW",
"id": 58885668,
"node_id": "MDQ6VXNlcjU4ODg1NjY4",
"avatar_url": "https://avatars.githubusercontent.com/u/58885668?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AleksanderWWW",
"html_url": "https://github.com/AleksanderWWW",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [
"## What does this PR do\r\n\r\nThis PR aims to fix a bug that appears when using `NeptuneCallback` with `run=None` and at least one of the `capture_*` params set. \r\n\r\n### Cause of the problem\r\n\r\nThis is due to the fact, that in one of the methods those params have hardcoded values, but the kwargs passed to... | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29041/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29041/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29041",
"html_url": "https://github.com/huggingface/transformers/pull/29041",
"diff_url": "https://github.com/huggingface/transformers/pull/29041.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29041.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29040 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29040/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29040/comments | https://api.github.com/repos/huggingface/transformers/issues/29040/events | https://github.com/huggingface/transformers/issues/29040 | 2,136,718,515 | I_kwDOCUB6oc5_W7yz | 29,040 | i am getting this error while trying to run the example script for finetuning t5 on squad for question answering | {
"login": "preethip02",
"id": 84133769,
"node_id": "MDQ6VXNlcjg0MTMzNzY5",
"avatar_url": "https://avatars.githubusercontent.com/u/84133769?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/preethip02",
"html_url": "https://github.com/preethip02",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"Hi @preethip02, thanks for raising this issue! \r\n\r\nCould you provide a minimal code example to reproduce this error? Specifically, how are you launching the script? ",
"I followed the steps as given in the README file at huggingface/examples\r\n\r\nThe code i executed was \r\n\r\ngit clone https://github.com... | 1,708 | 1,708 | null | NONE | null | ### System Info
System Information:
- transformers version: 4.38.0.dev0
- Platform: Linux-5.19.0-45-generic-x86_64-with-glibc2.31
- Python version: 3.9.16
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.27.2
- Accelerate config: not found
- PyTorch version (GPU?)... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29040/timeline | reopened | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29039 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29039/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29039/comments | https://api.github.com/repos/huggingface/transformers/issues/29039/events | https://github.com/huggingface/transformers/pull/29039 | 2,136,595,226 | PR_kwDOCUB6oc5m-v26 | 29,039 | FIX: Fix error with `logger.warning` + inline with recent refactor | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"Thanks ! That's on me as well :D ",
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29039). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
In fact current on transformers main some legacy setup that call `model._is_quantized_training_enabled` throw an error:
```bash
Arguments: (<class 'FutureWarning'>,)
--- Logging error ---
Traceback (most recent call last):
File "/home/younes_huggingface_co/miniconda3/envs/fix-test/li... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29039/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29039/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29039",
"html_url": "https://github.com/huggingface/transformers/pull/29039",
"diff_url": "https://github.com/huggingface/transformers/pull/29039.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29039.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29038 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29038/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29038/comments | https://api.github.com/repos/huggingface/transformers/issues/29038/events | https://github.com/huggingface/transformers/pull/29038 | 2,136,581,159 | PR_kwDOCUB6oc5m-su2 | 29,038 | Remove timm in modeling files | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29038). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | COLLABORATOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29038/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29038",
"html_url": "https://github.com/huggingface/transformers/pull/29038",
"diff_url": "https://github.com/huggingface/transformers/pull/29038.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29038.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29037 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29037/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29037/comments | https://api.github.com/repos/huggingface/transformers/issues/29037/events | https://github.com/huggingface/transformers/pull/29037 | 2,136,545,870 | PR_kwDOCUB6oc5m-ksU | 29,037 | Fix copies between DETR and DETA | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29037). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
Fixes failing quality checks on main:
https://app.circleci.com/pipelines/github/huggingface/transformers/84538/workflows/0d4691b0-4988-4040-a6bf-bd1ad90f523b/jobs/1093017?utm_campaign=vcs-integration-link&utm_medium=referral&utm_source=github-checks-link&utm_content=summary
| {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29037/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29037/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29037",
"html_url": "https://github.com/huggingface/transformers/pull/29037",
"diff_url": "https://github.com/huggingface/transformers/pull/29037.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29037.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29036 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29036/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29036/comments | https://api.github.com/repos/huggingface/transformers/issues/29036/events | https://github.com/huggingface/transformers/issues/29036 | 2,136,524,983 | I_kwDOCUB6oc5_WMi3 | 29,036 | `object of type 'NoneType' has no len()` when trying to use `WhisperNoSpeechDetection` | {
"login": "cifkao",
"id": 8046580,
"node_id": "MDQ6VXNlcjgwNDY1ODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8046580?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cifkao",
"html_url": "https://github.com/cifkao",
"followers_url": "https://api.github.com/users/cifkao/foll... | [] | open | false | null | [] | [
"cc @sanchit-gandhi @ylacombe ",
"Thanks for opening the issue! Opened a PR to fix this!"
] | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.38.0.dev0 (5b6fa23 – after merging #28687)
- Platform: macOS-14.2.1-arm64-arm-64bit
- Python version: 3.10.13
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29036/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29036/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29035 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29035/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29035/comments | https://api.github.com/repos/huggingface/transformers/issues/29035/events | https://github.com/huggingface/transformers/pull/29035 | 2,136,293,976 | PR_kwDOCUB6oc5m9soV | 29,035 | Add a clone method for model configs | {
"login": "FremyCompany",
"id": 364405,
"node_id": "MDQ6VXNlcjM2NDQwNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/364405?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FremyCompany",
"html_url": "https://github.com/FremyCompany",
"followers_url": "https://api.github.com/u... | [] | open | false | null | [] | [
"Guidance on where to add the tests, and what changes to make to the docs would be welcome, too.",
"@FremyCompany so I can better understand the purpose behind this PR: `new_config = copy.deepcopy(config)` doesn't work in some settings, right? If so, in which situations?\r\n\r\nThe following works on my end:\r\n`... | 1,707 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
Adding a convenience `clone()` method to `PretrainedConfig` that creates a deep copy of the current configuration. Useful to make changes to it without modifying the original.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that'... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29035/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29035/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29035",
"html_url": "https://github.com/huggingface/transformers/pull/29035",
"diff_url": "https://github.com/huggingface/transformers/pull/29035.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29035.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29034 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29034/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29034/comments | https://api.github.com/repos/huggingface/transformers/issues/29034/events | https://github.com/huggingface/transformers/pull/29034 | 2,136,181,353 | PR_kwDOCUB6oc5m9TdM | 29,034 | Removed obsolete attribute setting for AQLM quantization. | {
"login": "BlackSamorez",
"id": 16901341,
"node_id": "MDQ6VXNlcjE2OTAxMzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BlackSamorez",
"html_url": "https://github.com/BlackSamorez",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"In the meantime, I updated [the Colab notebook](https://colab.research.google.com/drive/1-xZmBRXT5Fm3Ghn4Mwa2KRypORXb855X?usp=sharing) to this branch.\r\nSeems to be working again.",
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29034). All of your documentation chang... | 1,707 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29034/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29034/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29034",
"html_url": "https://github.com/huggingface/transformers/pull/29034",
"diff_url": "https://github.com/huggingface/transformers/pull/29034.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29034.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29033 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29033/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29033/comments | https://api.github.com/repos/huggingface/transformers/issues/29033/events | https://github.com/huggingface/transformers/issues/29033 | 2,135,817,420 | I_kwDOCUB6oc5_TfzM | 29,033 | Trainer doesn't handle torch.compiled QLoRA models correctly | {
"login": "readwriteexec",
"id": 129907247,
"node_id": "U_kgDOB746Lw",
"avatar_url": "https://avatars.githubusercontent.com/u/129907247?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/readwriteexec",
"html_url": "https://github.com/readwriteexec",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"Hi @readwriteexec !\r\nThanks for the issue ! \r\nCan you try out: https://github.com/huggingface/transformers/pull/29055 ? I will also try to run some trainign on my end usign QLoRA + compile but from what I have understood it is not really supported I think. But in any case we should not throw that error on the ... | 1,707 | 1,708 | 1,708 | NONE | null | ### System Info
- `transformers` version: 4.38.0.dev0
- Platform: Linux-6.1.58+-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.28.0.dev0
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.0+cu121 (True)
- Tensorf... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29033/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29033/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29032 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29032/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29032/comments | https://api.github.com/repos/huggingface/transformers/issues/29032/events | https://github.com/huggingface/transformers/pull/29032 | 2,135,816,940 | PR_kwDOCUB6oc5m8DQv | 29,032 | Feature: Option to set the tracking URI for MLflowCallback. | {
"login": "seanswyi",
"id": 20367759,
"node_id": "MDQ6VXNlcjIwMzY3NzU5",
"avatar_url": "https://avatars.githubusercontent.com/u/20367759?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seanswyi",
"html_url": "https://github.com/seanswyi",
"followers_url": "https://api.github.com/users/sea... | [] | closed | false | null | [] | [
"Sounds good, thanks for the feedback @amyeroberts! I just made another minor change in the docstring and committed it: the default value of `MLFLOW_TRACKING_URI` should be an empty string rather than `None`.",
"Is there any way to rerun tests? The failed `tests_torch` seems to be a timeout-related issue and wasn... | 1,707 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
Previously, the MLflowCallback was only able to set MLflow experiments or runs. This PR adds the option to also set the tracking URI.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the t... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29032/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29032/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29032",
"html_url": "https://github.com/huggingface/transformers/pull/29032",
"diff_url": "https://github.com/huggingface/transformers/pull/29032.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29032.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29031 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29031/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29031/comments | https://api.github.com/repos/huggingface/transformers/issues/29031/events | https://github.com/huggingface/transformers/issues/29031 | 2,135,744,898 | I_kwDOCUB6oc5_TOGC | 29,031 | [i18n-<languageCode>] Translating docs to <languageName> | {
"login": "goalend",
"id": 110501477,
"node_id": "U_kgDOBpYeZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/110501477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/goalend",
"html_url": "https://github.com/goalend",
"followers_url": "https://api.github.com/users/goalend/foll... | [
{
"id": 2796628563,
"node_id": "MDU6TGFiZWwyNzk2NjI4NTYz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/WIP",
"name": "WIP",
"color": "234C99",
"default": false,
"description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in pro... | closed | false | null | [] | [] | 1,707 | 1,707 | 1,707 | NONE | null | <!--
Note: Please search to see if an issue already exists for the language you are trying to translate.
-->
Hi!
Let's bring the documentation to all the <languageName>-speaking community 🌐 (currently 0 out of 267 complete)
Who would want to translate? Please follow the 🤗 [TRANSLATING guide](https://github.com/hug... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29031/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29031/timeline | not_planned | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29030 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29030/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29030/comments | https://api.github.com/repos/huggingface/transformers/issues/29030/events | https://github.com/huggingface/transformers/pull/29030 | 2,135,702,557 | PR_kwDOCUB6oc5m7qCo | 29,030 | FEAT [`Generation`]: Introduce a centralized API to switch between cache implementations | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29030). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
I would like to introduce a new API before the release to centralize switching between cache implementations !
Right now to load SInkCache one needs to do:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM, SinkCache
tokenizer = AutoTokenizer.from_pretrained("T... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29030/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29030/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29030",
"html_url": "https://github.com/huggingface/transformers/pull/29030",
"diff_url": "https://github.com/huggingface/transformers/pull/29030.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29030.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29029 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29029/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29029/comments | https://api.github.com/repos/huggingface/transformers/issues/29029/events | https://github.com/huggingface/transformers/issues/29029 | 2,135,678,463 | I_kwDOCUB6oc5_S93_ | 29,029 | Padding causes forward to produce different logits (Llama2-7b) | {
"login": "c3ianwu",
"id": 92783433,
"node_id": "U_kgDOBYfDSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/92783433?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/c3ianwu",
"html_url": "https://github.com/c3ianwu",
"followers_url": "https://api.github.com/users/c3ianwu/follow... | [] | open | false | null | [] | [
"Cc @younesbelkada too :)",
"I believe this comment is relevant to this issue: https://github.com/huggingface/transformers/issues/25420#issuecomment-1775317535",
"On point @amyeroberts TLDR it's expected ",
"Thanks @amyeroberts @ArthurZucker \r\n\r\nI did a few more experiments based on the issue linked by @... | 1,707 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.36.2
- Platform: Linux-5.15.107+-x86_64-with-glibc2.31
- Python version: 3.10.13
- Huggingface_hub version: 0.20.1
- Safetensors version: 0.4.1
- Accelerate version: 0.22.0
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.2+cu118 (True)
### Who can hel... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29029/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29029/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29028 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29028/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29028/comments | https://api.github.com/repos/huggingface/transformers/issues/29028/events | https://github.com/huggingface/transformers/issues/29028 | 2,135,441,284 | I_kwDOCUB6oc5_SD-E | 29,028 | Perplexity calculation in the official tutorial is not correct | {
"login": "balaabhijit",
"id": 132952260,
"node_id": "U_kgDOB-ywxA",
"avatar_url": "https://avatars.githubusercontent.com/u/132952260?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/balaabhijit",
"html_url": "https://github.com/balaabhijit",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | [] | 1,707 | 1,707 | null | NONE | null | ### System Info
```yaml
Pytorch: 2.1.0+cu121
datasets: 2.17.0
transformers: 4.35.2
```
### Who can help?
@ArthurZucker @stevhliu @younesbelkada
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29028/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29028/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29027 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29027/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29027/comments | https://api.github.com/repos/huggingface/transformers/issues/29027/events | https://github.com/huggingface/transformers/pull/29027 | 2,135,368,695 | PR_kwDOCUB6oc5m6iz2 | 29,027 | [`CLeanup`] Revert SDPA attention changes that got in the static kv cache PR | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29027). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | COLLABORATOR | null | # What does this PR do?
cc @younesbelkada
#27931 removed `copied from` statements for Persimmon, Qwen2 and Mixstral / mistral which introduced unwanted changes for SDPA
Superseed #29026
Closes: https://github.com/huggingface/transformers/pull/29026 | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29027/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29027/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29027",
"html_url": "https://github.com/huggingface/transformers/pull/29027",
"diff_url": "https://github.com/huggingface/transformers/pull/29027.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29027.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29026 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29026/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29026/comments | https://api.github.com/repos/huggingface/transformers/issues/29026/events | https://github.com/huggingface/transformers/pull/29026 | 2,135,353,774 | PR_kwDOCUB6oc5m6fl8 | 29,026 | [`CI` / `core`] Fix CI with GC + pytorch 2.2 | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29026). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"It appears the rootcause was slightlly different, see: https://github.com/huggingf... | 1,707 | 1,707 | 1,707 | CONTRIBUTOR | null | # What does this PR do?
Fixes the current failing CI for Mistral, Mixtral and Qwen2 for gradient checkpointing. For some reason, since pytorch 2.2, gradient checkpointing raises an error when going through in-place operations such as `tensor.mul_(xxx)` which was not the case in earlier versions.
Simply replacing... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29026/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29026/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29026",
"html_url": "https://github.com/huggingface/transformers/pull/29026",
"diff_url": "https://github.com/huggingface/transformers/pull/29026.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29026.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29025 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29025/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29025/comments | https://api.github.com/repos/huggingface/transformers/issues/29025/events | https://github.com/huggingface/transformers/issues/29025 | 2,135,280,570 | I_kwDOCUB6oc5_Rcu6 | 29,025 | What optimizations are available for AutoModelForVision2Seq | {
"login": "FurkanGozukara",
"id": 19240467,
"node_id": "MDQ6VXNlcjE5MjQwNDY3",
"avatar_url": "https://avatars.githubusercontent.com/u/19240467?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FurkanGozukara",
"html_url": "https://github.com/FurkanGozukara",
"followers_url": "https://api.gi... | [] | open | false | null | [] | [
"Hi @FurkanGozukara !\r\nThanks for the issue !\r\n\r\n`AutoModelForVision2Seq` inherits from HF transformers' `PreTrainedModel`, therefore it benefits from most of the optimizations that are available, for instance you can load it in fp16, bf16 as usual through the `torch_dtype` argument, CPU offloading through de... | 1,707 | 1,708 | null | NONE | null | Hello. I am using AutoModelForVision2Seq for Kosmos 2 model like below
```
model_source = "microsoft/kosmos-2-patch14-224"
model = AutoModelForVision2Seq.from_pretrained(model_source).to("cuda")
processor = AutoProcessor.from_pretrained(model_source)
```
I checked this link but couldn't find any... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29025/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29025/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29024 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29024/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29024/comments | https://api.github.com/repos/huggingface/transformers/issues/29024/events | https://github.com/huggingface/transformers/pull/29024 | 2,135,278,649 | PR_kwDOCUB6oc5m6PHt | 29,024 | Adding _tie_weights() to more models | {
"login": "hackyon",
"id": 1557853,
"node_id": "MDQ6VXNlcjE1NTc4NTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1557853?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hackyon",
"html_url": "https://github.com/hackyon",
"followers_url": "https://api.github.com/users/hackyon/... | [] | open | false | null | [] | [
"Still working through another 10 failing cases (where it's not so obvious), so only marking as draft for now.\r\n\r\ntests/models/flava/test_modeling_flava.py .....F \r\ntests/models/encodec/test_modeling_encodec.py F\r\ntests/models/fsmt/test_modeling_fsmt.py F \r\ntests/models/lxmert/test_modeling_lxmert.py F \... | 1,707 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
This is a follow-up to ~#28947~ #28948. It turns out that the CI only runs a small subset of tests, so there are quite a bit of sneaky failures throughout the tests.
I had to explicitly run the following command (it will take quite a long time):
pytest -k "test_save_load_low_cpu_mem_usage... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29024/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29024/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29024",
"html_url": "https://github.com/huggingface/transformers/pull/29024",
"diff_url": "https://github.com/huggingface/transformers/pull/29024.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29024.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29023 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29023/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29023/comments | https://api.github.com/repos/huggingface/transformers/issues/29023/events | https://github.com/huggingface/transformers/pull/29023 | 2,135,246,093 | PR_kwDOCUB6oc5m6H-h | 29,023 | [Quantization] Quanto quantizer | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMar... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29023). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"cc @dacorvo "
] | 1,707 | 1,708 | null | MEMBER | null | # What does this PR do ?
This PR adds the quantization methods from quanto library. We will support inference + model quantization if the user perform weights only quantization since we don't require a calibration dataset.
TODO:
- [ ] Couple of fix to do on quanto side (e.g safetensors saving)
- [ ] docs
- [ ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29023/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29023/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29023",
"html_url": "https://github.com/huggingface/transformers/pull/29023",
"diff_url": "https://github.com/huggingface/transformers/pull/29023.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29023.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29022 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29022/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29022/comments | https://api.github.com/repos/huggingface/transformers/issues/29022/events | https://github.com/huggingface/transformers/issues/29022 | 2,134,772,595 | I_kwDOCUB6oc5_Pgtz | 29,022 | `get_torch_version()` doesn't return same result as `import torch; torch.__version__` | {
"login": "relh",
"id": 3629411,
"node_id": "MDQ6VXNlcjM2Mjk0MTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/3629411?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/relh",
"html_url": "https://github.com/relh",
"followers_url": "https://api.github.com/users/relh/followers",
... | [] | open | false | null | [] | [
"Hi @relh, thanks for raising this issue! \r\n\r\nIndeed, despite the promised profit, I don't think this is something we should be handling within `generic.py`. Assuming there's one version installed is a fair one, and being able to parse the torch version is something we do throughout the library e.g. [here](http... | 1,707 | 1,707 | null | NONE | null | ### System Info
ubuntu linux with conda and pip
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [X] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give detai... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29022/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29022/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29021 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29021/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29021/comments | https://api.github.com/repos/huggingface/transformers/issues/29021/events | https://github.com/huggingface/transformers/pull/29021 | 2,134,769,927 | PR_kwDOCUB6oc5m4e5s | 29,021 | Flax: Flax examples without pytorch dependencies | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | [
"Nope, the speech recognition example relies on torch's Dataloader :(",
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29021). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | MEMBER | null | # What does this PR do?
WIP | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29021/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29021/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29021",
"html_url": "https://github.com/huggingface/transformers/pull/29021",
"diff_url": "https://github.com/huggingface/transformers/pull/29021.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29021.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29020 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29020/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29020/comments | https://api.github.com/repos/huggingface/transformers/issues/29020/events | https://github.com/huggingface/transformers/issues/29020 | 2,134,475,811 | I_kwDOCUB6oc5_OYQj | 29,020 | NotImplementedError: A model class needs to define a `prepare_inputs_for_generation` method in order to use `.generate()`. | {
"login": "nikhilajoshy",
"id": 37141775,
"node_id": "MDQ6VXNlcjM3MTQxNzc1",
"avatar_url": "https://avatars.githubusercontent.com/u/37141775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikhilajoshy",
"html_url": "https://github.com/nikhilajoshy",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"Hi @nikhilajoshy, thanks for raising an issue! \r\n\r\nPlease make sure to share the full error traceback in the issue information. \r\n\r\nCould you also make sure that the code is properly formatted in the example and that it can be run to fully reproduce the error? "
] | 1,707 | 1,708 | 1,708 | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Windows-10-10.0.19045-SP0
- Python version: 3.11.0
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): not installed (NA)
- Tensorflow version (GP... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29020/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29020/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29019 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29019/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29019/comments | https://api.github.com/repos/huggingface/transformers/issues/29019/events | https://github.com/huggingface/transformers/pull/29019 | 2,134,369,847 | PR_kwDOCUB6oc5m3HFW | 29,019 | Update important model list | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29019). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"I hesitated but Mistral and Llama are very close in terms of architecture and supp... | 1,707 | 1,708 | 1,708 | MEMBER | null | Adds LLaMa to the important models to test on each commit | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29019/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29019/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29019",
"html_url": "https://github.com/huggingface/transformers/pull/29019",
"diff_url": "https://github.com/huggingface/transformers/pull/29019.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29019.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29018 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29018/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29018/comments | https://api.github.com/repos/huggingface/transformers/issues/29018/events | https://github.com/huggingface/transformers/pull/29018 | 2,134,309,104 | PR_kwDOCUB6oc5m25gs | 29,018 | Make LogitsProcessor compatible with torch.compile | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29018). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"@gante I went through the comments and fixed where possible. I am wondering if it ... | 1,707 | 1,708 | null | MEMBER | null | # What does this PR do?
Small part of the issue #28981 . This PR makes sure that Logits Processor and Stopping Criteria are compatible with `torch.compile` when `fullgraph=True`. The changes were tested with dummy inputs and logits and also with Llama. For now only the Processors used in `generate` were checked, tho... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29018/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29018/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29018",
"html_url": "https://github.com/huggingface/transformers/pull/29018",
"diff_url": "https://github.com/huggingface/transformers/pull/29018.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29018.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29017 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29017/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29017/comments | https://api.github.com/repos/huggingface/transformers/issues/29017/events | https://github.com/huggingface/transformers/issues/29017 | 2,134,307,588 | I_kwDOCUB6oc5_NvME | 29,017 | Error: The selected decoder is not prepared for the encoder hidden states to be passed. | {
"login": "nikhilajoshy",
"id": 37141775,
"node_id": "MDQ6VXNlcjM3MTQxNzc1",
"avatar_url": "https://avatars.githubusercontent.com/u/37141775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikhilajoshy",
"html_url": "https://github.com/nikhilajoshy",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"Hi @nikhilajoshy, thanks for raising an issue! \r\n\r\nCould you please provide a full traceback of the error encountered? \r\n\r\nPlease note that the not all encoder-decoder pairs are compatible - we don't guarantee all possible pairings can be loaded and run. \r\n\r\n> Not able to load any EncoderDecoderModel u... | 1,707 | 1,707 | 1,707 | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Windows-10-10.0.19045-SP0
- Python version: 3.11.0
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.26.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.2.0+cpu (False)
### Who can help?
_No respon... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29017/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29017/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29016 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29016/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29016/comments | https://api.github.com/repos/huggingface/transformers/issues/29016/events | https://github.com/huggingface/transformers/issues/29016 | 2,133,884,761 | I_kwDOCUB6oc5_MH9Z | 29,016 | Trainer: Functions to inspect model and optimizer status | {
"login": "yqy2001",
"id": 55196500,
"node_id": "MDQ6VXNlcjU1MTk2NTAw",
"avatar_url": "https://avatars.githubusercontent.com/u/55196500?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yqy2001",
"html_url": "https://github.com/yqy2001",
"followers_url": "https://api.github.com/users/yqy200... | [] | open | false | null | [] | [
"cc @muellerzr @pacman100 "
] | 1,707 | 1,707 | null | CONTRIBUTOR | null | ### Feature request
In huggingface Trainer, are there any functions to inspect model and optimizer status? such as, how many parameters require grad, learning rate of each parameter, which optimizer group each parameter belong...
I didn't find any related function in Trainer, and I know implementing it by myself ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29016/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29016/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29015 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29015/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29015/comments | https://api.github.com/repos/huggingface/transformers/issues/29015/events | https://github.com/huggingface/transformers/pull/29015 | 2,133,811,967 | PR_kwDOCUB6oc5m1LxC | 29,015 | Support resuming of deepspeed + Lora + offloading | {
"login": "thepowerfuldeez",
"id": 11796343,
"node_id": "MDQ6VXNlcjExNzk2MzQz",
"avatar_url": "https://avatars.githubusercontent.com/u/11796343?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thepowerfuldeez",
"html_url": "https://github.com/thepowerfuldeez",
"followers_url": "https://api... | [] | open | false | null | [] | [
"cc @pacman100 @younesbelkada ",
"Could you please provide any updates on this PR?",
"Sure @thepowerfuldeez ! \r\n@pacman100 is currently working on fixing issues with repsect to deepspeed and providing working scripts that you can run out of the box: https://github.com/huggingface/peft/pull/1489 we'll review t... | 1,707 | 1,708 | null | NONE | null | This PR is a upstream version of @kazemf78 PR to support resuming of Lora training when using deepspeed.
Without setting `load_module_strict=False` as a default, checkpoint is not loaded due to Lora not containing all weights, throwing an error `deepspeed resume Error(s) in loading state_dict for PeftModelForCausalLM`... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29015/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29015/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29015",
"html_url": "https://github.com/huggingface/transformers/pull/29015",
"diff_url": "https://github.com/huggingface/transformers/pull/29015.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29015.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29013 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29013/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29013/comments | https://api.github.com/repos/huggingface/transformers/issues/29013/events | https://github.com/huggingface/transformers/pull/29013 | 2,133,666,705 | PR_kwDOCUB6oc5m0rzi | 29,013 | DeformableDetrModel support fp16 | {
"login": "DonggeunYu",
"id": 17740653,
"node_id": "MDQ6VXNlcjE3NzQwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/17740653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DonggeunYu",
"html_url": "https://github.com/DonggeunYu",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | [
"Actually, one thing we'll need to add is a test e.g. [like here for MT5](https://github.com/huggingface/transformers/blob/1ecf5f7c982d761b4daaa96719d162c324187c64/tests/models/mt5/test_modeling_mt5.py#L424). \r\n\r\nFor the quality checks, running `make fix-copies` and pushing the changes should resolve the issues... | 1,707 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
This PR for DeformableDetrModel support fp16.
https://github.com/huggingface/transformers/issues/29011
## Who can review?
@amyeroberts
| {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29013/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29013/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29013",
"html_url": "https://github.com/huggingface/transformers/pull/29013",
"diff_url": "https://github.com/huggingface/transformers/pull/29013.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29013.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29012 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29012/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29012/comments | https://api.github.com/repos/huggingface/transformers/issues/29012/events | https://github.com/huggingface/transformers/pull/29012 | 2,133,655,466 | PR_kwDOCUB6oc5m0pT7 | 29,012 | Add LLaVa 1.6 | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29012). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
This PR adds the new LLaVa 1.6 model.
To do:
- [x] not sure how batched generation works
- [x] make `image_sizes` a tensor instead of a list
- [x] make sure llava 1.5 still works | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29012/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 4,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29012/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29012",
"html_url": "https://github.com/huggingface/transformers/pull/29012",
"diff_url": "https://github.com/huggingface/transformers/pull/29012.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29012.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29011 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29011/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29011/comments | https://api.github.com/repos/huggingface/transformers/issues/29011/events | https://github.com/huggingface/transformers/issues/29011 | 2,133,651,104 | I_kwDOCUB6oc5_LO6g | 29,011 | Need to DeformableDetrModel support fp16 | {
"login": "DonggeunYu",
"id": 17740653,
"node_id": "MDQ6VXNlcjE3NzQwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/17740653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DonggeunYu",
"html_url": "https://github.com/DonggeunYu",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [] | 1,707 | 1,707 | null | CONTRIBUTOR | null | ### Feature request
Need to DeformableDetrModel using fp16.
~~~
from transformers import AutoImageProcessor
from trf.models import DeformableDetrModel
from PIL import Image
import torch
import requests
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, strea... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29011/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29011/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29010 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29010/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29010/comments | https://api.github.com/repos/huggingface/transformers/issues/29010/events | https://github.com/huggingface/transformers/issues/29010 | 2,133,599,341 | I_kwDOCUB6oc5_LCRt | 29,010 | KV Cache Size Issue during Inference | {
"login": "gopikrishnajha",
"id": 96072995,
"node_id": "U_kgDOBbn1Iw",
"avatar_url": "https://avatars.githubusercontent.com/u/96072995?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gopikrishnajha",
"html_url": "https://github.com/gopikrishnajha",
"followers_url": "https://api.github.com... | [] | open | false | null | [] | [
"cc @ArthurZucker ",
"Hey! That's the normal behavior for auto regressive transformers. The key value states's shape decreases, while the actual cache increases in size. \nI don't know how you debugged, but the prefill (first forward) will be big, then each new forward will only add 1 to the sequence length dimen... | 1,707 | 1,708 | null | NONE | null | This is w.r.t inference using models like mistral7b or llama.
In my understanding, KV cache size should grow as we process more tokens, however I see in the code that it shrinks as more tokens are processed. For example, in transformers/src/transformers/models/mistral/modeling_mistral.py, see the following code.
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29010/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29010/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29009 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29009/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29009/comments | https://api.github.com/repos/huggingface/transformers/issues/29009/events | https://github.com/huggingface/transformers/pull/29009 | 2,133,392,507 | PR_kwDOCUB6oc5mzw8O | 29,009 | FIX [`Trainer` / tags]: Fix trainer + tags when users do not pass `"tags"` to `trainer.push_to_hub()` | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [] | 1,707 | 1,707 | 1,707 | CONTRIBUTOR | null | # What does this PR do?
As per title - and fixes: https://github.com/hiyouga/LLaMA-Factory/pull/2474#issuecomment-1941603142 raised by @hiyouga
Indeed, we should always push tags if there are any that are saved on the model. Currently the logic is wrong, as it pushes the tags only if `"tags"` is passed to `train... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29009/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29009/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29009",
"html_url": "https://github.com/huggingface/transformers/pull/29009",
"diff_url": "https://github.com/huggingface/transformers/pull/29009.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29009.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29008 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29008/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29008/comments | https://api.github.com/repos/huggingface/transformers/issues/29008/events | https://github.com/huggingface/transformers/pull/29008 | 2,133,272,790 | PR_kwDOCUB6oc5mzXfk | 29,008 | Add CrystalCoder Model | {
"login": "TianhuaTao",
"id": 9389466,
"node_id": "MDQ6VXNlcjkzODk0NjY=",
"avatar_url": "https://avatars.githubusercontent.com/u/9389466?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TianhuaTao",
"html_url": "https://github.com/TianhuaTao",
"followers_url": "https://api.github.com/users... | [] | open | false | null | [] | [
"Hi @TianhuaTao, thanks for opening this PR! \r\n\r\nThe easiest and recommended way to make a model available in `transformers` is to add the modeling code directly on the hub: https://huggingface.co/docs/transformers/custom_models. We have as much support there as we can (let us know if anything isn't working 🤗 ... | 1,707 | 1,708 | null | NONE | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29008/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29008/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29008",
"html_url": "https://github.com/huggingface/transformers/pull/29008",
"diff_url": "https://github.com/huggingface/transformers/pull/29008.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29008.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29007 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29007/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29007/comments | https://api.github.com/repos/huggingface/transformers/issues/29007/events | https://github.com/huggingface/transformers/issues/29007 | 2,133,162,466 | I_kwDOCUB6oc5_JXni | 29,007 | Many checkpoints are outdated (torch.save'd with torch < 1.6) and don't support mmap | {
"login": "thiagocrepaldi",
"id": 5469809,
"node_id": "MDQ6VXNlcjU0Njk4MDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5469809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thiagocrepaldi",
"html_url": "https://github.com/thiagocrepaldi",
"followers_url": "https://api.gith... | [] | open | false | null | [] | [
"Hi @thiagocrepaldi, thanks for raising this issue! \r\n\r\nUnfortunately, it's simply not possible for us to convert all checkpoints to be compatible. There are currently more than 800k models listed on the hub, as well as many private models and models which haven't been uploaded. Backwards compatibility is impor... | 1,707 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.36.0
- Platform: Linux-6.5.0-15-generic-x86_64-with-glibc2.31
- Python version: 3.11.5
- Huggingface_hub version: 0.19.4
- Safetensors version: 0.4.0
- Accelerate version: 0.24.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.3.0a0+git78a84f1 (True)
- ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29007/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29007/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29006 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29006/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29006/comments | https://api.github.com/repos/huggingface/transformers/issues/29006/events | https://github.com/huggingface/transformers/issues/29006 | 2,133,043,866 | I_kwDOCUB6oc5_I6qa | 29,006 | load_state_dict doesnt support torch._subclasses.fake_tensor.FakeTensorMode | {
"login": "thiagocrepaldi",
"id": 5469809,
"node_id": "MDQ6VXNlcjU0Njk4MDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5469809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thiagocrepaldi",
"html_url": "https://github.com/thiagocrepaldi",
"followers_url": "https://api.gith... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | [
"Hi @thiagocrepaldi, thanks for raising this issue! \r\n\r\nI'm going to cc in @Narsil, the king of safetensors here. \r\n\r\nIf you want to be able to create an empty model, you can use [accelerate's `init_empty_weights` utility](https://huggingface.co/docs/accelerate/v0.11.0/en/big_modeling): \r\n\r\n```py\r\nfro... | 1,707 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.36.0
- Platform: Linux-6.5.0-15-generic-x86_64-with-glibc2.31
- Python version: 3.11.5
- Huggingface_hub version: 0.19.4
- Safetensors version: 0.4.0
- Accelerate version: 0.24.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.3.0a0+git78a84f1 (True)
- ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29006/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29006/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29005 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29005/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29005/comments | https://api.github.com/repos/huggingface/transformers/issues/29005/events | https://github.com/huggingface/transformers/pull/29005 | 2,132,933,259 | PR_kwDOCUB6oc5myOmB | 29,005 | Cache: standardize cache interface | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | open | false | null | [] | [
"Model correctness depends on #28937, rebasing with its contents",
"Closed in place of #29180 (merge conflicts 🤷 )"
] | 1,707 | 1,708 | null | MEMBER | null | # What does this PR do?
In #27931, where the static cache was introduced, we noticed it had the following hard requirements:
1. The model instance holds the cache, as opposed to being a tensor passed around;
2. Each layer has its own cache, as opposed to a single cache for all layers.
This contrasts with previo... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29005/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29005/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29005",
"html_url": "https://github.com/huggingface/transformers/pull/29005",
"diff_url": "https://github.com/huggingface/transformers/pull/29005.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29005.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29004 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29004/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29004/comments | https://api.github.com/repos/huggingface/transformers/issues/29004/events | https://github.com/huggingface/transformers/pull/29004 | 2,132,847,728 | PR_kwDOCUB6oc5mx8NS | 29,004 | fix for custom pipeline configuration | {
"login": "not-lain",
"id": 70411813,
"node_id": "MDQ6VXNlcjcwNDExODEz",
"avatar_url": "https://avatars.githubusercontent.com/u/70411813?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/not-lain",
"html_url": "https://github.com/not-lain",
"followers_url": "https://api.github.com/users/not... | [] | open | false | null | [] | [
"@Rocketknight1 could you review this one too ?\r\nfixed the tests (i forgor to update my branch :D ) ",
"Sure, I'll try to take a look at this one and the pipeline upload one!"
] | 1,707 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
fixes configuration file not pointing at a remote repo with custom pipeline architecture
Fixes #28907
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://gi... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29004/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29004/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29004",
"html_url": "https://github.com/huggingface/transformers/pull/29004",
"diff_url": "https://github.com/huggingface/transformers/pull/29004.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29004.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29003 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29003/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29003/comments | https://api.github.com/repos/huggingface/transformers/issues/29003/events | https://github.com/huggingface/transformers/pull/29003 | 2,132,689,223 | PR_kwDOCUB6oc5mxZzC | 29,003 | [DO NOT MERGE] Remove big block of code in _from_pretrained() | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"> I'm going to see what the CI says.\r\n\r\n🙃 ",
"The whole block seems to be there just to throw a warning when you load a tokenizer with the wrong class! "
] | 1,707 | 1,707 | 1,707 | MEMBER | null | I can't figure out what this code is doing, and I suspect we don't need to run it all. I'm going to see what the CI says. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29003/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29003/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29003",
"html_url": "https://github.com/huggingface/transformers/pull/29003",
"diff_url": "https://github.com/huggingface/transformers/pull/29003.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29003.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29002 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29002/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29002/comments | https://api.github.com/repos/huggingface/transformers/issues/29002/events | https://github.com/huggingface/transformers/pull/29002 | 2,132,623,369 | PR_kwDOCUB6oc5mxLWu | 29,002 | [`Doc`] Fix docbuilder - make `BackboneMixin` and `BackboneConfigMixin` importable from `utils`. | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29002). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"cc @ydshieh ",
"I'm merging as this is quite urgent for resolving failing tests ... | 1,707 | 1,707 | 1,707 | COLLABORATOR | null | # What does this PR do?
Several CIs have started having the doc builds failing e.g.:
* https://github.com/huggingface/transformers/actions/runs/7881443714/job/21529678872
* https://github.com/huggingface/transformers/actions/runs/7884708184/job/21530274589
On one case rerunning lead to a successful build:
* F... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29002/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29002/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29002",
"html_url": "https://github.com/huggingface/transformers/pull/29002",
"diff_url": "https://github.com/huggingface/transformers/pull/29002.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29002.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29001 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29001/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29001/comments | https://api.github.com/repos/huggingface/transformers/issues/29001/events | https://github.com/huggingface/transformers/pull/29001 | 2,132,617,158 | PR_kwDOCUB6oc5mxJ97 | 29,001 | Update all references to canonical models | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29001). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"The failing tests are also failing on `main` and due to the static cache PR. The `... | 1,707 | 1,708 | 1,708 | MEMBER | null | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29001/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29001/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29001",
"html_url": "https://github.com/huggingface/transformers/pull/29001",
"diff_url": "https://github.com/huggingface/transformers/pull/29001.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29001.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29000 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29000/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29000/comments | https://api.github.com/repos/huggingface/transformers/issues/29000/events | https://github.com/huggingface/transformers/pull/29000 | 2,132,557,752 | PR_kwDOCUB6oc5mw9CS | 29,000 | Extend import utils to cover "editable" torch versions | {
"login": "bhack",
"id": 1710528,
"node_id": "MDQ6VXNlcjE3MTA1Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1710528?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bhack",
"html_url": "https://github.com/bhack",
"followers_url": "https://api.github.com/users/bhack/follower... | [] | open | false | null | [] | [
"Why formatting error we have? it isn't clear from the CI log",
"The repository uses double quotes for string literals. You can format your code by running 'make style' (see [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request)).",
"Done, who can re... | 1,707 | 1,707 | null | NONE | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29000/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29000/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29000",
"html_url": "https://github.com/huggingface/transformers/pull/29000",
"diff_url": "https://github.com/huggingface/transformers/pull/29000.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29000.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28999 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28999/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28999/comments | https://api.github.com/repos/huggingface/transformers/issues/28999/events | https://github.com/huggingface/transformers/issues/28999 | 2,132,306,791 | I_kwDOCUB6oc5_GGtn | 28,999 | Pytorch not detected in official "editable" nightly | {
"login": "bhack",
"id": 1710528,
"node_id": "MDQ6VXNlcjE3MTA1Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1710528?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bhack",
"html_url": "https://github.com/bhack",
"followers_url": "https://api.github.com/users/bhack/follower... | [] | open | false | null | [] | [] | 1,707 | 1,707 | null | NONE | null | ### System Info
Pytorch is not detecting in the offical editable nighlty conda env:
https://github.com/pytorch/pytorch/blob/main/CONTRIBUTING.md#nightly-checkout--pull
```python
File "/opt/conda/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1325, in __getattribute__
requires_backends... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28999/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28999/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28998 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28998/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28998/comments | https://api.github.com/repos/huggingface/transformers/issues/28998/events | https://github.com/huggingface/transformers/issues/28998 | 2,132,280,613 | I_kwDOCUB6oc5_GAUl | 28,998 | `dataset.map` for tokenization hangs at 60% | {
"login": "sarahpannn",
"id": 62582677,
"node_id": "MDQ6VXNlcjYyNTgyNjc3",
"avatar_url": "https://avatars.githubusercontent.com/u/62582677?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sarahpannn",
"html_url": "https://github.com/sarahpannn",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"Hey, I ran this without issues, not stuck but just super slow. Might be a lot of things from the tokenizer to the dataset to your CPUs"
] | 1,707 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.36.2
- Platform: Linux-5.15.0-92-generic-x86_64-with-glibc2.35
- Python version: 3.11.5
- Huggingface_hub version: 0.20.2
- Safetensors version: 0.4.1
- Accelerate version: 0.25.0
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.2+cu121 (True)
- Tensor... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28998/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28998/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28997 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28997/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28997/comments | https://api.github.com/repos/huggingface/transformers/issues/28997/events | https://github.com/huggingface/transformers/issues/28997 | 2,132,254,332 | I_kwDOCUB6oc5_F558 | 28,997 | Automatically add tokens when using model.generate() | {
"login": "MikeDean2367",
"id": 65744560,
"node_id": "MDQ6VXNlcjY1NzQ0NTYw",
"avatar_url": "https://avatars.githubusercontent.com/u/65744560?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MikeDean2367",
"html_url": "https://github.com/MikeDean2367",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"Hi @MikeDean2367, thanks for raising an issue! \r\n\r\nThis is a question best placed in our [forums](https://discuss.huggingface.co/). We try to reserve the github issues for feature requests and bug reports.",
"ok, i'll go to the forums. Thanks!"
] | 1,707 | 1,707 | 1,707 | NONE | null | ### Feature request
When calling `model.generate()`, I hope that when a certain token is generated, a new token is automatically added after it.
For example, when the model generates "The founder of Apple is Steve," I would like it to add a token "Jobs" at the end automatically. After that, the input of the next s... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28997/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28997/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28996 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28996/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28996/comments | https://api.github.com/repos/huggingface/transformers/issues/28996/events | https://github.com/huggingface/transformers/issues/28996 | 2,132,190,989 | I_kwDOCUB6oc5_FqcN | 28,996 | Starcoder/GPTBigCode has broken beam search when converted to ONNX runtime model | {
"login": "lidingsnyk",
"id": 139234713,
"node_id": "U_kgDOCEyNmQ",
"avatar_url": "https://avatars.githubusercontent.com/u/139234713?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lidingsnyk",
"html_url": "https://github.com/lidingsnyk",
"followers_url": "https://api.github.com/users/lid... | [] | open | false | null | [] | [
"cc @gante ",
"@lidingsnyk GPTBigCode does have an unconventional cache, yes, which may be the underlying cause for the bug you're seeing. However, in terms of code, the error stems from `optimum` (or from the interface between `transformers` and `optimum`, also usually handled on the `optimum` side) 🤗 \r\n\r\nI... | 1,707 | 1,707 | null | NONE | null | Not sure if the root cause of the issue is in `huggingface/tranformers` or `huggingface/onnxruntime`, but posting it here in case people have more context. Sorry if this ended up being noise for this forum.
### System Info
```
transformers version: 4.37.2
optimum[onnxruntime-gpu]==1.16.2
onnxruntime-gpu==1.1... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28996/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28996/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28995 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28995/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28995/comments | https://api.github.com/repos/huggingface/transformers/issues/28995/events | https://github.com/huggingface/transformers/pull/28995 | 2,132,036,961 | PR_kwDOCUB6oc5mvLAb | 28,995 | fix(CLIP): make clip model exportable using torch.jit.trace | {
"login": "Bycob",
"id": 15674552,
"node_id": "MDQ6VXNlcjE1Njc0NTUy",
"avatar_url": "https://avatars.githubusercontent.com/u/15674552?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Bycob",
"html_url": "https://github.com/Bycob",
"followers_url": "https://api.github.com/users/Bycob/follow... | [] | open | false | null | [] | [
"Hi @Bycob, thanks for opening this PR! \r\n\r\nCould you share how you're tracing the model? \r\n\r\nOn `main` I'm able to trace CLIP without any issue:\r\n\r\n```py\r\nimport torch\r\nfrom PIL import Image\r\nimport requests\r\nfrom transformers import AutoProcessor, CLIPModel\r\n\r\nmodel = CLIPModel.from_pretra... | 1,707 | 1,707 | null | NONE | null | # What does this PR do?
I added `.long()` on two places to make the model exportable using `torch.jit.trace`. If they are ommited we get this error:
```
mllib internal error: Libtorch error:The following operation failed in the TorchScript interpreter.
Traceback of TorchScript, serialized code (most recent call... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28995/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28995/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28995",
"html_url": "https://github.com/huggingface/transformers/pull/28995",
"diff_url": "https://github.com/huggingface/transformers/pull/28995.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28995.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28994 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28994/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28994/comments | https://api.github.com/repos/huggingface/transformers/issues/28994/events | https://github.com/huggingface/transformers/pull/28994 | 2,131,907,734 | PR_kwDOCUB6oc5muue_ | 28,994 | Fix max_length criteria when using inputs_embeds | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | [
"oh, i see, added a new fix and checked that creating an empty tensor does not break anything",
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28994). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the la... | 1,707 | 1,708 | 1,708 | MEMBER | null | # What does this PR do?
Fixes #28953 . StoppingCriteria with max_length behaves differently when provided `input_ids` or `inputs_embeds`, this happens only on decoder-only models. The PR fixes it so that the criteria accounts for the length of `input_embeds` when generating
## Before submitting
- [ ] This PR f... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28994/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28994/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28994",
"html_url": "https://github.com/huggingface/transformers/pull/28994",
"diff_url": "https://github.com/huggingface/transformers/pull/28994.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28994.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28993 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28993/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28993/comments | https://api.github.com/repos/huggingface/transformers/issues/28993/events | https://github.com/huggingface/transformers/issues/28993 | 2,131,905,783 | I_kwDOCUB6oc5_Ekz3 | 28,993 | Add Hiera model | {
"login": "p1atdev",
"id": 60182057,
"node_id": "MDQ6VXNlcjYwMTgyMDU3",
"avatar_url": "https://avatars.githubusercontent.com/u/60182057?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/p1atdev",
"html_url": "https://github.com/p1atdev",
"followers_url": "https://api.github.com/users/p1atde... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
},
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-Mt... | open | false | null | [] | [
"Can I work on this ?",
"@Namangarg110 Certainly! Feel free to open a PR when you're ready and ping us for review 🤗. To avoid issues from becoming too stale, we will prioritise the first open PR when reviewing over the first comment on issues. ",
"Thanks @amyeroberts. This is my first open-source issue. Would ... | 1,707 | 1,708 | null | NONE | null | ### Model description
Hiera is a hierarchical vision transformer that is fast, powerful, and, above all, simple. It outperforms the state-of-the-art across a wide array of image and video tasks while being much faster.
### Open source status
- [X] The model implementation is available
- [X] The model weights are ava... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28993/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28993/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28992 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28992/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28992/comments | https://api.github.com/repos/huggingface/transformers/issues/28992/events | https://github.com/huggingface/transformers/issues/28992 | 2,131,814,303 | I_kwDOCUB6oc5_EOef | 28,992 | Where size 5 comes from in LlamaModelforCasualLM?? | {
"login": "daehuikim",
"id": 40377750,
"node_id": "MDQ6VXNlcjQwMzc3NzUw",
"avatar_url": "https://avatars.githubusercontent.com/u/40377750?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/daehuikim",
"html_url": "https://github.com/daehuikim",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | [
"Hi @daehuikim, thanks for raising an issue! \r\n\r\nThis is a question best placed in our [forums](https://discuss.huggingface.co/). We try to reserve the github issues for feature requests and bug reports.\r\n\r\nThe value 5 comes from the input sequence length (the number of input tokens) i.e. `len(input_ids[0])... | 1,707 | 1,707 | 1,707 | NONE | null | ```
from transformers import (
AutoModelForCausalLM,
AutoTokenizer
)
model_name = "meta-llama/Llama-2-7b-chat-hf"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
input_text = "How are you?"
input_ids = tokenizer.encode(input_text, retur... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28992/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28992/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28991 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28991/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28991/comments | https://api.github.com/repos/huggingface/transformers/issues/28991/events | https://github.com/huggingface/transformers/pull/28991 | 2,131,473,179 | PR_kwDOCUB6oc5mtPkS | 28,991 | ENH [`AutoQuantizer`]: enhance trainer + not supported quant methods | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28991). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | CONTRIBUTOR | null | # What does this PR do?
Currently, if a quantization method do not support PEFT fine-tuning an old error with bistandbytes is raised:
```bash
The model you want to train is loaded in 8-bit precision. if you want to fine-tune an 8-bit
model, please make sure that you have installed `bitsandbytes>=0.37.0`.
```... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28991/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28991/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28991",
"html_url": "https://github.com/huggingface/transformers/pull/28991",
"diff_url": "https://github.com/huggingface/transformers/pull/28991.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28991.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28990 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28990/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28990/comments | https://api.github.com/repos/huggingface/transformers/issues/28990/events | https://github.com/huggingface/transformers/issues/28990 | 2,131,390,171 | I_kwDOCUB6oc5_Cm7b | 28,990 | MBartForConditionalGeneration to do mask filling task with mbart-large-50-many-to-many-mmt | {
"login": "Aureole-1210",
"id": 59786603,
"node_id": "MDQ6VXNlcjU5Nzg2NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/59786603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Aureole-1210",
"html_url": "https://github.com/Aureole-1210",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"cc @ArthurZucker "
] | 1,707 | 1,708 | null | NONE | null | ### System Info
I also have problem with this.
I want to use 【facebook/mbart-large-50-many-to-many-mmt】 to do mask filling task. But the output is always strange.
I modify the input format as the Model Card from https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt suggested.
My code is as follows:
#... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28990/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28990/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28989 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28989/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28989/comments | https://api.github.com/repos/huggingface/transformers/issues/28989/events | https://github.com/huggingface/transformers/pull/28989 | 2,131,359,266 | PR_kwDOCUB6oc5ms4F- | 28,989 | Add cuda_custom_kernel in DETA | {
"login": "SangbumChoi",
"id": 34004152,
"node_id": "MDQ6VXNlcjM0MDA0MTUy",
"avatar_url": "https://avatars.githubusercontent.com/u/34004152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SangbumChoi",
"html_url": "https://github.com/SangbumChoi",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28989). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28989/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28989/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28989",
"html_url": "https://github.com/huggingface/transformers/pull/28989",
"diff_url": "https://github.com/huggingface/transformers/pull/28989.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28989.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28988 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28988/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28988/comments | https://api.github.com/repos/huggingface/transformers/issues/28988/events | https://github.com/huggingface/transformers/pull/28988 | 2,131,357,148 | PR_kwDOCUB6oc5ms3qX | 28,988 | ENH: Do not pass warning message in case `quantization_config` is in config but not passed as an arg | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28988). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | CONTRIBUTOR | null | # What does this PR do?
Currently, transformers always warns users with a wrong message when loading a model that has a quantization_config without even passing quantization_config to from_pretrained. Indeed we should warn users only when `quantization_config_from_args` instead of all the time
cc @amyeroberts
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28988/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28988/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28988",
"html_url": "https://github.com/huggingface/transformers/pull/28988",
"diff_url": "https://github.com/huggingface/transformers/pull/28988.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28988.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28987 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28987/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28987/comments | https://api.github.com/repos/huggingface/transformers/issues/28987/events | https://github.com/huggingface/transformers/pull/28987 | 2,131,305,610 | PR_kwDOCUB6oc5mss6V | 28,987 | [`Awq`] Add peft support for AWQ | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28987). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"cc @amyeroberts - we just got a release from autoawq ! this is ready for review 🙏... | 1,707 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
As per title, adds Trainer + AWQ + PEFT support.
Needs to be merged at the same time as: https://github.com/huggingface/peft/pull/1399
The tests are directly added on https://github.com/huggingface/peft/pull/1399
cc @casper-hansen @pacman100 @amyeroberts | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28987/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28987/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28987",
"html_url": "https://github.com/huggingface/transformers/pull/28987",
"diff_url": "https://github.com/huggingface/transformers/pull/28987.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28987.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28986 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28986/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28986/comments | https://api.github.com/repos/huggingface/transformers/issues/28986/events | https://github.com/huggingface/transformers/pull/28986 | 2,131,108,419 | PR_kwDOCUB6oc5msBoV | 28,986 | Fix a configuration key error in forward() of MusicgenForConditionalGeneration | {
"login": "IntelliNik",
"id": 37289946,
"node_id": "MDQ6VXNlcjM3Mjg5OTQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/37289946?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/IntelliNik",
"html_url": "https://github.com/IntelliNik",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [] | 1,707 | 1,707 | null | NONE | null | Hi everyone,
I think I noticed a bug in the forward function of MusicgenForConditionalGeneration. When calculating the loss with given labels, I get the error "'MusicgenConfig' object has no attribute 'vocab_size'" as only the decoder.config has a vocab_size entry.
I think this should be the correct way to imple... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28986/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28986/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28986",
"html_url": "https://github.com/huggingface/transformers/pull/28986",
"diff_url": "https://github.com/huggingface/transformers/pull/28986.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28986.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28985 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28985/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28985/comments | https://api.github.com/repos/huggingface/transformers/issues/28985/events | https://github.com/huggingface/transformers/pull/28985 | 2,130,913,115 | PR_kwDOCUB6oc5mrW-2 | 28,985 | [`pipeline`] Add pool option to image feature extraction pipeline | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28985). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"@ArthurZucker Updated the error message and added tests for exact outputs in [b94d... | 1,707 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
Adds the flag `pool` which will return the pooled output, rather than the raw hidden states.
Doesn't work for data2vecvision as the model doesn't [add the pooling layer by default](https://github.com/huggingface/transformers/blob/78ba9f4617370a41c436126bbbb6f8d75924837c/src/transformers/m... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28985/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28985/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28985",
"html_url": "https://github.com/huggingface/transformers/pull/28985",
"diff_url": "https://github.com/huggingface/transformers/pull/28985.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28985.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28984 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28984/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28984/comments | https://api.github.com/repos/huggingface/transformers/issues/28984/events | https://github.com/huggingface/transformers/pull/28984 | 2,130,838,955 | PR_kwDOCUB6oc5mrGZh | 28,984 | [WIP] Word level timestamp for long-form generation | {
"login": "patrickvonplaten",
"id": 23423619,
"node_id": "MDQ6VXNlcjIzNDIzNjE5",
"avatar_url": "https://avatars.githubusercontent.com/u/23423619?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/patrickvonplaten",
"html_url": "https://github.com/patrickvonplaten",
"followers_url": "https://... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28984). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"It's actually much harder to do this than I thought and I sadly won't have time to... | 1,707 | 1,708 | null | MEMBER | null | # What does this PR do?
Fixes: https://github.com/huggingface/transformers/issues/28977
We haven't added word level timestamp for long-form generation yet. It's definitely possible, but it'll require some more changes in `generate`. Happy to take a closer look here the next days.
With the PR in its current sta... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28984/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28984/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28984",
"html_url": "https://github.com/huggingface/transformers/pull/28984",
"diff_url": "https://github.com/huggingface/transformers/pull/28984.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28984.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28983 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28983/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28983/comments | https://api.github.com/repos/huggingface/transformers/issues/28983/events | https://github.com/huggingface/transformers/issues/28983 | 2,130,708,973 | I_kwDOCUB6oc5_AAnt | 28,983 | Fix custom architectures | {
"login": "not-lain",
"id": 70411813,
"node_id": "MDQ6VXNlcjcwNDExODEz",
"avatar_url": "https://avatars.githubusercontent.com/u/70411813?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/not-lain",
"html_url": "https://github.com/not-lain",
"followers_url": "https://api.github.com/users/not... | [] | open | false | null | [] | [
"@Rocketknight1 this might help with keeping track to everything.\r\nI'm going to start working on the 2nd issue above hope I fix it soon. "
] | 1,707 | 1,707 | null | CONTRIBUTOR | null | opening this issue for better visibility and to keep track to what needs to be fixed, any contributions are welcome.
| name | issue | pull request | comment |
| ------------- | ------------- | ------------- | ------------- |
| dependency issue when working with a custom architecture in a repo that has a dot in it... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28983/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28983/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28982 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28982/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28982/comments | https://api.github.com/repos/huggingface/transformers/issues/28982/events | https://github.com/huggingface/transformers/pull/28982 | 2,130,602,975 | PR_kwDOCUB6oc5mqSZ_ | 28,982 | Correct zero division error in inverse sqrt scheduler | {
"login": "DavidAfonsoValente",
"id": 74915610,
"node_id": "MDQ6VXNlcjc0OTE1NjEw",
"avatar_url": "https://avatars.githubusercontent.com/u/74915610?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DavidAfonsoValente",
"html_url": "https://github.com/DavidAfonsoValente",
"followers_url": "ht... | [] | open | false | null | [] | [
"Hi @DavidAfonsoValente, thanks for opening this PR! \r\n\r\nCould you give some more context about the issue this resolves, ideally with a reproducible snippet? \r\n\r\nJust looking at the PR, it implies that `timescale` is 0, which I don't think should ever be the case. ",
"I corrected the description link to t... | 1,707 | 1,708 | null | NONE | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28982/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28982/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28982",
"html_url": "https://github.com/huggingface/transformers/pull/28982",
"diff_url": "https://github.com/huggingface/transformers/pull/28982.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28982.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28981 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28981/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28981/comments | https://api.github.com/repos/huggingface/transformers/issues/28981/events | https://github.com/huggingface/transformers/issues/28981 | 2,130,600,607 | I_kwDOCUB6oc5-_mKf | 28,981 | tracker: `generate` compatibility with `torch.compile` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | open | false | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.co... | [] | 1,707 | 1,707 | null | MEMBER | null | # `generate` 🤜 🤛 `torch.compile`
This issue is a tracker of the compatibility between `.generate` and `torch.compile` ([intro docs by pytorch](https://pytorch.org/tutorials/intermediate/torch_compile_tutorial.html)). The goal is to enable `fullgraph=True` compilation on the main `generate` use cases.
⚠️ Is *yo... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28981/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28981/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28980 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28980/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28980/comments | https://api.github.com/repos/huggingface/transformers/issues/28980/events | https://github.com/huggingface/transformers/issues/28980 | 2,130,570,113 | I_kwDOCUB6oc5-_euB | 28,980 | Add sliding window attention to sdpa in mistral | {
"login": "ehuaa",
"id": 5137359,
"node_id": "MDQ6VXNlcjUxMzczNTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5137359?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ehuaa",
"html_url": "https://github.com/ehuaa",
"followers_url": "https://api.github.com/users/ehuaa/follower... | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First... | open | false | null | [] | [
"cc @fxmarty ",
"Hi, thank you for the suggestion, SDPA support for mistral was added by @ArthurZucker in https://github.com/huggingface/transformers/pull/28133, maybe he has more insight.",
"I think it comes down to just adding `sliding_window` to the call for `_prepare_4d_causal_attention_mask_for_sdpa` yes. ... | 1,707 | 1,708 | null | NONE | null | ### Feature request
https://github.com/huggingface/transformers/blob/main/src/transformers/models/mistral/modeling_mistral.py#L1006-L1023

In the code listed above, the latest version of transformers cannot u... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28980/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28980/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28979 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28979/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28979/comments | https://api.github.com/repos/huggingface/transformers/issues/28979/events | https://github.com/huggingface/transformers/issues/28979 | 2,130,560,594 | I_kwDOCUB6oc5-_cZS | 28,979 | transformers/configuration_utils.py: TypeError: Object of type ResNetConfig is not JSON serializable ( AutoModelForObjectDetection.from_pretrained("microsoft/table-transformer-detection"..)) | {
"login": "dokondr",
"id": 1510880,
"node_id": "MDQ6VXNlcjE1MTA4ODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1510880?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dokondr",
"html_url": "https://github.com/dokondr",
"followers_url": "https://api.github.com/users/dokondr/... | [] | closed | false | null | [] | [
"Hi @dokondr, thanks for raising an issue! \r\n\r\nI'm unable to replicate this issue locally. Could you try installing with pip and using the most recent version of transformers? \r\n\r\n```\r\npip install -U transformers\r\n```",
"> Hi @dokondr, thanks for raising an issue!\r\n> \r\n> I'm unable to replicate th... | 1,707 | 1,707 | 1,707 | NONE | null | ### System Info
When loading model:
AutoModelForObjectDetection.from_pretrained("microsoft/table-transformer-detection", revision="no_timm")
transformers/configuration_utils.py returns a TypeError:
Object of type ResNetConfig is not JSON serializable
This error happens when I run in virtual environment (Win... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28979/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28979/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28978 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28978/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28978/comments | https://api.github.com/repos/huggingface/transformers/issues/28978/events | https://github.com/huggingface/transformers/issues/28978 | 2,130,494,276 | I_kwDOCUB6oc5-_MNE | 28,978 | Whisper Sequential long-form decoding doesn't work when forcing task | {
"login": "antoinethl",
"id": 56915854,
"node_id": "MDQ6VXNlcjU2OTE1ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/56915854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/antoinethl",
"html_url": "https://github.com/antoinethl",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | [
"cc @sanchit-gandhi @ylacombe ",
"Hey @antoinethl,\r\n\r\nThanks for reporting the bug! Note that the bug is already solved on \"main\" with https://github.com/huggingface/transformers/pull/28687. Could you try to install transformers as follows:\r\n\r\n```\r\n!pip install git+https://github.com/huggingface/trans... | 1,707 | 1,707 | 1,707 | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-4.15.0-142-generic-x86_64-with-glibc2.23
- Python version: 3.10.11
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.0.1 (True)
- Tensor... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28978/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28978/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28977 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28977/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28977/comments | https://api.github.com/repos/huggingface/transformers/issues/28977/events | https://github.com/huggingface/transformers/issues/28977 | 2,130,442,867 | I_kwDOCUB6oc5--_pz | 28,977 | Whisper Sequential long-form decoding doesn't work with timestamps per token | {
"login": "antoinethl",
"id": 56915854,
"node_id": "MDQ6VXNlcjU2OTE1ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/56915854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/antoinethl",
"html_url": "https://github.com/antoinethl",
"followers_url": "https://api.github.com/use... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | [
"cc @sanchit-gandhi @ylacombe ",
"This is more of a feature request than a bug I'd say. Happy to have a look with https://github.com/huggingface/transformers/pull/28984"
] | 1,707 | 1,707 | null | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-4.15.0-142-generic-x86_64-with-glibc2.23
- Python version: 3.10.11
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.0.1 (True)
- Tensor... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28977/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28977/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28976 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28976/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28976/comments | https://api.github.com/repos/huggingface/transformers/issues/28976/events | https://github.com/huggingface/transformers/issues/28976 | 2,130,252,754 | I_kwDOCUB6oc5--RPS | 28,976 | [`spam`] | {
"login": "goalend",
"id": 110501477,
"node_id": "U_kgDOBpYeZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/110501477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/goalend",
"html_url": "https://github.com/goalend",
"followers_url": "https://api.github.com/users/goalend/foll... | [] | open | true | null | [] | [] | 1,707 | 1,707 | null | NONE | spam | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28976/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28976/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28975 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28975/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28975/comments | https://api.github.com/repos/huggingface/transformers/issues/28975/events | https://github.com/huggingface/transformers/pull/28975 | 2,130,220,986 | PR_kwDOCUB6oc5mo-vh | 28,975 | Static Cache: load models with MQA or GQA | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28975). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | MEMBER | null | # What does this PR do?
Adds support to loading MQA or GQA models to the static cache (such as [this one](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0)) | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28975/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28975/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28975",
"html_url": "https://github.com/huggingface/transformers/pull/28975",
"diff_url": "https://github.com/huggingface/transformers/pull/28975.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28975.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28974 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28974/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28974/comments | https://api.github.com/repos/huggingface/transformers/issues/28974/events | https://github.com/huggingface/transformers/pull/28974 | 2,130,219,207 | PR_kwDOCUB6oc5mo-Wp | 28,974 | Updated requirements for image-classification samples: datasets>=2.14.0 | {
"login": "alekseyfa",
"id": 26468927,
"node_id": "MDQ6VXNlcjI2NDY4OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/26468927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alekseyfa",
"html_url": "https://github.com/alekseyfa",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28974). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,707 | 1,707 | 1,707 | CONTRIBUTOR | null | # What does this PR do?
This PR updates the dependency requirements for image-classification case.
The run_image_classification.py script in the current implementation uses the token parameter, which was introduced as of datasets version 2.14.0. Thus, using packages below the specified version may cause errors.
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28974/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28974/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28974",
"html_url": "https://github.com/huggingface/transformers/pull/28974",
"diff_url": "https://github.com/huggingface/transformers/pull/28974.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28974.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28973 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28973/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28973/comments | https://api.github.com/repos/huggingface/transformers/issues/28973/events | https://github.com/huggingface/transformers/pull/28973 | 2,130,194,809 | PR_kwDOCUB6oc5mo5AS | 28,973 | Image Feature Extraction docs | {
"login": "merveenoyan",
"id": 53175384,
"node_id": "MDQ6VXNlcjUzMTc1Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/53175384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/merveenoyan",
"html_url": "https://github.com/merveenoyan",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_28973). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"@merveenoyan There's a PR to add the pool option - I'm just waiting for review #28... | 1,707 | 1,708 | null | CONTRIBUTOR | null | This PR adds task guide for image feature extraction.
Note that as of now the image feature extraction pipeline doesn't have a pooler output, so the result in the doc is the theoretical result :') This PR can be reviewed after that change is merged since this wouldn't work without `ViTPooler`. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28973/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28973/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28973",
"html_url": "https://github.com/huggingface/transformers/pull/28973",
"diff_url": "https://github.com/huggingface/transformers/pull/28973.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28973.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28972 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28972/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28972/comments | https://api.github.com/repos/huggingface/transformers/issues/28972/events | https://github.com/huggingface/transformers/issues/28972 | 2,130,170,435 | I_kwDOCUB6oc5-99JD | 28,972 | NotImplementedError: Cannot copy out of meta tensor; no data! when moving LLaVa from meta device to CUDA | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"Ok I'm seeing the same error with BERT:\r\n```python\r\nfrom transformers import BertConfig, BertModel\r\nimport torch\r\n\r\nconfig = BertConfig()\r\n\r\nwith torch.device(\"meta\"):\r\n model = BertModel(config)\r\n\r\npretrained_model = BertModel.from_pretrained(\"bert-base-uncased\")\r\nmodel.load_state_dic... | 1,707 | 1,707 | null | CONTRIBUTOR | null | ### System Info
Transformers 4.37.0.dev0
### Who can help?
@ArthurZucker @younesbelkada
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Reproduction
Getting this error:
```
Traceback (most recent call last):
File "src/transformers/models/llava/test_meta_d... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28972/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28972/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28971 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28971/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28971/comments | https://api.github.com/repos/huggingface/transformers/issues/28971/events | https://github.com/huggingface/transformers/pull/28971 | 2,130,115,339 | PR_kwDOCUB6oc5moncX | 28,971 | Allow setting dtype in rescaling in image_processing_donut.py | {
"login": "archit76",
"id": 15254541,
"node_id": "MDQ6VXNlcjE1MjU0NTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/15254541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/archit76",
"html_url": "https://github.com/archit76",
"followers_url": "https://api.github.com/users/arc... | [] | closed | false | null | [] | [
"Not required anymore. Works for me!"
] | 1,707 | 1,707 | 1,707 | NONE | null | Fixes #28969
With this fix, dtype can be changed by passing as a kwargs argument
if do_rescale:
images = [
self.rescale(image=image, scale=rescale_factor, input_data_format=input_data_format, **kwargs)
for image in images
] | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28971/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28971/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28971",
"html_url": "https://github.com/huggingface/transformers/pull/28971",
"diff_url": "https://github.com/huggingface/transformers/pull/28971.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28971.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28970 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28970/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28970/comments | https://api.github.com/repos/huggingface/transformers/issues/28970/events | https://github.com/huggingface/transformers/issues/28970 | 2,129,935,531 | I_kwDOCUB6oc5-9Dyr | 28,970 | Question about the use of bias in the Graphormer Model | {
"login": "sarah-af",
"id": 74510900,
"node_id": "MDQ6VXNlcjc0NTEwOTAw",
"avatar_url": "https://avatars.githubusercontent.com/u/74510900?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sarah-af",
"html_url": "https://github.com/sarah-af",
"followers_url": "https://api.github.com/users/sar... | [] | open | false | null | [] | [
"cc @clefourrier "
] | 1,707 | 1,707 | null | NONE | null | Hi,
The documentation of the Graphormerconfig indicates that the parameter,
bias (bool, optional, defaults to True) — Uses bias in the attention module - unsupported at the moment.
I have 2 questions,
1. Is that the same attention bias introduced in the paper using the shortest path distance? Or where is it app... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28970/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28970/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28969 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28969/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28969/comments | https://api.github.com/repos/huggingface/transformers/issues/28969/events | https://github.com/huggingface/transformers/issues/28969 | 2,129,867,176 | I_kwDOCUB6oc5-8zGo | 28,969 | Changing dtype(For half precision) not possible in rescale in image_processing_donut | {
"login": "archit76",
"id": 15254541,
"node_id": "MDQ6VXNlcjE1MjU0NTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/15254541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/archit76",
"html_url": "https://github.com/archit76",
"followers_url": "https://api.github.com/users/arc... | [] | closed | false | null | [] | [
"@amyeroberts \r\nPassing kwargs should resolve this:\r\nhttps://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/image_processing_donut.py#L441-L445\r\nif do_rescale:\r\nimages = [\r\nself.rescale(image=image, scale=rescale_factor, input_data_format=input_data_format, **kwargs)\r\nfor im... | 1,707 | 1,707 | 1,707 | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-6.1.0-1029-oem-x86_64-with-glibc2.35
- Python version: 3.9.18
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.24.1
https://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/image_... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28969/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28969/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28968 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28968/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28968/comments | https://api.github.com/repos/huggingface/transformers/issues/28968/events | https://github.com/huggingface/transformers/issues/28968 | 2,129,599,093 | I_kwDOCUB6oc5-7xp1 | 28,968 | The initialized weights of nn.Linear are very large within __init__ | {
"login": "zhjohnchan",
"id": 37367987,
"node_id": "MDQ6VXNlcjM3MzY3OTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/37367987?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhjohnchan",
"html_url": "https://github.com/zhjohnchan",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"Looks like a precision problem? The values are in the normal range in version 4.35.2.",
"cc @ydshieh this is an issue you looked into",
"Hi @zhjohnchan We do have similar issue, but I am surprised that v4.35 has normal values. I will have to take a look into this.\r\nBTW, could you share you torch versions, es... | 1,707 | 1,707 | null | NONE | null | ### System Info
Hi,
I wrote a classification wrapper for CLIPModel like this:
```
class CLIPVisionTransformerForImageClassification(CLIPPreTrainedModel):
config_class = CLIPConfig
_no_split_modules = ["CLIPEncoderLayer"]
def __init__(self, config: CLIPConfig):
super().__init__(config)
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28968/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28968/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28967 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28967/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28967/comments | https://api.github.com/repos/huggingface/transformers/issues/28967/events | https://github.com/huggingface/transformers/issues/28967 | 2,129,173,660 | I_kwDOCUB6oc5-6Jyc | 28,967 | ### Summary | {
"login": "goalend",
"id": 110501477,
"node_id": "U_kgDOBpYeZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/110501477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/goalend",
"html_url": "https://github.com/goalend",
"followers_url": "https://api.github.com/users/goalend/foll... | [] | closed | false | null | [] | [
"✌️"
] | 1,707 | 1,707 | 1,707 | NONE | null | ### Summary
The importance of enforcing a set of quality standards to continuously deploy in a consistent and predictable way can’t be underestimated. Implementing these standards without the duplication of CI/CD configuration code is a challenge many organizations face today. How can workflows solve these problems an... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28967/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28967/timeline | not_planned | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28966 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28966/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28966/comments | https://api.github.com/repos/huggingface/transformers/issues/28966/events | https://github.com/huggingface/transformers/pull/28966 | 2,129,051,352 | PR_kwDOCUB6oc5mlFle | 28,966 | Implementation of SuperPoint and AutoModelForKeypointDetection | {
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | [
"Hi @amyeroberts @rafaelpadilla @ArthurZucker ,\r\n\r\nI couldn't see other solution than to create a branch from scratch and re-implementing everything that was discussed since the beginning of the original [PR](https://github.com/huggingface/transformers/pull/25786). Here, most of my problems (RegNet related erro... | 1,707 | 1,708 | null | NONE | null | # What does this PR do?
This PR implements SuperPoint, one of the few models that generate keypoints and descriptors given an image, as discussed in [this previous pull request](https://github.com/huggingface/transformers/pull/25697)
The goal is to implement this model and a new type of AutoModel : `AutoModelForKey... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28966/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28966/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/28966",
"html_url": "https://github.com/huggingface/transformers/pull/28966",
"diff_url": "https://github.com/huggingface/transformers/pull/28966.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/28966.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/28965 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28965/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28965/comments | https://api.github.com/repos/huggingface/transformers/issues/28965/events | https://github.com/huggingface/transformers/issues/28965 | 2,129,043,434 | I_kwDOCUB6oc5-5p_q | 28,965 | Problems with starting meta-llama/Llama-2-7b-hf model using transformers library. HFValidationError | {
"login": "Karoljv",
"id": 93725497,
"node_id": "U_kgDOBZYjOQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93725497?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Karoljv",
"html_url": "https://github.com/Karoljv",
"followers_url": "https://api.github.com/users/Karoljv/follow... | [] | closed | false | null | [] | [] | 1,707 | 1,707 | 1,707 | NONE | null | ### System Info
I am using:
Windows 11
Python 3.10.4
Torch 2.2.0
Transformers 4.37.2
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own ta... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28965/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28965/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28964 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28964/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28964/comments | https://api.github.com/repos/huggingface/transformers/issues/28964/events | https://github.com/huggingface/transformers/issues/28964 | 2,129,025,459 | I_kwDOCUB6oc5-5lmz | 28,964 | RuntimeError: result type Float can't be cast to the desired output type Byte | {
"login": "KaifAhmad1",
"id": 98801504,
"node_id": "U_kgDOBeOXYA",
"avatar_url": "https://avatars.githubusercontent.com/u/98801504?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KaifAhmad1",
"html_url": "https://github.com/KaifAhmad1",
"followers_url": "https://api.github.com/users/KaifA... | [] | closed | false | null | [] | [
"Hey ! You are using custom code, we probably won't have the bandwidth to debug it for you! Best recommendation is to put a debug breakpoint and see what is happening! 🤗 ",
"@KaifAhmad1 thanks for the issue! The issue seems to be related to the custom code that is on the Hub ! \r\nI recommend opening an issue in... | 1,707 | 1,708 | 1,708 | NONE | null | ### System Info
OS: Windows 11 x64
Cuda: 12.1
transformers: 4.37.2
sentence_transformers: 2.3.1
bitsandbytes: 0.42.0
pip: 23.1.2
Python 3.10.10
### Who can help?
Hey, @ArthurZucker , @younesbelkada Please guide for this issue.
### Information
- [ ] The official example scripts
- [X] My own modified scripts... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28964/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28964/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28963 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28963/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28963/comments | https://api.github.com/repos/huggingface/transformers/issues/28963/events | https://github.com/huggingface/transformers/issues/28963 | 2,129,019,295 | I_kwDOCUB6oc5-5kGf | 28,963 | 2 | {
"login": "goalend",
"id": 110501477,
"node_id": "U_kgDOBpYeZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/110501477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/goalend",
"html_url": "https://github.com/goalend",
"followers_url": "https://api.github.com/users/goalend/foll... | [] | closed | false | null | [] | [] | 1,707 | 1,707 | 1,707 | NONE | null | 2
_Originally posted by @goalend in https://github.com/anchore/grype/pull/1710#discussion_r1485608950_
_Originally posted by @goalend in https://github.com/huggingface/transformers/issues/28962_ | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28963/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28963/timeline | not_planned | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28962 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28962/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28962/comments | https://api.github.com/repos/huggingface/transformers/issues/28962/events | https://github.com/huggingface/transformers/issues/28962 | 2,129,019,145 | I_kwDOCUB6oc5-5kEJ | 28,962 | 2 | {
"login": "goalend",
"id": 110501477,
"node_id": "U_kgDOBpYeZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/110501477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/goalend",
"html_url": "https://github.com/goalend",
"followers_url": "https://api.github.com/users/goalend/foll... | [] | closed | false | null | [] | [] | 1,707 | 1,707 | 1,707 | NONE | null | 2
_Originally posted by @goalend in https://github.com/anchore/grype/pull/1710#discussion_r1485608950_ | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28962/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28962/timeline | not_planned | null | null |
https://api.github.com/repos/huggingface/transformers/issues/28961 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/28961/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/28961/comments | https://api.github.com/repos/huggingface/transformers/issues/28961/events | https://github.com/huggingface/transformers/issues/28961 | 2,128,952,481 | I_kwDOCUB6oc5-5Tyh | 28,961 | Option to set the tracking URI for MLflowCallback. | {
"login": "seanswyi",
"id": 20367759,
"node_id": "MDQ6VXNlcjIwMzY3NzU5",
"avatar_url": "https://avatars.githubusercontent.com/u/20367759?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seanswyi",
"html_url": "https://github.com/seanswyi",
"followers_url": "https://api.github.com/users/sea... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | [
"Hi @seanswyi, thanks for opening this feature request! \r\n\r\nThe integrations are maintained by third party contributors, rather than the transformers team. If you or anyone else in the community would like to open a PR to add this we'd be happy to review! ",
"Thanks for the heads up @amyeroberts! I'll submit ... | 1,707 | 1,708 | 1,708 | CONTRIBUTOR | null | ### Feature request
Option to set not only the experiment name but also the tracking URI for MLflow.
### Motivation
My company and I use our own MLflow URI and all of our code has `mlflow.set_tracking_uri($URI)` inside. I'm not seeing such an option for the MLflowCallback and am only seeing an option to set the expe... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/28961/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/28961/timeline | completed | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.