Conversation
| - "tests/pipelines/test_pipelines_common.py" | ||
| - "tests/models/test_modeling_common.py" | ||
| - "examples/**/*.py" | ||
| - ".github/**.yml" |
There was a problem hiding this comment.
Temporary. For this PR.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
| logger.addHandler(stream_handler) | ||
|
|
||
|
|
||
| @unittest.skipIf(is_transformers_version(">=", "4.57.5"), "Size mismatch") |
There was a problem hiding this comment.
Internal discussion: https://huggingface.slack.com/archives/C014N4749J9/p1768474502541349
| torch.nn.ConvTranspose2d, | ||
| torch.nn.ConvTranspose3d, | ||
| torch.nn.Linear, | ||
| torch.nn.Embedding, |
There was a problem hiding this comment.
Happening because of the way weight loading is done in v5.
| model = AutoModel.from_pretrained( | ||
| "hf-internal-testing/tiny-stable-diffusion-torch", subfolder="text_encoder", use_safetensors=False | ||
| ) |
There was a problem hiding this comment.
Internal discussion: https://huggingface.slack.com/archives/C014N4749J9/p1768462040821759
| input_ids = ( | ||
| input_ids["input_ids"] if not isinstance(input_ids, list) and "input_ids" in input_ids else input_ids | ||
| ) |
There was a problem hiding this comment.
Internal discussion https://huggingface.slack.com/archives/C014N4749J9/p1768537424692669
| inputs = { | ||
| "prompt": "dance monkey", | ||
| "negative_prompt": "", | ||
| "negative_prompt": "bad", |
There was a problem hiding this comment.
Otherwise, the corresponding tokenizer outputs:
negative_prompt=[' ']
prompt=[' ']
text_input_ids=tensor([], size=(1, 0), dtype=torch.int64)which leads to:
E RuntimeError: cannot reshape tensor of 0 elements into shape [1, 0, -1, 8] because the unspecified dimension size -1 can be any value and is ambiguous|
Hmm, https://github.com/huggingface/diffusers/actions/runs/21354964855/job/61460242386?pr=12976 fails on this PR but passes without any regrets on this https://github.com/huggingface/diffusers/actions/runs/21344761402/job/61450173169?pr=12996. So, I am not sure at this point what's happening TBH. Other failures seem to be already existing and well-known. |
| for component_name in model_components_pipe: | ||
| pipe_component = model_components_pipe[component_name] | ||
| pipe_loaded_component = model_components_pipe_loaded[component_name] | ||
| for p1, p2 in zip(pipe_component.parameters(), pipe_loaded_component.parameters()): |
There was a problem hiding this comment.
transformers v5
transformers v5transformers v5
transformers v5transformers v5
What does this PR do?
This PR is to assess if we can move to
transformersmain again for our CI. This will also help us migrate totransformersv5 successfully.Bunch of inline comments to make the reviewers aware of internal discussions.