Commit 61a5116
authored
feat(recipes): add VLM knowledge distillation recipe with chunked KD loss (#2205)
* feat(loss): add chunked KD loss for memory-efficient distillation
Add a ``chunk_size`` knob to ``KDLoss`` that processes valid tokens in
chunks when computing forward KL. Only one ``[chunk_size, vocab_size]``
fp32 probability/log-prob tensor is materialized at a time, which keeps
peak memory bounded for large-vocab VLMs while remaining numerically
identical to the unchunked path (verified by new unit tests).
The TP path is unaffected; chunking is opt-in via ``chunk_size > 0``.
Signed-off-by: khazic <khazzz1c@gmail.com>
* feat(recipes): add VLM knowledge distillation recipe
Add ``KnowledgeDistillationRecipeForVLM`` under ``recipes/vlm/kd.py``.
The recipe extends ``FinetuneRecipeForVLM`` with a frozen teacher
``NeMoAutoModelForImageTextToText``, a KD loss term, and a ``kd_ratio``
linear mix between CE and KD losses.
The training loop forwards multimodal inputs (pixel_values,
image_grid_thw, etc.) to both teacher and student, frees intermediate
activations eagerly to keep peak memory low, and reports CE/KD
sub-losses alongside the combined loss in validation metrics. Pipeline
parallelism is not supported.
Signed-off-by: khazic <khazzz1c@gmail.com>
* docs(examples): add Qwen3.5 VLM KD example config
Add an example YAML that distills Qwen3.5-9B (teacher) into Qwen3.5-4B
(student) on the public ``mmoukouba/MedPix-VQA`` medical-image VQA
dataset. The config exercises the chunked KD loss
(``kd_loss_fn.chunk_size: 512``), freezes the student's vision and
audio towers, and uses FSDP2.
Signed-off-by: khazic <khazzz1c@gmail.com>
---------
Signed-off-by: khazic <khazzz1c@gmail.com>1 parent adc20e2 commit 61a5116
4 files changed
Lines changed: 752 additions & 4 deletions
File tree
- examples/vlm_kd/qwen3_5
- nemo_automodel
- components/loss
- recipes/vlm
- tests/unit_tests/loss
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
89 | 89 | | |
90 | 90 | | |
91 | 91 | | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
92 | 123 | | |
93 | 124 | | |
94 | 125 | | |
| |||
108 | 139 | | |
109 | 140 | | |
110 | 141 | | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
111 | 146 | | |
112 | 147 | | |
113 | 148 | | |
| |||
116 | 151 | | |
117 | 152 | | |
118 | 153 | | |
| 154 | + | |
119 | 155 | | |
120 | 156 | | |
121 | 157 | | |
122 | 158 | | |
123 | 159 | | |
124 | 160 | | |
| 161 | + | |
125 | 162 | | |
126 | 163 | | |
127 | 164 | | |
| |||
191 | 228 | | |
192 | 229 | | |
193 | 230 | | |
| 231 | + | |
| 232 | + | |
194 | 233 | | |
195 | 234 | | |
196 | 235 | | |
197 | | - | |
198 | | - | |
199 | | - | |
200 | 236 | | |
201 | 237 | | |
202 | 238 | | |
| |||
0 commit comments