Version
v26.1.0
Platform
Darwin Mac.lan 25.4.0 Darwin Kernel Version 25.4.0: Thu Mar 19 19:31:17 PDT 2026; root:xnu-12377.101.15~1/RELEASE_ARM64_T6020 arm64
Subsystem
vm
What steps will reproduce the bug?
Run the following script with node --experimental-vm-modules --expose-gc node-vm-leak.mjs
// node-vm-leak.mjs
import { writeHeapSnapshot } from "node:v8";
import vm from "node:vm";
const ITERATIONS = 10;
// ~80 MB array
const SOURCE = `export const big = new Array(10_000_000).fill(0);`;
async function runIteration(withDynamicImport) {
const context = vm.createContext({});
const mod = new vm.SourceTextModule(SOURCE, {
context,
...(withDynamicImport
? {
importModuleDynamically: () => {
throw new Error("no dynamic imports");
},
}
: {}),
});
await mod.link(() => {
throw new Error("no imports");
});
await mod.evaluate();
// Drop all JS-side references and force GC.
// (context and mod go out of scope here)
}
console.log("variant: without importModuleDynamically");
for (let i = 0; i < ITERATIONS; i++) {
await runIteration(false);
global.gc();
const mb = (process.memoryUsage().heapUsed / 1024 / 1024).toFixed(1);
console.log(` iteration ${i + 1}: ${mb} MB heap`);
}
// Write snapshot to disk — avoids the DevTools pre-snapshot GC.
const snapshotPath = writeHeapSnapshot();
console.log(`\n*** Heap snapshot written to: ${snapshotPath} ***`);
console.log("\nvariant: with importModuleDynamically closure");
for (let i = 0; i < ITERATIONS; i++) {
await runIteration(true);
global.gc();
const mb = (process.memoryUsage().heapUsed / 1024 / 1024).toFixed(1);
console.log(` iteration ${i + 1}: ${mb} MB heap`);
}
This reproduction is based on https://github.com/eberlitz/jest-leak, thanks to them for the great minimal reproduction in Jest that was easy to port to plain Node APIs 👍
How often does it reproduce? Is there a required condition?
100% of the time
What is the expected behavior? Why is that the expected behavior?
Heap should stay flat after the first iteration. Once context and mod go out of scope, gc() is called, and there are no other JS-side references to either object, both the vm.Context and the vm.SourceTextModule (and anything reachable from the module's namespace) should be eligible for collection.
What do you see instead?
Heap grows ~80 MB per iteration and never shrinks:
iteration 1: 79.6 MB heap
iteration 2: 156.0 MB heap
iteration 3: 232.5 MB heap
iteration 4: 308.9 MB heap
iteration 5: 385.4 MB heap
iteration 6: 461.8 MB heap
iteration 7: 538.3 MB heap
iteration 8: 614.7 MB heap
iteration 9: 691.3 MB heap
iteration 10: 767.7 MB heap
A heap snapshot written via v8.writeHeapSnapshot() (to avoid the pre-snapshot GC that DevTools triggers) was loaded into DevTools and the .heapsnapshot file was shared with Claude Code for analysis.
Everything below this point is Claude Code's interpretation of the snapshot. I have not independently verified it and it may contain errors. I'm including it as a starting point for investigation.
Claude Code identified the following retainer chain anchored at SourceTextModule:
SourceTextModule @N
← <symbol> in Error @N (stack trace stored under a symbol)
← [0] in Array @N
← args in {<symbol kAsyncContextFrame>, callback, args} @N
← [1] in Array @N
← list in FixedCircularBuffer @N
← head in FixedQueue @N
← queue in system / Context @N (the vm context)
← context in queueMicrotask() @N
← queueMicrotask in global @N ← GC root
The GC root is the main process global. Its queueMicrotask function closes over a reference to the vm.Context. That context's microtask FixedQueue contains an AsyncContextFrame whose args holds an Error. The Error's stack trace (stored under a symbol) is an array of 7 CallSiteInfo entries. Only one of them — the module evaluation frame — retains significant memory: its slot 1 (the function slot) holds the SourceTextModule directly. In V8's CallSiteInfo, slot 1 is the function of that frame; for a module's top-level evaluation, the "function" is the module itself.
The FixedCircularBuffer backing the queue has exactly 2 populated slots: index 0 (a tiny AsyncContextFrame retaining almost nothing) and index 1 (the frame described above, retaining ~80 MB). The remaining ~2046 slots are undefined. So each evaluate() call leaves exactly 2 unprocessed frames in the vm context's microtask queue — one of which pins the SourceTextModule.
After mod.evaluate() resolves and all JS-side variables go out of scope, the vm context and SourceTextModule remain live because:
- the vm context is captured inside the main global's
queueMicrotask closure, and
- the vm context's microtask queue still holds an async context frame — created during module evaluation for async stack trace tracking — whose
Error stack contains a CallSiteInfo with the SourceTextModule in its function slot.
Additional information
This is also all Claude Code 😀 Hopefully it helps rather than adding noise 🙂
Full object cluster. A second snapshot view shows the internal V8 SourceTextModule (C++ side, system / SourceTextModule @23563) sitting inside ModuleWrap @23561, which is in turn referenced via <symbol kWrap> on the JS SourceTextModule @63249. The JS wrapper, the C++ ModuleWrap, and the internal V8 module are one cluster — all kept alive by the same microtask queue root. Fixing the JS-side retention fixes all three.
Multiple queueMicrotask retainer paths. The vm context's own queueMicrotask (visible as queueMicrotask in system / Context @36643 and queueMicrotask in {setupTaskQueue, queueMicrotask} @41565) is also a retainer, in addition to the main global's closure. Both paths lead back to the same FixedQueue on the vm context.
Relation to prior fixes. This leak was reported previously in #33439 (2020) and #41101 (2021) and was partially addressed by #48510 (released in v20.8.0) and #46785. Those fixes addressed vm.Script / vm.compileFunction leaks via symbol-based host-defined options. The current leak is distinct: it goes through async stack trace capture (AsyncContextFrame / CallSiteInfo) and the microtask queue infrastructure, not through importModuleDynamically bookkeeping. The issue reproduces identically with or without an importModuleDynamically callback.
Real-world impact. In Jest's ESM mode (--experimental-vm-modules), each test file runs inside its own vm.Context and creates multiple vm.SourceTextModule instances. With --runInBand, heap grows ~80 MB per test file (matching the module namespace size) with no upper bound. A 22-file suite in the linked reproduction climbs from ~100 MB to ~1.4 GB. Jest tears down its registries and nulls the context reference after each test file; there is nothing user-land can do to work around the leak.
Relation to the loader redesign. #62720 is working on a new vm/modules loader API. It's unclear whether that work touches this code path, but flagging it here in case there is overlap.
Version
v26.1.0
Platform
Subsystem
vmWhat steps will reproduce the bug?
Run the following script with
node --experimental-vm-modules --expose-gc node-vm-leak.mjsThis reproduction is based on https://github.com/eberlitz/jest-leak, thanks to them for the great minimal reproduction in Jest that was easy to port to plain Node APIs 👍
How often does it reproduce? Is there a required condition?
100% of the time
What is the expected behavior? Why is that the expected behavior?
Heap should stay flat after the first iteration. Once
contextandmodgo out of scope,gc()is called, and there are no other JS-side references to either object, both thevm.Contextand thevm.SourceTextModule(and anything reachable from the module's namespace) should be eligible for collection.What do you see instead?
Heap grows ~80 MB per iteration and never shrinks:
A heap snapshot written via
v8.writeHeapSnapshot()(to avoid the pre-snapshot GC that DevTools triggers) was loaded into DevTools and the.heapsnapshotfile was shared with Claude Code for analysis.Everything below this point is Claude Code's interpretation of the snapshot. I have not independently verified it and it may contain errors. I'm including it as a starting point for investigation.
Claude Code identified the following retainer chain anchored at
SourceTextModule:The GC root is the main process
global. ItsqueueMicrotaskfunction closes over a reference to thevm.Context. That context's microtaskFixedQueuecontains anAsyncContextFramewhoseargsholds anError. TheError's stack trace (stored under a symbol) is an array of 7CallSiteInfoentries. Only one of them — the module evaluation frame — retains significant memory: its slot 1 (the function slot) holds theSourceTextModuledirectly. In V8'sCallSiteInfo, slot 1 is the function of that frame; for a module's top-level evaluation, the "function" is the module itself.The
FixedCircularBufferbacking the queue has exactly 2 populated slots: index 0 (a tinyAsyncContextFrameretaining almost nothing) and index 1 (the frame described above, retaining ~80 MB). The remaining ~2046 slots areundefined. So eachevaluate()call leaves exactly 2 unprocessed frames in the vm context's microtask queue — one of which pins theSourceTextModule.After
mod.evaluate()resolves and all JS-side variables go out of scope, the vm context and SourceTextModule remain live because:queueMicrotaskclosure, andErrorstack contains aCallSiteInfowith theSourceTextModulein its function slot.Additional information
This is also all Claude Code 😀 Hopefully it helps rather than adding noise 🙂
Full object cluster. A second snapshot view shows the internal V8
SourceTextModule(C++ side,system / SourceTextModule @23563) sitting insideModuleWrap @23561, which is in turn referenced via<symbol kWrap>on the JSSourceTextModule @63249. The JS wrapper, the C++ModuleWrap, and the internal V8 module are one cluster — all kept alive by the same microtask queue root. Fixing the JS-side retention fixes all three.Multiple
queueMicrotaskretainer paths. The vm context's ownqueueMicrotask(visible asqueueMicrotask in system / Context @36643andqueueMicrotask in {setupTaskQueue, queueMicrotask} @41565) is also a retainer, in addition to the main global's closure. Both paths lead back to the sameFixedQueueon the vm context.Relation to prior fixes. This leak was reported previously in #33439 (2020) and #41101 (2021) and was partially addressed by #48510 (released in v20.8.0) and #46785. Those fixes addressed
vm.Script/vm.compileFunctionleaks via symbol-based host-defined options. The current leak is distinct: it goes through async stack trace capture (AsyncContextFrame/CallSiteInfo) and the microtask queue infrastructure, not throughimportModuleDynamicallybookkeeping. The issue reproduces identically with or without animportModuleDynamicallycallback.Real-world impact. In Jest's ESM mode (
--experimental-vm-modules), each test file runs inside its ownvm.Contextand creates multiplevm.SourceTextModuleinstances. With--runInBand, heap grows ~80 MB per test file (matching the module namespace size) with no upper bound. A 22-file suite in the linked reproduction climbs from ~100 MB to ~1.4 GB. Jest tears down its registries and nulls the context reference after each test file; there is nothing user-land can do to work around the leak.Relation to the loader redesign. #62720 is working on a new vm/modules loader API. It's unclear whether that work touches this code path, but flagging it here in case there is overlap.