Skip to content

fix(core): wait for replay propagation before suspending#1961

Open
karthikscale3 wants to merge 14 commits intomainfrom
kk/schedule-when-idle-repro-test
Open

fix(core): wait for replay propagation before suspending#1961
karthikscale3 wants to merge 14 commits intomainfrom
kk/schedule-when-idle-repro-test

Conversation

@karthikscale3
Copy link
Copy Markdown
Contributor

@karthikscale3 karthikscale3 commented May 7, 2026

Summary

Fixes a false-positive WorkflowRuntimeError: Unconsumed event in event log that can happen during high-concurrency replay when many parallel branches advance into follow-up sequential steps.

Problem

pendingDeliveries can briefly drop to 0 after a step result finishes hydrating, before the workflow VM has resumed across the VM boundary and registered callbacks for the next wave of useStep calls.

scheduleWhenIdle treated that transient 0 as truly idle and fired WorkflowSuspension immediately. In that case replay could abort before the next-wave step callback was registered, leaving an existing step_created event unclaimed. The deferred unconsumed-event check then failed the run with WorkflowRuntimeError even though the event log and step execution were valid.

Fix

Update scheduleWhenIdle so idle suspension waits for replay delivery propagation before firing:

  1. Wait for the current promiseQueue to drain.
  2. Yield once with setTimeout(0) so cross-VM promise continuations can run.
  3. Wait for the possibly-updated promiseQueue again.
  4. Re-check pendingDeliveries and loop if new deliveries appeared.
  5. If this idle cycle observed in-flight deliveries, wait the same propagation delay (aliased from EventsConsumer's DEFERRED_CHECK_DELAY_MS) before suspending, then re-check pendingDeliveries one more time.

The non-zero deferred delay only applies after scheduleWhenIdle has seen replay deliveries in flight. That gives follow-up useStep callbacks time to register after hydrated results cross the VM boundary, while preserving the fast suspension path for ordinary new-work scheduling where no replay delivery was active.

Latency note

The added wait is bounded to "idle cycles that observed deliveries", which in practice is at most one per replay round per scheduling site, not per step. Cold-start single-step workflows and ordinary new-work suspensions are unaffected. Worth watching CI/prod data, but it should not be a meaningful added cost relative to the I/O involved in writing step_created events.

Why no cancellation hook

Unlike EventsConsumer's deferred unconsumed-event check, this propagation timer is intentionally not cancelled when a follow-up useStep/hook/sleep registers during the wait. If a callback arrives mid-wait and consumes the pending *_created event, the suspension still fires after the delay, but it is harmless: the matching invocation already has hasCreatedEvent=true, so the suspension handler does not re-create the step and the run simply continues replay from the persisted log.

Tests

  • Added a focused unit test (packages/core/src/private.test.ts) for the scheduleWhenIdle state machine: covers (a) the fast path firing immediately when no deliveries are observed, (b) the deferred path firing after the propagation window when deliveries were observed, (c) re-looping when pendingDeliveries reappears mid-wait, and (d) continued polling while deliveries persist.
  • Added a new e2e regression workflow 96_many_steps.ts::concurrentMultiWaveWorkflow with concurrent branches and multiple sequential waves. The workflow lives in workbench/example/workflows/ (canonical) with symlinks across the rest of the matrix.
  • The stress e2e is intentionally limited to the Vercel-backed nextjs-turbopack lane (APP_NAME=nextjs-turbopack with WORKFLOW_VERCEL_ENV set), which is the representative adapter/environment for the production-shaped replay race. What we observed in CI when it ran more broadly: local dev/prod/postgres and Windows lanes timed out at the 600s test limit with the workflow still running, while some full-matrix Vercel adapter lanes failed inside the stress workflow with Cannot read properties of undefined (reading 'map'). Even on Vercel Turbopack, the original 45-item x 3-rep workload could run for ~40 minutes, so the regression now uses a smaller 12-item x 2-rep workload with 2-3s stragglers to preserve the multi-wave replay shape without making required CI depend on an oversized stress run.
  • Verified pnpm --filter @workflow/core build and pnpm --filter @workflow/core test pass locally.

Adds an e2e regression test that exercises 50 concurrent items each running
two sequential steps (search → addResult). The pattern produces the timing
skew that triggers the scheduleWhenIdle premature-suspension race seen in
production runs (e.g. wrun_01KQ05J17ZJHGZFRYZ20QM1DBS, where 250 steps
completed cleanly server-side but the workflow still failed with
WorkflowRuntimeError "Unconsumed event in event log" due to scheduleWhenIdle
firing WorkflowSuspension before the addResult callback could be registered).

The race only manifests reliably when flow handlers run across separate
function invocations, so this test should be evaluated against a real Vercel
deployment (which CI does for the nextjs-turbopack matrix entry).

Co-authored-by: Cursor <cursoragent@cursor.com>
@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented May 7, 2026

🦋 Changeset detected

Latest commit: d48b6df

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 18 packages
Name Type
@workflow/core Patch
@workflow/builders Patch
@workflow/cli Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/vitest Patch
@workflow/web-shared Patch
@workflow/web Patch
workflow Patch
@workflow/world-testing Patch
tarballs Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/sveltekit Patch
@workflow/vite Patch
@workflow/nuxt Patch
@workflow/ai Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vercel
Copy link
Copy Markdown
Contributor

vercel Bot commented May 7, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
example-nextjs-workflow-turbopack Ready Ready Preview, Comment May 8, 2026 1:47am
example-nextjs-workflow-webpack Ready Ready Preview, Comment May 8, 2026 1:47am
example-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-astro-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-express-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-fastify-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-hono-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-nitro-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-nuxt-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-sveltekit-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-tanstack-start-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workbench-vite-workflow Ready Ready Preview, Comment May 8, 2026 1:47am
workflow-docs Ready Ready Preview, Comment, Open in v0 May 8, 2026 1:47am
workflow-swc-playground Ready Ready Preview, Comment May 8, 2026 1:47am
workflow-tarballs Ready Ready Preview, Comment May 8, 2026 1:47am
workflow-web Ready Ready Preview, Comment May 8, 2026 1:47am

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 7, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 0.031s (-30.9% 🟢) 1.005s (~) 0.974s 10 1.00x
💻 Local Nitro 0.033s (-23.7% 🟢) 1.005s (~) 0.972s 10 1.08x
🐘 Postgres Express 0.045s (-23.1% 🟢) 1.012s (~) 0.968s 10 1.46x
🐘 Postgres Nitro 0.050s (-47.9% 🟢) 1.011s (-3.1%) 0.961s 10 1.62x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 0.325s (-20.6% 🟢) 2.412s (-3.9%) 2.086s 10 1.00x
▲ Vercel Next.js (Turbopack) 0.372s (+47.8% 🔺) 2.722s (+16.7% 🔺) 2.350s 10 1.14x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 1.068s (-5.1% 🟢) 2.007s (~) 0.939s 10 1.00x
💻 Local Nitro 1.070s (-5.4% 🟢) 2.006s (~) 0.935s 10 1.00x
🐘 Postgres Nitro 1.082s (-5.1% 🟢) 2.009s (~) 0.927s 10 1.01x
🐘 Postgres Express 1.088s (-5.1% 🟢) 2.009s (~) 0.921s 10 1.02x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.673s (-57.0% 🟢) 3.731s (-36.9% 🟢) 2.058s 10 1.00x
▲ Vercel Next.js (Turbopack) 3.249s (+59.6% 🔺) 4.694s (+22.5% 🔺) 1.445s 10 1.94x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 10.403s (-4.7%) 11.020s (~) 0.617s 3 1.00x
🐘 Postgres Express 10.406s (-5.1% 🟢) 11.014s (~) 0.608s 3 1.00x
💻 Local Nitro 10.437s (-4.7%) 11.023s (~) 0.586s 3 1.00x
🐘 Postgres Nitro 10.443s (-3.9%) 11.018s (~) 0.575s 3 1.00x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 14.066s (-40.7% 🟢) 16.052s (-36.1% 🟢) 1.986s 2 1.00x
▲ Vercel Next.js (Turbopack) 15.825s (-8.6% 🟢) 17.702s (-8.7% 🟢) 1.877s 2 1.13x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 13.464s (-7.7% 🟢) 14.019s (-6.7% 🟢) 0.555s 5 1.00x
🐘 Postgres Express 13.465s (-7.7% 🟢) 14.015s (-6.7% 🟢) 0.550s 5 1.00x
💻 Local Express 13.479s (-10.0% 🟢) 14.028s (-6.7% 🟢) 0.549s 5 1.00x
💻 Local Nitro 13.484s (-10.5% 🟢) 14.027s (-12.5% 🟢) 0.543s 5 1.00x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 22.242s (-65.5% 🟢) 24.128s (-63.8% 🟢) 1.886s 3 1.00x
▲ Vercel Next.js (Turbopack) 24.546s (-53.3% 🟢) 26.260s (-51.9% 🟢) 1.714s 3 1.10x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 11.876s (-15.2% 🟢) 12.015s (-17.7% 🟢) 0.140s 8 1.00x
💻 Local Express 11.923s (-28.2% 🟢) 12.146s (-28.7% 🟢) 0.222s 8 1.00x
🐘 Postgres Nitro 11.982s (-14.2% 🟢) 12.269s (-14.3% 🟢) 0.287s 8 1.01x
💻 Local Nitro 11.984s (-28.6% 🟢) 12.147s (-28.7% 🟢) 0.163s 8 1.01x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 36.446s (-91.4% 🟢) 39.409s (-90.7% 🟢) 2.963s 3 1.00x
▲ Vercel Next.js (Turbopack) 36.667s (-90.7% 🟢) 38.771s (-90.2% 🟢) 2.104s 3 1.01x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.149s (-9.8% 🟢) 2.007s (~) 0.857s 15 1.00x
🐘 Postgres Express 1.150s (-8.8% 🟢) 2.007s (~) 0.858s 15 1.00x
💻 Local Nitro 1.160s (-28.9% 🟢) 2.006s (-3.3%) 0.846s 15 1.01x
💻 Local Express 1.175s (-21.1% 🟢) 2.005s (~) 0.830s 15 1.02x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 10.254s (+263.9% 🔺) 11.628s (+169.0% 🔺) 1.374s 3 1.00x
▲ Vercel Next.js (Turbopack) 10.858s (+219.6% 🔺) 12.453s (+152.5% 🔺) 1.595s 3 1.06x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.223s (-48.0% 🟢) 2.007s (-33.3% 🟢) 0.784s 15 1.00x
🐘 Postgres Express 1.223s (-48.2% 🟢) 2.007s (-33.3% 🟢) 0.784s 15 1.00x
💻 Local Nitro 1.644s (-47.7% 🟢) 2.005s (-48.4% 🟢) 0.361s 15 1.35x
💻 Local Express 1.726s (-41.5% 🟢) 2.005s (-41.9% 🟢) 0.279s 15 1.41x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 13.376s (+230.1% 🔺) 15.372s (+159.6% 🔺) 1.996s 2 1.00x
▲ Vercel Next.js (Turbopack) 18.616s (+162.2% 🔺) 20.803s (+133.6% 🔺) 2.186s 2 1.39x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.364s (-60.8% 🟢) 2.007s (-49.9% 🟢) 0.643s 15 1.00x
🐘 Postgres Express 1.382s (-60.4% 🟢) 2.007s (-49.9% 🟢) 0.625s 15 1.01x
💻 Local Nitro 4.381s (-47.5% 🟢) 5.012s (-44.4% 🟢) 0.630s 6 3.21x
💻 Local Express 4.764s (-42.9% 🟢) 5.349s (-40.7% 🟢) 0.585s 6 3.49x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 16.869s (+89.2% 🔺) 19.747s (+80.2% 🔺) 2.878s 2 1.00x
▲ Vercel Nitro 23.256s (+559.7% 🔺) 25.265s (+356.6% 🔺) 2.009s 2 1.38x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.152s (-8.3% 🟢) 2.008s (~) 0.856s 15 1.00x
🐘 Postgres Nitro 1.152s (-8.3% 🟢) 2.006s (~) 0.854s 15 1.00x
💻 Local Nitro 1.263s (-32.3% 🟢) 2.006s (-14.3% 🟢) 0.743s 15 1.10x
💻 Local Express 1.324s (-30.1% 🟢) 2.006s (-15.2% 🟢) 0.682s 15 1.15x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 7.406s (+201.2% 🔺) 9.221s (+121.1% 🔺) 1.814s 5 1.00x
▲ Vercel Next.js (Turbopack) 11.291s (+285.1% 🔺) 13.153s (+183.3% 🔺) 1.862s 3 1.52x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.224s (-47.7% 🟢) 2.008s (-33.3% 🟢) 0.783s 15 1.00x
🐘 Postgres Nitro 1.235s (-47.2% 🟢) 2.008s (-33.3% 🟢) 0.773s 15 1.01x
💻 Local Nitro 1.665s (-45.7% 🟢) 2.005s (-48.4% 🟢) 0.341s 15 1.36x
💻 Local Express 1.958s (-37.5% 🟢) 2.315s (-38.5% 🟢) 0.357s 13 1.60x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 12.517s (+298.3% 🔺) 14.452s (+219.6% 🔺) 1.935s 3 1.00x
▲ Vercel Nitro 17.695s (+447.3% 🔺) 19.413s (+282.4% 🔺) 1.719s 2 1.41x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.357s (-61.0% 🟢) 2.007s (-49.9% 🟢) 0.649s 15 1.00x
🐘 Postgres Express 1.383s (-60.5% 🟢) 2.007s (-50.0% 🟢) 0.624s 15 1.02x
💻 Local Nitro 4.807s (-47.4% 🟢) 5.513s (-45.0% 🟢) 0.706s 6 3.54x
💻 Local Express 5.885s (-33.1% 🟢) 6.413s (-30.8% 🟢) 0.528s 5 4.34x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 24.697s (+384.9% 🔺) 27.129s (+298.0% 🔺) 2.432s 2 1.00x
▲ Vercel Next.js (Turbopack) 43.516s (+544.0% 🔺) 46.084s (+439.4% 🔺) 2.568s 1 1.76x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 10 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.450s (-46.4% 🟢) 1.006s (-1.7%) 0.557s 60 1.00x
🐘 Postgres Nitro 0.451s (-45.0% 🟢) 1.006s (~) 0.555s 60 1.00x
💻 Local Nitro 0.485s (-50.5% 🟢) 1.004s (-8.2% 🟢) 0.519s 60 1.08x
💻 Local Express 0.493s (-49.9% 🟢) 1.004s (-6.7% 🟢) 0.511s 60 1.10x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 5.353s (-75.7% 🟢) 7.089s (-70.5% 🟢) 1.737s 9 1.00x
▲ Vercel Next.js (Turbopack) 7.381s (-49.1% 🟢) 9.282s (-42.3% 🟢) 1.901s 7 1.38x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 25 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.045s (-47.1% 🟢) 1.532s (-32.1% 🟢) 0.487s 59 1.00x
🐘 Postgres Nitro 1.052s (-45.4% 🟢) 1.738s (-17.3% 🟢) 0.686s 52 1.01x
💻 Local Nitro 1.188s (-60.8% 🟢) 2.006s (-46.6% 🟢) 0.817s 45 1.14x
💻 Local Express 1.195s (-60.4% 🟢) 2.005s (-44.1% 🟢) 0.810s 45 1.14x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 13.585s (-65.6% 🟢) 15.611s (-62.2% 🟢) 2.025s 6 1.00x
▲ Vercel Next.js (Turbopack) 17.422s (-65.0% 🟢) 19.553s (-62.2% 🟢) 2.131s 5 1.28x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 50 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.060s (-48.4% 🟢) 2.477s (-43.3% 🟢) 0.417s 49 1.00x
🐘 Postgres Nitro 2.107s (-48.6% 🟢) 2.735s (-40.6% 🟢) 0.627s 44 1.02x
💻 Local Nitro 2.667s (-71.3% 🟢) 3.007s (-70.0% 🟢) 0.340s 40 1.29x
💻 Local Express 2.672s (-71.0% 🟢) 3.007s (-70.0% 🟢) 0.335s 40 1.30x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 42.025s (-56.6% 🟢) 43.780s (-55.5% 🟢) 1.755s 3 1.00x
▲ Vercel Next.js (Turbopack) 49.976s (-53.4% 🟢) 52.401s (-51.9% 🟢) 2.425s 3 1.19x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 10 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.187s (-33.9% 🟢) 1.006s (~) 0.819s 60 1.00x
🐘 Postgres Nitro 0.187s (-34.0% 🟢) 1.006s (~) 0.819s 60 1.00x
💻 Local Nitro 0.424s (-30.0% 🟢) 1.004s (-1.7%) 0.580s 60 2.27x
💻 Local Express 0.509s (-9.1% 🟢) 1.095s (+9.0% 🔺) 0.586s 55 2.73x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 5.153s (+210.2% 🔺) 6.714s (+100.4% 🔺) 1.561s 9 1.00x
▲ Vercel Next.js (Turbopack) 8.184s (+304.6% 🔺) 9.790s (+158.1% 🔺) 1.607s 7 1.59x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 25 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.307s (-39.9% 🟢) 1.006s (~) 0.700s 90 1.00x
🐘 Postgres Nitro 0.311s (-37.3% 🟢) 1.006s (~) 0.695s 90 1.02x
💻 Local Express 2.129s (-15.3% 🟢) 2.820s (-6.3% 🟢) 0.691s 32 6.95x
💻 Local Nitro 2.190s (-13.7% 🟢) 2.766s (-8.1% 🟢) 0.576s 33 7.14x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 10.957s (+239.7% 🔺) 12.908s (+167.7% 🔺) 1.951s 7 1.00x
▲ Vercel Next.js (Turbopack) 14.595s (+312.8% 🔺) 16.392s (+215.7% 🔺) 1.797s 6 1.33x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

workflow with 50 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.631s (-22.9% 🟢) 1.006s (-1.1%) 0.375s 120 1.00x
🐘 Postgres Nitro 0.641s (-18.9% 🟢) 1.006s (~) 0.365s 120 1.01x
💻 Local Nitro 9.378s (-16.2% 🟢) 10.023s (-14.1% 🟢) 0.645s 12 14.86x
💻 Local Express 9.809s (-12.3% 🟢) 10.278s (-13.9% 🟢) 0.469s 12 15.54x
💻 Local Next.js (Turbopack) ⚠️ missing - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 31.258s (+202.7% 🔺) 33.058s (+169.1% 🔺) 1.800s 5 1.00x
▲ Vercel Nitro 33.272s (+330.8% 🔺) 35.668s (+279.4% 🔺) 2.395s 4 1.06x
▲ Vercel Express ⚠️ missing - - - -

🔍 Observability: Next.js (Turbopack) | Nitro

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 1.130s (+467.6% 🔺) 2.005s (+99.6% 🔺) 0.012s (-4.1%) 2.019s (+98.3% 🔺) 0.889s 10 1.00x
💻 Local Nitro 1.130s (+428.9% 🔺) 2.005s (+99.6% 🔺) 0.010s (-20.0% 🟢) 2.017s (+98.0% 🔺) 0.886s 10 1.00x
🐘 Postgres Nitro 1.130s (+451.4% 🔺) 2.000s (+100.1% 🔺) 0.001s (-6.7% 🟢) 2.009s (+98.7% 🔺) 0.879s 10 1.00x
🐘 Postgres Express 1.133s (+452.3% 🔺) 1.996s (+99.9% 🔺) 0.001s (-25.0% 🟢) 2.010s (+98.7% 🔺) 0.877s 10 1.00x
💻 Local Next.js (Turbopack) ⚠️ missing - - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 4.407s (+15.0% 🔺) 5.676s (+7.6% 🔺) 1.595s (+114.9% 🔺) 7.764s (+19.8% 🔺) 3.357s 10 1.00x
▲ Vercel Next.js (Turbopack) 5.896s (-14.0% 🟢) 6.205s (-28.3% 🟢) 0.845s (+33.7% 🔺) 8.572s (-12.4% 🟢) 2.676s 10 1.34x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

stream pipeline with 5 transform steps (1MB)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.492s (+136.8% 🔺) 2.005s (+99.2% 🔺) 0.004s (-1.7%) 2.026s (+98.0% 🔺) 0.534s 30 1.00x
🐘 Postgres Nitro 1.507s (+141.5% 🔺) 2.003s (+98.9% 🔺) 0.004s (-7.3% 🟢) 2.024s (+98.0% 🔺) 0.517s 30 1.01x
💻 Local Nitro 1.515s (+80.7% 🔺) 2.010s (+98.6% 🔺) 0.010s (+6.0% 🔺) 2.022s (+81.2% 🔺) 0.507s 30 1.02x
💻 Local Express 1.525s (+101.5% 🔺) 2.011s (+95.5% 🔺) 0.009s (-3.0%) 2.022s (+94.4% 🔺) 0.496s 30 1.02x
💻 Local Next.js (Turbopack) ⚠️ missing - - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 9.822s (-66.6% 🟢) 11.275s (-63.4% 🟢) 0.221s (+97.5% 🔺) 12.050s (-62.1% 🟢) 2.229s 5 1.00x
▲ Vercel Next.js (Turbopack) 15.832s (-6.4% 🟢) 16.140s (-11.5% 🟢) 0.533s (+152.2% 🔺) 18.313s (-3.3%) 2.481s 4 1.61x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

10 parallel streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.649s (-32.5% 🟢) 1.030s (-19.4% 🟢) 0.000s (-60.3% 🟢) 1.050s (-19.6% 🟢) 0.402s 58 1.00x
🐘 Postgres Nitro 0.672s (-30.6% 🟢) 1.048s (-16.0% 🟢) 0.000s (-100.0% 🟢) 1.060s (-15.7% 🟢) 0.388s 57 1.04x
💻 Local Express 1.326s (+8.3% 🔺) 2.015s (~) 0.000s (-90.0% 🟢) 2.016s (~) 0.690s 30 2.04x
💻 Local Nitro 1.354s (+10.7% 🔺) 2.014s (~) 0.000s (+66.7% 🔺) 2.016s (~) 0.662s 30 2.09x
💻 Local Next.js (Turbopack) ⚠️ missing - - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 7.187s (+135.6% 🔺) 8.367s (+90.5% 🔺) 0.000s (-100.0% 🟢) 8.864s (+84.3% 🔺) 1.677s 7 1.00x
▲ Vercel Next.js (Turbopack) 11.782s (+15.7% 🔺) 11.923s (+3.5%) 0.000s (NaN%) 13.749s (+14.1% 🔺) 1.968s 5 1.64x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

fan-out fan-in 10 streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.293s (-27.8% 🟢) 1.991s (-7.0% 🟢) 0.000s (-6.7% 🟢) 2.055s (-5.5% 🟢) 0.762s 30 1.00x
🐘 Postgres Express 1.320s (-25.5% 🟢) 2.065s (-5.2% 🟢) 0.000s (NaN%) 2.084s (-5.2% 🟢) 0.764s 29 1.02x
💻 Local Express 3.057s (-11.8% 🟢) 3.737s (-7.4% 🟢) 0.001s (-4.4%) 3.741s (-7.3% 🟢) 0.683s 17 2.36x
💻 Local Nitro 3.097s (-8.6% 🟢) 3.903s (-3.2%) 0.001s (+99.2% 🔺) 3.906s (-3.2%) 0.809s 16 2.40x
💻 Local Next.js (Turbopack) ⚠️ missing - - - - -
🐘 Postgres Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 11.293s (+175.9% 🔺) 12.869s (+139.5% 🔺) 0.000s (-100.0% 🟢) 13.362s (+130.6% 🔺) 2.069s 5 1.00x
▲ Vercel Next.js (Turbopack) 19.855s (+253.5% 🔺) 21.126s (+202.6% 🔺) 0.000s (-100.0% 🟢) 21.577s (+186.2% 🔺) 1.722s 4 1.76x
▲ Vercel Express ⚠️ missing - - - - -

🔍 Observability: Nitro | Next.js (Turbopack)

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Nitro 12/21
🐘 Postgres Express 13/21
▲ Vercel Nitro 18/21
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 17/21
Next.js (Turbopack) ▲ Vercel 21/21
Nitro 🐘 Postgres 17/21
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run


Some benchmark jobs failed:

  • Local: success
  • Postgres: success
  • Vercel: failure

Check the workflow run for details.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 7, 2026

🧪 E2E Test Results

All tests passed

Summary

Passed Failed Skipped Total
✅ ▲ Vercel Production 1201 0 229 1430
✅ 💻 Local Development 1587 0 233 1820
✅ 📦 Local Production 1587 0 233 1820
✅ 🐘 Local Postgres 1587 0 233 1820
✅ 🪟 Windows 129 0 1 130
✅ 📋 Other 727 0 183 910
Total 6818 0 1112 7930

Details by Category

✅ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 103 0 27
✅ example 103 0 27
✅ express 103 0 27
✅ fastify 103 0 27
✅ hono 103 0 27
✅ nextjs-turbopack 128 0 2
✅ nextjs-webpack 127 0 3
✅ nitro 103 0 27
✅ nuxt 103 0 27
✅ sveltekit 122 0 8
✅ vite 103 0 27
✅ 💻 Local Development
App Passed Failed Skipped
✅ astro-stable 104 0 26
✅ express-stable 104 0 26
✅ fastify-stable 104 0 26
✅ hono-stable 104 0 26
✅ nextjs-turbopack-canary 110 0 20
✅ nextjs-turbopack-stable-lazy-discovery-disabled 129 0 1
✅ nextjs-turbopack-stable-lazy-discovery-enabled 129 0 1
✅ nextjs-webpack-canary 110 0 20
✅ nextjs-webpack-stable-lazy-discovery-disabled 129 0 1
✅ nextjs-webpack-stable-lazy-discovery-enabled 129 0 1
✅ nitro-stable 104 0 26
✅ nuxt-stable 104 0 26
✅ sveltekit-stable 123 0 7
✅ vite-stable 104 0 26
✅ 📦 Local Production
App Passed Failed Skipped
✅ astro-stable 104 0 26
✅ express-stable 104 0 26
✅ fastify-stable 104 0 26
✅ hono-stable 104 0 26
✅ nextjs-turbopack-canary 110 0 20
✅ nextjs-turbopack-stable-lazy-discovery-disabled 129 0 1
✅ nextjs-turbopack-stable-lazy-discovery-enabled 129 0 1
✅ nextjs-webpack-canary 110 0 20
✅ nextjs-webpack-stable-lazy-discovery-disabled 129 0 1
✅ nextjs-webpack-stable-lazy-discovery-enabled 129 0 1
✅ nitro-stable 104 0 26
✅ nuxt-stable 104 0 26
✅ sveltekit-stable 123 0 7
✅ vite-stable 104 0 26
✅ 🐘 Local Postgres
App Passed Failed Skipped
✅ astro-stable 104 0 26
✅ express-stable 104 0 26
✅ fastify-stable 104 0 26
✅ hono-stable 104 0 26
✅ nextjs-turbopack-canary 110 0 20
✅ nextjs-turbopack-stable-lazy-discovery-disabled 129 0 1
✅ nextjs-turbopack-stable-lazy-discovery-enabled 129 0 1
✅ nextjs-webpack-canary 110 0 20
✅ nextjs-webpack-stable-lazy-discovery-disabled 129 0 1
✅ nextjs-webpack-stable-lazy-discovery-enabled 129 0 1
✅ nitro-stable 104 0 26
✅ nuxt-stable 104 0 26
✅ sveltekit-stable 123 0 7
✅ vite-stable 104 0 26
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 129 0 1
✅ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 104 0 26
✅ e2e-local-dev-tanstack-start- 104 0 26
✅ e2e-local-postgres-nest-stable 104 0 26
✅ e2e-local-postgres-tanstack-start- 104 0 26
✅ e2e-local-prod-nest-stable 104 0 26
✅ e2e-local-prod-tanstack-start- 104 0 26
✅ e2e-vercel-prod-tanstack-start 103 0 27

📋 View full workflow run

Replaces the simple 50-item × 2-step pattern with the production failure
pattern: 80 concurrent items each running 5 nested waves of steps (parallel
search reps → sequential addResult → sequential getProjectResults → parallel
exa-source loop → sequential getToday + parallel fetchStatus). A few items
per wave 1 are stragglers whose searchStep lags 10-15s behind the rest of
the batch, mirroring the T97/T9T/T9V pattern from production run
wrun_01KQ05J17ZJHGZFRYZ20QM1DBS.

This timing skew is what triggers scheduleWhenIdle to fire WorkflowSuspension
in the gap between fast hydrations completing (pendingDeliveries → 0) and
the next useStep callback registering, leaving the next-wave step's
step_created event unclaimed → WorkflowRuntimeError.

Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Cursor <cursoragent@cursor.com>
karthikscale3 and others added 9 commits May 7, 2026 13:52
Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Cursor <cursoragent@cursor.com>
Adds focused unit tests for scheduleWhenIdle's fast and deferred paths,
plus a local REPLAY_PROPAGATION_DELAY_MS alias and a doc note explaining
why the deferred timer does not need cancellation when a follow-up
useStep registers mid-wait.

Co-authored-by: Cursor <cursoragent@cursor.com>
Move 96_many_steps.ts to the canonical workbench/example/workflows
location and add the matching symlinks in every workbench whose
workflows directory is a real directory (nextjs-turbopack,
nextjs-webpack, nitro-v3, sveltekit). Adapters whose workflows
directory is itself a symlink chain (express/fastify/hono/nitro/
nitro-v2/nuxt/tanstack-start/vite via nitro-v3, astro via sveltekit,
nest via example) pick the file up automatically through the chain.

Drop the `test.skipIf(APP_NAME !== 'nextjs-turbopack')` guard so
every adapter in the e2e matrix exercises the regression workflow.

Co-authored-by: Cursor <cursoragent@cursor.com>
Switches private.test.ts to vi.useFakeTimers + advanceTimersByTimeAsync
so the state-machine assertions are deterministic and instant rather
than wall-clock dependent. Uses a getter-based pendingDeliveries stub
to avoid sinon-fake-timers' loopLimit when polling iterations would
otherwise persist across many 0ms re-polls.

Also adds a TODO next to REPLAY_PROPAGATION_DELAY_MS pointing at a
future deterministic VM-resumption-in-flight counter so the 100ms
heuristic isn't load-bearing forever.

Co-authored-by: Cursor <cursoragent@cursor.com>
Keep the high-concurrency replay stress case on the representative Next.js Turbopack lane after CI showed the full adapter matrix timing out or failing under the added load.

Co-authored-by: Cursor <cursoragent@cursor.com>
Limit the high-concurrency replay regression to the Vercel-backed Next.js Turbopack lane so local and Windows matrices do not time out under the stress workload.

Co-authored-by: Cursor <cursoragent@cursor.com>
Reduce the regression workload to keep the Vercel Turbopack lane mergeable while preserving the same multi-wave replay shape.

Co-authored-by: Cursor <cursoragent@cursor.com>
Copy link
Copy Markdown
Member

@TooTallNate TooTallNate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment-only review

Looks good overall — fix logic is sound and the test setup is well-instrumented. A few small observations, none blocking.

What I verified

Diagnosis matches the fix. The race-window argument in the PR description is exactly right: between when host-side delivery decrements pendingDeliveries and when the workflow VM re-enters and registers the next-wave useStep callback, there's a synchronous window where pendingDeliveries === 0 but the workflow is not idle. The fix mirrors the existing pattern in EventsConsumer.deferredCheck (packages/core/src/events-consumer.ts:130-150) and reuses the same DEFERRED_CHECK_DELAY_MS = 100 budget — keeping the two propagation guards in sync via a single source of truth, with a TODO for replacing the wall-clock heuristic with a deterministic counter later. Sensible.

State-machine traversal: walked through each combination (fast path / observed-then-drained / drained-then-reappeared mid-wait) and the logic is correct. runWhenStillIdle does the queue → setTimeout(0) → queue dance to flush cross-VM microtasks before declaring idle, and fireWhenReady adds the propagation delay only when sawPendingDeliveries is set, preserving the fast suspension path for ordinary new-work scheduling.

Tests pass locally: 944 / 944 in @workflow/core, 3 / 3 in the new private.test.ts. The fake-timer harness is well-thought-out — the stubPendingDeliveries getter-with-sequence pattern sidesteps the loopLimit issue with infinite 0ms polling, and the DRAIN_MS = floor(DEFERRED_CHECK_DELAY_MS / 2) constant is the right approach to drive the microtask chain past internal hops without crossing the propagation boundary.

E2E regression is appropriately scoped: test.skipIf(APP_NAME !== 'nextjs-turbopack' || !WORKFLOW_VERCEL_ENV) keeps a known-flaky stress workflow off the local matrix while still exercising the production-shaped replay race on the representative adapter. The PR description's transparency about prior matrix-wide failures (timeouts on local/Windows lanes, Cannot read properties of undefined (reading 'map') on some adapters) is appreciated.

CI

107 / 107 of the test-matrix checks pass. The one red check, Benchmark Vercel (express), fails on consumeAndVerifyStreams from 97_bench.ts with Stream 5 correctness failure: expected 1048576 bytes, got 917504 — a stream truncation in express's bench workflow, unrelated to scheduleWhenIdle. Benchmark Vercel (nitro-v3) and Benchmark Vercel (nextjs-turbopack) pass on the same run, so this is an express-specific bench flake, not a regression from this PR.

Inline comments

Three small non-blocking observations — see inline.

Suggestion for landing

14 commits is a bit much for a "fix one bug + add one test" PR (most of the churn is iterating on e2e gating + scaling down the stress workflow). Worth squashing on merge so the eventual main history shows just fix(core): wait for replay propagation before suspending plus the test commit, rather than the dev-time exploration. GitHub's "Squash and merge" defaults to using the PR title, which already reads cleanly.

} else {
fn();
}
}, REPLAY_PROPAGATION_DELAY_MS);
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor: this sawPendingDeliveries = true is unreachable. fireWhenReady early-returns when sawPendingDeliveries is false, so by the time this setTimeout callback fires, the flag is already true. Can drop the assignment for clarity.

Suggested change
}, REPLAY_PROPAGATION_DELAY_MS);
setTimeout(() => {
if (ctx.pendingDeliveries > 0) {
ctx.promiseQueue.then(() => {
setTimeout(check, 0);
});
} else {
fn();
}
}, REPLAY_PROPAGATION_DELAY_MS);

Non-blocking — same observation applies regardless of how the assignment ordering shakes out.

await vi.advanceTimersByTimeAsync(DEFERRED_CHECK_DELAY_MS * 2);
expect(fn).toHaveBeenCalledTimes(1);
});
});
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Heads up — the PR description says these tests cover four cases:

(a) the fast path firing immediately when no deliveries are observed, (b) the deferred path firing after the propagation window when deliveries were observed, (c) re-looping when pendingDeliveries reappears mid-wait, and (d) continued polling while deliveries persist.

…but only (a), (b), and (c) are actually here. (d) — the multi-iteration polling case where pendingDeliveries stays > 0 across several check rounds before finally draining — isn't tested. Either:

  • Add a test that drives [2, 1, 1, 0, 0] or similar and asserts fn doesn't fire until the late 0s
  • Or drop case (d) from the PR description

Non-blocking; just keeps the description accurate.

}
}

export async function concurrentMultiWaveWorkflow() {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tiny observation about the symlink set: this file gets symlinked into nextjs-turbopack, nextjs-webpack, nitro-v3, and sveltekit, but the concurrentMultiWaveWorkflow e2e test below in e2e.test.ts skips unless APP_NAME === 'nextjs-turbopack' && WORKFLOW_VERCEL_ENV is set. So in the gated CI matrix today, only the nextjs-turbopack symlink is reachable — the other three are dead weight.

Not really a problem (harmless, and keeps the door open if anyone wants to widen the gate or run it manually from those workbenches). Just flagging in case the symlink set was intended to match the test matrix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport-stable Cherry-pick this PR to the stable branch when merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants