test(samples/js): add Responses web service sample and integration coverage#671
Open
test(samples/js): add Responses web service sample and integration coverage#671
Conversation
…ses API client - Add InputImageContent and InputFileContent content part types - Expand ContentPart union to include new types - Add 7 new reasoning/annotation streaming event interfaces and expand StreamingEvent union - Add ListResponsesResult type - Add list() method to ResponsesClient (GET /v1/responses) - Default store to true in ResponsesClientSettings._serialize() - Add vision.ts with createImageContentFromFile and createImageContentFromUrl helpers - Export vision helpers from index.ts - Add unit tests: vision helpers, list(), reasoning event types, store default - Add integration tests for list() and vision (skipped when addon unavailable) - Add IS_NATIVE_ADDON_AVAILABLE guard to testUtils to skip integration tests gracefully - Update responses.ts example with list() and vision examples Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Contributor
There was a problem hiding this comment.
Pull request overview
This PR extends the JS/TS SDK’s Responses API support by adding multimodal (vision) input helpers, a list() endpoint wrapper, and additional missing streaming event/type definitions to better align with the service contract.
Changes:
- Added new content-part and streaming event types (image/file inputs + reasoning/annotation events) and a
ListResponsesResulttype. - Added
ResponsesClient.list()(GET /v1/responses) and changedResponsesClientSettings._serialize()sostoredefaults totrue. - Introduced
createImageContentFromFile()/createImageContentFromUrl()helpers, exported viasrc/index.ts, and expanded tests + examples accordingly.
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| sdk/js/test/testUtils.ts | Adds IS_NATIVE_ADDON_AVAILABLE detection to help skip integration tests when the native addon isn’t present. |
| sdk/js/test/openai/responsesClient.test.ts | Adds unit/integration tests for vision helpers, list(), new streaming event types, and updates store-default expectations. |
| sdk/js/src/types.ts | Adds image/file content parts, reasoning/annotation streaming event interfaces, extends unions, and defines ListResponsesResult. |
| sdk/js/src/openai/vision.ts | New helper module to build InputImageContent parts from files or URLs. |
| sdk/js/src/openai/responsesClient.ts | Adds list() and changes serialization default for store. |
| sdk/js/src/index.ts | Exports the new vision helper functions from the public SDK surface. |
| sdk/js/examples/responses.ts | Adds examples demonstrating listing stored responses and sending vision input. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
apsonawane
reviewed
Apr 23, 2026
apsonawane
reviewed
Apr 23, 2026
apsonawane
reviewed
Apr 23, 2026
apsonawane
reviewed
Apr 23, 2026
apsonawane
reviewed
Apr 23, 2026
- Make media_type optional in InputImageContent (server can infer) - Add file existence check in createImageContentFromFile - Add bmp support to MEDIA_TYPE_MAP - Add optional maxDimension resize via soft-peer sharp dependency - Omit media_type in createImageContentFromUrl (server infers from URL) - Add JSDoc on ResponsesClientSettings.store documenting default=true - Replace FoundryLocalManager.create() in checkNativeAddonAvailable with file-existence checks (avoids side effects) - Replace unreachable-server list() test with globalThis.fetch mock - Use fs.mkdtempSync for unique temp dirs in file-based tests Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Contributor
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 7 out of 7 changed files in this pull request and generated 4 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
apsonawane
reviewed
Apr 24, 2026
- vision.ts: use fs.promises.readFile (async) instead of sync existsSync/readFileSync
- vision.ts: validate maxDimension is a finite positive integer before use
- vision.ts: update JSDoc to reflect overloaded options param (object or detail string)
- vision.ts: resizeImage returns { buffer, mediaType } so media_type is explicit post-resize
- vision.ts: pass fallbackMediaType into resizeImage to avoid hardcoded 'image/png'
- testUtils.ts: check all 4 candidate addon paths (sdk/js/prebuilds, sdk/js/native,
sdk/js/dist/prebuilds, sdk/js/dist/native) matching CoreInterop.loadAddon() logic
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…es-api-sdk-vision-support
- Add list pagination options and response metadata fields - Align streaming event types with server DTOs for reasoning, annotations, refusals, content parts, and function calls - Make responses example self-contained by generating a temporary PNG - Document Responses API store default, vision formats including BMP, and Foundry Local image_data contract - Remove native-addon availability auto-skip from Responses integration tests Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
baijumeswani
previously approved these changes
Apr 27, 2026
Use the existing chat_completions native command for Responses create and streaming calls when a CoreInterop transport is available, with HTTP fallback for server-backed operations. Keep FFI-created stored responses available through the same client instance and cover the behavior with unit tests.\n\nCo-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Revert the SDK API surface changes from this branch and add a JavaScript web-service Responses sample plus integration coverage for direct /v1/responses calls, streaming, and tool calling.\n\nCo-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Make the JavaScript Responses web-service sample and integration tests call an already-running OpenAI-compatible Foundry Local web server without using the JS native addon.\n\nCo-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…sponses calls Match the C# web-server sample pattern: FoundryLocalManager handles SDK init, EP download, model download/load, and startWebService. The OpenAI JS SDK (openai npm package) makes all Responses API calls against the running endpoint. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
MaanavD
pushed a commit
that referenced
this pull request
May 3, 2026
… locally Cross-references the JS Responses sample/tests (PR #671) to keep the C# pattern consistent. Sample (samples/cs/responses-foundry-local-web-server): - Added README.md mirroring the JS sample (prereqs, run, expected output, troubleshooting) - Tool now uses an empty-params schema (matches JS PR), which the small qwen2.5-0.5b reliably calls - Single ResponseTool reused on the follow-up call; deterministic options (Temperature=0, MaxOutputTokenCount=64) - Cleanup wrapped in try/finally so StopWebService/Unload run even on exceptions Integration tests (sdk/cs/test/FoundryLocal.Tests/ResponsesIntegrationTests.cs): - Mirrors the JS suite responsesWebService.test.ts (NonStreaming, Streaming, FunctionCalling) - Skips when Utils.IsRunningInCI() is true and when qwen2.5-0.5b is not pre-cached - Streaming asserts response.created, response.output_text.delta, and response.completed events (parity with JS) - Tool-calling test reuses the same get_weather empty-params definition - Streaming options include StreamingEnabled = true so the official ResponsesClient allows the call Pre-existing fix (test infra only): - Utils.GetRepoRoot() previously failed in git worktrees because .git is a file, not a directory; now accepts either form. This unblocked test execution in worktree checkouts. Validation: - dotnet build samples/cs/responses-foundry-local-web-server -c Release: 0 warnings, 0 errors - dotnet build sdk/cs/test/FoundryLocal.Tests -c Release: 0 errors - dotnet test --filter ResponsesIntegration: all 3 Responses tests pass end-to-end against a real local model - The 10 remaining failures across the project are pre-existing EmbeddingClientTests infra (different model not cached), unrelated to this PR Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Reworks this PR to avoid changing the JavaScript SDK API surface. The PR now focuses on using an existing Foundry Local OpenAI-compatible web service for Responses API scenarios, similar to the existing chat-completions web-server sample.
Changes
samples/js/web-server-responses, a JavaScript sample that uses the OpenAI JS SDK against a running Foundry Local/v1web service and demonstrates:openai.responses.create()non-streaming calls,function_call_outputfollow-up.sdk/js/test/openai/responsesWebService.test.tsintegration coverage that calls/v1/responsesdirectly withfetchfor:FoundryLocalManageror the JS native addon. They require an already-running service and are enabled with:FOUNDRY_LOCAL_RESPONSES_ENDPOINTorFOUNDRY_LOCAL_ENDPOINTFOUNDRY_LOCAL_RESPONSES_MODELorFOUNDRY_LOCAL_MODELLocal validation