Skip to content

Comments

Change OpenAI API endpoints from Completions to Responses for TypeScript#277

Open
fedtti wants to merge 1 commit intomicrosoft:mainfrom
fedtti:feature/responses-api
Open

Change OpenAI API endpoints from Completions to Responses for TypeScript#277
fedtti wants to merge 1 commit intomicrosoft:mainfrom
fedtti:feature/responses-api

Conversation

@fedtti
Copy link

@fedtti fedtti commented Oct 21, 2025

This PR simply move from the Completions API to the newer Responses API endpoint (cf. https://platform.openai.com/docs/guides/migrate-to-responses). They should be fully retro-compatible due to the messages' nature (cf. https://platform.openai.com/docs/guides/migrate-to-responses?update-item-definitions=responses&update-multiturn=responses#migrating-from-chat-completions).

Copy link

@darshjme-codes darshjme-codes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important migration, but needs careful validation! ⚠️

Migration Correctness

The PR changes the endpoint from /chat/completions to /responses. According to OpenAI's migration guide, these APIs are mostly compatible, but there are important differences:

Request Format Changes:

  1. messages field: Responses API uses a more structured format
  2. store parameter: New field in Responses API (defaults to true)
  3. metadata support: Enhanced in Responses API

Response Format Changes:

  1. Response IDs: Format may differ (chatcmpl-* vs resp-*)
  2. Usage tracking: Field names might differ slightly
  3. Error codes: Responses API has updated error handling

Critical Question: Does TypeChat's createOpenAILanguageModel function parse the response correctly for both APIs? The code should be tested against actual Responses API output.

Backward Compatibility Concern

Changing the default endpoint is a breaking change for users who:

  • Have custom OPENAI_ENDPOINT pointing to /chat/completions
  • Use Azure OpenAI (which may not support /responses yet)
  • Proxy/middleware that intercepts /chat/completions

Recommendation: Consider a phased migration:

// Option 1: Environment variable flag
const defaultEndpoint = env.OPENAI_USE_RESPONSES_API === 'true' 
  ? "https://api.openai.com/v1/responses"
  : "https://api.openai.com/v1/chat/completions";

// Option 2: Versioned function
export function createOpenAILanguageModelV2(...) // Uses Responses API
export function createOpenAILanguageModel(...)   // Still uses Completions (deprecated)

This allows users to opt-in, then deprecate the old endpoint in a future release.

Azure OpenAI Compatibility

The PR updates the Azure endpoint documentation:

// Old: .../chat/completions?api-version=...
// New: .../responses?api-version=...

Verify: Does Azure OpenAI support the /responses endpoint? Azure often lags behind OpenAI's API updates. Check Azure's API docs:

If Azure doesn't support /responses yet, this PR will break all Azure users. Consider keeping Azure on /chat/completions until confirmed.

Testing Requirements

MUST TEST before merging:

  1. Basic completion:
test('Responses API returns valid completion', async () => {
  const model = createOpenAILanguageModel(API_KEY, 'gpt-4', 'https://api.openai.com/v1/responses');
  const result = await model.complete('Say hello');
  expect(result).toContain('hello');
});
  1. Error handling:
test('Responses API error format is handled', async () => {
  const model = createOpenAILanguageModel('invalid-key', 'gpt-4', 'https://api.openai.com/v1/responses');
  await expect(model.complete('test')).rejects.toThrow();
});
  1. Azure compatibility:
test('Azure OpenAI still works', async () => {
  const endpoint = 'https://{resource}.openai.azure.com/.../responses?api-version=2024-02-15-preview';
  // Verify Azure supports this endpoint
});
  1. Streaming (if supported):
    Verify streaming responses work identically.

Response Parsing

Inspect the actual response format difference:

Chat Completions:

{
  "id": "chatcmpl-123",
  "object": "chat.completion",
  "choices": [{"message": {"role": "assistant", "content": "..."}}]
}

Responses API:

{
  "id": "resp-123",
  "object": "chat.response", // Note: might differ
  "choices": [{"message": {"role": "assistant", "content": "..."}}]
}

Verify TypeChat's response parser doesn't hardcode object: "chat.completion" checks.

Documentation Updates Needed

  1. CHANGELOG: Add breaking change notice
  2. Migration guide: Help users update their code
  3. Azure note: Clarify Azure OpenAI compatibility status
  4. Environment variable docs: Document how to override endpoint if needed

Overall Assessment

This migration is directionally correct (OpenAI recommends moving to Responses API), but the PR needs:

  1. Confirmation Azure supports /responses (or keep Azure on old endpoint)
  2. Live testing against real OpenAI Responses API
  3. Backward compatibility strategy (flag or deprecation period)
  4. Updated tests covering new endpoint

Recommendation: Request changes. Don't merge until Azure compatibility and response format are verified. The risk of breaking existing users is high.

Alternative Approach: Add Responses API support alongside Completions, make Completions default for now, add deprecation warning, then flip default in v2.0.

Great initiative, but needs validation! 🔍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants