Skip to content

Commit

Permalink
chore(docs): use client instead of package name in Node examples (ope…
Browse files Browse the repository at this point in the history
  • Loading branch information
stainless-app[bot] authored Aug 10, 2024
1 parent 0d22c5e commit 1081fb7
Showing 1 changed file with 8 additions and 8 deletions.
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ The full API of this library can be found in [api.md file](api.md) along with ma
```js
import OpenAI from 'openai';

const openai = new OpenAI({
const client = new OpenAI({
apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted
});

Expand All @@ -53,7 +53,7 @@ We provide support for streaming responses using Server Sent Events (SSE).
```ts
import OpenAI from 'openai';

const openai = new OpenAI();
const client = new OpenAI();

async function main() {
const stream = await openai.chat.completions.create({
Expand All @@ -80,7 +80,7 @@ This library includes TypeScript definitions for all request params and response
```ts
import OpenAI from 'openai';

const openai = new OpenAI({
const client = new OpenAI({
apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted
});

Expand Down Expand Up @@ -301,7 +301,7 @@ import fs from 'fs';
import fetch from 'node-fetch';
import OpenAI, { toFile } from 'openai';

const openai = new OpenAI();
const client = new OpenAI();

// If you have access to Node `fs` we recommend using `fs.createReadStream()`:
await openai.files.create({ file: fs.createReadStream('input.jsonl'), purpose: 'fine-tune' });
Expand Down Expand Up @@ -399,7 +399,7 @@ You can use the `maxRetries` option to configure or disable this:
<!-- prettier-ignore -->
```js
// Configure the default for all requests:
const openai = new OpenAI({
const client = new OpenAI({
maxRetries: 0, // default is 2
});

Expand All @@ -416,7 +416,7 @@ Requests time out after 10 minutes by default. You can configure this with a `ti
<!-- prettier-ignore -->
```ts
// Configure the default for all requests:
const openai = new OpenAI({
const client = new OpenAI({
timeout: 20 * 1000, // 20 seconds (default is 10 minutes)
});

Expand Down Expand Up @@ -471,7 +471,7 @@ You can also use the `.withResponse()` method to get the raw `Response` along wi

<!-- prettier-ignore -->
```ts
const openai = new OpenAI();
const client = new OpenAI();

const response = await openai.chat.completions
.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'gpt-3.5-turbo' })
Expand Down Expand Up @@ -582,7 +582,7 @@ import http from 'http';
import { HttpsProxyAgent } from 'https-proxy-agent';

// Configure the default for all requests:
const openai = new OpenAI({
const client = new OpenAI({
httpAgent: new HttpsProxyAgent(process.env.PROXY_URL),
});

Expand Down

0 comments on commit 1081fb7

Please sign in to comment.