Skip to content

Commit

Permalink
update docs website
Browse files Browse the repository at this point in the history
  • Loading branch information
EPMatt committed May 19, 2024
1 parent f3a62b8 commit 013cb7d
Show file tree
Hide file tree
Showing 6 changed files with 473 additions and 52 deletions.
63 changes: 54 additions & 9 deletions docs/docs/plugins/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,38 +32,83 @@ Install the plugin in your project with your favorite package manager:

## Usage

### Initialize

```typescript
import 'dotenv/config';

import { configureGenkit } from '@genkit-ai/core';
import { defineFlow, startFlowsServer } from '@genkit-ai/flow';
import { anthropic } from 'genkitx-anthropicai';

configureGenkit({
plugins: [
// Anthropic API key is required and defaults to the ANTHROPIC_API_KEY environment variable
anthropic({ apiKey: process.env.ANTHROPIC_API_KEY }),
],
logLevel: 'debug',
enableTracingAndMetrics: true,
});
```

### Basic examples

The simplest way to call the text generation model is by using the helper function `generate`:

```
// Basic usage of an LLM
```typescript
// ...configure Genkit (as shown above)...

const response = await generate({
model: claude3Haiku,
prompt: 'Tell me a joke.',
model: claude3Haiku, // model imported from genkitx-anthropicai
prompt: 'Tell me a joke.',
});

console.log(await response.text());
```

Using the same interface, you can prompt a multimodal model:
### Multi-modal prompt

```typescript
// ...configure Genkit (as shown above)...

```
const response = await generate({
model: claude3Haiku,
prompt: [
{ text: 'What animal is in the photo?' },
{ media: { url: imageUrl} },
{ media: { url: imageUrl } },
],
config:{
config: {
// control of the level of visual detail when processing image embeddings
// Low detail level also decreases the token usage
visualDetailLevel: 'low',
}
},
});
console.log(await response.text());
```

### Within a flow

```typescript
// ...configure Genkit (as shown above)...

export const myFlow = defineFlow(
{
name: 'menuSuggestionFlow',
inputSchema: z.string(),
outputSchema: z.string(),
},
async (subject) => {
const llmResponse = await generate({
prompt: `Suggest an item for the menu of a ${subject} themed restaurant`,
model: claude3Opus,
});

return llmResponse.text();
}
);
startFlowsServer();
```

## Contributing

Want to contribute to the project? That's awesome! Head over to our [Contribution Guidelines](https://github.com/TheFireCo/genkit-plugins/blob/main/CONTRIBUTING.md).
Expand Down
151 changes: 151 additions & 0 deletions docs/docs/plugins/azure-openai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,151 @@
---
id: genkitx-azure-openai
title: 🚧 genkitx-azure-openai
---

:::warning

This plugin is currently a work in progress.
:::

<h1 align="center">
Firebase Genkit - Azure OpenAI Plugin
</h1>

<h4 align="center">Azure OpenAI Community Plugin for Google Firebase Genkit</h4>

<div align="center">
<img alt="Github lerna version" src="https://img.shields.io/github/lerna-json/v/TheFireCo/genkit-plugins?label=version"/>
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/genkitx-azure-openai"/>
<img alt="GitHub Org's stars" src="https://img.shields.io/github/stars/TheFireCo?style=social"/>
<img alt="GitHub License" src="https://img.shields.io/github/license/TheFireCo/genkit-plugins"/>
<img alt="Static Badge" src="https://img.shields.io/badge/yes-a?label=maintained"/>
</div>

<div align="center">
<img alt="GitHub Issues or Pull Requests" src="https://img.shields.io/github/issues/TheFireCo/genkit-plugins?color=blue"/>
<img alt="GitHub Issues or Pull Requests" src="https://img.shields.io/github/issues-pr/TheFireCo/genkit-plugins?color=blue"/>
<img alt="GitHub commit activity" src="https://img.shields.io/github/commit-activity/m/TheFireCo/genkit-plugins"/>
</div>

**`genkitx-azure-openai`** is a community plugin for using Azure OpenAI APIs with
[Firebase GenKit](https://github.com/firebase/genkit). Built by [**The Fire Company**](https://github.com/TheFireCo). 🔥

## Installation

Install the plugin in your project with your favorite package manager:

- `npm install genkitx-azure-openai`
- `yarn add genkitx-azure-openai`
- `pnpm add genkitx-azure-openai`

## Usage

> The interface to the models of this plugin is the same as for the [OpenAI plugin](../openai/).
### Initialize

You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following [this guide](https://learn.microsoft.com/azure/ai-services/openai/how-to/create-resource?pivots=web-portal).

Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the "Keys and Endpoint" section of your instance.

You can then define the following environment variables to use the service:

```
AZURE_OPENAI_API_ENDPOINT=<YOUR_ENDPOINT>
AZURE_OPENAI_API_KEY=<YOUR_KEY>
AZURE_OPENAI_API_EMBEDDING_DEPLOYMENT_NAME=<YOUR_EMBEDDING_DEPLOYMENT
```

Alternatively, you can pass the values directly to the `azureOpenAI` constructor:

```typescript
import { azureOpenAI } from 'genkitx-azure-openai';

export default configureGenkit({
plugins: [
azureOpenAI({
apiKey: '<your_key>',
azureOpenAIEndpoint: '<your_endpoint>',
azureOpenAIApiDeploymentName: '<your_embedding_deployment_name',
}),
// other plugins
],
});
```

If you're using Azure Managed Identity, you can also pass the credentials directly to the constructor:

```typescript
import { azureOpenAI } from 'genkitx-azure-openai';
import { DefaultAzureCredential } from '@azure/identity';

const credential = new DefaultAzureCredential();

export default configureGenkit({
plugins: [
azureOpenAI({
credential,
azureOpenAIEndpoint: '<your_endpoint>',
azureOpenAIApiDeploymentName: '<your_embedding_deployment_name',
}),
// other plugins
],
});
```

### Basic examples

The simplest way to call the text generation model is by using the helper function `generate`:

```typescript
// Basic usage of an LLM
const response = await generate({
model: gpt35Turbo,
prompt: 'Tell me a joke.',
});

console.log(await response.text());
```

Using the same interface, you can prompt a multimodal model:

```typescript
const response = await generate({
model: gpt4o,
prompt: [
{ text: 'What animal is in the photo?' },
{ media: { url: imageUrl } },
],
config: {
// control of the level of visual detail when processing image embeddings
// Low detail level also decreases the token usage
visualDetailLevel: 'low',
},
});
console.log(await response.text());
```

For more detailed examples and the explanation of other functionalities, refer to the examples in the [official Github repo of the plugin](https://github.com/TheFireCo/genkit-plugins/blob/main/examples/README.md) or in the [official Genkit documentation](https://firebase.google.com/docs/genkit/get-started).

## Contributing

Want to contribute to the project? That's awesome! Head over to our [Contribution Guidelines](https://github.com/TheFireCo/genkit-plugins/blob/main/CONTRIBUTING.md).

## Need support?

:::info
This repository depends on Google's Firebase Genkit. For issues and questions related to GenKit, please refer to instructions available in [GenKit's repository](https://github.com/firebase/genkit).
:::

Reach out by opening a discussion on [Github Discussions](https://github.com/TheFireCo/genkit-plugins/discussions).

## Credits

This plugin is proudly maintained by the team at [**The Fire Company**](https://github.com/TheFireCo). 🔥

## License

This project is licensed under the [Apache 2.0 License](https://github.com/TheFireCo/genkit-plugins/blob/main/LICENSE).

[![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202%2E0-lightgrey.svg)](https://github.com/TheFireCo/genkit-plugins/blob/main/LICENSE)
87 changes: 73 additions & 14 deletions docs/docs/plugins/cohere.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,36 +34,95 @@ Install the plugin in your project with your favorite package manager:

## Usage

### Initialize

```typescript
import 'dotenv/config';

import { configureGenkit } from '@genkit-ai/core';
import { defineFlow, startFlowsServer } from '@genkit-ai/flow';
import { cohere } from 'genkitx-cohere';

configureGenkit({
plugins: [
// Cohere API key is required and defaults to the COHERE_API_KEY environment variable
cohere({ apiKey: process.env.COHERE_API_KEY }),
],
logLevel: 'debug',
enableTracingAndMetrics: true,
});
```

### Basic examples

The simplest way to call the text generation model is by using the helper function `generate`:

```typescript
// Basic usage of an LLM
// ...configure Genkit (as shown above)...

const response = await generate({
model: commandRPlus,
model: commandRPlus, // model imported from genkitx-cohere
prompt: 'Tell me a joke.',
});

console.log(await response.text());
```

Using the same interface, you can prompt a multimodal model:
### Within a flow

```typescript
const response = await generate({
model: commandRPlus,
prompt: [
{ text: 'What animal is in the photo?' },
{ media: { url: imageUrl } },
],
config: {
// control of the level of visual detail when processing image embeddings
// Low detail level also decreases the token usage
visualDetailLevel: 'low',
// ...configure Genkit (as shown above)...

export const myFlow = defineFlow(
{
name: 'menuSuggestionFlow',
inputSchema: z.string(),
outputSchema: z.string(),
},
async (subject) => {
const llmResponse = await generate({
prompt: `Suggest an item for the menu of a ${subject} themed restaurant`,
model: commandRPlus,
});

return llmResponse.text();
}
);
startFlowsServer();
```

### Tool use

```typescript
// ...configure Genkit (as shown above)...

const createReminder = defineTool(
{
name: 'createReminder',
description: 'Use this to create reminders for things in the future',
inputSchema: z.object({
time: z
.string()
.describe('ISO timestamp string, e.g. 2024-04-03T12:23:00Z'),
reminder: z.string().describe('the content of the reminder'),
}),
outputSchema: z.number().describe('the ID of the created reminder'),
},
(reminder) => Promise.resolve(3)
);

const result = generate({
model: llama3x70b,
tools: [createReminder],
prompt: `
You are a reminder assistant.
If you create a reminder, describe in text the reminder you created as a response.
Query: I have a meeting with Anna at 3 for dinner - can you set a reminder for the time?
`,
});
console.log(await response.text());

console.log(result.then((res) => res.text()));
```

## Contributing
Expand Down
Loading

0 comments on commit 013cb7d

Please sign in to comment.