Skip to content

Commit

Permalink
fix: update basic-gemini app for compatibility and documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
Gustolandia committed Nov 12, 2024
1 parent 757a088 commit 46c49b3
Show file tree
Hide file tree
Showing 3 changed files with 156 additions and 18 deletions.
100 changes: 100 additions & 0 deletions js/testapps/basic-gemini/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@

# Basic Gemini Project

This project demonstrates the integration of Genkit with GoogleAI and VertexAI to generate dynamic content through flows and tools. The specific focus is a simple app that generates a joke based on a user-provided subject.

## Features

- Uses `genkit` to define and run flows.
- Integrates `GoogleAI` and `VertexAI` plugins for enhanced AI capabilities.
- Demonstrates `jokeSubjectGenerator` as a tool and `jokeFlow` as a flow.

## Setup

### Prerequisites

- **Node.js** v20 or later
- **pnpm** package manager
- **Google Cloud SDK** installed and initialized

### Environment Variables

Ensure the following environment variables are set:

- **`GOOGLE_APPLICATION_CREDENTIALS`**: Path to the service account key JSON file.
- **`GOOGLE_PROJECT_ID`**: Your Google Cloud project ID.
- **`GOOGLE_API_KEY`**: Your API key for GoogleAI.

To set these variables, run:

```bash
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
export GOOGLE_PROJECT_ID="your-google-project-id"
export GOOGLE_API_KEY="your-google-api-key"
```

### Google Cloud Initialization

1. Authenticate and set up your project:
```bash
gcloud auth login
gcloud config set project [YOUR_PROJECT_ID]
```
2. Ensure the necessary roles are assigned to your service account:
- Vertex AI User
- Storage Object Viewer

### Install Dependencies

Run the following command in the project root:

```bash
pnpm install
```

## Running the Project

The project requires running two terminals:

1. **Terminal 1**: Start the development server with hot-reloading:
```bash
pnpm genkit:dev
```

2. **Terminal 2**: You can either:
- Run the flow directly:
```bash
genkit flow:run jokeFlow
```
- Start the Genkit UI for visual execution:
```bash
genkit ui:start
```

## Logic and Purpose

This app serves as a demonstration of Genkit's capability to define tools and flows with integrated AI models. The `jokeFlow` takes a user input via the `jokeSubjectGenerator` tool and uses `gemini15Flash` to produce a joke.
### Code Overview
#### `jokeSubjectGenerator`
- **Purpose**: Generates a joke subject, either provided by the user or defaults to "banana".
- **Input**: `string` (e.g., "apple").
- **Output**: `string` (same as input).
#### `jokeFlow`
- **Purpose**: Uses `jokeSubjectGenerator` and AI to generate a joke.
- **Input**: User provides a subject through the UI.
- **Output**: A joke about the subject.
### Example
- **Input**: "cat"
- **Output**: "Why did the cat sit on the computer? Because it wanted to keep an eye on the mouse!"
## Notes
- Adjust the temperature in the `config` for creativity levels.
- Ensure all environment variables are correctly set for smooth execution.
9 changes: 6 additions & 3 deletions js/testapps/basic-gemini/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -7,20 +7,23 @@
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node lib/index.js",
"build": "tsc",
"build:watch": "tsc --watch"
"build:watch": "tsc --watch",
"genkit:dev": "cross-env GENKIT_ENV=dev pnpm dev",
"dev": "tsx --watch src/index.ts"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"genkit": "workspace:*",
"@genkit-ai/firebase": "workspace:*",
"@genkit-ai/google-cloud": "workspace:*",
"@genkit-ai/googleai": "workspace:*",
"@genkit-ai/vertexai": "workspace:*",
"express": "^4.20.0"
"express": "^4.20.0",
"genkit": "workspace:*"
},
"devDependencies": {
"cross-env": "^7.0.3",
"typescript": "^5.6.2"
}
}
65 changes: 50 additions & 15 deletions js/testapps/basic-gemini/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,42 +14,77 @@
* limitations under the License.
*/

import { gemini15Flash, googleAI } from '@genkit-ai/googleai';
import { vertexAI } from '@genkit-ai/vertexai';
import { googleAI } from '@genkit-ai/googleai';
import { gemini15Flash, vertexAI } from '@genkit-ai/vertexai';
import { genkit, z } from 'genkit';

// Initialize Genkit with GoogleAI and VertexAI plugins
const ai = genkit({
plugins: [googleAI(), vertexAI()],
});
console.log('Genkit initialized:', ai);

/**
* @tool jokeSubjectGenerator
* A tool to generate a subject for a joke based on input or return a default value.
*
* @param {string} subject - The initial subject or fallback.
* @returns {string} The joke subject.
*
* @example
* Input: "apple"
* Output: "apple"
*/
const jokeSubjectGenerator = ai.defineTool(
{
name: 'jokeSubjectGenerator',
description: 'Can be called to generate a subject for a joke',
description: 'Generates a subject for a joke based on input',
},
async () => {
return 'banana';
async (subject: string = 'banana') => {
return subject;
}
);

/**
* @flow jokeFlow
* A flow to generate a joke subject using a Large Language Model.
*
* @flowDescription
* This flow leverages the `jokeSubjectGenerator` tool to provide a subject
* and constructs a joke using the `gemini15Flash` model from VertexAI.
*
* @returns {Promise<string>} Generated joke subject as a string.
*
* @example
* Input: { jokeSubject: "apple" }
* Output: "Generated joke about: apple"
*/
export const jokeFlow = ai.defineFlow(
{
name: 'jokeFlow',
inputSchema: z.void(),
outputSchema: z.any(),
inputSchema: z.object({
jokeSubject: z.string().optional().default('banana'), // Accepts input from the UI
}),
outputSchema: z.string(), // Outputs a string
},
async () => {
async (input) => {
console.log('Flow execution started');

// Use the provided joke subject or fallback to default
const jokeSubject = await jokeSubjectGenerator(input.jokeSubject);
console.log('Generated joke subject:', jokeSubject);

// Generate response using the LLM model
const llmResponse = await ai.generate({
model: gemini15Flash,
config: {
temperature: 2,
},
output: {
schema: z.object({ jokeSubject: z.string() }),
temperature: 2, // Adjust model temperature to control creativity
},
tools: [jokeSubjectGenerator],
prompt: `come up with a subject to joke about (using the function provided)`,
tools: [jokeSubjectGenerator], // Tool used in the flow
prompt: `Generate a joke about the following subject: ${jokeSubject}`,
});
return llmResponse.output;

console.log('LLM Response:', llmResponse.text); // Logs the response text from the model
return llmResponse.text;
}
);

0 comments on commit 46c49b3

Please sign in to comment.