diff --git a/docs-go/cloud-run.md b/docs-go/cloud-run.md index a619b9477..db841c58b 100644 --- a/docs-go/cloud-run.md +++ b/docs-go/cloud-run.md @@ -1,8 +1,6 @@ # Firebase Genkit with Cloud Run -You can deploy Firebase Genkit flows as web services using Cloud Run. This page, -as an example, walks you through the process of deploying the default sample -flow. +You can deploy Firebase Genkit flows as web services using Cloud Run. 1. Install the [Google Cloud CLI](https://cloud.google.com/sdk/docs/install) if you haven't already. @@ -36,23 +34,26 @@ flow. go mod init example/cloudrun ``` -1. Initialize Genkit in your project: +1. Install the Genkit package and the model plugin you want to use: ```posix-terminal - genkit init + go get "github.com/firebase/genkit/go" ``` - Select the model provider you want to use. + One of the following: - Accept the defaults for the remaining prompts. The `genkit` tool will create - a sample source file to get you started developing your own AI flows. - For the rest of this tutorial, however, you'll just deploy the sample flow. + ```posix-terminal + go get "github.com/firebase/genkit/go/plugins/googleai" + ``` + + ```posix-terminal + go get "github.com/firebase/genkit/go/plugins/vertexai" + ``` -1. Edit the sample file (`main.go` or `genkit.go`) to explicitly specify the - port the flow server should listen on: +1. Create a file (`main.go`) for your flows: ```golang - {% includecode github_path="firebase/genkit/go/internal/doc-snippets/flows.go" region_tag="init" adjust_indentation="auto" %} + {% includecode github_path="firebase/genkit/go/internal/doc-snippets/deploy/main.go" region_tag="main" adjust_indentation="auto" %} ``` 1. Make API credentials available to your deployed function. Do one of the @@ -118,7 +119,7 @@ flow. 1. Start the UI: ```posix-terminal - genkit start + genkit start -- go run . ``` 1. In the developer UI (http://localhost:4000/), run the flow: @@ -139,14 +140,14 @@ flow. - {Gemini (Google AI)} ```posix-terminal - gcloud run deploy --port 3400 \ + gcloud run deploy \ --update-secrets=GOOGLE_GENAI_API_KEY=:latest ``` - {Gemini (Vertex AI)} ```posix-terminal - gcloud run deploy --port 3400 \ + gcloud run deploy \ --set-env-vars GCLOUD_PROJECT= \ --set-env-vars GCLOUD_LOCATION=us-central1 ``` diff --git a/docs-go/deploy.md b/docs-go/deploy.md index 41ee60aec..368bda9e1 100644 --- a/docs-go/deploy.md +++ b/docs-go/deploy.md @@ -3,7 +3,7 @@ You can deploy Firebase Genkit flows as web services using any service that can host a Go binary. This page, as an example, walks you through the general process of deploying the -default sample flow, and points out where you must take provider-specific +a sample flow, and points out where you must take provider-specific actions. 1. Create a directory for the Genkit sample project: @@ -22,28 +22,28 @@ actions. go mod init example/cloudrun ``` -1. Initialize Genkit in your project: +1. Install the Genkit package and the model plugin you want to use: ```posix-terminal - genkit init + go get "github.com/firebase/genkit/go" ``` - Select the model provider you want to use. + One of the following: - Accept the defaults for the remaining prompts. The `genkit` tool will create - a sample source file to get you started developing your own AI flows. - For the rest of this tutorial, however, you'll just deploy the sample flow. + ```posix-terminal + go get "github.com/firebase/genkit/go/plugins/googleai" + ``` + + ```posix-terminal + go get "github.com/firebase/genkit/go/plugins/vertexai" + ``` -1. Edit the sample file (`main.go` or `genkit.go`) to explicitly specify the - port the flow server should listen on: +1. Create a file (`main.go`) for your flows: ```golang - {% includecode github_path="firebase/genkit/go/internal/doc-snippets/flows.go" region_tag="init" adjust_indentation="auto" %} + {% includecode github_path="firebase/genkit/go/internal/doc-snippets/deploy/main.go" region_tag="main" adjust_indentation="auto" %} ``` - If your provider requires you to listen on a specific port, be sure to - configure Genkit accordingly. - 1. Implement some form of authentication and authorization to gate access to the flows you plan to deploy. @@ -120,7 +120,7 @@ actions. 1. Start the UI: ```posix-terminal - genkit start + genkit start -- go run . ``` 1. In the developer UI (http://localhost:4000/), run the flow: diff --git a/docs-go/get-started-go.md b/docs-go/get-started-go.md index 0326d247d..994098b36 100644 --- a/docs-go/get-started-go.md +++ b/docs-go/get-started-go.md @@ -8,22 +8,71 @@ exploration. If you discover issues with the libraries or this documentation please report them in our [GitHub repository](https://github.com/firebase/genkit/). -To get started with Genkit, install the Genkit CLI and run -`genkit init` in a Go project. The rest of this page shows you how. +This page shows you how to get started with Genkit in a Go project. ## Requirements -- Go 1.22 or later. See [Download and install](https://go.dev/doc/install) in - the official Go docs. +Go 1.22 or later. See [Download and install](https://go.dev/doc/install) in +the official Go docs. -- Node.js 20 or later (for the Genkit CLI and UI). See the next section for a - brief guide on installing Node. +## Install Genkit dependencies {:#install} -## Install Genkit {:#install} +1. If you don't already have a Go project that you want to add AI features to, + create a new module in an empty directory: -1. If you don't already have Node 20 or newer on your system, install it now. + ```posix-terminal + go mod init example/genkit-getting-started + ``` + +1. Install the Genkit package and the `googleai` model plugin: + + ```posix-terminal + go get "github.com/firebase/genkit/go" + + go get "github.com/firebase/genkit/go/plugins/googleai" + ``` + +## Configure your model API key + +For this guide, we’ll show you how to use the Gemini API which provides a +generous free tier and does not require a credit card to get started. To use the +Gemini API, you'll need an API key. If you don't already have one, create a key +in Google AI Studio. + +[Get an API key from Google AI Studio](https://makersuite.google.com/app/apikey) + +After you’ve created an API key, set the `GOOGLE_GENAI_API_KEY` environment +variable to your key with the following command: + +```posix-terminal +export GOOGLE_GENAI_API_KEY= +``` + +Note: While this tutorial uses the Gemini API from AI Studio, Genkit supports a +wide variety of model providers including +[Gemini from Vertex AI](/docs/genkit/plugins/vertex-ai#generative_ai_models), +Anthropic’s Claude 3 models and Llama 3.1 through the +[Vertex AI Model Garden](/docs/genkit/plugins/vertex-ai#anthropic_claude_3_on_vertex_ai_model_garden), +open source models through +[Ollama](/docs/genkit/plugins/ollama), and several other +[community-supported providers](/docs/genkit/models#models-supported) like +OpenAI and Cohere. + +## Make your first request + +Get started with Genkit in just a few lines of simple code. - Recommendation: The [`nvm`](https://github.com/nvm-sh/nvm) and +```go +{% includecode github_path="firebase/genkit/go/internal/doc-snippets/init/main.go" region_tag="main" adjust_indentation="auto" %} +``` + +## Optional: Install the Genkit CLI + +Genkit has a CLI and developer UI that helps you locally test and debug your +app. To install these tools: + +1. If you don't already have Node 20 or newer on your system, + the [`nvm`](https://github.com/nvm-sh/nvm) and [`nvm-windows`](https://github.com/coreybutler/nvm-windows) tools are a convenient way to install specific versions of Node if it's not already installed on your system. These tools install Node on a per-user basis, so you @@ -53,116 +102,24 @@ To get started with Genkit, install the Genkit CLI and run 1. Install the Genkit CLI by running the following command: ```posix-terminal - npm i -g genkit + npm i -g genkit-cli ``` This command installs the Genkit CLI into your Node installation directory so that it can be used outside of a Node project. -## Create and explore a sample project {:#explore} - -1. Create a new project directory: - - ```posix-terminal - mkdir genkit-intro && cd genkit-intro - ``` - -1. Initialize a Genkit project: - - ```posix-terminal - genkit init - ``` - - 1. Select `Go` as the runtime environment. - - 1. Select your model: - - - {Gemini (Google AI)} - - The simplest way to get started is with Google AI Gemini API. Make sure - it's - [available in your region](https://ai.google.dev/available_regions). - - [Generate an API key](https://aistudio.google.com/app/apikey) for the - Gemini API using Google AI Studio. Then, set the `GOOGLE_GENAI_API_KEY` - environment variable to your key: - - ```posix-terminal - export GOOGLE_GENAI_API_KEY= - ``` - - - {Gemini (Vertex AI)} - - If the Google AI Gemini API is not available in your region, consider - using the Vertex AI API which also offers Gemini and other models. You - will need to have a billing-enabled Google Cloud project, enable AI - Platform API, and set some additional environment variables: - - ```posix-terminal - gcloud services enable aiplatform.googleapis.com - - export GCLOUD_PROJECT= - - export GCLOUD_LOCATION=us-central1 - ``` - - See [Vertex AI pricing](https://cloud.google.com/vertex-ai/generative-ai/pricing). - - 1. Specify anything for the module name. For example: `example/genkit-intro` - - 1. Choose default answers to the rest of the questions, which will - initialize your project folder with some sample code. - - The `genkit init` command creates a sample Go module and installs the - required dependencies. The file `main.go` contains a single flow, - `menuSuggestionFlow`, that prompts an LLM to suggest an item for a - restaurant with a given theme. - - This file looks something like the following (the plugin configuration steps - might look different if you selected Vertex AI): - - ```golang - {% includecode github_path="firebase/genkit/go/internal/doc-snippets/init/main.go" region_tag="main" adjust_indentation="auto" %} - ``` - - As you build out your app's AI features with Genkit, you will likely create - flows with multiple steps such as input preprocessing, more sophisticated - prompt construction, integrating external information sources for - retrieval-augmented generation (RAG), and more. - -1. Now you can run and explore Genkit features and the sample project locally - on your machine. Download and start the Genkit Developer UI: - - ```posix-terminal - genkit start - ``` - - Welcome to
-    Genkit Developer UI - - The Genkit Developer UI is now running on your machine. When you run models - or flows in the next step, your machine will perform the orchestration tasks - needed to get the steps of your flow working together; calls to external - services such as the Gemini API will continue to be made against live - servers. - - Also, because you are in a dev environment, Genkit will store traces and - flow state in local files. - -1. The Genkit Developer UI downloads and opens automatically when you run the - `genkit start` command. - - The Developer UI lets you see which flows you have defined and models you - configured, run them, and examine traces of previous runs. Try out some of - these features: - - - On the **Run** tab, you will see a list of all of the flows that you have - defined and any models that have been configured by plugins. - - Click **menuSuggestionFlow** and try running it with some input text (for - example, `"cat"`). If all goes well, you'll be rewarded with a menu - suggestion for a cat themed restaurant. - - - On the **Inspect** tab, you'll see a history of flow executions. For each - flow, you can see the parameters that were passed to the flow and a trace - of each step as they ran. +## Next steps + +Now that you’re set up to make model requests with Genkit, learn how to use more +Genkit capabilities to build your AI-powered apps and workflows. To get started +with additional Genkit capabilities, see the following guides: + +* [Generating content](/docs/genkit-go/models): Learn how to use Genkit’s unified + generation API to generate text and structured data from any supported + model. +* [Creating flows](/docs/genkit-go/flows): Learn how to use special Genkit + functions, called flows, that provide end-to-end observability for workflows + and rich debugging from Genkit tooling. +* [Prompting models](/docs/genkit-go/prompts): Learn how Genkit lets you treat + prompt templates as functions, encapsulating model configurations and + input/output schema. diff --git a/go/internal/doc-snippets/deploy/main.go b/go/internal/doc-snippets/deploy/main.go new file mode 100644 index 000000000..3a32c9284 --- /dev/null +++ b/go/internal/doc-snippets/deploy/main.go @@ -0,0 +1,57 @@ +// Copyright 2024 Google LLC +// +// Licensed under the Apache License, Version 2.0 (the "License"); +// you may not use this file except in compliance with the License. +// You may obtain a copy of the License at +// +// http://www.apache.org/licenses/LICENSE-2.0 +// +// Unless required by applicable law or agreed to in writing, software +// distributed under the License is distributed on an "AS IS" BASIS, +// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +// See the License for the specific language governing permissions and +// limitations under the License. + +// [START main] +package main + +import ( + "context" + "errors" + "fmt" + "log" + + "github.com/firebase/genkit/go/ai" + "github.com/firebase/genkit/go/genkit" + "github.com/firebase/genkit/go/plugins/googleai" +) + +func main() { + ctx := context.Background() + + if err := googleai.Init(ctx, nil); err != nil { + log.Fatal(err) + } + + genkit.DefineFlow("menuSuggestionFlow", func(ctx context.Context, input string) (string, error) { + m := googleai.Model("gemini-1.5-flash") + if m == nil { + return "", errors.New("menuSuggestionFlow: failed to find model") + } + + resp, err := ai.Generate(ctx, m, + ai.WithConfig(&ai.GenerationCommonConfig{Temperature: 1}), + ai.WithTextPrompt(fmt.Sprintf(`Suggest an item for the menu of a %s themed restaurant`, input))) + if err != nil { + return "", err + } + + text := resp.Text() + return text, nil + }) + + if err := genkit.Init(ctx, nil); err != nil { + log.Fatal(err) + } +} +// [END main] \ No newline at end of file diff --git a/go/internal/doc-snippets/init/main.go b/go/internal/doc-snippets/init/main.go index a66237a77..79af07c0b 100644 --- a/go/internal/doc-snippets/init/main.go +++ b/go/internal/doc-snippets/init/main.go @@ -17,13 +17,10 @@ package main import ( "context" - "errors" - "fmt" "log" // Import Genkit and the Google AI plugin "github.com/firebase/genkit/go/ai" - "github.com/firebase/genkit/go/genkit" "github.com/firebase/genkit/go/plugins/googleai" ) @@ -38,38 +35,26 @@ func main() { log.Fatal(err) } - // Define a simple flow that prompts an LLM to generate menu suggestions. - genkit.DefineFlow("menuSuggestionFlow", func(ctx context.Context, input string) (string, error) { - // The Google AI API provides access to several generative models. Here, - // we specify gemini-1.5-flash. - m := googleai.Model("gemini-1.5-flash") - if m == nil { - return "", errors.New("menuSuggestionFlow: failed to find model") - } - - // Construct a request and send it to the model API (Google AI). - resp, err := ai.Generate(ctx, m, - ai.WithConfig(&ai.GenerationCommonConfig{Temperature: 1}), - ai.WithTextPrompt(fmt.Sprintf(`Suggest an item for the menu of a %s themed restaurant`, input))) - if err != nil { - return "", err - } - - // Handle the response from the model API. In this sample, we just - // convert it to a string. but more complicated flows might coerce the - // response into structured output or chain the response into another - // LLM call. - text := resp.Text() - return text, nil - }) + // The Google AI API provides access to several generative models. Here, + // we specify gemini-1.5-flash. + m := googleai.Model("gemini-1.5-flash") + if m == nil { + log.Fatal("Failed to find model") + } - // Initialize Genkit and start a flow server. This call must come last, - // after all of your plug-in configuration and flow definitions. When you - // pass a nil configuration to Init, Genkit starts a local flow server, - // which you can interact with using the developer UI. - if err := genkit.Init(ctx, nil); err != nil { - log.Fatal(err) + // Construct a request and send it to the model API (Google AI). + resp, err := ai.Generate(ctx, m, + ai.WithConfig(&ai.GenerationCommonConfig{Temperature: 1}), + ai.WithTextPrompt(`Suggest an item for the menu of a pirate themed restaurant`)) + if err != nil { + log.Fatal() } -} + // Handle the response from the model API. In this sample, we just + // convert it to a string. but more complicated flows might coerce the + // response into structured output or chain the response into another + // LLM call. + text := resp.Text() + println(text) +} // [END main]