Skip to content

A local gateway service to trace your LLM application.

License

Notifications You must be signed in to change notification settings

zhengfeiwang/local-gateway

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

How to use local gateway on Windows

Scenario: direct network requests

Though OpenAI has provided a Python SDK, there are still developers determined to request OpenAI API directly. This section will guide you through how to use the local gateway to trace your LLM application.

In this scenario, how you interact with the OpenAI API may look like this snippet.

1. Install OpenSSL and generate self-signed certificate

This page has various links, which can be very helpful; just follow this to install OpenSSL on Windows, then execute the following commands in the terminal to generate a self-signed certificate:

cd <path/to/repository>/src/local-gateway/
openssl req -x509 -nodes -newkey rsa:2048 -keyout key.pem -out cert.pem -days 365 -config openssl.cnf

2. Invoke gateway service, setup interception and system proxy

# terminal #0 - invoke gateway service
cd <path/to/repository>/src/local-gateway/
python .\run.py
# another terminal #1 - setup interception
cd <path/to/repository>/src/local-gateway/
mitmproxy --mode regular -s redirect-script.py --ssl-insecure
# another terminal #2 - setup system proxy
cd <path/to/repository>/scripts/
.\interception-win.ps1 start

3. Trust mitmproxy in certifi

Add mitmproxy CA certificate to certifi, so that Python requests will trust mitmproxy and you don't need to update your code. You shall find mitmproxy CA certificate in ~/.mitmproxy/ directory with file name mitmproxy-ca-cert.cer.

# locate `cacert.pem`
import certifi
print(certifi.where())

4. Run your LLM application/script...

# there is one prepared script in the repository
cd <path/to/repository>/scripts/
python .\aoai-requests.py <prompt>  # default prompt is "Wakanda Forever"

5. Check traces

Traces by default are stored in the <path/to/repository>/src/local-gateway/traces.json.

You can customize the location via environment variable GATEWAY_TRACE_DESTINATION, for example in PowerShell:

$env:GATEWAY_TRACE_DESTINATION='C:\Users\<your-alias>\traces.json'

then invoke the gateway service (terminal #0).

6. Stop proxy

cd <path/to/repository>/scripts/
.\interception-win.ps1 stop

Scenario: use OpenAI SDK

OpenAI Python SDK will honor the environment variable OPENAI_BASE_URL to request (see implementation in version 1.37.1), so if you are using OpenAI Python SDK, it is recommended to leverage this feature - TBD.

Also, you can manually configure the client to skip certificate verification and everything will work like the previous scenario:

from openai import DefaultHttpxClient

client = AzureOpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    api_version="2023-12-01-preview",
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    http_client=DefaultHttpxClient(verify=False),  # this is the line you need to add
)

About

A local gateway service to trace your LLM application.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published