Though OpenAI has provided a Python SDK, there are still developers determined to request OpenAI API directly. This section will guide you through how to use the local gateway to trace your LLM application.
In this scenario, how you interact with the OpenAI API may look like this snippet.
This page has various links, which can be very helpful; just follow this to install OpenSSL on Windows, then execute the following commands in the terminal to generate a self-signed certificate:
cd <path/to/repository>/src/local-gateway/
openssl req -x509 -nodes -newkey rsa:2048 -keyout key.pem -out cert.pem -days 365 -config openssl.cnf
# terminal #0 - invoke gateway service
cd <path/to/repository>/src/local-gateway/
python .\run.py
# another terminal #1 - setup interception
cd <path/to/repository>/src/local-gateway/
mitmproxy --mode regular -s redirect-script.py --ssl-insecure
# another terminal #2 - setup system proxy
cd <path/to/repository>/scripts/
.\interception-win.ps1 start
Add mitmproxy
CA certificate to certifi
, so that Python requests
will trust mitmproxy
and you don't need to update your code. You shall find mitmproxy
CA certificate in ~/.mitmproxy/
directory with file name mitmproxy-ca-cert.cer
.
# locate `cacert.pem`
import certifi
print(certifi.where())
# there is one prepared script in the repository
cd <path/to/repository>/scripts/
python .\aoai-requests.py <prompt> # default prompt is "Wakanda Forever"
Traces by default are stored in the <path/to/repository>/src/local-gateway/traces.json
.
You can customize the location via environment variable GATEWAY_TRACE_DESTINATION
, for example in PowerShell:
$env:GATEWAY_TRACE_DESTINATION='C:\Users\<your-alias>\traces.json'
then invoke the gateway service (terminal #0).
cd <path/to/repository>/scripts/
.\interception-win.ps1 stop
OpenAI Python SDK will honor the environment variable OPENAI_BASE_URL
to request (see implementation in version 1.37.1), so if you are using OpenAI Python SDK, it is recommended to leverage this feature - TBD.
Also, you can manually configure the client to skip certificate verification and everything will work like the previous scenario:
from openai import DefaultHttpxClient
client = AzureOpenAI(
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_version="2023-12-01-preview",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
http_client=DefaultHttpxClient(verify=False), # this is the line you need to add
)