Replies: 1 comment
-
You could achieve that my using a
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We want to use the Kong AI Gateway functionality through the ai-proxy plugin for our services. We have created the KongPlugin, a Kubernetes Service, and an Ingress resource. The plugin is attached to the service using the
konghq.com/plugin
annotation.✅ The AI proxy works correctly when sending requests to the ingress path in the OpenAI format.
🚫 It only started working after adding an Ingress resource. Without the Ingress, the kong-ingress-controller never detects or configures the service. Additionally, internal requests sent directly to
service-name.namespace.svc.cluster.local
are not proxied to the AI endpoint; instead, they attempt to reach the real upstream endpoint.How can we expose the Kong AI Gateway to our internal services without making it accessible externally?
Beta Was this translation helpful? Give feedback.
All reactions