このページは、まだ日本語ではご利用いただけません。翻訳中です。
古いプラグインバージョンのドキュメントを閲覧しています。
This guide walks you through setting up the AI Proxy Advanced plugin with Gemini.
For all providers, the Kong AI Proxy Advanced plugin attaches to route entities.
Prerequisites
- Kong Gateway is installed and running
- Create or retrieve an API key on the Google Cloud API Credentials Page to access Google’s AI services
Configure the AI Proxy Advanced plugin
- Create a service in Kong Gateway that will represent the Google Gemini API:
curl -i -X POST http://localhost:8001/services \ --data "name=gemini-service" \ --data "url=https://generativelanguage.googleapis.com"
- Create a route that maps to the service you defined:
curl -i -X POST http://localhost:8001/routes \ --data "paths[]=/gemini" \ --data "service.id=$(curl -s http://localhost:8001/services/gemini-service | jq -r '.id')"
- Use the Kong Admin API to configure the AI Proxy Advanced plugin to route requests to Google Gemini:
curl -i -X POST http://localhost:8001/services/gemini-service/plugins \ --header "accept: application/json" \ --header "Content-Type: application/json" \ --data ' { "name": "ai-proxy-advanced", "config": { "targets": [ { "route_type": "llm/v1/chat", "auth": { "param_name": "key", "param_value": "<GEMINI_API_TOKEN>", "param_location": "query" }, "model": { "provider": "gemini", "name": "gemini-1.5-flash" } } ] } } '
Be sure to replace GEMINI_API_TOKEN
with your API token.
Test the configuration
Make an llm/v1/chat
type request to test your new endpoint:
curl -X POST http://localhost:8000/gemini \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'