A simple AI agent built with the GitHub Copilot SDK for Java, running as an Azure Function.
Looking for TypeScript, Python, or C#?
- Java 17+
- Maven 3.8+
- Azure Functions Core Tools
- Azure Developer CLI (azd) (only needed for deploying Microsoft Foundry resources)
- Access to an AI model via one of:
- GitHub Copilot subscription — models are available automatically
- Bring Your Own Key (BYOK) — use an API key from Microsoft Foundry (see BYOK docs)
Want to use your own models? See Deploy Microsoft Foundry Resources below to provision a Microsoft Foundry project instead of using GitHub Copilot models.
-
Clone the repository
-
Build the project:
mvn clean package -DskipTests
-
Run the function locally:
func start
-
Test the agent (in a new terminal):
# Interactive chat client mvn exec:java -Dexec.mainClass="com.function.Chat" # Or use curl directly curl -X POST http://localhost:7071/api/ask -d "what are the laws"
To chat with a deployed instance, grab the URL and function key from your
azdenvironment:export AGENT_URL=$(azd env get-value SERVICE_API_URI) export FUNCTION_KEY=$(az functionapp keys list \ -n $(azd env get-value AZURE_FUNCTION_APP_NAME) \ -g $(azd env get-value RESOURCE_GROUP) \ --query "functionKeys.default" -o tsv) mvn exec:java -Dexec.mainClass="com.function.Chat"
The agent logic is in src/main/java/com/function/Ask.java. It creates a CopilotClient, configures a session with a system message (Asimov's Three Laws of Robotics), and exposes an HTTP endpoint (/api/ask) that accepts a prompt and returns the agent's response.
src/main/java/com/function/Chat.java is a lightweight console client that POSTs messages to the function in a loop, giving you an interactive chat experience. It defaults to http://localhost:7071 but can be pointed at a deployed instance via the AGENT_URL environment variable.
If you prefer to use your own models via BYOK and don't already have a Microsoft Foundry project with a model deployed:
azd auth login
azd upThis provisions all resources and configures local development automatically.
- Microsoft Foundry project with GPT-5-mini model
- Azure Functions app (Java 17, Flex Consumption plan)
- Storage, monitoring, and all necessary RBAC role assignments
- Optional: Search for vector store (disabled by default)
- Optional: Cosmos DB for agent thread storage (disabled by default)
By default the agent uses GitHub Copilot's models. To use your own model from Microsoft Foundry instead, set these environment variables:
export AZURE_OPENAI_ENDPOINT="https://<your-ai-services>.openai.azure.com/"
export AZURE_OPENAI_API_KEY="<your-api-key>"
export AZURE_OPENAI_MODEL="gpt-5-mini" # optional, defaults to gpt-5-miniGetting these values:
- If you ran
azd up, the endpoint is already in your environment — runazd env get-values | grep AZURE_OPENAI_ENDPOINT - For the API key, go to Azure Portal → your AI Services resource → Keys and Endpoint → select the Azure OpenAI tab
- Or find both in the Microsoft Foundry portal under your project settings
See the BYOK docs for details.