-
-
Notifications
You must be signed in to change notification settings - Fork 29
Expand file tree
/
Copy path.env.example
More file actions
113 lines (97 loc) · 4.33 KB
/
.env.example
File metadata and controls
113 lines (97 loc) · 4.33 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
# ============================================================================
# Wardrobe Environment Configuration
# ============================================================================
# Copy this file to .env and update the values for your environment
# For production, generate secure secrets: openssl rand -hex 32
# ============================================================================
# Database Configuration
# ============================================================================
POSTGRES_USER=wardrobe
POSTGRES_PASSWORD=change-me-use-secure-password
POSTGRES_DB=wardrobe
POSTGRES_PORT=5432
# Redis Configuration
# ============================================================================
REDIS_PORT=6379
# Backend Configuration
# ============================================================================
BACKEND_PORT=8000
# IMPORTANT: Generate a secure secret for production
SECRET_KEY=change-me-in-production-use-openssl-rand-hex-32
# Set DEBUG=true to enable dev credential login (no OIDC required)
# NOTE: Dev mode also requires SECRET_KEY=change-me-in-production (the docker-compose default).
# If you override SECRET_KEY, dev login will not activate.
# DEBUG=true
# Frontend Configuration
# ============================================================================
FRONTEND_PORT=3000
NEXTAUTH_URL=http://localhost:3000
# IMPORTANT: Generate a secure secret for production
NEXTAUTH_SECRET=change-me-in-production-use-openssl-rand-hex-32
# Authentication (OIDC Provider)
# ============================================================================
# Leave all empty to use development credential-based login (default)
# Choose ONE of the following authentication methods:
# Option 1: OIDC
# OIDC_ISSUER_URL=https://auth.example.com
# OIDC_CLIENT_ID=wardrobe
# OIDC_CLIENT_SECRET=your-client-secret
# OIDC_END_SESSION_URL=https://auth.example.com/api/oidc/end-session
# Set to true if your OIDC provider uses a self-signed certificate:
# OIDC_SKIP_SSL_VERIFY=false
# Option 2: Forward Auth (TinyAuth, Authelia, etc.)
# AUTH_TRUST_PROXY=true
# DNS Configuration
# ============================================================================
# If your OIDC provider or other services use a local DNS hostname that
# Docker containers can't resolve, set this to your local DNS server IP:
# LOCAL_DNS=10.0.0.1
# AI Service Configuration
# ============================================================================
# REQUIRED: Choose ONE of the following AI providers
# The app needs a vision model (for analyzing clothing images) and a text model (for recommendations)
# ============================================================================
# Option 1: Ollama (Recommended - FREE, runs locally, no API key needed)
# ============================================================================
# 1. Install Ollama from https://ollama.ai
# 2. Pull required models:
# ollama pull llava:7b # Vision model for image analysis
# ollama pull gemma3 # Text model for recommendations
# (Alternative text models: llama3, qwen2.5, mistral)
# 3. Use this configuration:
AI_BASE_URL=http://host.docker.internal:11434/v1
AI_API_KEY=not-needed
AI_VISION_MODEL=llava:7b
AI_TEXT_MODEL=gemma3:latest
# Option 2: OpenAI (Paid API - requires API key)
# ============================================================================
# Uncomment these lines and comment out the Ollama config above:
# AI_BASE_URL=https://api.openai.com/v1
# AI_API_KEY=sk-your-api-key-here
# AI_VISION_MODEL=gpt-4o
# AI_TEXT_MODEL=gpt-4o
# Option 3: LocalAI (Self-hosted OpenAI alternative)
# ============================================================================
# AI_BASE_URL=http://localai:8080/v1
# AI_API_KEY=not-needed
# AI_VISION_MODEL=gpt-4-vision-preview
# AI_TEXT_MODEL=gpt-3.5-turbo
# Option 4: Azure OpenAI
# ============================================================================
# AI_BASE_URL=https://your-resource.openai.azure.com/openai/deployments/your-deployment/
# AI_API_KEY=your-azure-api-key
# AI_VISION_MODEL=gpt-4o
# AI_TEXT_MODEL=gpt-4o
# Notification Providers (all optional)
# ============================================================================
# ntfy.sh push notifications
NTFY_SERVER=https://ntfy.sh
NTFY_TOPIC=
NTFY_TOKEN=
# Mattermost webhook
MATTERMOST_WEBHOOK_URL=
# Email notifications
SMTP_HOST=
SMTP_PORT=587
SMTP_USER=
SMTP_PASSWORD=