Compare commits

..

81 Commits

Author SHA1 Message Date
9b554644bc fix: change PullImage option to false in deploy workflow
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 16:15:13 -03:00
f71d11400a feat: add a playful line to the dashboard version display
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 14s
2026-03-28 16:13:31 -03:00
a708b5a6f5 fix: comment out Docker tag command in build workflow
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 6s
2026-03-28 16:07:40 -03:00
a4f805a122 fix: correct syntax for Docker tag command in build workflow
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 6s
2026-03-28 16:04:52 -03:00
e63db79119 fix: correct syntax for Docker build and tag commands in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 6s
2026-03-28 16:04:04 -03:00
165c141560 feat: remove Gitea registry login step from build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 6s
2026-03-28 16:01:00 -03:00
968fd07385 feat: simplify registry references in build and docker-compose files
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 15s
2026-03-28 15:59:47 -03:00
2c29a4bce7 feat: update image reference for condado-newsletter service in docker-compose
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 10s
2026-03-28 15:58:09 -03:00
58c3a54d4a feat: add generation source handling for task creation and updates
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 50s
2026-03-28 15:35:49 -03:00
ea54858165 feat: add required environment variable checks for stack deployment
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 15:01:07 -03:00
fda6fc77ee feat: implement orphan container cleanup for idempotent stack deployment
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 14:56:07 -03:00
ff93afd075 feat: add debug information and response headers to Portainer stack deployment
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 14:54:22 -03:00
af5def1e71 feat: enhance build and deploy workflows with debug information and environment variable handling
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 14:52:50 -03:00
0f9d35311f fix: remove image push steps from build workflow
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 14:49:07 -03:00
dc8f182c56 feat: add Gitea registry login and update Docker image build process
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 1m2s
2026-03-28 14:47:34 -03:00
b3f6d0ef17 fix: remove unnecessary env_file entries and correct OPENAI_API_KEY format
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 32s
2026-03-28 14:38:49 -03:00
6606c27323 fix: update OPENAI_API_KEY environment variable to include .env suffix
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 11s
2026-03-28 14:38:02 -03:00
2c5c299aaa fix: standardize PostgreSQL user and password environment variables in Docker Compose
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 11s
2026-03-28 14:36:25 -03:00
167cfdb742 fix: update PostgreSQL user and password environment variables for consistency
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 13s
2026-03-28 14:31:23 -03:00
b7274c4b6d fix: update healthcheck command for PostgreSQL service to use localhost
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 13s
2026-03-28 14:18:49 -03:00
5499a8a585 fix: update PostgreSQL healthcheck command and ensure default network is specified
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 15s
2026-03-28 14:10:37 -03:00
1c4a35eea8 fix: add 'default' network to condado-newsletter service in Docker Compose
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 8s
2026-03-28 14:04:02 -03:00
363bfcb135 fix: add external network configuration for condado-newsletter service in Docker Compose
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 17s
2026-03-28 14:01:30 -03:00
eaf588f7d5 fix: correct spelling of 'postgres' in Docker Compose and entrypoint scripts
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 15s
2026-03-28 13:54:14 -03:00
621bb1773c fix: update healthcheck command for PostgreSQL service in Docker Compose
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 31s
2026-03-28 13:46:34 -03:00
942da74778 fix: remove PostgreSQL initialization from entrypoint and update Docker configuration for external database
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 2m23s
2026-03-28 13:40:24 -03:00
52ea621145 fix: update Docker build command to use correct image name and remove unnecessary tagging steps
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 11s
2026-03-28 13:28:10 -03:00
f6d37bb1f2 fix: update Docker image source for condado-newsletter service
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 22s
2026-03-28 13:19:59 -03:00
2e2e75fe87 fix: update JwtService to handle default expiration and add tests for token generation
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 39s
2026-03-28 03:40:03 -03:00
8f508034d5 fix: update Docker configuration for image source and enhance logging in supervisord
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 14s
2026-03-28 03:32:08 -03:00
7108aff54d fix: add access and error log configuration for Nginx
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 39s
2026-03-28 03:26:50 -03:00
b0a4278699 fix: update stack deployment to use production Docker Compose file 2026-03-28 03:25:35 -03:00
73c51e514c fix: update Docker Compose configuration for service names and database connection
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 7s
2026-03-28 03:24:00 -03:00
596a17b252 fix: update supervisord configuration to log output to stdout
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 9s
2026-03-28 03:14:15 -03:00
5ff28fa3d4 fix: update homepage logo and href in Docker Compose configuration
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 12s
2026-03-28 03:09:35 -03:00
a672c9efed fix: correct stack name in Portainer deployment configuration
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 35s
2026-03-28 03:07:44 -03:00
bfe8965c06 fix: enhance Portainer API interaction with DNS fallback and improved error handling
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 11s
2026-03-28 03:06:37 -03:00
c72595d396 fix: improve Portainer deployment script with enhanced logging and error handling
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 11s
2026-03-28 03:05:05 -03:00
51b596c7a5 fix: update Portainer API URL and correct image reference in Docker Compose
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 12s
2026-03-28 03:03:33 -03:00
e4e2ae3479 fix: sanitize Portainer API stack response output for improved logging
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 12s
2026-03-28 02:58:37 -03:00
808c0d0a22 fix: update Portainer API URL to use the correct lab address
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 11s
2026-03-28 02:55:59 -03:00
e3938d2351 fix: add network info logging before Portainer deployment
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 11s
2026-03-28 02:53:05 -03:00
8a04363b11 fix: enhance Portainer API deployment with detailed error handling and logging
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 11s
2026-03-28 02:49:30 -03:00
1038f40721 fix: update Portainer API URL to include port number for deployment
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 7s
2026-03-28 02:30:01 -03:00
4fd90b2497 fix: streamline deployment process by removing Gitea registry login steps and enhancing Portainer API integration
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 16s
2026-03-28 02:21:36 -03:00
cb74fdef7b fix: remove Gitea container registry login and push steps from build workflow
All checks were successful
Build And Publish Production Image / Build And Publish Production Image (push) Successful in 7s
2026-03-28 01:34:08 -03:00
0ed6f3824a fix: update build workflow to combine tagging and pushing of registry images
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 17s
2026-03-28 01:27:09 -03:00
572dc49bc9 fix: update Docker image tag format in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 1m37s
2026-03-28 00:49:03 -03:00
29627a0062 fix: correct syntax for Docker image tags in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 16s
2026-03-28 00:42:16 -03:00
776941b323 fix: update Docker Hub login step to be optional and clean up registry login process
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 1m28s
2026-03-28 00:40:15 -03:00
5f8834d0d4 fix: update Docker registry configuration and login endpoint in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 1m18s
2026-03-28 00:31:33 -03:00
854fabd874 fix: update logging of Docker registry credentials to use base64 encoding
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 22s
2026-03-27 22:28:45 -03:00
000bc0cc36 fix: update Docker registry credentials logging to use environment variables
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 12s
2026-03-27 22:27:51 -03:00
4d27a256d2 fix: update logging of Docker registry credentials to use secrets
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 21s
2026-03-27 22:26:12 -03:00
08bfced7ce fix: add logging for Docker registry login credentials in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 7s
2026-03-27 22:25:18 -03:00
c266be0eba Remove CI workflow and instructions documentation files
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 23s
2026-03-27 22:18:02 -03:00
837214f41a fix: update Gitea registry login endpoint in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 7s
2026-03-27 17:07:30 -03:00
fa4bf360ff fix: update registry URL in build workflow
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 12s
2026-03-27 17:05:51 -03:00
2072dd299d fix: enhance Gitea registry login step to handle empty secrets
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 27s
2026-03-27 16:50:50 -03:00
af391efa89 fix: update Gitea registry login step to use correct secret names
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 22s
2026-03-27 16:46:54 -03:00
8893e85d53 fix: move Docker Hub login step into build job
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 1m48s
2026-03-27 16:44:14 -03:00
14ecd2fa18 fix: add Docker Hub login step to build workflow
Some checks failed
Build And Publish Production Image / Log In To Docker Hub (push) Successful in 2s
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 12s
2026-03-27 16:42:57 -03:00
0fa3d28c1b Merge pull request 'develop' (#8) from develop into main
Some checks failed
Build And Publish Production Image / Build And Publish Production Image (push) Failing after 24s
Reviewed-on: #8
2026-03-27 16:35:58 -03:00
924d3eab35 Merge branch 'main' into develop 2026-03-27 16:35:54 -03:00
c6a3971c15 Merge pull request 'fix: ensure newline at end of file in build workflow' (#7) from feature/testing into develop
Reviewed-on: #7
2026-03-27 16:35:38 -03:00
18dba7f7a2 Merge branch 'feature/testing' of http://gitea.lab/sancho41/condado-newsletter into feature/testing
Some checks failed
CI / Backend Tests (pull_request) Has been cancelled
CI / Frontend Tests (pull_request) Has been cancelled
2026-03-27 16:34:54 -03:00
62306ea6a6 fix: update build trigger to use push on main branch instead of pull request review 2026-03-27 16:34:40 -03:00
90f63bc6ed fix: ensure newline at end of file in build workflow
Some checks failed
CI / Backend Tests (pull_request) Has been cancelled
CI / Frontend Tests (pull_request) Has been cancelled
2026-03-27 16:27:32 -03:00
ac6efceede fix: ensure newline at end of file in build workflow
Some checks failed
CI / Frontend Tests (pull_request) Has been cancelled
CI / Backend Tests (pull_request) Has been cancelled
2026-03-27 16:26:29 -03:00
440a7eade1 Merge pull request 'develop' (#6) from develop into main
Reviewed-on: #6
2026-03-27 16:24:58 -03:00
1581ddcaea Merge branch 'main' into develop 2026-03-27 16:24:50 -03:00
37a9ef22df Merge pull request 'feature/testing' (#5) from feature/testing into main
Reviewed-on: #5
2026-03-27 16:23:53 -03:00
81d04b63d1 develop (#4)
Reviewed-on: #4
Co-authored-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
Co-committed-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
2026-03-27 16:23:13 -03:00
6306073921 feature/testing (#3)
Reviewed-on: #3
Co-authored-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
Co-committed-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
2026-03-27 16:22:43 -03:00
5723c74e39 fix: add missing colon in Active Entities label on DashboardPage (#1) (#2)
Reviewed-on: #1
Co-authored-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
Co-committed-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
Reviewed-on: #2
2026-03-27 16:19:26 -03:00
46f78467bb fix: add missing colon in Active Entities label on DashboardPage (#1)
Reviewed-on: #1
Co-authored-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
Co-committed-by: Gabriel Sancho <gabriel.sancho13@gmail.com>
2026-03-27 16:18:58 -03:00
d6de131a9b feat: update build workflow to create and publish all-in-one Docker image on approved PRs
Some checks failed
CI / Frontend Tests (pull_request) Has been cancelled
CI / Backend Tests (pull_request) Has been cancelled
2026-03-27 16:18:25 -03:00
6305a8e95e refactor: update build process to create a single all-in-one Docker image and adjust related configurations
Some checks failed
CI / Frontend Tests (pull_request) Has been cancelled
CI / Backend Tests (pull_request) Has been cancelled
2026-03-27 16:10:14 -03:00
3f0bb4be73 feat: update Docker configuration and CI/CD workflows for local image builds
Some checks failed
CI / Backend Tests (pull_request) Failing after 11m8s
CI / Frontend Tests (pull_request) Has been cancelled
2026-03-27 16:01:34 -03:00
06112330b6 fix(ci): add missing 'with' block for checkout step in backend and frontend jobs
Some checks failed
CI / Backend Tests (pull_request) Failing after 11m23s
CI / Frontend Tests (pull_request) Successful in 9m45s
2026-03-27 15:35:32 -03:00
46391948b3 fix: add missing colon in Active Entities label on DashboardPage
Some checks failed
CI / Backend Tests (pull_request) Failing after 2m11s
CI / Frontend Tests (pull_request) Failing after 1m23s
2026-03-27 15:28:12 -03:00
33 changed files with 813 additions and 1199 deletions

View File

@@ -33,5 +33,5 @@ LLAMA_MODEL=gemma3:4b
# ── Application ─────────────────────────────────────────────────────────────── # ── Application ───────────────────────────────────────────────────────────────
APP_RECIPIENTS=friend1@example.com,friend2@example.com APP_RECIPIENTS=friend1@example.com,friend2@example.com
# ── Frontend (Vite build-time) ──────────────────────────────────────────────── # ── Frontend (Vite dev proxy) ────────────────────────────────────────────────
VITE_API_BASE_URL=http://localhost VITE_API_BASE_URL=http://localhost

View File

@@ -0,0 +1,62 @@
name: Build And Publish Production Image
on:
push:
branches:
- main
jobs:
build:
name: Build And Publish Production Image
runs-on: ubuntu-latest
env:
REGISTRY: gitea.lab
IMAGE_NAME: sancho41/condado-newsletter
REGISTRY_USERNAME: ${{ secrets.REGISTRY_USERNAME }}
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}
steps:
- uses: actions/checkout@v4
with:
github-server-url: http://gitea.lab
- name: Build debug context
run: |
set -eu
echo "Build debug"
echo "Repository: ${GITEA_REPOSITORY:-unknown}"
echo "Ref: ${GITEA_REF:-unknown}"
echo "Sha: ${GITEA_SHA:-unknown}"
echo "Runner OS: ${RUNNER_OS:-unknown}"
echo "Registry: ${REGISTRY}"
echo "Image: ${IMAGE_NAME}"
echo "Image latest tag: ${REGISTRY}/${IMAGE_NAME}:latest"
echo "Image sha tag: ${REGISTRY}/${IMAGE_NAME}:${GITEA_SHA:-unknown}"
echo "HTTP_PROXY=${HTTP_PROXY:-<empty>}"
echo "HTTPS_PROXY=${HTTPS_PROXY:-<empty>}"
echo "NO_PROXY=${NO_PROXY:-<empty>}"
if command -v ip >/dev/null 2>&1; then
echo "Runner network info:"
ip -4 addr show || true
ip route || true
else
hostname -I || true
fi
- name: Verify Docker CLI
run: docker version
- name: Log in to Docker Hub (optional)
if: ${{ secrets.DOCKERHUB_USERNAME != '' && secrets.DOCKERHUB_TOKEN != '' }}
run: echo "${{ secrets.DOCKERHUB_TOKEN }}" | docker login docker.io -u "${{ secrets.DOCKERHUB_USERNAME }}" --password-stdin
- name: Build all-in-one image
run: |
docker build -t "${IMAGE_NAME}:latest" -f Dockerfile.allinone .
# docker tag "${IMAGE_NAME}:latest" "${IMAGE_NAME}:${{ gitea.sha }}"
- name: Build result debug
run: |
set -eu
echo "Listing produced image tags"
docker image ls "${IMAGE_NAME}" --format 'table {{.Repository}}\t{{.Tag}}\t{{.ID}}\t{{.CreatedSince}}' || true

View File

@@ -13,6 +13,8 @@ jobs:
working-directory: backend working-directory: backend
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with:
github-server-url: http://gitea.lab
- name: Set up JDK 21 - name: Set up JDK 21
uses: actions/setup-java@v4 uses: actions/setup-java@v4
@@ -42,6 +44,8 @@ jobs:
working-directory: frontend working-directory: frontend
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with:
github-server-url: http://gitea.lab
- name: Set up Node 20 - name: Set up Node 20
uses: actions/setup-node@v4 uses: actions/setup-node@v4

330
.gitea/workflows/deploy.yml Normal file
View File

@@ -0,0 +1,330 @@
name: Deploy Production Stack
on:
workflow_run:
workflows: ["Build And Publish Production Image"]
types: [completed]
workflow_dispatch:
jobs:
deploy:
name: Deploy Stack Via Portainer
if: ${{ gitea.event_name == 'workflow_dispatch' || gitea.event.workflow_run.conclusion == 'success' }}
runs-on: ubuntu-latest
env:
STACK_NAME: condado-newsletter-stack
PORTAINER_URL: ${{ secrets.PORTAINER_URL }}
PORTAINER_API_KEY: ${{ secrets.PORTAINER_API_KEY }}
PORTAINER_ENDPOINT_ID: ${{ secrets.PORTAINER_ENDPOINT_ID }}
ENV_VARS: ${{ secrets.ENV_VARS }}
steps:
- uses: actions/checkout@v4
with:
github-server-url: http://gitea.lab
- name: Validate ENV_VARS secret
run: |
set -eu
if [ -z "${ENV_VARS}" ]; then
echo "ENV_VARS secret is empty."
exit 1
fi
- name: Deploy stack via Portainer API
run: |
set -u
set +e
if ! command -v curl >/dev/null 2>&1; then
echo "curl is not available in this runner image"
exit 1
fi
if ! command -v jq >/dev/null 2>&1; then
echo "jq is not available in this runner image"
exit 1
fi
PORTAINER_BASE_URL=$(printf '%s' "${PORTAINER_URL:-http://portainer.lab/}" | sed -E 's/[[:space:]]+$//; s#/*$##')
echo "Portainer deploy debug"
echo "PORTAINER_URL=${PORTAINER_URL:-http://portainer.lab/}"
echo "PORTAINER_BASE_URL=${PORTAINER_BASE_URL}"
echo "STACK_NAME=${STACK_NAME}"
echo "PORTAINER_ENDPOINT_ID=${PORTAINER_ENDPOINT_ID}"
echo "HTTP_PROXY=${HTTP_PROXY:-<empty>}"
echo "HTTPS_PROXY=${HTTPS_PROXY:-<empty>}"
echo "NO_PROXY=${NO_PROXY:-<empty>}"
echo "Current runner network info:"
if command -v ip >/dev/null 2>&1; then
ip -4 addr show || true
ip route || true
else
hostname -I || true
fi
ENV_JSON=$(printf '%s\n' "${ENV_VARS}" | jq -R -s '
split("\n")
| map(gsub("\r$"; ""))
| map(select(length > 0))
| map(select(startswith("#") | not))
| map(select(test("^[A-Za-z_][A-Za-z0-9_]*=.*$")))
| map(capture("^(?<name>[A-Za-z_][A-Za-z0-9_]*)=(?<value>.*)$"))
| map({name: .name, value: .value})
')
echo "Loaded $(printf '%s' "${ENV_JSON}" | jq 'length') env entries from ENV_VARS"
echo "ENV names preview:"
printf '%s' "${ENV_JSON}" | jq -r '.[0:10][]?.name' || true
REQUIRED_ENV_KEYS=(
APP_PASSWORD
JWT_SECRET
SPRING_DATASOURCE_USERNAME
SPRING_DATASOURCE_PASSWORD
APP_RECIPIENTS
)
MISSING_KEYS=()
for REQUIRED_KEY in "${REQUIRED_ENV_KEYS[@]}"; do
if ! printf '%s' "${ENV_JSON}" | jq -e --arg required_key "${REQUIRED_KEY}" 'map(.name) | index($required_key) != null' >/dev/null; then
MISSING_KEYS+=("${REQUIRED_KEY}")
fi
done
if [ "${#MISSING_KEYS[@]}" -gt 0 ]; then
echo "ENV_VARS is missing required keys: ${MISSING_KEYS[*]}"
exit 1
fi
echo "Portainer base URL: ${PORTAINER_BASE_URL}"
echo "Target stack: ${STACK_NAME}"
echo "Endpoint id set: $([ -n "${PORTAINER_ENDPOINT_ID}" ] && echo yes || echo no)"
PORTAINER_HOST=$(printf '%s' "${PORTAINER_BASE_URL}" | sed -E 's#^[a-zA-Z]+://##; s#/.*$##; s/:.*$//')
PORTAINER_IP=""
ACTIVE_PORTAINER_BASE_URL="${PORTAINER_BASE_URL}"
if command -v getent >/dev/null 2>&1; then
PORTAINER_IP=$(getent hosts "${PORTAINER_HOST}" | awk 'NR==1{print $1}')
if [ -n "${PORTAINER_IP}" ]; then
PORTAINER_IP_BASE_URL="${PORTAINER_BASE_URL/${PORTAINER_HOST}/${PORTAINER_IP}}"
echo "Portainer DNS resolved ${PORTAINER_HOST} -> ${PORTAINER_IP}"
echo "IP fallback URL: ${PORTAINER_IP_BASE_URL}"
else
echo "DNS lookup returned no IP for ${PORTAINER_HOST}"
fi
else
echo "getent not available; skipping DNS pre-check"
fi
STACKS_BODY=$(mktemp)
STACKS_HEADERS=$(mktemp)
STACKS_ERR=$(mktemp)
STACKS_HTTP_CODE=$(curl -sS \
--noproxy "*" \
-D "${STACKS_HEADERS}" \
-o "${STACKS_BODY}" \
-w "%{http_code}" \
"${ACTIVE_PORTAINER_BASE_URL}/api/stacks" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
2>"${STACKS_ERR}")
STACKS_CURL_EXIT=$?
echo "GET /api/stacks curl exit: ${STACKS_CURL_EXIT}"
echo "GET /api/stacks http code: ${STACKS_HTTP_CODE}"
echo "GET /api/stacks headers:"
cat "${STACKS_HEADERS}" || true
if [ "${STACKS_CURL_EXIT}" -eq 6 ] && [ -n "${PORTAINER_IP:-}" ]; then
echo "Retrying stack list with IP fallback due to DNS failure"
STACKS_HTTP_CODE=$(curl -sS \
--noproxy "*" \
-D "${STACKS_HEADERS}" \
-o "${STACKS_BODY}" \
-w "%{http_code}" \
"${PORTAINER_IP_BASE_URL}/api/stacks" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
2>"${STACKS_ERR}")
STACKS_CURL_EXIT=$?
if [ "${STACKS_CURL_EXIT}" -eq 0 ]; then
ACTIVE_PORTAINER_BASE_URL="${PORTAINER_IP_BASE_URL}"
fi
echo "Retry GET /api/stacks curl exit: ${STACKS_CURL_EXIT}"
echo "Retry GET /api/stacks http code: ${STACKS_HTTP_CODE}"
fi
if [ "${STACKS_CURL_EXIT}" -ne 0 ]; then
echo "GET /api/stacks stderr:"
cat "${STACKS_ERR}" || true
exit "${STACKS_CURL_EXIT}"
fi
if [ "${STACKS_HTTP_CODE}" -lt 200 ] || [ "${STACKS_HTTP_CODE}" -ge 300 ]; then
echo "GET /api/stacks body:"
cat "${STACKS_BODY}" || true
exit 1
fi
STACK_ID=$(jq -r --arg stack_name "${STACK_NAME}" '.[] | select(.Name == $stack_name) | .Id' "${STACKS_BODY}" | head -n 1)
APPLY_BODY=$(mktemp)
APPLY_HEADERS=$(mktemp)
APPLY_ERR=$(mktemp)
# If the stack does not exist yet, remove orphan containers with names defined in compose.
# This enables an idempotent create-or-recreate flow when old standalone containers exist.
if [ -z "${STACK_ID}" ]; then
echo "Stack not found in Portainer; checking for orphan containers with conflicting names"
mapfile -t CONTAINER_NAMES < <(awk '/container_name:/{print $2}' docker-compose.prod.yml | tr -d '"' | sed '/^$/d')
for CONTAINER_NAME in "${CONTAINER_NAMES[@]}"; do
FILTERS=$(jq -cn --arg n "^/${CONTAINER_NAME}$" '{name: [$n]}')
FILTERS_URLENC=$(printf '%s' "${FILTERS}" | jq -sRr @uri)
LIST_URL="${ACTIVE_PORTAINER_BASE_URL}/api/endpoints/${PORTAINER_ENDPOINT_ID}/docker/containers/json?all=1&filters=${FILTERS_URLENC}"
LIST_BODY=$(mktemp)
LIST_ERR=$(mktemp)
LIST_HTTP_CODE=$(curl -sS \
--noproxy "*" \
-o "${LIST_BODY}" \
-w "%{http_code}" \
"${LIST_URL}" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
2>"${LIST_ERR}")
LIST_CURL_EXIT=$?
echo "Container pre-check [${CONTAINER_NAME}] curl=${LIST_CURL_EXIT} http=${LIST_HTTP_CODE}"
if [ "${LIST_CURL_EXIT}" -ne 0 ]; then
echo "Container pre-check stderr for ${CONTAINER_NAME}:"
cat "${LIST_ERR}" || true
continue
fi
if [ "${LIST_HTTP_CODE}" -lt 200 ] || [ "${LIST_HTTP_CODE}" -ge 300 ]; then
echo "Container pre-check non-success response for ${CONTAINER_NAME}:"
cat "${LIST_BODY}" || true
continue
fi
mapfile -t MATCHING_IDS < <(jq -r '.[].Id' "${LIST_BODY}")
if [ "${#MATCHING_IDS[@]}" -eq 0 ]; then
echo "No conflicting container found for ${CONTAINER_NAME}"
continue
fi
for CONTAINER_ID in "${MATCHING_IDS[@]}"; do
DELETE_URL="${ACTIVE_PORTAINER_BASE_URL}/api/endpoints/${PORTAINER_ENDPOINT_ID}/docker/containers/${CONTAINER_ID}?force=1"
DELETE_BODY=$(mktemp)
DELETE_ERR=$(mktemp)
DELETE_HTTP_CODE=$(curl -sS -X DELETE \
--noproxy "*" \
-o "${DELETE_BODY}" \
-w "%{http_code}" \
"${DELETE_URL}" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
2>"${DELETE_ERR}")
DELETE_CURL_EXIT=$?
echo "Removed conflicting container ${CONTAINER_NAME} (${CONTAINER_ID}) curl=${DELETE_CURL_EXIT} http=${DELETE_HTTP_CODE}"
if [ "${DELETE_CURL_EXIT}" -ne 0 ]; then
echo "Delete stderr:"
cat "${DELETE_ERR}" || true
fi
if [ "${DELETE_HTTP_CODE}" -lt 200 ] || [ "${DELETE_HTTP_CODE}" -ge 300 ]; then
echo "Delete response body:"
cat "${DELETE_BODY}" || true
fi
done
done
fi
if [ -n "${STACK_ID}" ]; then
echo "Updating existing stack id=${STACK_ID}"
REQUEST_URL="${ACTIVE_PORTAINER_BASE_URL}/api/stacks/${STACK_ID}?endpointId=${PORTAINER_ENDPOINT_ID}"
PAYLOAD=$(jq -n \
--rawfile stack_file docker-compose.prod.yml \
--argjson env_vars "${ENV_JSON}" \
'{StackFileContent: $stack_file, Env: $env_vars, Prune: false, PullImage: false}')
echo "Apply request URL: ${REQUEST_URL}"
echo "Apply payload summary:"
printf '%s' "${PAYLOAD}" | jq -r '{stackFileLength: (.StackFileContent | length), envCount: (.Env | length), prune: .Prune, pullImage: .PullImage}' || true
APPLY_HTTP_CODE=$(curl -sS -X PUT \
--noproxy "*" \
-D "${APPLY_HEADERS}" \
-o "${APPLY_BODY}" \
-w "%{http_code}" \
"${REQUEST_URL}" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
-H "Content-Type: application/json" \
-d "${PAYLOAD}" \
2>"${APPLY_ERR}")
APPLY_CURL_EXIT=$?
else
echo "Creating new stack ${STACK_NAME}"
REQUEST_URL="${ACTIVE_PORTAINER_BASE_URL}/api/stacks/create/standalone/string?endpointId=${PORTAINER_ENDPOINT_ID}"
PAYLOAD=$(jq -n \
--arg name "${STACK_NAME}" \
--rawfile stack_file docker-compose.prod.yml \
--argjson env_vars "${ENV_JSON}" \
'{Name: $name, StackFileContent: $stack_file, Env: $env_vars, FromAppTemplate: false}')
echo "Apply request URL: ${REQUEST_URL}"
echo "Apply payload summary:"
printf '%s' "${PAYLOAD}" | jq -r '{name: .Name, stackFileLength: (.StackFileContent | length), envCount: (.Env | length), fromAppTemplate: .FromAppTemplate}' || true
APPLY_HTTP_CODE=$(curl -sS -X POST \
--noproxy "*" \
-D "${APPLY_HEADERS}" \
-o "${APPLY_BODY}" \
-w "%{http_code}" \
"${REQUEST_URL}" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
-H "Content-Type: application/json" \
-d "${PAYLOAD}" \
2>"${APPLY_ERR}")
APPLY_CURL_EXIT=$?
fi
echo "Apply curl exit: ${APPLY_CURL_EXIT}"
echo "Apply http code: ${APPLY_HTTP_CODE}"
echo "Apply response headers:"
cat "${APPLY_HEADERS}" || true
if [ "${APPLY_CURL_EXIT}" -ne 0 ]; then
echo "Apply stderr:"
cat "${APPLY_ERR}" || true
exit "${APPLY_CURL_EXIT}"
fi
if [ "${APPLY_HTTP_CODE}" -lt 200 ] || [ "${APPLY_HTTP_CODE}" -ge 300 ]; then
echo "Apply response body:"
cat "${APPLY_BODY}" || true
echo "Apply response parsed as JSON (if possible):"
jq -r '.' "${APPLY_BODY}" 2>/dev/null || echo "<non-json or empty body>"
if [ ! -s "${APPLY_BODY}" ]; then
echo "Apply body is empty; retrying once with verbose curl for diagnostics"
curl -v -X "$( [ -n "${STACK_ID}" ] && echo PUT || echo POST )" \
--noproxy "*" \
-o /tmp/portainer-debug-body.txt \
"${REQUEST_URL}" \
-H "X-API-Key: ${PORTAINER_API_KEY}" \
-H "Content-Type: application/json" \
-d "${PAYLOAD}" \
2>/tmp/portainer-debug-stderr.txt || true
echo "Verbose retry stderr:"
cat /tmp/portainer-debug-stderr.txt || true
echo "Verbose retry body:"
cat /tmp/portainer-debug-body.txt || true
fi
exit 1
fi
echo "Portainer deploy completed successfully"

View File

@@ -1,6 +1,6 @@
--- ---
name: infra name: infra
description: "Use when working on Docker configuration, Docker Compose files, Dockerfiles, Nginx config, Supervisor config, Gitea Actions workflows, CI/CD pipelines, environment variables, or overall project architecture in the condado-news-letter project. Trigger phrases: docker, dockerfile, compose, nginx, ci/cd, gitea actions, build fails, infra, architecture, environment variables, container, supervisor, allinone image." description: "Use when working on Docker configuration, Docker Compose files, Dockerfiles, Nginx config, Supervisor config, Gitea Actions workflows, CI/CD pipelines, deploy flows, environment variables, or overall project architecture in the condado-news-letter project. Trigger phrases: docker, dockerfile, compose, nginx, ci/cd, gitea actions, deploy, build fails, infra, architecture, environment variables, container, supervisor, allinone image."
tools: [read, edit, search, execute, todo] tools: [read, edit, search, execute, todo]
argument-hint: "Describe the infrastructure change or Docker/CI task to implement." argument-hint: "Describe the infrastructure change or Docker/CI task to implement."
--- ---
@@ -15,13 +15,14 @@ You are a senior DevOps / infrastructure engineer and software architect for the
| `backend/Dockerfile` | Backend-only multi-stage build image | | `backend/Dockerfile` | Backend-only multi-stage build image |
| `frontend/Dockerfile` | Frontend build + Nginx image | | `frontend/Dockerfile` | Frontend build + Nginx image |
| `docker-compose.yml` | Dev stack (postgres + backend + nginx + mailhog) | | `docker-compose.yml` | Dev stack (postgres + backend + nginx + mailhog) |
| `docker-compose.prod.yml` | Prod stack (postgres + backend + nginx, no mailhog) | | `docker-compose.prod.yml` | Prod stack (single all-in-one image) |
| `nginx/nginx.conf` | Nginx config for multi-container compose flavours | | `nginx/nginx.conf` | Nginx config for multi-container compose flavours |
| `nginx/nginx.allinone.conf` | Nginx config for the all-in-one image (localhost backend) | | `nginx/nginx.allinone.conf` | Nginx config for the all-in-one image (localhost backend) |
| `frontend/nginx.docker.conf` | Nginx config embedded in frontend image | | `frontend/nginx.docker.conf` | Nginx config embedded in frontend image |
| `docker/supervisord.conf` | Supervisor config (manages postgres + java + nginx inside allinone) | | `docker/supervisord.conf` | Supervisor config (manages postgres + java + nginx inside allinone) |
| `docker/entrypoint.sh` | Allinone container entrypoint (DB init, env wiring, supervisord start) | | `docker/entrypoint.sh` | Allinone container entrypoint (DB init, env wiring, supervisord start) |
| `.gitea/workflows/ci.yml` | CI: backend tests + frontend tests on pull requests to `develop` | | `.gitea/workflows/ci.yml` | CI: backend tests + frontend tests on pull requests to `develop` |
| `.gitea/workflows/build.yml` | Build: create and publish the all-in-one image on approved PRs to `main` |
| `.env.example` | Template for all environment variables | | `.env.example` | Template for all environment variables |
## System Topology ## System Topology
@@ -53,7 +54,7 @@ Docker volume → /var/lib/postgresql/data
| Flavour | Command | Notes | | Flavour | Command | Notes |
|---|---|---| |---|---|---|
| Dev | `docker compose up --build` | Includes Mailhog on :1025/:8025 | | Dev | `docker compose up --build` | Includes Mailhog on :1025/:8025 |
| Prod (compose) | `docker compose -f docker-compose.prod.yml up --build` | External DB/SMTP | | Prod (compose) | `docker compose -f docker-compose.prod.yml up -d` | Prebuilt all-in-one image with internal PostgreSQL |
| All-in-one | `docker run -p 80:80 -e APP_PASSWORD=... <image>` | Everything in one container | | All-in-one | `docker run -p 80:80 -e APP_PASSWORD=... <image>` | Everything in one container |
## Key Environment Variables ## Key Environment Variables
@@ -73,15 +74,16 @@ All injected at runtime — never hardcoded in images.
| `IMAP_HOST` / `IMAP_PORT` / `IMAP_INBOX_FOLDER` | Backend | IMAP server | | `IMAP_HOST` / `IMAP_PORT` / `IMAP_INBOX_FOLDER` | Backend | IMAP server |
| `OPENAI_API_KEY` / `OPENAI_MODEL` | Backend | OpenAI credentials | | `OPENAI_API_KEY` / `OPENAI_MODEL` | Backend | OpenAI credentials |
| `APP_RECIPIENTS` | Backend | Comma-separated recipient emails | | `APP_RECIPIENTS` | Backend | Comma-separated recipient emails |
| `VITE_API_BASE_URL` | Frontend (build-time ARG) | Backend API base URL | | `VITE_API_BASE_URL` | Frontend dev server | Backend API base URL for Vite proxy |
## CI/CD Pipeline ## CI/CD Pipeline
| Workflow | Trigger | What it does | | Workflow | Trigger | What it does |
|---|---|---| |---|---|---|
| `ci.yml` | Pull request to `develop` | Backend `./gradlew test` + Frontend `npm run test` | | `ci.yml` | Pull request to `develop` | Backend `./gradlew test` + Frontend `npm run test` |
| `build.yml` | Approved PR review to `main` | Builds `condado-newsletter` on the target Docker host, then pushes `latest` and `${github.sha}` tags to Gitea container registry |
Legacy publish/version workflows were removed from in-repo automation. The runner shares the target Docker host, so this workflow builds the image locally, tags it for `gitea.lab/sancho41/condado-newsletter`, and pushes it to Gitea container registry. `docker-compose.prod.yml` must reference that published image and not local build directives.
## Implementation Rules ## Implementation Rules

View File

@@ -1,57 +0,0 @@
name: CI
on:
pull_request:
branches: ["develop"]
jobs:
backend-test:
name: Backend Tests
runs-on: ubuntu-latest
defaults:
run:
working-directory: backend
steps:
- uses: actions/checkout@v4
- name: Set up JDK 21
uses: actions/setup-java@v4
with:
java-version: "21"
distribution: temurin
cache: gradle
- name: Make Gradle wrapper executable
run: chmod +x gradlew
- name: Run tests
run: ./gradlew test --no-daemon
- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: backend-test-results
path: backend/build/reports/tests/
frontend-test:
name: Frontend Tests
runs-on: ubuntu-latest
defaults:
run:
working-directory: frontend
steps:
- uses: actions/checkout@v4
- name: Set up Node 20
uses: actions/setup-node@v4
with:
node-version: "20"
cache: npm
cache-dependency-path: frontend/package-lock.json
- name: Install dependencies
run: npm ci
- name: Run tests
run: npm run test

View File

@@ -83,8 +83,8 @@ The cycle for every step is:
| Reverse Proxy | Nginx (serves frontend + proxies `/api` to backend) | | Reverse Proxy | Nginx (serves frontend + proxies `/api` to backend) |
| Dev Mail | Mailhog (SMTP trap + web UI) | | Dev Mail | Mailhog (SMTP trap + web UI) |
| All-in-one image | Single Docker image: Nginx + Spring Boot + PostgreSQL + Supervisor | | All-in-one image | Single Docker image: Nginx + Spring Boot + PostgreSQL + Supervisor |
| Image registry | Not configured (legacy Docker Hub publish workflow removed) | | Image registry | Gitea container registry (`gitea.lab/sancho41/condado-newsletter`) |
| CI/CD | Gitea Actions — run backend/frontend tests on pull requests to `develop` | | CI/CD | Gitea Actions — test PRs to `develop`, build and publish the production image on approved PRs targeting `main` |
## Deployment Flavours ## Deployment Flavours
@@ -93,7 +93,7 @@ There are **three ways to run the project**:
| Flavour | Command | When to use | | Flavour | Command | When to use |
|---------------------|---------------------------------|------------------------------------------------| |---------------------|---------------------------------|------------------------------------------------|
| **Dev** | `docker compose up` | Local development — includes Mailhog | | **Dev** | `docker compose up` | Local development — includes Mailhog |
| **Prod (compose)** | `docker compose -f docker-compose.prod.yml up` | Production with external DB/SMTP | | **Prod (compose)** | `docker compose -f docker-compose.prod.yml up -d` | Production with the prebuilt all-in-one image |
| **All-in-one** | `docker run ...` | Simplest deploy — everything in one container | | **All-in-one** | `docker run ...` | Simplest deploy — everything in one container |
### All-in-one Image ### All-in-one Image
@@ -104,7 +104,7 @@ The all-in-one image (`Dockerfile.allinone`) bundles **everything** into a singl
- **PostgreSQL** — embedded database - **PostgreSQL** — embedded database
- **Supervisor** — process manager that starts and supervises all three processes - **Supervisor** — process manager that starts and supervises all three processes
The all-in-one image is built locally or in external pipelines as needed (no default registry publish workflow in-repo). The all-in-one image is built on the runner host and then published to the Gitea container registry.
**Minimal `docker run` command:** **Minimal `docker run` command:**
```bash ```bash
@@ -121,7 +121,7 @@ docker run -d \
-e IMAP_PORT=993 \ -e IMAP_PORT=993 \
-e APP_RECIPIENTS=friend1@example.com,friend2@example.com \ -e APP_RECIPIENTS=friend1@example.com,friend2@example.com \
-v condado-data:/var/lib/postgresql/data \ -v condado-data:/var/lib/postgresql/data \
<registry-or-local-image>/condado-newsletter:latest gitea.lab/sancho41/condado-newsletter:latest
``` ```
The app is then available at `http://localhost`. The app is then available at `http://localhost`.
@@ -213,7 +213,7 @@ condado-news-letter/ ← repo root
├── .env.example ← template for all env vars ├── .env.example ← template for all env vars
├── .gitignore ├── .gitignore
├── docker-compose.yml ← dev stack (Nginx + Backend + PostgreSQL + Mailhog) ├── docker-compose.yml ← dev stack (Nginx + Backend + PostgreSQL + Mailhog)
├── docker-compose.prod.yml ← prod stack (Nginx + Backend + PostgreSQL) ├── docker-compose.prod.yml ← prod stack (single all-in-one image)
├── Dockerfile.allinone ← all-in-one image (Nginx + Backend + PostgreSQL + Supervisor) ├── Dockerfile.allinone ← all-in-one image (Nginx + Backend + PostgreSQL + Supervisor)
├── .github/ ├── .github/
@@ -312,7 +312,7 @@ npm run test
docker compose up --build docker compose up --build
# Prod # Prod
docker compose -f docker-compose.prod.yml up --build docker compose -f docker-compose.prod.yml up -d
# Stop # Stop
docker compose down docker compose down
@@ -456,7 +456,7 @@ Never hardcode any of these values.
| `OPENAI_API_KEY` | Backend | OpenAI API key | | `OPENAI_API_KEY` | Backend | OpenAI API key |
| `OPENAI_MODEL` | Backend | OpenAI model (default: `gpt-4o`) | | `OPENAI_MODEL` | Backend | OpenAI model (default: `gpt-4o`) |
| `APP_RECIPIENTS` | Backend | Comma-separated list of recipient emails | | `APP_RECIPIENTS` | Backend | Comma-separated list of recipient emails |
| `VITE_API_BASE_URL` | Frontend | Backend API base URL (used by Vite at build time) | | `VITE_API_BASE_URL` | Frontend | Backend API base URL for the Vite dev server proxy |
> ⚠️ Never hardcode credentials. Always use environment variables or a `.env` file (gitignored). > ⚠️ Never hardcode credentials. Always use environment variables or a `.env` file (gitignored).
@@ -575,8 +575,9 @@ Good examples:
| Workflow file | Trigger | What it does | | Workflow file | Trigger | What it does |
|----------------------------|----------------------------|-----------------------------------------------------------| |----------------------------|----------------------------|-----------------------------------------------------------|
| `.gitea/workflows/ci.yml` | PR to `develop` | Backend tests (`./gradlew test`) + Frontend tests (`npm run test`) | | `.gitea/workflows/ci.yml` | PR to `develop` | Backend tests (`./gradlew test`) + Frontend tests (`npm run test`) |
| `.gitea/workflows/build.yml` | Approved PR review on `main` | Build `condado-newsletter`, then publish `latest` and `${github.sha}` tags to Gitea container registry |
Current policy: old publish/version automation workflows were removed during the Gitea migration. Build policy: the runner shares the target Docker host, so the build workflow produces the image locally, tags it for `gitea.lab/sancho41/condado-newsletter`, and pushes it to Gitea container registry. `docker-compose.prod.yml` references that published image.
--- ---

View File

@@ -15,6 +15,7 @@ FROM gradle:8-jdk21-alpine AS backend-build
WORKDIR /app/backend WORKDIR /app/backend
COPY backend/build.gradle.kts backend/settings.gradle.kts ./ COPY backend/build.gradle.kts backend/settings.gradle.kts ./
COPY backend/gradle.properties ./
COPY backend/gradle ./gradle COPY backend/gradle ./gradle
RUN gradle dependencies --no-daemon --quiet || true RUN gradle dependencies --no-daemon --quiet || true
@@ -28,14 +29,10 @@ ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update && apt-get install -y \ RUN apt-get update && apt-get install -y \
nginx \ nginx \
postgresql \
supervisor \ supervisor \
openjdk-21-jre-headless \ openjdk-21-jre-headless \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# PostgreSQL data directory
RUN mkdir -p /var/lib/postgresql/data && chown -R postgres:postgres /var/lib/postgresql
# Copy frontend static files # Copy frontend static files
COPY --from=frontend-build /app/frontend/dist /usr/share/nginx/html COPY --from=frontend-build /app/frontend/dist /usr/share/nginx/html

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,7 @@
package com.condado.newsletter.dto package com.condado.newsletter.dto
import com.condado.newsletter.model.EntityTask import com.condado.newsletter.model.EntityTask
import com.condado.newsletter.model.TaskGenerationSource
import jakarta.validation.constraints.NotBlank import jakarta.validation.constraints.NotBlank
import jakarta.validation.constraints.NotNull import jakarta.validation.constraints.NotNull
import java.time.LocalDateTime import java.time.LocalDateTime
@@ -11,7 +12,8 @@ data class EntityTaskCreateDto(
@field:NotBlank val name: String, @field:NotBlank val name: String,
val prompt: String, val prompt: String,
@field:NotBlank val scheduleCron: String, @field:NotBlank val scheduleCron: String,
@field:NotBlank val emailLookback: String @field:NotBlank val emailLookback: String,
val generationSource: TaskGenerationSource = TaskGenerationSource.LLAMA
) )
data class EntityTaskUpdateDto( data class EntityTaskUpdateDto(
@@ -19,7 +21,8 @@ data class EntityTaskUpdateDto(
@field:NotBlank val name: String, @field:NotBlank val name: String,
@field:NotBlank val prompt: String, @field:NotBlank val prompt: String,
@field:NotBlank val scheduleCron: String, @field:NotBlank val scheduleCron: String,
@field:NotBlank val emailLookback: String @field:NotBlank val emailLookback: String,
val generationSource: TaskGenerationSource? = null
) )
data class EntityTaskResponseDto( data class EntityTaskResponseDto(
@@ -29,6 +32,7 @@ data class EntityTaskResponseDto(
val prompt: String, val prompt: String,
val scheduleCron: String, val scheduleCron: String,
val emailLookback: String, val emailLookback: String,
val generationSource: TaskGenerationSource,
val active: Boolean, val active: Boolean,
val createdAt: LocalDateTime? val createdAt: LocalDateTime?
) { ) {
@@ -41,6 +45,7 @@ data class EntityTaskResponseDto(
prompt = task.prompt, prompt = task.prompt,
scheduleCron = task.scheduleCron, scheduleCron = task.scheduleCron,
emailLookback = task.emailLookback, emailLookback = task.emailLookback,
generationSource = task.generationSource,
active = task.active, active = task.active,
createdAt = task.createdAt createdAt = task.createdAt
) )

View File

@@ -3,6 +3,8 @@ package com.condado.newsletter.model
import jakarta.persistence.CascadeType import jakarta.persistence.CascadeType
import jakarta.persistence.Column import jakarta.persistence.Column
import jakarta.persistence.Entity import jakarta.persistence.Entity
import jakarta.persistence.EnumType
import jakarta.persistence.Enumerated
import jakarta.persistence.FetchType import jakarta.persistence.FetchType
import jakarta.persistence.GeneratedValue import jakarta.persistence.GeneratedValue
import jakarta.persistence.GenerationType import jakarta.persistence.GenerationType
@@ -37,6 +39,10 @@ class EntityTask(
@Column(name = "email_lookback", nullable = false) @Column(name = "email_lookback", nullable = false)
val emailLookback: String, val emailLookback: String,
@Enumerated(EnumType.STRING)
@Column(name = "generation_source", nullable = false)
val generationSource: TaskGenerationSource = TaskGenerationSource.LLAMA,
@Column(nullable = false) @Column(nullable = false)
val active: Boolean = true, val active: Boolean = true,

View File

@@ -0,0 +1,19 @@
package com.condado.newsletter.model
import com.fasterxml.jackson.annotation.JsonCreator
import com.fasterxml.jackson.annotation.JsonValue
enum class TaskGenerationSource(
@get:JsonValue val value: String
) {
OPENAI("openai"),
LLAMA("llama");
companion object {
@JvmStatic
@JsonCreator
fun from(value: String): TaskGenerationSource =
entries.firstOrNull { it.value.equals(value, ignoreCase = true) }
?: throw IllegalArgumentException("Invalid generationSource: $value")
}
}

View File

@@ -47,6 +47,7 @@ class EntityTaskService(
prompt = dto.prompt, prompt = dto.prompt,
scheduleCron = dto.scheduleCron, scheduleCron = dto.scheduleCron,
emailLookback = dto.emailLookback, emailLookback = dto.emailLookback,
generationSource = dto.generationSource,
active = true active = true
) )
@@ -66,6 +67,7 @@ class EntityTaskService(
prompt = dto.prompt, prompt = dto.prompt,
scheduleCron = dto.scheduleCron, scheduleCron = dto.scheduleCron,
emailLookback = dto.emailLookback, emailLookback = dto.emailLookback,
generationSource = dto.generationSource ?: existing.generationSource,
active = existing.active, active = existing.active,
createdAt = existing.createdAt createdAt = existing.createdAt
).apply { id = existing.id } ).apply { id = existing.id }
@@ -83,6 +85,7 @@ class EntityTaskService(
prompt = existing.prompt, prompt = existing.prompt,
scheduleCron = existing.scheduleCron, scheduleCron = existing.scheduleCron,
emailLookback = existing.emailLookback, emailLookback = existing.emailLookback,
generationSource = existing.generationSource,
active = false, active = false,
createdAt = existing.createdAt createdAt = existing.createdAt
).apply { id = existing.id } ).apply { id = existing.id }
@@ -100,6 +103,7 @@ class EntityTaskService(
prompt = existing.prompt, prompt = existing.prompt,
scheduleCron = existing.scheduleCron, scheduleCron = existing.scheduleCron,
emailLookback = existing.emailLookback, emailLookback = existing.emailLookback,
generationSource = existing.generationSource,
active = true, active = true,
createdAt = existing.createdAt createdAt = existing.createdAt
).apply { id = existing.id } ).apply { id = existing.id }

View File

@@ -14,8 +14,10 @@ import java.util.Date
@Service @Service
class JwtService( class JwtService(
@Value("\${app.jwt.secret}") val secret: String, @Value("\${app.jwt.secret}") val secret: String,
@Value("\${app.jwt.expiration-ms}") val expirationMs: Long @Value("\${app.jwt.expiration-ms:86400000}") expirationMsRaw: String
) { ) {
private val expirationMs: Long = expirationMsRaw.toLongOrNull() ?: 86400000L
private val signingKey by lazy { private val signingKey by lazy {
Keys.hmacShaKeyFor(secret.toByteArray(Charsets.UTF_8)) Keys.hmacShaKeyFor(secret.toByteArray(Charsets.UTF_8))
} }

View File

@@ -3,6 +3,7 @@ package com.condado.newsletter.service
import com.condado.newsletter.dto.GeneratedMessageHistoryResponseDto import com.condado.newsletter.dto.GeneratedMessageHistoryResponseDto
import com.condado.newsletter.dto.TaskPreviewGenerateRequestDto import com.condado.newsletter.dto.TaskPreviewGenerateRequestDto
import com.condado.newsletter.model.GeneratedMessageHistory import com.condado.newsletter.model.GeneratedMessageHistory
import com.condado.newsletter.model.TaskGenerationSource
import com.condado.newsletter.repository.EntityTaskRepository import com.condado.newsletter.repository.EntityTaskRepository
import com.condado.newsletter.repository.GeneratedMessageHistoryRepository import com.condado.newsletter.repository.GeneratedMessageHistoryRepository
import org.springframework.stereotype.Service import org.springframework.stereotype.Service
@@ -16,7 +17,8 @@ import java.util.UUID
class TaskGeneratedMessageService( class TaskGeneratedMessageService(
private val generatedMessageHistoryRepository: GeneratedMessageHistoryRepository, private val generatedMessageHistoryRepository: GeneratedMessageHistoryRepository,
private val entityTaskRepository: EntityTaskRepository, private val entityTaskRepository: EntityTaskRepository,
private val llamaPreviewService: LlamaPreviewService private val llamaPreviewService: LlamaPreviewService,
private val aiService: AiService
) { ) {
/** Lists persisted generated messages for a task. */ /** Lists persisted generated messages for a task. */
@@ -25,15 +27,19 @@ class TaskGeneratedMessageService(
.findAllByTask_IdOrderByCreatedAtDesc(taskId) .findAllByTask_IdOrderByCreatedAtDesc(taskId)
.map { GeneratedMessageHistoryResponseDto.from(it) } .map { GeneratedMessageHistoryResponseDto.from(it) }
/** /** Generates a new message with the task-selected provider, persists it, and returns it. */
* Generates a new message using local Llama, persists it, and returns it.
*/
@Transactional @Transactional
fun generateAndSave(taskId: UUID, request: TaskPreviewGenerateRequestDto): GeneratedMessageHistoryResponseDto { fun generateAndSave(taskId: UUID, request: TaskPreviewGenerateRequestDto): GeneratedMessageHistoryResponseDto {
val task = entityTaskRepository.findById(taskId) val task = entityTaskRepository.findById(taskId)
.orElseThrow { IllegalArgumentException("Task not found: $taskId") } .orElseThrow { IllegalArgumentException("Task not found: $taskId") }
val prompt = buildPrompt(request) val prompt = buildPrompt(request)
val generatedContent = llamaPreviewService.generate(prompt) val generatedContent = when (task.generationSource) {
TaskGenerationSource.LLAMA -> llamaPreviewService.generate(prompt)
TaskGenerationSource.OPENAI -> {
val parsed = aiService.generate(prompt)
"SUBJECT: ${parsed.subject}\nBODY:\n${parsed.body}"
}
}
val nextLabel = "Message #${generatedMessageHistoryRepository.countByTask_Id(taskId) + 1}" val nextLabel = "Message #${generatedMessageHistoryRepository.countByTask_Id(taskId) + 1}"
val saved = generatedMessageHistoryRepository.save( val saved = generatedMessageHistoryRepository.save(

View File

@@ -10,7 +10,7 @@ spring:
jpa: jpa:
hibernate: hibernate:
ddl-auto: validate ddl-auto: ${SPRING_JPA_HIBERNATE_DDL_AUTO:validate}
show-sql: false show-sql: false
properties: properties:
hibernate: hibernate:

View File

@@ -8,6 +8,7 @@ import com.condado.newsletter.scheduler.EntityScheduler
import com.condado.newsletter.service.JwtService import com.condado.newsletter.service.JwtService
import com.ninjasquad.springmockk.MockkBean import com.ninjasquad.springmockk.MockkBean
import jakarta.servlet.http.Cookie import jakarta.servlet.http.Cookie
import org.assertj.core.api.Assertions.assertThat
import org.junit.jupiter.api.AfterEach import org.junit.jupiter.api.AfterEach
import org.junit.jupiter.api.Test import org.junit.jupiter.api.Test
import org.springframework.beans.factory.annotation.Autowired import org.springframework.beans.factory.annotation.Autowired
@@ -15,6 +16,7 @@ import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMock
import org.springframework.boot.test.context.SpringBootTest import org.springframework.boot.test.context.SpringBootTest
import org.springframework.http.MediaType import org.springframework.http.MediaType
import org.springframework.test.web.servlet.MockMvc import org.springframework.test.web.servlet.MockMvc
import org.springframework.test.web.servlet.request.MockMvcRequestBuilders.put
import org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post import org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post
import org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath import org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath
import org.springframework.test.web.servlet.result.MockMvcResultMatchers.status import org.springframework.test.web.servlet.result.MockMvcResultMatchers.status
@@ -56,7 +58,8 @@ class EntityTaskControllerTest {
"name": "Morning Blast", "name": "Morning Blast",
"prompt": "", "prompt": "",
"scheduleCron": "0 8 * * 1-5", "scheduleCron": "0 8 * * 1-5",
"emailLookback": "last_week" "emailLookback": "last_week",
"generationSource": "openai"
} }
""".trimIndent() """.trimIndent()
@@ -70,5 +73,96 @@ class EntityTaskControllerTest {
.andExpect(jsonPath("$.entityId").value(entity.id.toString())) .andExpect(jsonPath("$.entityId").value(entity.id.toString()))
.andExpect(jsonPath("$.name").value("Morning Blast")) .andExpect(jsonPath("$.name").value("Morning Blast"))
.andExpect(jsonPath("$.prompt").value("")) .andExpect(jsonPath("$.prompt").value(""))
.andExpect(jsonPath("$.generationSource").value("openai"))
val persisted = entityTaskRepository.findAll().first()
assertThat(persisted.generationSource.value).isEqualTo("openai")
}
@Test
fun should_updateTaskAndPersistGenerationSource_when_validRequestProvided() {
val entity = virtualEntityRepository.save(
VirtualEntity(
name = "Entity B",
email = "entity-b@condado.com",
jobTitle = "Ops"
)
)
val createdPayload = """
{
"entityId": "${entity.id}",
"name": "Task One",
"prompt": "Initial prompt",
"scheduleCron": "0 8 * * 1-5",
"emailLookback": "last_week",
"generationSource": "openai"
}
""".trimIndent()
val createdResult = mockMvc.perform(
post("/api/v1/tasks")
.cookie(authCookie())
.contentType(MediaType.APPLICATION_JSON)
.content(createdPayload)
)
.andExpect(status().isCreated)
.andReturn()
val taskId = com.jayway.jsonpath.JsonPath.read<String>(createdResult.response.contentAsString, "$.id")
val updatePayload = """
{
"entityId": "${entity.id}",
"name": "Task One Updated",
"prompt": "Updated prompt",
"scheduleCron": "0 10 * * 1-5",
"emailLookback": "last_day",
"generationSource": "llama"
}
""".trimIndent()
mockMvc.perform(
put("/api/v1/tasks/$taskId")
.cookie(authCookie())
.contentType(MediaType.APPLICATION_JSON)
.content(updatePayload)
)
.andExpect(status().isOk)
.andExpect(jsonPath("$.name").value("Task One Updated"))
.andExpect(jsonPath("$.generationSource").value("llama"))
val persisted = entityTaskRepository.findById(java.util.UUID.fromString(taskId)).orElseThrow()
assertThat(persisted.generationSource.value).isEqualTo("llama")
}
@Test
fun should_returnBadRequest_when_generationSourceIsInvalid() {
val entity = virtualEntityRepository.save(
VirtualEntity(
name = "Entity C",
email = "entity-c@condado.com",
jobTitle = "Ops"
)
)
val payload = """
{
"entityId": "${entity.id}",
"name": "Morning Blast",
"prompt": "Prompt",
"scheduleCron": "0 8 * * 1-5",
"emailLookback": "last_week",
"generationSource": "invalid-provider"
}
""".trimIndent()
mockMvc.perform(
post("/api/v1/tasks")
.cookie(authCookie())
.contentType(MediaType.APPLICATION_JSON)
.content(payload)
)
.andExpect(status().isBadRequest)
} }
} }

View File

@@ -40,7 +40,7 @@ class AuthServiceTest {
fun should_returnValidClaims_when_jwtTokenParsed() { fun should_returnValidClaims_when_jwtTokenParsed() {
val realJwtService = JwtService( val realJwtService = JwtService(
secret = "test-secret-key-for-testing-only-must-be-at-least-32-characters", secret = "test-secret-key-for-testing-only-must-be-at-least-32-characters",
expirationMs = 86400000L expirationMsRaw = "86400000"
) )
val token = realJwtService.generateToken() val token = realJwtService.generateToken()
@@ -51,7 +51,7 @@ class AuthServiceTest {
fun should_returnFalse_when_expiredTokenValidated() { fun should_returnFalse_when_expiredTokenValidated() {
val realJwtService = JwtService( val realJwtService = JwtService(
secret = "test-secret-key-for-testing-only-must-be-at-least-32-characters", secret = "test-secret-key-for-testing-only-must-be-at-least-32-characters",
expirationMs = 1L expirationMsRaw = "1"
) )
val token = realJwtService.generateToken() val token = realJwtService.generateToken()

View File

@@ -0,0 +1,26 @@
package com.condado.newsletter.service
import io.jsonwebtoken.Jwts
import io.jsonwebtoken.security.Keys
import org.junit.jupiter.api.Assertions.assertTrue
import org.junit.jupiter.api.Test
class JwtServiceTest {
private val secret = "12345678901234567890123456789012"
@Test
fun should_generate_token_when_expiration_is_empty() {
val jwtService = JwtService(secret, "")
val token = jwtService.generateToken()
val claims = Jwts.parser()
.verifyWith(Keys.hmacShaKeyFor(secret.toByteArray(Charsets.UTF_8)))
.build()
.parseSignedClaims(token)
.payload
assertTrue(claims.expiration.after(claims.issuedAt))
}
}

View File

@@ -5,6 +5,8 @@ import com.condado.newsletter.dto.TaskPreviewGenerateRequestDto
import com.condado.newsletter.dto.TaskPreviewTaskDto import com.condado.newsletter.dto.TaskPreviewTaskDto
import com.condado.newsletter.model.EntityTask import com.condado.newsletter.model.EntityTask
import com.condado.newsletter.model.GeneratedMessageHistory import com.condado.newsletter.model.GeneratedMessageHistory
import com.condado.newsletter.model.ParsedAiResponse
import com.condado.newsletter.model.TaskGenerationSource
import com.condado.newsletter.model.VirtualEntity import com.condado.newsletter.model.VirtualEntity
import com.condado.newsletter.repository.EntityTaskRepository import com.condado.newsletter.repository.EntityTaskRepository
import com.condado.newsletter.repository.GeneratedMessageHistoryRepository import com.condado.newsletter.repository.GeneratedMessageHistoryRepository
@@ -21,15 +23,17 @@ class TaskGeneratedMessageServiceTest {
private val generatedMessageHistoryRepository: GeneratedMessageHistoryRepository = mockk() private val generatedMessageHistoryRepository: GeneratedMessageHistoryRepository = mockk()
private val entityTaskRepository: EntityTaskRepository = mockk() private val entityTaskRepository: EntityTaskRepository = mockk()
private val llamaPreviewService: LlamaPreviewService = mockk() private val llamaPreviewService: LlamaPreviewService = mockk()
private val aiService: AiService = mockk()
private val service = TaskGeneratedMessageService( private val service = TaskGeneratedMessageService(
generatedMessageHistoryRepository = generatedMessageHistoryRepository, generatedMessageHistoryRepository = generatedMessageHistoryRepository,
entityTaskRepository = entityTaskRepository, entityTaskRepository = entityTaskRepository,
llamaPreviewService = llamaPreviewService llamaPreviewService = llamaPreviewService,
aiService = aiService
) )
@Test @Test
fun should_generateAndPersistMessage_when_generateAndSaveCalled() { fun should_useLlamaProvider_when_taskGenerationSourceIsLlama() {
val taskId = UUID.randomUUID() val taskId = UUID.randomUUID()
val entity = VirtualEntity(name = "Entity", email = "e@x.com", jobTitle = "Ops").apply { id = UUID.randomUUID() } val entity = VirtualEntity(name = "Entity", email = "e@x.com", jobTitle = "Ops").apply { id = UUID.randomUUID() }
val task = EntityTask( val task = EntityTask(
@@ -37,7 +41,8 @@ class TaskGeneratedMessageServiceTest {
name = "Task", name = "Task",
prompt = "Prompt", prompt = "Prompt",
scheduleCron = "0 9 * * 1", scheduleCron = "0 9 * * 1",
emailLookback = "last_week" emailLookback = "last_week",
generationSource = TaskGenerationSource.LLAMA
).apply { id = taskId } ).apply { id = taskId }
val captured = slot<GeneratedMessageHistory>() val captured = slot<GeneratedMessageHistory>()
@@ -59,9 +64,40 @@ class TaskGeneratedMessageServiceTest {
assertThat(captured.captured.task.id).isEqualTo(taskId) assertThat(captured.captured.task.id).isEqualTo(taskId)
verify(exactly = 1) { llamaPreviewService.generate(any()) } verify(exactly = 1) { llamaPreviewService.generate(any()) }
verify(exactly = 0) { aiService.generate(any()) }
verify(exactly = 1) { generatedMessageHistoryRepository.save(any()) } verify(exactly = 1) { generatedMessageHistoryRepository.save(any()) }
} }
@Test
fun should_useOpenAiProvider_when_taskGenerationSourceIsOpenai() {
val taskId = UUID.randomUUID()
val entity = VirtualEntity(name = "Entity", email = "e@x.com", jobTitle = "Ops").apply { id = UUID.randomUUID() }
val task = EntityTask(
virtualEntity = entity,
name = "Task",
prompt = "Prompt",
scheduleCron = "0 9 * * 1",
emailLookback = "last_week",
generationSource = TaskGenerationSource.OPENAI
).apply { id = taskId }
val captured = slot<GeneratedMessageHistory>()
every { aiService.generate(any()) } returns ParsedAiResponse(subject = "Open Subject", body = "Open Body")
every { entityTaskRepository.findById(taskId) } returns java.util.Optional.of(task)
every { generatedMessageHistoryRepository.countByTask_Id(taskId) } returns 0
every { generatedMessageHistoryRepository.save(capture(captured)) } answers {
captured.captured.apply {
id = UUID.fromString("00000000-0000-0000-0000-000000000001")
}
}
val response = service.generateAndSave(taskId, sampleRequest())
assertThat(response.content).isEqualTo("SUBJECT: Open Subject\nBODY:\nOpen Body")
verify(exactly = 1) { aiService.generate(any()) }
verify(exactly = 0) { llamaPreviewService.generate(any()) }
}
private fun sampleRequest() = TaskPreviewGenerateRequestDto( private fun sampleRequest() = TaskPreviewGenerateRequestDto(
entity = TaskPreviewEntityDto( entity = TaskPreviewEntityDto(
id = UUID.randomUUID().toString(), id = UUID.randomUUID().toString(),

View File

@@ -1,40 +1,42 @@
services: services:
condado-newsletter-postgres:
# ── PostgreSQL ─────────────────────────────────────────────────────────────── image: postgres:16
postgres: container_name: condado-newsletter-postgres
image: postgres:16-alpine restart: unless-stopped
restart: always
environment: environment:
POSTGRES_DB: condado POSTGRES_DB: ${APP_DB_NAME:-condado}
POSTGRES_USER: ${SPRING_DATASOURCE_USERNAME} POSTGRES_USER: ${POSTGRES_USER:-condado}
POSTGRES_PASSWORD: ${SPRING_DATASOURCE_PASSWORD} POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-condado}
volumes: volumes:
- postgres-data:/var/lib/postgresql/data - postgres-data:/var/lib/postgresql/data
networks: networks:
- condado-net - default
healthcheck: healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${SPRING_DATASOURCE_USERNAME} -d condado"] test: ["CMD-SHELL", "pg_isready -h localhost -U $${POSTGRES_USER:-postgres}"]
interval: 10s interval: 10s
timeout: 5s timeout: 5s
retries: 5 retries: 10
start_period: 10s
# ── Backend (Spring Boot) ──────────────────────────────────────────────────── condado-newsletter:
backend: image: sancho41/condado-newsletter:latest
build: container_name: condado-newsletter
context: ./backend restart: unless-stopped
dockerfile: Dockerfile
restart: always
depends_on: depends_on:
postgres: condado-newsletter-postgres:
condition: service_healthy condition: service_healthy
networks:
- external
- default
environment: environment:
SPRING_PROFILES_ACTIVE: prod SPRING_PROFILES_ACTIVE: prod
SPRING_DATASOURCE_URL: ${SPRING_DATASOURCE_URL} SPRING_JPA_HIBERNATE_DDL_AUTO: ${SPRING_JPA_HIBERNATE_DDL_AUTO:-update}
SPRING_DATASOURCE_USERNAME: ${SPRING_DATASOURCE_USERNAME} SPRING_DATASOURCE_URL: jdbc:postgresql://condado-newsletter-postgres:5432/${APP_DB_NAME:-condado}
SPRING_DATASOURCE_PASSWORD: ${SPRING_DATASOURCE_PASSWORD} SPRING_DATASOURCE_USERNAME: ${SPRING_DATASOURCE_USERNAME:-condado}
SPRING_DATASOURCE_PASSWORD: ${SPRING_DATASOURCE_PASSWORD:-condado}
APP_PASSWORD: ${APP_PASSWORD} APP_PASSWORD: ${APP_PASSWORD}
JWT_SECRET: ${JWT_SECRET} JWT_SECRET: ${JWT_SECRET}
JWT_EXPIRATION_MS: ${JWT_EXPIRATION_MS} JWT_EXPIRATION_MS: ${JWT_EXPIRATION_MS:-86400000}
MAIL_HOST: ${MAIL_HOST} MAIL_HOST: ${MAIL_HOST}
MAIL_PORT: ${MAIL_PORT} MAIL_PORT: ${MAIL_PORT}
MAIL_USERNAME: ${MAIL_USERNAME} MAIL_USERNAME: ${MAIL_USERNAME}
@@ -50,27 +52,24 @@ services:
extra_hosts: extra_hosts:
- "celtinha.desktop:host-gateway" - "celtinha.desktop:host-gateway"
- "host.docker.internal:host-gateway" - "host.docker.internal:host-gateway"
networks: labels:
- condado-net - "traefik.enable=true"
- "traefik.http.routers.condado.rule=Host(`condado-newsletter.lab`)"
# ── Frontend + Nginx ───────────────────────────────────────────────────────── - "traefik.http.services.condado.loadbalancer.server.port=80"
nginx: - "traefik.docker.network=traefik"
build: - "homepage.group=Hyperlink"
context: ./frontend - "homepage.name=Condado Newsletter"
dockerfile: Dockerfile - "homepage.description=Automated newsletter generator using AI"
args: - "homepage.logo=claude-ai.png"
VITE_API_BASE_URL: ${VITE_API_BASE_URL} - "homepage.href=http://condado-newsletter.lab"
restart: always
ports:
- "80:80"
depends_on:
- backend
networks:
- condado-net
volumes: volumes:
postgres-data: postgres-data:
networks: networks:
condado-net: default:
driver: bridge driver: bridge
external:
name: traefik
external: true

View File

@@ -4,14 +4,13 @@ services:
postgres: postgres:
image: postgres:16-alpine image: postgres:16-alpine
restart: unless-stopped restart: unless-stopped
container_name: condado-newsletter-postgres
environment: environment:
POSTGRES_DB: condado POSTGRES_DB: condado
POSTGRES_USER: ${SPRING_DATASOURCE_USERNAME} POSTGRES_USER: ${SPRING_DATASOURCE_USERNAME}
POSTGRES_PASSWORD: ${SPRING_DATASOURCE_PASSWORD} POSTGRES_PASSWORD: ${SPRING_DATASOURCE_PASSWORD}
volumes: volumes:
- postgres-data:/var/lib/postgresql/data - postgres-data:/var/lib/postgresql/data
networks:
- condado-net
healthcheck: healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${SPRING_DATASOURCE_USERNAME} -d condado"] test: ["CMD-SHELL", "pg_isready -U ${SPRING_DATASOURCE_USERNAME} -d condado"]
interval: 10s interval: 10s
@@ -20,6 +19,7 @@ services:
# ── Backend (Spring Boot) ──────────────────────────────────────────────────── # ── Backend (Spring Boot) ────────────────────────────────────────────────────
backend: backend:
container_name: condado-newsletter-backend
build: build:
context: ./backend context: ./backend
dockerfile: Dockerfile dockerfile: Dockerfile
@@ -29,7 +29,7 @@ services:
condition: service_healthy condition: service_healthy
environment: environment:
SPRING_PROFILES_ACTIVE: dev SPRING_PROFILES_ACTIVE: dev
SPRING_DATASOURCE_URL: ${SPRING_DATASOURCE_URL} SPRING_DATASOURCE_URL: jdbc:postgresql://postgres:5432/condado
SPRING_DATASOURCE_USERNAME: ${SPRING_DATASOURCE_USERNAME} SPRING_DATASOURCE_USERNAME: ${SPRING_DATASOURCE_USERNAME}
SPRING_DATASOURCE_PASSWORD: ${SPRING_DATASOURCE_PASSWORD} SPRING_DATASOURCE_PASSWORD: ${SPRING_DATASOURCE_PASSWORD}
APP_PASSWORD: ${APP_PASSWORD} APP_PASSWORD: ${APP_PASSWORD}
@@ -50,36 +50,42 @@ services:
extra_hosts: extra_hosts:
- "celtinha.desktop:host-gateway" - "celtinha.desktop:host-gateway"
- "host.docker.internal:host-gateway" - "host.docker.internal:host-gateway"
networks:
- condado-net
# ── Frontend + Nginx ───────────────────────────────────────────────────────── # ── Frontend + Nginx ─────────────────────────────────────────────────────────
nginx: nginx:
container_name: condado-newsletter-frontend
build: build:
context: ./frontend context: ./frontend
dockerfile: Dockerfile dockerfile: Dockerfile
args: args:
VITE_API_BASE_URL: ${VITE_API_BASE_URL} VITE_API_BASE_URL: ${VITE_API_BASE_URL}
restart: unless-stopped restart: unless-stopped
ports:
- "80:80"
depends_on: depends_on:
- backend - backend
networks: networks:
- condado-net - traefik
labels:
- "traefik.enable=true"
- "traefik.http.routers.condado.rule=Host(`condado-newsletter.lab`)"
- "traefik.http.services.condado.loadbalancer.server.port=80"
- "homepage.group=Hyperlink"
- "homepage.name=Condado Newsletter"
- "homepage.description=Automated newsletter generator using AI"
- "homepage.logo=claude-dark.png"
- "homepage.href=http://condado-newsletter.lab"
# ── Mailhog (DEV ONLY — SMTP trap) ─────────────────────────────────────────── # ── Mailhog (DEV ONLY — SMTP trap) ───────────────────────────────────────────
mailhog: mailhog:
container_name: condado-newsletter-mailhog
image: mailhog/mailhog:latest image: mailhog/mailhog:latest
restart: unless-stopped restart: unless-stopped
ports: ports:
- "8025:8025" - "8025:8025"
networks:
- condado-net
volumes: volumes:
postgres-data: postgres-data:
networks: networks:
condado-net: traefik:
driver: bridge external: true
name: traefik

View File

@@ -1,28 +1,33 @@
#!/bin/bash #!/bin/bash
set -e set -e
# ── Initialise PostgreSQL data directory on first run ───────────────────────── APP_DB_NAME=${APP_DB_NAME:-condado}
if [ ! -f /var/lib/postgresql/data/PG_VERSION ]; then APP_DB_USER=${SPRING_DATASOURCE_USERNAME:-condado}
echo "Initialising PostgreSQL data directory..." APP_DB_PASSWORD=${SPRING_DATASOURCE_PASSWORD:-condado}
su -c "/usr/lib/postgresql/16/bin/initdb -D /var/lib/postgresql/data --encoding=UTF8 --locale=C" postgres
# Start postgres temporarily to create the app database and user
su -c "/usr/lib/postgresql/16/bin/pg_ctl -D /var/lib/postgresql/data -w start" postgres
su -c "psql -c \"CREATE USER condado WITH PASSWORD 'condado';\"" postgres
su -c "psql -c \"CREATE DATABASE condado OWNER condado;\"" postgres
su -c "/usr/lib/postgresql/16/bin/pg_ctl -D /var/lib/postgresql/data -w stop" postgres
echo "PostgreSQL initialised."
fi
# ── Ensure supervisor log directory exists ──────────────────────────────────── # ── Ensure supervisor log directory exists ────────────────────────────────────
mkdir -p /var/log/supervisor mkdir -p /var/log/supervisor
# ── Defaults for all-in-one local PostgreSQL ───────────────────────────────── # ── Defaults for external PostgreSQL service in production compose ───────────
export SPRING_DATASOURCE_URL=${SPRING_DATASOURCE_URL:-jdbc:postgresql://localhost:5432/condado} export SPRING_DATASOURCE_URL=${SPRING_DATASOURCE_URL:-jdbc:postgresql://condado-newsletter-postgres:5432/${APP_DB_NAME}}
export SPRING_DATASOURCE_USERNAME=${SPRING_DATASOURCE_USERNAME:-condado} export SPRING_DATASOURCE_USERNAME=${SPRING_DATASOURCE_USERNAME:-${APP_DB_USER}}
export SPRING_DATASOURCE_PASSWORD=${SPRING_DATASOURCE_PASSWORD:-condado} export SPRING_DATASOURCE_PASSWORD=${SPRING_DATASOURCE_PASSWORD:-${APP_DB_PASSWORD}}
export JWT_EXPIRATION_MS=${JWT_EXPIRATION_MS:-86400000}
# ── Log all Spring Boot environment variables for debugging ──────────────────
echo "========================================"
echo "Spring Boot Configuration:"
echo "========================================"
echo "SPRING_DATASOURCE_URL=${SPRING_DATASOURCE_URL}"
echo "SPRING_DATASOURCE_USERNAME=${SPRING_DATASOURCE_USERNAME}"
echo "SPRING_DATASOURCE_PASSWORD=${SPRING_DATASOURCE_PASSWORD}"
echo "JWT_EXPIRATION_MS=${JWT_EXPIRATION_MS}"
echo "JAVA_OPTS=${JAVA_OPTS:-not set}"
echo "OPENAI_API_KEY=${OPENAI_API_KEY:-not set}"
echo "========================================"
# ── Start all services via supervisord ─────────────────────────────────────── # ── Start all services via supervisord ───────────────────────────────────────
# Export unbuffered output for both Python and Java
export PYTHONUNBUFFERED=1
export JAVA_OPTS="${JAVA_OPTS} -Dfile.encoding=UTF-8 -Djava.awt.headless=true"
exec /usr/bin/supervisord -c /etc/supervisor/conf.d/supervisord.conf exec /usr/bin/supervisord -c /etc/supervisor/conf.d/supervisord.conf

View File

@@ -1,27 +1,26 @@
[supervisord] [supervisord]
nodaemon=true nodaemon=true
logfile=/var/log/supervisor/supervisord.log silent=false
logfile=/dev/stdout
logfile_maxbytes=0
pidfile=/var/run/supervisord.pid pidfile=/var/run/supervisord.pid
loglevel=info
[program:postgres]
command=/usr/lib/postgresql/16/bin/postgres -D /var/lib/postgresql/data
user=postgres
autostart=true
autorestart=true
stdout_logfile=/var/log/supervisor/postgres.log
stderr_logfile=/var/log/supervisor/postgres.err.log
[program:backend] [program:backend]
command=java -jar /app/app.jar command=java -Dspring.output.ansi.enabled=always -Dlogging.level.root=DEBUG -jar /app/app.jar
autostart=true autostart=true
autorestart=true autorestart=true
startsecs=15 startsecs=15
stdout_logfile=/var/log/supervisor/backend.log stdout_logfile=/dev/stdout
stderr_logfile=/var/log/supervisor/backend.err.log stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0
[program:nginx] [program:nginx]
command=/usr/sbin/nginx -g "daemon off;" command=/usr/sbin/nginx -g "daemon off;"
autostart=true autostart=true
autorestart=true autorestart=true
stdout_logfile=/var/log/supervisor/nginx.log stdout_logfile=/dev/stdout
stderr_logfile=/var/log/supervisor/nginx.err.log stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0

View File

@@ -39,6 +39,7 @@ const taskOne: EntityTaskResponse = {
prompt: 'Summarize jokes', prompt: 'Summarize jokes',
scheduleCron: '0 9 * * 1', scheduleCron: '0 9 * * 1',
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
active: true, active: true,
createdAt: '2026-03-26T10:00:00Z', createdAt: '2026-03-26T10:00:00Z',
} }
@@ -50,6 +51,7 @@ const taskTwo: EntityTaskResponse = {
prompt: 'Escalate sandwich policy', prompt: 'Escalate sandwich policy',
scheduleCron: '0 11 1 * *', scheduleCron: '0 11 1 * *',
emailLookback: 'last_month', emailLookback: 'last_month',
generationSource: 'llama',
active: false, active: false,
createdAt: '2026-03-26T11:00:00Z', createdAt: '2026-03-26T11:00:00Z',
} }
@@ -72,6 +74,7 @@ const previewTask = {
prompt: 'Draft an absurdly official update about disappearing crackers.', prompt: 'Draft an absurdly official update about disappearing crackers.',
scheduleCron: '15 10 * * 2', scheduleCron: '15 10 * * 2',
emailLookback: 'last_week' as const, emailLookback: 'last_week' as const,
generationSource: 'openai' as const,
} }
describe('tasksApi', () => { describe('tasksApi', () => {
@@ -143,6 +146,7 @@ describe('tasksApi', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'openai',
}) })
expect(createdTask).toEqual( expect(createdTask).toEqual(
@@ -157,6 +161,7 @@ describe('tasksApi', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'openai',
}) })
}) })
@@ -168,6 +173,7 @@ describe('tasksApi', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}, },
}) })
@@ -177,6 +183,7 @@ describe('tasksApi', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}) })
expect(updatedTask).toEqual({ expect(updatedTask).toEqual({
@@ -185,6 +192,7 @@ describe('tasksApi', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}) })
expect(mockedApiClient.put).toHaveBeenCalledWith('/v1/tasks/task-1', { expect(mockedApiClient.put).toHaveBeenCalledWith('/v1/tasks/task-1', {
entityId: 'entity-1', entityId: 'entity-1',
@@ -192,6 +200,7 @@ describe('tasksApi', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}) })
}) })

View File

@@ -50,6 +50,7 @@ describe('CreateTaskPage', () => {
expect(screen.getByLabelText(/task name/i)).toBeInTheDocument() expect(screen.getByLabelText(/task name/i)).toBeInTheDocument()
expect(screen.queryByLabelText(/task prompt/i)).not.toBeInTheDocument() expect(screen.queryByLabelText(/task prompt/i)).not.toBeInTheDocument()
expect(screen.getByLabelText(/^Generation Source$/i)).toHaveValue('openai')
expect(screen.getByLabelText(/^Email Period$/i)).toBeInTheDocument() expect(screen.getByLabelText(/^Email Period$/i)).toBeInTheDocument()
expect(screen.getByLabelText(/^Minute$/i)).toBeInTheDocument() expect(screen.getByLabelText(/^Minute$/i)).toBeInTheDocument()
expect(screen.getByLabelText(/^Hour$/i)).toBeInTheDocument() expect(screen.getByLabelText(/^Hour$/i)).toBeInTheDocument()
@@ -68,6 +69,7 @@ describe('CreateTaskPage', () => {
prompt: '', prompt: '',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
active: false, active: false,
createdAt: '2026-03-26T10:00:00Z', createdAt: '2026-03-26T10:00:00Z',
}) })
@@ -78,6 +80,7 @@ describe('CreateTaskPage', () => {
prompt: '', prompt: '',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
active: false, active: false,
createdAt: '2026-03-26T10:00:00Z', createdAt: '2026-03-26T10:00:00Z',
}) })
@@ -118,6 +121,7 @@ describe('CreateTaskPage', () => {
prompt: '', prompt: '',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
}) })
) )
expect(tasksApi.inactivateTask).toHaveBeenCalledWith('task-2') expect(tasksApi.inactivateTask).toHaveBeenCalledWith('task-2')

View File

@@ -57,6 +57,7 @@ const mockTask = {
prompt: 'Summarize jokes', prompt: 'Summarize jokes',
scheduleCron: '0 9 * * 1', scheduleCron: '0 9 * * 1',
emailLookback: 'last_week' as const, emailLookback: 'last_week' as const,
generationSource: 'openai' as const,
active: true, active: true,
createdAt: '2026-03-26T10:00:00Z', createdAt: '2026-03-26T10:00:00Z',
} }
@@ -77,7 +78,7 @@ describe('EditTaskPage', () => {
vi.mocked(tasksApi.getTask).mockResolvedValue(mockTask) vi.mocked(tasksApi.getTask).mockResolvedValue(mockTask)
vi.mocked(tasksApi.buildTaskPreviewPrompt).mockImplementation( vi.mocked(tasksApi.buildTaskPreviewPrompt).mockImplementation(
(entity, task) => (entity, task) =>
`PROMPT FOR ${entity.name}: ${task.name} | ${task.prompt} | ${task.scheduleCron} | ${task.emailLookback}` `PROMPT FOR ${entity.name}: ${task.name} | ${task.prompt} | ${task.scheduleCron} | ${task.emailLookback} | ${task.generationSource}`
) )
vi.mocked(tasksApi.activateTask).mockResolvedValue({ ...mockTask, active: true }) vi.mocked(tasksApi.activateTask).mockResolvedValue({ ...mockTask, active: true })
vi.mocked(tasksApi.inactivateTask).mockResolvedValue({ ...mockTask, active: false }) vi.mocked(tasksApi.inactivateTask).mockResolvedValue({ ...mockTask, active: false })
@@ -97,6 +98,7 @@ describe('EditTaskPage', () => {
expect(screen.getByRole('heading', { name: /edit task/i })).toBeInTheDocument() expect(screen.getByRole('heading', { name: /edit task/i })).toBeInTheDocument()
expect(screen.getByLabelText(/task name/i)).toHaveValue('Weekly Check-in') expect(screen.getByLabelText(/task name/i)).toHaveValue('Weekly Check-in')
expect(screen.getByLabelText(/task prompt/i)).toHaveValue('Summarize jokes') expect(screen.getByLabelText(/task prompt/i)).toHaveValue('Summarize jokes')
expect(screen.getByLabelText(/^Generation Source$/i)).toHaveValue('openai')
expect(screen.getByLabelText(/^Email Period$/i)).toHaveValue('last_week') expect(screen.getByLabelText(/^Email Period$/i)).toHaveValue('last_week')
expect(screen.getByLabelText(/^Minute$/i)).toHaveValue('0') expect(screen.getByLabelText(/^Minute$/i)).toHaveValue('0')
expect(screen.getByLabelText(/^Hour$/i)).toHaveValue('9') expect(screen.getByLabelText(/^Hour$/i)).toHaveValue('9')
@@ -124,6 +126,7 @@ describe('EditTaskPage', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}) })
const { queryClient } = renderPage() const { queryClient } = renderPage()
@@ -137,13 +140,16 @@ describe('EditTaskPage', () => {
fireEvent.change(screen.getByLabelText(/^Email Period$/i), { fireEvent.change(screen.getByLabelText(/^Email Period$/i), {
target: { value: 'last_day' }, target: { value: 'last_day' },
}) })
fireEvent.change(screen.getByLabelText(/^Generation Source$/i), {
target: { value: 'llama' },
})
fireEvent.click(screen.getByRole('button', { name: /Weekdays/i })) fireEvent.click(screen.getByRole('button', { name: /Weekdays/i }))
fireEvent.change(screen.getByLabelText(/^Hour$/i), { target: { value: '8' } }) fireEvent.change(screen.getByLabelText(/^Hour$/i), { target: { value: '8' } })
expect(screen.getByText(/Final Prompt/i)).toBeInTheDocument() expect(screen.getByText(/Final Prompt/i)).toBeInTheDocument()
expect( expect(
screen.getByText( screen.getByText(
'PROMPT FOR Entity A: Daily Check-in | Ask about ceremonial coffee | 0 8 * * 1-5 | last_day' 'PROMPT FOR Entity A: Daily Check-in | Ask about ceremonial coffee | 0 8 * * 1-5 | last_day | llama'
) )
).toBeInTheDocument() ).toBeInTheDocument()
@@ -159,6 +165,7 @@ describe('EditTaskPage', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}) })
) )
expect(vi.mocked(tasksApi.generateTaskPreview).mock.calls[0][0]).toEqual('task-1') expect(vi.mocked(tasksApi.generateTaskPreview).mock.calls[0][0]).toEqual('task-1')
@@ -171,6 +178,7 @@ describe('EditTaskPage', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}), }),
}) })
) )
@@ -186,6 +194,7 @@ describe('EditTaskPage', () => {
prompt: 'Ask about ceremonial coffee', prompt: 'Ask about ceremonial coffee',
scheduleCron: '0 8 * * 1-5', scheduleCron: '0 8 * * 1-5',
emailLookback: 'last_day', emailLookback: 'last_day',
generationSource: 'llama',
}) })
expect(invalidateQueriesSpy).toHaveBeenCalledWith({ queryKey: ['entity-tasks', 'entity-1'] }) expect(invalidateQueriesSpy).toHaveBeenCalledWith({ queryKey: ['entity-tasks', 'entity-1'] })
expect(invalidateQueriesSpy).toHaveBeenCalledWith({ queryKey: ['entity-task', 'task-1'] }) expect(invalidateQueriesSpy).toHaveBeenCalledWith({ queryKey: ['entity-task', 'task-1'] })

View File

@@ -120,6 +120,7 @@ describe('EntityDetailPage', () => {
prompt: 'Summarize jokes', prompt: 'Summarize jokes',
scheduleCron: '0 9 * * 1', scheduleCron: '0 9 * * 1',
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
active: true, active: true,
createdAt: '2026-03-26T10:00:00Z', createdAt: '2026-03-26T10:00:00Z',
}, },
@@ -165,6 +166,7 @@ describe('EntityDetailPage', () => {
prompt: 'Archive the sandwich minutes', prompt: 'Archive the sandwich minutes',
scheduleCron: '0 9 * * 1', scheduleCron: '0 9 * * 1',
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'llama',
active: false, active: false,
createdAt: '2026-03-26T10:00:00Z', createdAt: '2026-03-26T10:00:00Z',
}, },

View File

@@ -2,6 +2,7 @@ import type { VirtualEntityResponse } from './entitiesApi'
import apiClient from './apiClient' import apiClient from './apiClient'
export type EmailLookback = 'last_day' | 'last_week' | 'last_month' export type EmailLookback = 'last_day' | 'last_week' | 'last_month'
export type GenerationSource = 'openai' | 'llama'
export interface EntityTaskResponse { export interface EntityTaskResponse {
id: string id: string
@@ -10,6 +11,7 @@ export interface EntityTaskResponse {
prompt: string prompt: string
scheduleCron: string scheduleCron: string
emailLookback: EmailLookback emailLookback: EmailLookback
generationSource: GenerationSource
active: boolean active: boolean
createdAt: string createdAt: string
} }
@@ -20,6 +22,7 @@ export interface EntityTaskCreateDto {
prompt: string prompt: string
scheduleCron: string scheduleCron: string
emailLookback: EmailLookback emailLookback: EmailLookback
generationSource: GenerationSource
} }
export type EntityTaskUpdateDto = EntityTaskCreateDto export type EntityTaskUpdateDto = EntityTaskCreateDto

View File

@@ -6,12 +6,14 @@ import {
createTask, createTask,
inactivateTask, inactivateTask,
type EmailLookback, type EmailLookback,
type GenerationSource,
} from '../api/tasksApi' } from '../api/tasksApi'
interface TaskFormState { interface TaskFormState {
name: string name: string
scheduleCron: string scheduleCron: string
emailLookback: EmailLookback emailLookback: EmailLookback
generationSource: GenerationSource
} }
interface CronParts { interface CronParts {
@@ -72,6 +74,7 @@ const DEFAULT_TASK_FORM: TaskFormState = {
name: '', name: '',
scheduleCron: buildCron(DEFAULT_CRON_PARTS), scheduleCron: buildCron(DEFAULT_CRON_PARTS),
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
} }
export default function CreateTaskPage() { export default function CreateTaskPage() {
@@ -151,6 +154,7 @@ export default function CreateTaskPage() {
prompt: '', prompt: '',
scheduleCron: taskForm.scheduleCron, scheduleCron: taskForm.scheduleCron,
emailLookback: taskForm.emailLookback, emailLookback: taskForm.emailLookback,
generationSource: taskForm.generationSource,
}) })
}} }}
> >
@@ -167,6 +171,26 @@ export default function CreateTaskPage() {
/> />
</div> </div>
<div>
<label htmlFor="task-generation-source" className="text-sm font-medium text-slate-200">
Generation Source
</label>
<select
id="task-generation-source"
value={taskForm.generationSource}
onChange={(event) =>
setTaskForm((prev) => ({
...prev,
generationSource: event.target.value as GenerationSource,
}))
}
className="mt-1 w-full rounded-md border border-slate-700 bg-slate-900 px-3 py-2 text-sm text-slate-100"
>
<option value="openai">OpenAI</option>
<option value="llama">Llama</option>
</select>
</div>
<div> <div>
<label htmlFor="task-lookback" className="text-sm font-medium text-slate-200"> <label htmlFor="task-lookback" className="text-sm font-medium text-slate-200">
Email Period Email Period

View File

@@ -23,11 +23,12 @@ export default function DashboardPage() {
<div> <div>
<h1 className="text-3xl font-bold text-slate-100">Dashboard</h1> <h1 className="text-3xl font-bold text-slate-100">Dashboard</h1>
<p className="mt-2 text-xs text-slate-400">Version {appVersion}</p> <p className="mt-2 text-xs text-slate-400">Version {appVersion}</p>
<p className="mt-2 text-xs text-slate-400">Is this the real life?</p>
</div> </div>
<div className="grid gap-4 md:grid-cols-2"> <div className="grid gap-4 md:grid-cols-2">
<div className="rounded-xl border border-slate-800 bg-slate-900/70 p-5 shadow-sm"> <div className="rounded-xl border border-slate-800 bg-slate-900/70 p-5 shadow-sm">
<p className="text-sm text-slate-400">Active Entities</p> <p className="text-sm text-slate-400">Active Entities:</p>
<p className="mt-1 text-2xl font-bold">{activeCount} active {activeCount === 1 ? 'entity' : 'entities'}</p> <p className="mt-1 text-2xl font-bold">{activeCount} active {activeCount === 1 ? 'entity' : 'entities'}</p>
</div> </div>
<div className="rounded-xl border border-slate-800 bg-slate-900/70 p-5 shadow-sm"> <div className="rounded-xl border border-slate-800 bg-slate-900/70 p-5 shadow-sm">

View File

@@ -13,6 +13,7 @@ import {
inactivateTask, inactivateTask,
updateTask, updateTask,
type EmailLookback, type EmailLookback,
type GenerationSource,
} from '../api/tasksApi' } from '../api/tasksApi'
interface TaskFormState { interface TaskFormState {
@@ -20,6 +21,7 @@ interface TaskFormState {
prompt: string prompt: string
scheduleCron: string scheduleCron: string
emailLookback: EmailLookback emailLookback: EmailLookback
generationSource: GenerationSource
} }
interface CronParts { interface CronParts {
@@ -97,6 +99,7 @@ const DEFAULT_TASK_FORM: TaskFormState = {
prompt: '', prompt: '',
scheduleCron: buildCron(DEFAULT_CRON_PARTS), scheduleCron: buildCron(DEFAULT_CRON_PARTS),
emailLookback: 'last_week', emailLookback: 'last_week',
generationSource: 'openai',
} }
async function invalidateTaskQueries( async function invalidateTaskQueries(
@@ -164,6 +167,7 @@ export default function EditTaskPage() {
prompt: task.prompt, prompt: task.prompt,
scheduleCron: task.scheduleCron, scheduleCron: task.scheduleCron,
emailLookback: task.emailLookback, emailLookback: task.emailLookback,
generationSource: task.generationSource,
}) })
}, [task]) }, [task])
@@ -175,6 +179,7 @@ export default function EditTaskPage() {
prompt: data.prompt, prompt: data.prompt,
scheduleCron: data.scheduleCron, scheduleCron: data.scheduleCron,
emailLookback: data.emailLookback, emailLookback: data.emailLookback,
generationSource: data.generationSource,
}), }),
onSuccess: async () => { onSuccess: async () => {
await invalidateTaskQueries(queryClient, entityId, taskId) await invalidateTaskQueries(queryClient, entityId, taskId)
@@ -236,8 +241,16 @@ export default function EditTaskPage() {
prompt: taskForm.prompt, prompt: taskForm.prompt,
scheduleCron: taskForm.scheduleCron, scheduleCron: taskForm.scheduleCron,
emailLookback: taskForm.emailLookback, emailLookback: taskForm.emailLookback,
generationSource: taskForm.generationSource,
}), }),
[entityId, taskForm.emailLookback, taskForm.name, taskForm.prompt, taskForm.scheduleCron] [
entityId,
taskForm.emailLookback,
taskForm.generationSource,
taskForm.name,
taskForm.prompt,
taskForm.scheduleCron,
]
) )
const finalPrompt = useMemo(() => { const finalPrompt = useMemo(() => {
@@ -352,6 +365,26 @@ export default function EditTaskPage() {
/> />
</div> </div>
<div>
<label htmlFor="task-generation-source" className="text-sm font-medium text-slate-200">
Generation Source
</label>
<select
id="task-generation-source"
value={taskForm.generationSource}
onChange={(event) =>
setTaskForm((prev) => ({
...prev,
generationSource: event.target.value as GenerationSource,
}))
}
className="mt-1 w-full rounded-md border border-slate-700 bg-slate-900 px-3 py-2 text-sm text-slate-100"
>
<option value="openai">OpenAI</option>
<option value="llama">Llama</option>
</select>
</div>
<div> <div>
<label htmlFor="task-lookback" className="text-sm font-medium text-slate-200"> <label htmlFor="task-lookback" className="text-sm font-medium text-slate-200">
Email Period Email Period

View File

@@ -15,6 +15,9 @@ http {
gzip_types text/plain text/css application/json application/javascript gzip_types text/plain text/css application/json application/javascript
text/xml application/xml application/xml+rss text/javascript; text/xml application/xml application/xml+rss text/javascript;
access_log /dev/stdout;
error_log /dev/stderr;
server { server {
listen 80; listen 80;
server_name _; server_name _;