vigilar/docs/superpowers/specs/2026-04-02-daily-use-features-design.md
Aaron D. Lee 53ae925a70 Add daily use features design spec
Spec covers 5 feature areas for making Vigilar a system a household
relies on daily: multi-person presence detection, MobileNet person +
vehicle detection with driveway fencing, smart alert profiles with
presence/time awareness, recording timeline UI, and health monitoring
with auto-prune and daily digest.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 23:42:29 -04:00

463 lines
16 KiB
Markdown

# Vigilar — Daily Use Features Design Spec
**Date:** 2026-04-02
**Goal:** Make Vigilar a security system a household actually relies on daily, replacing Ring/Nest subscription features with local-first intelligence.
**User profile:** Single technical admin, 4-person household (2 adults, 2 kids). Cameras on exterior + common areas only. Wants automation-driven system that rarely needs manual interaction.
**Constraints:**
- No cloud dependencies
- No new Python dependencies (MobileNet runs via OpenCV DNN, already installed)
- Must run on RPi 5 or x86 mini-PC
- 22 Mbps upload (remote streaming already handled)
---
## 1. Multi-Person Presence Detection
### Overview
Track each family member's phone via WiFi ping to derive who's home. Household state drives auto-arm/disarm and alert filtering.
### Presence Monitor
New subprocess `vigilar/presence/monitor.py`:
- Pings each configured phone IP every `ping_interval_s` (default 30s) using ICMP or ARP check
- Per-member state: `HOME` or `AWAY`
- Departure requires `departure_delay_m` (default 10 min) of no response to prevent false triggers from WiFi sleep
- Arrival is immediate (first successful ping)
- Publishes per-member status to `vigilar/presence/{member_name}` and aggregate to `vigilar/presence/status`
### Household States
Derived from individual member states:
| State | Condition |
|---|---|
| `EMPTY` | No members home |
| `KIDS_HOME` | At least one child, zero adults |
| `ADULTS_HOME` | At least one adult home, but not all members (e.g., parent home, kids out) |
| `ALL_HOME` | Every configured member is home |
Precedence: `ALL_HOME` > `ADULTS_HOME` > `KIDS_HOME` > `EMPTY`. The first match wins. For alert profile purposes, `ADULTS_HOME` and `ALL_HOME` typically share the same behavior (both use "Home Daytime" / "Home Night" profiles).
### Auto-Arm Actions
Each household state maps to an arm action (configurable):
| Household state | Default arm action |
|---|---|
| `EMPTY` | `ARMED_AWAY` |
| `ADULTS_HOME` | `DISARMED` |
| `KIDS_HOME` | `ARMED_HOME` |
| `ALL_HOME` | `DISARMED` |
### Config
```toml
[presence]
enabled = true
ping_interval_s = 30
departure_delay_m = 10
method = "icmp" # icmp | arping
[[presence.members]]
name = "Dad"
ip = "192.168.1.50"
role = "adult"
[[presence.members]]
name = "Mom"
ip = "192.168.1.51"
role = "adult"
[[presence.members]]
name = "Kid 1"
ip = "192.168.1.52"
role = "child"
[[presence.members]]
name = "Kid 2"
ip = "192.168.1.53"
role = "child"
[presence.actions]
EMPTY = "ARMED_AWAY"
ADULTS_HOME = "DISARMED"
KIDS_HOME = "ARMED_HOME"
ALL_HOME = "DISARMED"
```
### MQTT Topics
- `vigilar/presence/{member_name}``{"state": "HOME"|"AWAY", "ts": ...}`
- `vigilar/presence/status``{"household": "ADULTS_HOME", "members": {"Dad": "HOME", ...}, "ts": ...}`
### Files
- `vigilar/presence/__init__.py`
- `vigilar/presence/monitor.py``PresenceMonitor` class, subprocess target
- `vigilar/presence/models.py``MemberState`, `HouseholdState` dataclasses
- Update `vigilar/config.py``PresenceConfig`, `PresenceMember`
- Update `vigilar/main.py` — start PresenceMonitor subprocess
- Update `vigilar/constants.py``HouseholdState` enum, presence MQTT topics
---
## 2. Person + Vehicle Detection
### Overview
MobileNet-SSD v2 runs as a second-stage filter on frames where MOG2 already detected motion. Classifies detections as person, vehicle, or unidentified motion. Vehicle fingerprinting compares detected vehicles against known family cars.
### Detection Pipeline
```
MOG2 motion detected
→ crop frame to motion region
→ MobileNet-SSD forward pass (~50ms x86, ~200ms RPi 5)
→ detection class?
person (class 1) → PERSON_DETECTED event
car/truck (class 3/8) → check vehicle fingerprint
→ known vehicle → KNOWN_VEHICLE_ARRIVED (quiet)
→ unknown vehicle → UNKNOWN_VEHICLE event (alert)
other/none → MOTION_START (logged, low severity)
```
### Person Detector
New module `vigilar/detection/person.py`:
- Class `PersonDetector`
- Loads MobileNet-SSD v2 COCO model via `cv2.dnn.readNetFromTensorflow()`
- Model files: `mobilenet_ssd_v2.pb` + `mobilenet_ssd_v2.pbtxt` (~22MB total)
- Downloaded during install to `/var/vigilar/models/`
- `detect(frame) -> list[Detection]` where `Detection` has: class_name, confidence, bounding_box
- Configurable `confidence_threshold` (default 0.5)
### Vehicle Fingerprinting
New module `vigilar/detection/vehicle.py`:
- Class `VehicleFingerprint`
- When a vehicle detection occurs in a fenced zone:
1. Crop bounding box from frame
2. Convert to HSV, compute hue/saturation/value histograms
3. Classify dominant color (white, black, silver, red, blue, etc.)
4. Compute relative size class from bbox area vs. zone area (compact, midsize, large)
5. Compare against stored known vehicle profiles
- Match threshold: color matches AND size class matches → known vehicle
- `VehicleProfile` dataclass: name, color_profile, size_range, histogram (stored as numpy array)
### Camera Zone Fencing
Extend existing camera config with named zones:
```toml
[[cameras.zones]]
name = "driveway"
region = [100, 300, 500, 600] # x, y, w, h in detection resolution coords
watch_for = ["vehicle", "person"]
alert_unknown_vehicles = true
[[cameras.zones]]
name = "walkway"
region = [50, 200, 200, 400]
watch_for = ["person"]
```
- Zones are optional — no zones = entire frame monitored for everything
- `watch_for` filters which detection types trigger events in that zone
- `alert_unknown_vehicles` enables vehicle fingerprint matching
### Vehicle Calibration CLI
```bash
vigilar calibrate-vehicle --camera front_door --zone driveway --name "Mom's SUV"
```
- Captures 10 frames over 5 seconds
- Runs vehicle detection on each
- Averages the color histogram + size measurements
- Stores profile in config
### Known Vehicle Config
```toml
[[vehicles.known]]
name = "Mom's SUV"
color_profile = "white"
size_class = "midsize"
calibration_file = "/var/vigilar/models/vehicles/moms_suv.npz"
[[vehicles.known]]
name = "Dad's SUV"
color_profile = "white"
size_class = "midsize"
calibration_file = "/var/vigilar/models/vehicles/dads_suv.npz"
```
### Detection Config
```toml
[detection]
person_detection = true
model_path = "/var/vigilar/models/mobilenet_ssd_v2.pb"
model_config = "/var/vigilar/models/mobilenet_ssd_v2.pbtxt"
confidence_threshold = 0.5
# Run detection on all cameras (empty = all)
cameras = []
```
### New Event Types
Add to `constants.py`:
- `PERSON_DETECTED` — person identified in frame
- `VEHICLE_DETECTED` — any vehicle detected
- `KNOWN_VEHICLE_ARRIVED` — matched a known vehicle profile
- `UNKNOWN_VEHICLE_DETECTED` — vehicle doesn't match any profile
### Files
- `vigilar/detection/__init__.py`
- `vigilar/detection/person.py``PersonDetector` class
- `vigilar/detection/vehicle.py``VehicleFingerprint`, `VehicleProfile`
- `vigilar/detection/zones.py` — zone filtering logic
- `vigilar/cli/cmd_calibrate.py``vigilar calibrate-vehicle` command
- Update `vigilar/camera/worker.py` — add detection stage after motion detection
- Update `vigilar/config.py``DetectionConfig`, `CameraZone`, `VehicleConfig`
- Update `vigilar/constants.py` — new event types
- Update `vigilar/storage/schema.py` — add `detection_type` column to recordings table
- `scripts/download_model.sh` — download MobileNet-SSD model files
---
## 3. Smart Alert Profiles
### Overview
Alert behavior is driven by configurable profiles that activate automatically based on household presence state and time of day. Profiles define what gets notified, to whom, and how.
### Profile Structure
Each profile has:
- **Name** — human-readable identifier
- **Activation conditions** — presence state(s) + optional time window
- **Rules** — per-detection-type behavior (push, record, quiet log)
- **Recipients** — which roles receive push notifications
### Default Profiles
Installed on first run, fully editable in Settings UI:
**Away** (activates when `EMPTY`):
- Person detected (any camera): push all + record
- Unknown vehicle: push all + record
- Known vehicle: quiet log + record
- Unidentified motion: record only
**Kids Home** (activates when `KIDS_HOME`):
- Person detected (exterior): push adults + record
- Unknown vehicle: push adults + record
- Known vehicle: quiet log
- Person/motion (common area): record only
**Home Daytime** (activates when `ADULTS_HOME` or `ALL_HOME`, outside sleep hours):
- Person detected: record only
- Unknown vehicle: push all + record
- Known vehicle: quiet log
- Unidentified motion: record only
**Home Night** (activates when `ADULTS_HOME` or `ALL_HOME`, during sleep hours):
- Person detected (exterior): push all + record
- Unknown vehicle: push all + record
- Known vehicle: quiet log
- Person/motion (common area): record only
### Profile Config
```toml
[alerts.schedule]
sleep_start = "23:00"
sleep_end = "06:00"
[[alerts.profiles]]
name = "Away"
presence_states = ["EMPTY"]
time_window = "" # all day
[[alerts.profiles.rules]]
detection_type = "person"
camera_location = "any"
action = "push_and_record"
recipients = "all"
[[alerts.profiles.rules]]
detection_type = "unknown_vehicle"
camera_location = "any"
action = "push_and_record"
recipients = "all"
[[alerts.profiles.rules]]
detection_type = "known_vehicle"
camera_location = "any"
action = "quiet_log"
[[alerts.profiles.rules]]
detection_type = "motion"
camera_location = "any"
action = "record_only"
```
### Push Notification Content
Notifications include detection context:
- **Title:** "Person at Front Door" / "Unknown Vehicle in Driveway" / "Mom's SUV arrived"
- **Body:** Time + camera name
- **Thumbnail:** Snapshot of the detection frame with bounding box overlay
- **Action:** Tap opens the PWA to that camera's timeline at the event time
### Alert Actions
- `push_and_record` — send push notification + start motion recording
- `push_adults` — send push only to adult-role members + record
- `record_only` — record the clip, no push notification
- `quiet_log` — log the event to DB, no recording, no push
### Recipients
- `all` — push to all registered devices
- `adults` — push only to devices registered by adult-role presence members
- `none` — no push
### Settings UI
New section in Settings > Notifications:
- List of profiles with enable/disable toggle
- Click to expand: edit time window, presence triggers, per-detection rules
- Each rule: dropdown for detection type, camera scope, action, recipients
- "Reset to defaults" button
### Files
- `vigilar/alerts/profiles.py``AlertProfile`, profile matching engine
- Update `vigilar/alerts/dispatcher.py` — role-based recipient filtering
- Update `vigilar/config.py``AlertProfile`, `AlertProfileRule`
- Update `vigilar/web/templates/settings.html` — profile editor section
- Update `vigilar/web/static/js/settings.js` — profile save/load
---
## 4. Recording Timeline
### Overview
A visual 24-hour timeline bar per camera showing when activity occurred, color-coded by detection type. Click to play. Replaces the current table-based recordings page.
### Timeline Data
New API endpoint:
```
GET /recordings/api/timeline?camera=front_door&date=2026-04-02
```
Returns:
```json
[
{"start": 1743573600, "end": 1743573720, "type": "person", "id": 42},
{"start": 1743580800, "end": 1743580830, "type": "vehicle", "id": 43},
{"start": 1743590000, "end": 1743590060, "type": "motion", "id": 44}
]
```
### Visual Design
```
Front Door |░░░░░░████░░░░░░░░▓▓░░░░████████░░░░░░░░|
6am 10am 2pm 6pm 11pm
Red ████ = person
Blue ▓▓▓▓ = vehicle
Gray ░░░░ = motion only
Dark = no activity
```
### UI Features
- **Click segment** → loads that recording clip in an inline video player
- **Hover segment** → shows thumbnail preview + detection type + time
- **Filter buttons** — All / People / Vehicles / Motion
- **Date picker** — navigate to previous days
- **Zoom** — pinch/drag on mobile, scroll-wheel on desktop to zoom into a time range
- **Mini-timelines on dashboard** — compact version under each camera in the 2x2 grid
### Implementation
- `vigilar/web/static/js/timeline.js` — Canvas-based timeline renderer
- `vigilar/web/templates/recordings.html` — replace table with timeline view
- `vigilar/web/blueprints/recordings.py` — add `/api/timeline` endpoint
- Update `vigilar/storage/schema.py` — add `detection_type` to recordings table (also needed by detection feature)
- Update `vigilar/storage/queries.py``get_timeline_data(camera_id, date)` query
### Thumbnail Generation
When a recording segment completes, extract a representative frame:
- For person detections: the frame with highest confidence person detection, with bounding box drawn
- For vehicle detections: frame with vehicle visible
- For motion: first frame of motion
- Stored as JPEG at `/var/vigilar/recordings/{camera}/{date}/thumb_{recording_id}.jpg`
- Served via `/recordings/{id}/thumbnail`
---
## 5. Health Monitoring + Self-Healing
### Overview
Periodic checks on all subsystems with auto-remediation where possible, surfaced via a dashboard health indicator and optional daily digest.
### Health Checks
| Check | Frequency | Warn | Critical | Auto-heal |
|---|---|---|---|---|
| Camera connected | 30s | Offline > 2 min | Offline > 10 min | Reconnect (already built) |
| MQTT broker | 30s | — | Unreachable | Supervisor restarts |
| Disk usage | 5 min | > 85% | > 95% | Auto-prune oldest non-starred recordings |
| DB integrity | 1 hour | — | Integrity check fails | Alert, restore from backup |
| UPS status | 30s (built) | On battery | Low battery | Shutdown sequence (built) |
| Presence monitor | 60s | — | No results > 5 min | Restart subprocess |
| Model files exist | Startup | — | Missing | Re-download |
### Auto-Prune
When disk exceeds 90%:
1. Find oldest recordings without a `starred` flag
2. Delete until usage drops below 80%
3. Log what was deleted
4. Push notification: "Auto-pruned 12 recordings (3.2 GB) to free disk space"
Starred recordings are never auto-pruned. Users can star recordings in the timeline UI (click star icon on a segment).
### Dashboard Health Widget
Traffic light indicator in the navbar (already has arm-state and UPS badges):
- **Green** — all systems healthy
- **Yellow** — degraded (camera offline, disk warning, etc.)
- **Red** — critical
Click opens a dropdown panel showing each subsystem with status + last check time.
### Daily Digest
Optional push notification at configurable time (default 8:00 AM):
```
Vigilar Daily Summary
━━━━━━━━━━━━━━━━━━━━
Overnight: 2 person detections, 0 unknown vehicles
Cameras: 4/4 online
Storage: 142 GB used (71%)
UPS: Online, 100% charge
```
Only sent if `daily_digest = true` in config. Skipped if no overnight activity.
### Config
```toml
[health]
enabled = true
disk_warn_pct = 85
disk_critical_pct = 95
auto_prune = true
auto_prune_target_pct = 80
daily_digest = true
daily_digest_time = "08:00"
```
### Files
- `vigilar/health/__init__.py`
- `vigilar/health/monitor.py``HealthMonitor` class, runs checks on schedule
- `vigilar/health/pruner.py` — auto-prune logic
- `vigilar/health/digest.py` — daily digest notification builder
- Update `vigilar/web/templates/base.html` — health indicator in navbar
- Update `vigilar/web/static/js/app.js` — health dropdown panel
- Update `vigilar/config.py``HealthConfig`
- Update `vigilar/storage/schema.py` — add `starred` column to recordings table
---
## New Module Summary
| Module | Files | Purpose |
|---|---|---|
| `vigilar/presence/` | 3 | Phone ping, household state derivation |
| `vigilar/detection/` | 4 | MobileNet person detection, vehicle fingerprinting, zone fencing |
| `vigilar/health/` | 4 | Health checks, auto-prune, daily digest |
| `vigilar/alerts/profiles.py` | 1 | Profile-based alert routing |
| `vigilar/cli/cmd_calibrate.py` | 1 | Vehicle calibration CLI |
| `vigilar/web/static/js/timeline.js` | 1 | Timeline renderer |
| `scripts/download_model.sh` | 1 | Model downloader |
| Config/schema/constant updates | — | Spread across existing files |
**Estimated new code:** ~2,000-2,500 lines across 15 new files + updates to ~10 existing files.
**No new pip dependencies.** MobileNet via OpenCV DNN. Ping via subprocess. Everything else builds on existing infrastructure.