Compare commits
57 Commits
09a87b79d2
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 6c9e06f33b | |||
| c1c3e5d71b | |||
| c64dd736f2 | |||
| cad0aa7e59 | |||
| 0ae39ab94b | |||
| 822d9d8e01 | |||
| 1db905eaae | |||
| 3d6ef5c7b4 | |||
| 78a4ce009c | |||
| 7ccab6fbc4 | |||
|
|
827eb97203 | ||
|
|
3cca0cffc5 | ||
|
|
d36828bde2 | ||
|
|
ed0048c795 | ||
|
|
b316edbaf9 | ||
| c1b0c41ef2 | |||
| 3bb75d49de | |||
| 3d77cb448a | |||
| 49383c0003 | |||
| 7d821b9c1c | |||
| 9b7e387ea6 | |||
| b4f0d1891e | |||
| 0da30b6d6b | |||
| 6cbb728d9a | |||
| ff92451a76 | |||
| 60485bc06a | |||
| f6f299c3e5 | |||
| 66485f5c59 | |||
| 5f9ff9bcc9 | |||
| 35730b36f0 | |||
| d516833cc3 | |||
| 220be64dec | |||
| b433477c64 | |||
| 43b7047c57 | |||
| 167417d1ec | |||
| fb8141b320 | |||
| 96712dda88 | |||
| f5a7b42e7c | |||
| 1b1e9d727e | |||
| 668d29b786 | |||
| e5f42e099e | |||
| a9edda38ef | |||
| edec5ff460 | |||
|
|
264eb7296f | ||
|
|
fbd4295302 | ||
|
|
7bdb324ebc | ||
|
|
28b19b5219 | ||
|
|
75ddd559c9 | ||
|
|
5a1067263a | ||
|
|
e67de6215a | ||
|
|
7179b6531e | ||
|
|
fd618d7714 | ||
|
|
d1ffb857c8 | ||
|
|
f8eba0ee7e | ||
|
|
e6b5bf2cf1 | ||
|
|
fbae75b957 | ||
|
|
93476655fc |
45
.cursor/rules/led-driver.mdc
Normal file
45
.cursor/rules/led-driver.mdc
Normal file
@@ -0,0 +1,45 @@
|
||||
---
|
||||
description: led-driver — MicroPython ESP32: mpremote, imports, layout, I/O, no pycache in src
|
||||
globs: led-driver/**
|
||||
alwaysApply: false
|
||||
---
|
||||
|
||||
# led-driver (MicroPython / ESP32)
|
||||
|
||||
## Device and tests
|
||||
|
||||
1. Validate **MicroPython behaviour** under **`led-driver/`** with **`mpremote connect <PORT> …`** on the chip. Host **`python3`** does **not** prove the firmware build.
|
||||
|
||||
2. **Execution target is fixed:** treat **`led-driver/`** code as firmware that runs **only on MicroPython ESP32 devices**. Do **not** run `led-driver/src/main.py` (or other firmware modules) with host CPython as a normal execution path.
|
||||
|
||||
3. **Flow:** `mpremote connect <PORT> cp <local> :<on-flash>` then `run <script>.py`. Inline commands only — no **`.sh`** wrappers unless the user asks. Default serial placeholder: **`/dev/ttyACM0`**.
|
||||
|
||||
4. Checks that **import and run** code from **`led-driver/src/`** belong in **`led-driver/tests/`** and run with **`mpremote run …`**. **Do not** add **`pytest`** under **`led-controller/tests/`** that **`sys.path`**-loads **`led-driver/src`** and runs those modules on CPython.
|
||||
|
||||
## Import layout
|
||||
|
||||
4. **No** **`sys.path.insert`**, **`__file__`** path stitching, or other import-path hacks under **`led-driver/`**. Use device flash search path, or host **`PYTHONPATH`** / layout you control.
|
||||
|
||||
5. **No** “import fixer” code — fix copy order, flash paths, or env instead.
|
||||
|
||||
## Imports (fail loudly)
|
||||
|
||||
6. If a dependency does not load, **crash** and fix deployment or filesystem. **Do not** catch **`ImportError`** / **`ModuleNotFoundError`** around **`import`** / **`from … import`** for app/firmware modules (`settings`, `utils`, `network`, `machine`, …).
|
||||
|
||||
7. **Allowed — stdlib name pairs only** (MicroPython vs CPython): one **`except ImportError`**, then **one** fallback import, **no** extra logic in **`except`**:
|
||||
- `uos` → `os`
|
||||
- `ubinascii` → `binascii`
|
||||
- `utime` → `time`
|
||||
Not for “maybe the file exists on flash” — only different **stdlib** names.
|
||||
|
||||
8. **No** large inline reimplementations after **`except ImportError`** — deploy the real module.
|
||||
|
||||
## I/O
|
||||
|
||||
9. Non-blocking **recv** / **accept**: use plain **`except OSError:`** (or **break** on empty). **No** errno / EAGAIN / EWOULDBLOCK tables or **`getattr(errno, …)`** unless fixing a **documented** target bug.
|
||||
|
||||
10. Minimal **`try` / `except OSError`** around optional socket options (e.g. **`SO_REUSEADDR`**) is fine.
|
||||
|
||||
## Host Python and `src/`
|
||||
|
||||
11. **Do not** leave **`__pycache__/`** or **`.pyc`** under **`led-driver/src/`** from host runs. Remove if created; **`.gitignore`** already ignores it. Prefer **`PYTHONDONTWRITEBYTECODE=1`** or **`-B`** when host Python must touch **`led-driver/src/`**.
|
||||
14
.cursor/rules/pattern-workflow.mdc
Normal file
14
.cursor/rules/pattern-workflow.mdc
Normal file
@@ -0,0 +1,14 @@
|
||||
---
|
||||
description: Require test pattern, pattern metadata, and test preset for new patterns
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Pattern workflow requirements
|
||||
|
||||
1. When creating a new pattern under `led-driver/src/patterns/`, also add/update a corresponding test file in `led-driver/tests/patterns/`.
|
||||
|
||||
2. When adding a new pattern, ensure led-controller has `db/pattern.json`; if it does not exist, create it. Add the new pattern metadata and parameter mappings there. Optionally set **`supports_manual`** to `false` when the pattern is a poor fit for manual mode or audio beat triggers (smooth/blended animations); omit or `true` otherwise.
|
||||
|
||||
3. When adding a new pattern, add at least one test preset entry in `db/preset.json` in led-controller that uses the new pattern.
|
||||
|
||||
4. For any pattern that supports both auto and manual modes, keep behaviour parity unless explicitly requested otherwise: background colour handling, colour-cycling order, and parameter timing semantics (e.g. `n2`/`n3` meaning) must match between auto and manual paths.
|
||||
18
.cursor/rules/scoped-fixes.mdc
Normal file
18
.cursor/rules/scoped-fixes.mdc
Normal file
@@ -0,0 +1,18 @@
|
||||
---
|
||||
description: Fix only the issue or task the user gave; no refactors unless requested
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Scoped fixes (no overscoping)
|
||||
|
||||
1. **Change only what is needed** to satisfy the user’s *current* request (bug, error, feature, or explicit follow-up). Prefer the smallest diff that fixes it.
|
||||
|
||||
2. **Refactors:** Do **not** refactor (restructure, rename, extract functions, change abstractions, or “make it nicer”) **unless the user explicitly asked for a refactor**. A bug fix may touch nearby lines only as much as required to correct the bug.
|
||||
|
||||
3. **Do not** rename, reformat, or “clean up” unrelated code; do not add extra error handling, logging, or features you were not asked for.
|
||||
|
||||
4. **Related issues:** If you spot other problems (missing functions, wrong types elsewhere, style), you may **mention them in prose** — do **not** fix them unless the user explicitly asks.
|
||||
|
||||
5. **Tests and docs:** Add or change tests or documentation **only** when the user asked for them or they are strictly required to verify the requested fix.
|
||||
|
||||
6. **Multiple distinct fixes:** If the user reported one error (e.g. a single `TypeError`), fix **that** cause first. Offer to tackle follow-ups separately rather than bundling.
|
||||
16
.cursor/rules/strict-user-scope.mdc
Normal file
16
.cursor/rules/strict-user-scope.mdc
Normal file
@@ -0,0 +1,16 @@
|
||||
---
|
||||
description: enforce strict user-scoped changes only
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Strict User Scope
|
||||
|
||||
1. Only implement exactly what the user asked for in the current message.
|
||||
|
||||
2. Do not add extra refactors, cleanups, renames, architecture changes, or behavioural changes unless the user explicitly asked for them.
|
||||
|
||||
3. If a potential improvement is noticed, mention it briefly and ask before changing code.
|
||||
|
||||
4. For revert/undo requests, perform the narrowest possible revert and do not modify anything else.
|
||||
|
||||
5. Keep edits minimal and local to the requested area.
|
||||
18
.cursor/rules/submodules-led-driver-tool.mdc
Normal file
18
.cursor/rules/submodules-led-driver-tool.mdc
Normal file
@@ -0,0 +1,18 @@
|
||||
---
|
||||
description: Keep led-driver and led-tool git submodules in sync when updating led-controller
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Submodule pointers (`led-driver`, `led-tool`)
|
||||
|
||||
This repo tracks **`led-driver`** and **`led-tool`** as git submodules (see `.gitmodules`).
|
||||
|
||||
When you **update led-controller** work that should ship with matching firmware or CLI behaviour—or when you finish changes **inside** those submodule directories—**record the new submodule commits in the parent repo**:
|
||||
|
||||
1. In each submodule, commit and push on its remote if there are local commits (or ensure the checkout is the intended revision).
|
||||
2. From the **led-controller** root: `git add led-driver led-tool` after their HEADs point at the right commits.
|
||||
3. Include the parent-repo commit that bumps the gitlinks (so CI and clones get consistent trees).
|
||||
|
||||
**Do not** leave submodule directories dirty or forgotten while presenting the parent repo as “done”: either commit the submodule pointer update in led-controller, or leave an explicit note if the user must push submodule remotes first.
|
||||
|
||||
If the user only asked for a submodule bump with no code edits, a single `chore(submodules): bump led-driver and led-tool` style commit is appropriate (see commit rule).
|
||||
16
.gitignore
vendored
16
.gitignore
vendored
@@ -1,5 +1,7 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
# led-driver/src is MicroPython source — never keep host __pycache__ there (see .cursor/rules/led-driver.mdc)
|
||||
led-driver/src/__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
@@ -23,8 +25,22 @@ ENV/
|
||||
Thumbs.db
|
||||
|
||||
# Project specific
|
||||
scripts/.led-controller-venv
|
||||
docs/.help-print.html
|
||||
settings.json
|
||||
# Track shared JSON + preset binaries; ignore other db/*.json (e.g. device, zone) locally
|
||||
db/*
|
||||
!db/group.json
|
||||
!db/palette.json
|
||||
!db/pattern.json
|
||||
!db/preset.json
|
||||
!db/profile.json
|
||||
!db/scene.json
|
||||
!db/sequence.json
|
||||
!db/presets/
|
||||
!db/presets/*.bin
|
||||
*.log
|
||||
*.db
|
||||
*.sqlite
|
||||
.pytest_cache/
|
||||
.ropeproject/
|
||||
|
||||
3
.gitmodules
vendored
3
.gitmodules
vendored
@@ -4,3 +4,6 @@
|
||||
[submodule "led-tool"]
|
||||
path = led-tool
|
||||
url = git@git.technical.kiwi:technicalkiwi/led-tool.git
|
||||
[submodule "led-simulator"]
|
||||
path = led-simulator
|
||||
url = git@git.technical.kiwi:technicalkiwi/led-simulator.git
|
||||
|
||||
16
Pipfile
16
Pipfile
@@ -13,17 +13,21 @@ requests = "*"
|
||||
selenium = "*"
|
||||
adafruit-ampy = "*"
|
||||
microdot = "*"
|
||||
websockets = "*"
|
||||
numpy = "*"
|
||||
sounddevice = "*"
|
||||
|
||||
[dev-packages]
|
||||
pytest = "*"
|
||||
|
||||
[requires]
|
||||
python_version = "3.12"
|
||||
python_version = "3.11"
|
||||
|
||||
[scripts]
|
||||
web = "python /home/pi/led-controller/tests/web.py"
|
||||
watch = "python -m watchfiles 'python tests/web.py' src tests"
|
||||
install = "pipenv install"
|
||||
web = "python tests/web.py"
|
||||
watch = "python -m watchfiles \"python tests/web.py\" src tests"
|
||||
run = "sh -c 'cd src && python main.py'"
|
||||
dev = "watchfiles \"sh -c 'cd src && python main.py'\" src"
|
||||
help-pdf = "sh scripts/build_help_pdf.sh"
|
||||
dev = "python -m watchfiles \"sh -c 'cd src && LED_CONTROLLER_LIVE_RELOAD=1 python main.py'\" src"
|
||||
test = "python -m pytest"
|
||||
test-browser = "sh -c 'python tests/web.py > /tmp/led-controller-web.log 2>&1 & pid=$!; trap \"kill $pid\" EXIT; sleep 2; LED_CONTROLLER_RUN_BROWSER_TESTS=1 LED_CONTROLLER_DEVICE_IP=http://127.0.0.1:5000 python -m pytest tests/test_browser.py'"
|
||||
test-browser-device = "sh -c 'LED_CONTROLLER_RUN_BROWSER_TESTS=1 python -m pytest tests/test_browser.py'"
|
||||
|
||||
846
Pipfile.lock
generated
846
Pipfile.lock
generated
File diff suppressed because it is too large
Load Diff
24
README.md
24
README.md
@@ -1,27 +1,30 @@
|
||||
# led-controller
|
||||
|
||||
LED controller web app for managing profiles, tabs, presets, and colour palettes, and sending commands to LED devices over the serial -> ESP-NOW bridge.
|
||||
LED controller web app for managing profiles, **zones**, presets, and colour palettes, and sending commands to LED devices. Outbound paths include:
|
||||
|
||||
- **Serial → ESP-NOW bridge**: JSON lines over UART to an ESP32 that forwards ESP-NOW frames (configure `serial_port` and baud in `settings.json` / Settings model).
|
||||
- **Wi-Fi LED drivers**: TCP JSON lines (default port **8765** on the Pi; drivers discover the controller via **UDP 8766** broadcast).
|
||||
|
||||
## Run
|
||||
|
||||
- One-time setup for port 80 without root: `sudo scripts/setup-port80.sh`
|
||||
- Start app: `pipenv run run`
|
||||
- Start app: `pipenv run run` (override listen port with the **`PORT`** environment variable)
|
||||
- Dev watcher (auto-restart on `src/` changes): `pipenv run dev`
|
||||
- Regenerate **`docs/help.pdf`** from **`docs/help.md`**: `pipenv run help-pdf` (requires **pandoc** and **chromium** on the host)
|
||||
|
||||
## UI modes
|
||||
|
||||
- **Run mode**: focused control view. Select tabs/presets and apply profiles. Editing actions are hidden.
|
||||
- **Edit mode**: management view. Shows Tabs, Presets, Patterns, Colour Palette, and Send Presets controls, plus per-tile preset edit/remove and drag-reorder.
|
||||
- **Run mode**: focused control view. Select zones/presets and apply profiles. Editing actions are hidden.
|
||||
- **Edit mode**: management view. Shows **Zones**, Presets, Patterns, Colour Palette, and Send Presets controls, plus per-tile preset edit/remove and drag-reorder.
|
||||
|
||||
## Profiles
|
||||
|
||||
- Applying a profile updates session scope and refreshes the active tab content.
|
||||
- In **Run mode**, Profiles supports apply-only behavior (no create/clone/delete).
|
||||
- Applying a profile updates session scope and refreshes the active zone content.
|
||||
- In **Run mode**, Profiles supports apply-only behaviour (no create/clone/delete).
|
||||
- In **Edit mode**, Profiles supports create/clone/delete.
|
||||
- Creating a profile always creates a populated `default` tab (starter presets).
|
||||
- Optional **DJ tab** seeding creates:
|
||||
- `dj` tab bound to device name `dj`
|
||||
- Creating a profile always creates a populated `default` zone (starter presets).
|
||||
- Optional **DJ zone** seeding creates:
|
||||
- `dj` zone bound to device name `dj`
|
||||
- starter DJ presets (rainbow, single colour, transition)
|
||||
|
||||
## Preset colours and palette linking
|
||||
@@ -35,3 +38,6 @@ LED controller web app for managing profiles, tabs, presets, and colour palettes
|
||||
|
||||
- Main API reference: `docs/API.md`
|
||||
|
||||
## Driver pattern modules
|
||||
|
||||
Pattern **`.py`** sources live under **`led-driver/src/patterns`**. The Pi app resolves that path via `util.driver_patterns.driver_patterns_dir()`. If you deploy without that tree next to the app, set **`LED_CONTROLLER_PATTERNS_DIR`** to the directory that contains those files.
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
{}
|
||||
@@ -1 +1 @@
|
||||
{"1": {"name": "Main Group", "devices": ["1", "2", "3"]}, "2": {"name": "Accent Group", "devices": ["4", "5"]}}
|
||||
{"1": {"name": "group1", "devices": ["e8f60a16fb00", "e8f60a170794"], "wifi_driver_display_name": "desk", "wifi_driver_num_leds": 59, "wifi_color_order": "rgb", "wifi_startup_mode": "default", "pattern": "on", "colors": ["000000", "FF0000"], "brightness": 100, "delay": 100, "step_offset": 0, "step_increment": 1, "n1": 0, "n2": 0, "n3": 0, "n4": 0, "n5": 0, "n6": 0, "n7": 0, "n8": 0, "output_brightness": 255}, "2": {"name": "group2", "devices": ["188b0e1560a8"], "wifi_driver_display_name": null, "wifi_driver_num_leds": null, "wifi_color_order": "rgb", "wifi_startup_mode": "default", "output_brightness": 255, "pattern": "on", "colors": ["000000", "FF0000"], "brightness": 100, "delay": 100, "step_offset": 0, "step_increment": 1, "n1": 0, "n2": 0, "n3": 0, "n4": 0, "n5": 0, "n6": 0, "n7": 0, "n8": 0}}
|
||||
343
db/pattern.json
343
db/pattern.json
@@ -1,54 +1,291 @@
|
||||
{
|
||||
"on": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 1
|
||||
},
|
||||
"off": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0
|
||||
},
|
||||
"rainbow": {
|
||||
"n1": "Step Rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0
|
||||
},
|
||||
"transition": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10
|
||||
},
|
||||
"chase": {
|
||||
"n1": "Colour 1 Length",
|
||||
"n2": "Colour 2 Length",
|
||||
"n3": "Step 1",
|
||||
"n4": "Step 2",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2
|
||||
},
|
||||
"pulse": {
|
||||
"n1": "Attack",
|
||||
"n2": "Hold",
|
||||
"n3": "Decay",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10
|
||||
},
|
||||
"circle": {
|
||||
"n1": "Head Rate",
|
||||
"n2": "Max Length",
|
||||
"n3": "Tail Rate",
|
||||
"n4": "Min Length",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2
|
||||
},
|
||||
"blink": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10
|
||||
}
|
||||
}
|
||||
"on": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 1,
|
||||
"supports_manual": true
|
||||
},
|
||||
"off": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0,
|
||||
"supports_manual": true
|
||||
},
|
||||
"rainbow": {
|
||||
"n1": "Step Rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0,
|
||||
"supports_manual": true
|
||||
},
|
||||
"colour_cycle": {
|
||||
"n1": "Step Rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"transition": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": false
|
||||
},
|
||||
"chase": {
|
||||
"n1": "Colour 1 Length",
|
||||
"n2": "Colour 2 Length",
|
||||
"n3": "Step 1",
|
||||
"n4": "Step 2",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"pulse": {
|
||||
"n1": "Attack",
|
||||
"n2": "Hold",
|
||||
"n3": "Decay",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"circle": {
|
||||
"n1": "Head Rate",
|
||||
"n2": "Max Length",
|
||||
"n3": "Tail Rate",
|
||||
"n4": "Min Length",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2,
|
||||
"has_background": true,
|
||||
"supports_manual": false
|
||||
},
|
||||
"blink": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": false
|
||||
},
|
||||
"flicker": {
|
||||
"n1": "Min brightness",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"flame": {
|
||||
"n1": "Min brightness",
|
||||
"n2": "Breath period (ms)",
|
||||
"n3": "Spark gap min (ms, 0=default 10–30 s, -1=off)",
|
||||
"n4": "Spark gap max (ms)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": false
|
||||
},
|
||||
"twinkle": {
|
||||
"n1": "Twinkle activity (1–255, higher = more changes)",
|
||||
"n2": "Density (0–255, higher = more of the strip lit)",
|
||||
"n3": "Min adjacent LEDs per twinkle (same as max for fixed length)",
|
||||
"n4": "Max adjacent LEDs per twinkle (same as min for fixed length)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"radiate": {
|
||||
"n1": "Node spacing (LEDs)",
|
||||
"n2": "Out time (ms)",
|
||||
"n3": "In time (ms)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"meteor_rain": {
|
||||
"n1": "Tail length",
|
||||
"n2": "Speed (LEDs per frame)",
|
||||
"n3": "Fade amount (1-255)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"scanner": {
|
||||
"n1": "Eye width",
|
||||
"n2": "End pause (frames)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"gradient_scroll": {
|
||||
"n1": "Scroll step rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"comet_dual": {
|
||||
"n1": "Tail length",
|
||||
"n2": "Speed",
|
||||
"n3": "Gap",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"sparkle_trail": {
|
||||
"n1": "Spark density",
|
||||
"n2": "Decay",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": true
|
||||
},
|
||||
"wave": {
|
||||
"n1": "Wavelength",
|
||||
"n2": "Amplitude",
|
||||
"n3": "Drift speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"plasma": {
|
||||
"n1": "Scale",
|
||||
"n2": "Speed",
|
||||
"n3": "Contrast",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"segment_chase": {
|
||||
"n1": "Segment size",
|
||||
"n2": "Phase step",
|
||||
"n3": "Segment phase offset",
|
||||
"n4": "Gap per segment",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"bar_graph": {
|
||||
"n1": "Level percent",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": false
|
||||
},
|
||||
"breathing_dual": {
|
||||
"n1": "Phase offset",
|
||||
"n2": "Ease",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"strobe_burst": {
|
||||
"n1": "Burst count",
|
||||
"n2": "Burst gap",
|
||||
"n3": "Cooldown",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"rain_drops": {
|
||||
"n1": "Drop rate",
|
||||
"n2": "Ripple width",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"fireflies": {
|
||||
"n1": "Count",
|
||||
"n2": "Twinkle speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"clock_sweep": {
|
||||
"n1": "Hand width",
|
||||
"n2": "Marker interval",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"marquee": {
|
||||
"n1": "On length",
|
||||
"n2": "Off length",
|
||||
"n3": "Step",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"aurora": {
|
||||
"n1": "Band count",
|
||||
"n2": "Shimmer",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"snowfall": {
|
||||
"n1": "Flake density",
|
||||
"n2": "Fall speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"heartbeat": {
|
||||
"n1": "Pulse 1 ms",
|
||||
"n2": "Pulse 2 ms",
|
||||
"n3": "Pause ms",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"orbit": {
|
||||
"n1": "Orbit count",
|
||||
"n2": "Base speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"palette_morph": {
|
||||
"n1": "Morph ms",
|
||||
"n2": "Warp rate",
|
||||
"n3": "Turbulence",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
BIN
db/presets/1.bin
Normal file
BIN
db/presets/1.bin
Normal file
Binary file not shown.
3
db/presets/10.bin
Normal file
3
db/presets/10.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœ%ÎÁ
|
||||
Â0Ð_‘ñšCSµJîæ'D$¶«
|
||||
ÄÝ’¦ˆˆÿntOovæ²opxz‘´zޱ¦P
|
||||
2
db/presets/11.bin
Normal file
2
db/presets/11.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xњ%ОAВ …б»<·,J5\Е4
|
||||
К $84SX4Ж»‹eхеНlюШЅ B
|
||||
1
db/presets/12.bin
Normal file
1
db/presets/12.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœ%ÎA л|·, ŠÐK˜ÆP;*
|
||||
2
db/presets/13.bin
Normal file
2
db/presets/13.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœEÎÁ
|
||||
Â0Ð_‘9ç`«Qɯˆ”Ô®ˆ»e“RDüwsðô˜™Ë¼ÁñIИx”uS²¬p˜c¤ü¬»J-ç‹Ã¨éþ¨LÅrï½ÃD9¾:¿uˆK„ª9pg¥Ñ#ØÂ»Æ¾á‡Æ±qú1«ÜR¦!Mö¡Ãç<0B><>1
|
||||
2
db/presets/14.bin
Normal file
2
db/presets/14.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ=ÎÝ
|
||||
!†á[‰¯StK[¼€½‰ˆ°v*ÁTü!"º÷Ü¤Žžá<C5BE>9˜¼¹4bu™VÙ…¢)…’ÿåVÎÁ…”¡÷XO“RœãÀpJöz+žr[R2ÌäÌzäœÁÔ KªÄàE;àKõ´èÓæß¶Ð²£:»Îø%¦p±ŽŽvn? ¼?<3F>¨2ú
|
||||
BIN
db/presets/15.bin
Normal file
BIN
db/presets/15.bin
Normal file
Binary file not shown.
BIN
db/presets/2.bin
Normal file
BIN
db/presets/2.bin
Normal file
Binary file not shown.
2
db/presets/3.bin
Normal file
2
db/presets/3.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœUÎÁ
|
||||
Â0ЙsM5Uò+"²µ«â¦lSDÄwiNž³3‡ý@èɈPJ2–fª•Uþn×’‹.ˆ§³Ã¨éþ¨Â‹å>‡‰3½}×9ÐZbÕ•ÄÛÀè‘]cß<08>¡qh7f-·”ù’&ûÁãûF9/.
|
||||
2
db/presets/30.bin
Normal file
2
db/presets/30.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœEÎÁ
|
||||
Â0Ð_‘9ç`«Qɯˆ”Ô®ˆ»e“RDüwsðô˜™Ë¼ÁñIИx”uS²¬p˜c¤ü¬»J-ç‹Ã¨éþ¨LÅrï½ÃD9¾:¿uˆK„ª9pg¥Ñ#ØÂ»Æ¾á‡Æ±qú1«ÜR¦!Mö¡Çç<0B>“1
|
||||
BIN
db/presets/31.bin
Normal file
BIN
db/presets/31.bin
Normal file
Binary file not shown.
2
db/presets/32.bin
Normal file
2
db/presets/32.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ%ͽÂ0àW©Ž5C~•&VÆ
|
||||
¡@<40>)uª4K…xwR<}ç»Á° —ks<DjÎ)¦…É•B™ë–¸ž¯µža;l¼×Ú{Üž9ïÂ4×ÁÐStl«kævÅ[a'ì…ƒpN¦œ|ˆô}ýmðý‡-‰
|
||||
1
db/presets/33.bin
Normal file
1
db/presets/33.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœMÎ1!†á¿b¾[=5ÌNÎnÆô@I°\€Åÿ»Å.²<oÚ¼Aîéa±?,ŽÅQ<C385>-f‚ÂìZó…xÓþÇ·œr©°'!h~<´î-Õg…k‰÷G#_ùØ0ùä^Ü#7-a;FX ka6ÂVØý˜K1ùKœø_Ÿ/ÐM4y
|
||||
BIN
db/presets/34.bin
Normal file
BIN
db/presets/34.bin
Normal file
Binary file not shown.
2
db/presets/35.bin
Normal file
2
db/presets/35.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ%ͽÂ0àW©Ž5C~•&VÆ
|
||||
¡@<40>)uª4K…xwR<}ç»Á° —ks<DjÎ)¦…É•B™ë–¸ž¯µža;l¼×Ú{Üž9ïÂ4×ÁÐStl«kævÅ[a'ì…ƒpN¦œ|ˆô}ýmðý‡-‰
|
||||
1
db/presets/36.bin
Normal file
1
db/presets/36.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœMÎ1!†á¿b¾[=5ÌNÎnÆô@I°\€Åÿ»Å.²<oÚ¼Aîéa±?,ŽÅQ<C385>-f‚ÂìZó…xÓþÇ·œr©°'!h~<´î-Õg…k‰÷G#_ùØ0ùä^Ü#7-a;FX ka6ÂVØý˜K1ùKœø_Ÿ/ÐM4y
|
||||
BIN
db/presets/37.bin
Normal file
BIN
db/presets/37.bin
Normal file
Binary file not shown.
BIN
db/presets/38.bin
Normal file
BIN
db/presets/38.bin
Normal file
Binary file not shown.
3
db/presets/39.bin
Normal file
3
db/presets/39.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœUÎÁ‚0„áw¯=¤jú*†<>
|
||||
[m\[²”ƒ1¾»…ž<}ÉÌåÿ ºÁÂsŸ$P˜]Î$ño'Y`¯88ÒÚ{ô
|
||||
7 ÷GŽ´”£5Fa"voX£Üšl–•bÛè2ÆvãXé*¦rªœ+—<>Y’LC˜JM³·1•ºAÈo5qeî¿?ªð9±
|
||||
BIN
db/presets/4.bin
Normal file
BIN
db/presets/4.bin
Normal file
Binary file not shown.
4
db/presets/40.bin
Normal file
4
db/presets/40.bin
Normal file
@@ -0,0 +1,4 @@
|
||||
PRST1xśMÎÁ‚0„áwŻ=$ű*†<>
|
||||
[%Y[RÚ1ľ»…^<}ÉĚĺ˙Ŕ™7<E284A2>`ĺPa51rpËäŇ
|
||||
tÇĹÚ©×<1A>Â#,ĎWtĽĺŁŞ{…™Ĺě V+<2B>=(†Ä
|
||||
®5m¶ŐťÎŻk@×B[č
|
||||
2
db/presets/41.bin
Normal file
2
db/presets/41.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xśmŹÁ‚0†ßĄ\wČ`ŮMQ^Â2ĄčâÜČ1ĆřînĚ‹‰—~í—?Mű#ďüC™›F 0IďŃ™w¶ÚşÄ˛š7Ľm<C4BD>ËĺMęveýuUąo<v[şć:'§.Wop
|
||||
ƨĺDN)ąx» <09><H¤)B2r"˘Śá@–Ć*ˇNŕ+&gGĄ±WC8<_ßĐéŽńpłhMţ”îýŹ!I°
|
||||
2
db/presets/42.bin
Normal file
2
db/presets/42.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xњUЋ;В0птТєp>°WAQґђ5X2Nд8BЬ;©hv¤·SМЃ_BдЙq(,њ’Др·Эg?ЗtEЕЅЦЦжТZіf
|
||||
·иПdНJcЊВ$ћЯ “ЮТJq…PѓЪј…t)ПР‚є]ЁАињњw,q¶ОЛи¦\Wп^rнЕ–є°yЇКѕ?Эh>Ў
|
||||
BIN
db/presets/43.bin
Normal file
BIN
db/presets/43.bin
Normal file
Binary file not shown.
2
db/presets/44.bin
Normal file
2
db/presets/44.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœEÎM
|
||||
Â0à«Ès›Eÿ¢’ôE$¶£â¤$Ó…ˆww0góÁ{o1o°„ŠìÊì™)Ã`õ"”Y‹6§˜r<CB9C>›°ÇFgƒk÷‡0-:k
|
||||
3
db/presets/45.bin
Normal file
3
db/presets/45.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœ=ŽA‚0E¯B>Û.
|
||||
*š€KC*ŒØ¤¶¤Æxw<1B>Í{™7‹y!ØÁ€)s5';9
|
||||
\å1Eï¡°XfJA~mø·1ú˜2ÌußkÙÕZo^ls\®ÉÍw”å¸mµÂDÞ>a:Q»r„á´’Bh¤Z)aW°/8tÇ‚ÓKŠ7çip“üÙàý)<¡
|
||||
3
db/presets/46.bin
Normal file
3
db/presets/46.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xś-ÎÁ‚0Đ_!õ‡Šdo˝ô'Ś!Ş’”–”ĺ`Ś˙î<˝ÍĚö<>čfű•‹!Íž‹qs
|
||||
‹cö9J·Çý?RHy]QZkŚÖ’•Zc-n
|
||||
÷<=_ý*“Zk…Ń÷µrşŤ<13>óćbę„T
|
||||
2
db/presets/47.bin
Normal file
2
db/presets/47.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1x<EFBFBD>5־A‚0…ב«<D791>ַ¶@Dׂ- —0ֶT©<54>X[2ֶxwG׳ש&»˜‚yXh°M\₪<>׀<EFBFBD><D780>‚ֹ8…<>0[
|
||||
’ור/חט#%ט=ֺ¾†q”·r\…¹כ<C2B9>ƒMע¥©*…ֹzף„מd5Gh¦ֵ*„Zz+6b-1l ¿´™m¦ֻל2ֺLסגה"7ֹy5<79>־ד:G
|
||||
2
db/presets/48.bin
Normal file
2
db/presets/48.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ-ÎÁ Ð_1ã•ÔZŽúÆ´«’ 4°Õã¿»Š§7;sÙ¢»,˜
|
||||
/îNP˜3å(í¿8¥<38>r<EFBFBD>Ýa©õ¶ìŽÙ_®©ÈÐh0RpOØN¢›9ÁržI!XÓˆ<C393>ØËW„ö{+]eSéL9<4C>} ƒåƒ÷ªù0¿
|
||||
2
db/presets/49.bin
Normal file
2
db/presets/49.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1x<EFBFBD>=ЮA
|
||||
Т0аЋШw<D0A8>EZ5JаK<14>б<EFBFBD>ZH<5A><48>L"онС<D0BD>Ћ7ќџѓFЄ<46>с!\e<>е<>`<60>I<EFBFBD>KдќнRHЅТ<D085>и<0E>ЕЮсlp-ѓу)<29>ЋНЕzС;=i<>/ee<65>иiІє:Sv<53>=МютЁсЧЦщG.щ>ОЬ<D09E>Овсѓ,<2C>1И
|
||||
BIN
db/presets/5.bin
Normal file
BIN
db/presets/5.bin
Normal file
Binary file not shown.
2
db/presets/50.bin
Normal file
2
db/presets/50.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ5ÎA‚0Ы<C390>϶‹‚ˆ¦è%Œ!F <20>–´ÃÂïîhu6o2ÿ/æ ïV‚Sâ"Ѹ’碟\"(lŽ™¢—ø—tÿ¤Kˆ æ‚ÒZ-#·ò£µ¸*Üâ<Nì)I¥ÖZa Å=`ZYÝΆãN
|
||||
¾‚i„¦0RðMæ˜i3§ÌùËÃ}^¨›ùÂë
|
||||
BIN
db/presets/51.bin
Normal file
BIN
db/presets/51.bin
Normal file
Binary file not shown.
BIN
db/presets/52.bin
Normal file
BIN
db/presets/52.bin
Normal file
Binary file not shown.
2
db/presets/53.bin
Normal file
2
db/presets/53.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ5Î=Â0†á«Tk†þQ<C3BE>À%*T%Ô@¥’TŽ; ÄÝIáå±ôzðÞ¾å¨ET Ž·JT,V•ŧšÃð·0‰ ‡Ë>¸8™OõS¨ËÒ`äÙ¾A]Zíª¤²²<C2B2>¯@M¢ÎÉ7 v;÷-hã˜é2§Ìyg‘pŸf¦1ýTáû^
|
||||
7˜
|
||||
3
db/presets/54.bin
Normal file
3
db/presets/54.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1x<EFBFBD>5ΞΝ
|
||||
Β0ΰW)γ5‡ώhΉϊ"%ΪU5)›νAΔww5xϊ–™9μΑ=BI
|
||||
v>Η%Α`q"ΔA»o<ώγK<CEB3>#'Ψ#6‡²ο†'ƒ3ϋΫ]%-κ²4<C2B2>hvOΨVO·J„^Ι T°MΦ<C2AD><CEA6>ΐκ"l3»L›ΩgΊΗ«<CE97>iτ“ώSαύ<01><>5%
|
||||
4
db/presets/55.bin
Normal file
4
db/presets/55.bin
Normal file
@@ -0,0 +1,4 @@
|
||||
PRST1xœMαÂ0Ð_A×5CZ ´™Q~!¨‘BR%î€ÿŽE¦gÝÝà7¢{˜
|
||||
ofŸiž
|
||||
ÇL9JõŸÞRH¹ÀœÐX{Ô½–¬µµ£ÆYášýýÁ‘ŠL:&
|
||||
îÓËéVN0œWRˆdB3[Ä]e_é+‡ÊðcÉiö<69>.~’¿Z|¾¡ 61
|
||||
1
db/presets/56.bin
Normal file
1
db/presets/56.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœ5ŽAƒ E¯b¾[¨U+WiŒ¡2¶¦`š¦éÝ’nxÌ›Y¼Œ|ùPÌÚÎ<C39A>¿ˆ60l2r&.?ýýlµuâ‚Rõ|àCt%Wuß5®n½Ýƒ!OjÎiùN¹ÜN¦‚¨¢35DÑ@¤é”Ñft}ÆùÀæì²jšVÓª#TSL<53>-)ËìZ³ôŒßQ•AÓ
|
||||
1
db/presets/57.bin
Normal file
1
db/presets/57.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xњEО1В0Р« ПљЎiЎ ЂK „5)MЪФвоXНЂ—gщяБD72В‹lF—зВѓЙ‰pЋьoчR^@glOлаbpЛющ’И‹mУЬФлкЉ$ђдВС‚:ҐХљТЃ¬Іi/о+}еP9®L9=|а«ф‹пжg2д
|
||||
2
db/presets/58.bin
Normal file
2
db/presets/58.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ=ÎÍ
|
||||
Â0àW‘é5‡ô?ìM"} ‰vÕBMJ’D|wSž¾afû†5O!rˆ;³zç
|
||||
3
db/presets/59.bin
Normal file
3
db/presets/59.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1x°Mна
|
||||
б0ЮW▒вз╘SzTЯ%D╓╨L╣m├┬ЬНfКе\╬ДOЫ ╦'а┌)С"┤ЬЙ°ВP3╔ ⌡©П}LЖ└Й8≈dуNЖр²╝╘©?8P√⌠Zk┘√╪{ц6р╨▒#,╖▒┌≥Жb
|
||||
k└%Л4╜
|
||||
2
db/presets/6.bin
Normal file
2
db/presets/6.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœMÎK
|
||||
Â0…á½§ÜT£’tR$Ú«âMÉc âÞm<C39E>ˆ£þ39Oˆ»3,¦2Car¥p’¿rŽ!¦{ÀЍï‰0(œ’¿ÞŠpž‡Î…‘ƒ{À"WK„-©²‚hXMK•î;Ëú—6°¦±mìûSŠøèÇù’Æë
|
||||
4
db/presets/60.bin
Normal file
4
db/presets/60.bin
Normal file
@@ -0,0 +1,4 @@
|
||||
PRST1xœMÎA‚0Ы˜ï¶‹RÉ€KcŠŒBR[Òc¼»l\½Éÿùɼáí“ANr˜ÙFÙ
|
||||
V+ÂÑçê?½b
|
||||
8ö½éj<EFBFBD>‹Â—Ç,žS.ŒÖ
|
||||
;ûµù´›<04>Ä<EFBFBD>|ªL½uŨ)_ƒ
|
||||
BIN
db/presets/61.bin
Normal file
BIN
db/presets/61.bin
Normal file
Binary file not shown.
3
db/presets/62.bin
Normal file
3
db/presets/62.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœ5ŽA‚0E¯B>Û.
|
||||
*š€KCªŒBRÚ¦c¼»ÅÙ¼7óÿb>ðv"0Í\D눙Š)¤8@!ZÙ’—xOºò.¤æŠ²mµŒÜJW϶:n
|
||||
÷4¾ö4K¹ÖZ¡'gß0<C39F>¨]8ÀpZHÁW0ÕVðõÞô˜ÇŒSF“qθlˆ)<GGÝØË«¾?ð¹<
|
||||
3
db/presets/7.bin
Normal file
3
db/presets/7.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœMŽ1Â0Eïò»fp
|
||||
<EFBFBD>(K/<2F>
|
||||
<EFBFBD>H!©Òt@ˆ»cÈÂô¾Ÿ¿%¿<>üƒá0†2F†Âìkå’þÕ˜c.ÜÝ0‘¸Î‘%œ.%Üî5ñ"•Þ…‰£J&RðkÍpµ¬¬<C2AC>´HA§e•6mÜÂÉQ2p_¹kØ7Øæ’¯!ò9Lò–Æû¼Ã1ó
|
||||
BIN
db/presets/8.bin
Normal file
BIN
db/presets/8.bin
Normal file
Binary file not shown.
2
db/presets/9.bin
Normal file
2
db/presets/9.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ%ÎK
|
||||
Ã0Ы”éÖ‹$ýâ«”ÜFnŽ›PJï^ÇÖæI£Í|Áf&hlFæÃ6¹HPXLŒ$œãÀù|d…~àhË WxŠ{O‘iÍ<69>®iFòæÝî»I1@GI¤À-tޏ«œ*çÊ¥rÜ*÷Â"Á:Oƒs<>¶´ò”{
|
||||
@@ -1 +1 @@
|
||||
{"1": {"name": "default", "type": "tabs", "tabs": ["1", "8"], "scenes": [], "palette_id": "1"}, "2": {"name": "test", "type": "tabs", "tabs": ["6", "7"], "scenes": [], "palette_id": "12"}}
|
||||
{"1": {"name": "default", "type": "zones", "zones": ["1", "8"], "scenes": [], "palette_id": "1"}, "2": {"name": "test", "type": "zones", "zones": ["6", "7"], "scenes": [], "palette_id": "12"}}
|
||||
@@ -1 +1 @@
|
||||
{"1": {"group_name": "Main Group", "presets": ["1", "2"], "sequence_duration": 3000, "sequence_transition": 500, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0}, "2": {"group_name": "Accent Group", "presets": ["2", "3"], "sequence_duration": 2000, "sequence_transition": 300, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0}}
|
||||
{"1": {"group_name": "Main Group", "presets": ["1", "2"], "sequence_duration": 3000, "sequence_transition": 500, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0, "steps": [], "step_duration_ms": 3000, "loop": true, "name": "Main Group", "profile_id": "1", "lanes": [[{"preset_id": "42", "beats": 6}, {"preset_id": "5", "beats": 2}], [{"preset_id": "6", "beats": 1}]], "group_ids": ["1"], "advance_mode": "beats", "lanes_group_ids": [["1"], ["2"]]}, "2": {"group_name": "Accent Group", "presets": ["2", "3"], "sequence_duration": 2000, "sequence_transition": 300, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0, "steps": [{"preset_id": "2", "group_ids": [], "beats": 1}, {"preset_id": "3", "group_ids": [], "beats": 1}], "step_duration_ms": 2000, "loop": true, "name": "Accent Group", "profile_id": "1", "lanes": [[{"preset_id": "2", "group_ids": [], "beats": 1}, {"preset_id": "3", "group_ids": [], "beats": 1}]], "group_ids": [], "advance_mode": "time", "lanes_group_ids": [[]]}}
|
||||
@@ -1 +0,0 @@
|
||||
{"1": {"name": "default", "names": ["1", "2", "3", "4", "5", "6", "7", "8", "0", "a"], "presets": [["4", "2", "7"], ["15", "3", "14"], ["5", "8", "10"], ["11", "9", "12"], ["1", "13", "37"]], "presets_flat": ["4", "2", "7", "15", "3", "14", "5", "8", "10", "11", "9", "12", "1", "13", "37"], "default_preset": "15"}, "2": {"name": "default", "names": ["1", "2", "3", "4", "5", "6", "7", "8", "0", "a"], "presets": [["16", "17", "18"], ["19", "20", "21"], ["22", "23", "24"], ["25", "26", "27"], ["28", "29", "30"]], "presets_flat": ["16", "17", "18", "19", "20", "21", "22", "23", "24", "25", "26", "27", "28", "29", "30"]}, "3": {"name": "default", "names": ["1"], "presets": [], "default_preset": null}, "4": {"name": "default", "names": ["1"], "presets": [], "default_preset": null}, "5": {"name": "dj", "names": ["dj"], "presets": [["31", "32", "33"]], "default_preset": "31", "presets_flat": ["31", "32", "33"]}, "6": {"name": "default", "names": ["1"], "presets": [], "default_preset": null}, "7": {"name": "dj", "names": ["dj"], "presets": [["34", "35", "36"]], "default_preset": "34", "presets_flat": ["34", "35", "36"]}, "8": {"name": "test", "names": ["11"], "presets": [["1", "2", "3"], ["4", "5"]], "default_preset": "1", "presets_flat": ["1", "2", "3", "4", "5"]}}
|
||||
53
dev.py
53
dev.py
@@ -1,53 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import subprocess
|
||||
import serial
|
||||
import sys
|
||||
|
||||
print(sys.argv)
|
||||
|
||||
# Extract port (first arg if it's not a command)
|
||||
commands = ["src", "lib", "ls", "reset", "follow", "db"]
|
||||
port = None
|
||||
if len(sys.argv) > 1 and sys.argv[1] not in commands:
|
||||
port = sys.argv[1]
|
||||
|
||||
|
||||
for cmd in sys.argv[1:]:
|
||||
print(cmd)
|
||||
match cmd:
|
||||
case "src":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "cp", "-r", ".", ":" ], cwd="src")
|
||||
else:
|
||||
print("Error: Port required for 'src' command")
|
||||
case "lib":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "cp", "-r", "lib", ":" ])
|
||||
else:
|
||||
print("Error: Port required for 'lib' command")
|
||||
case "ls":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "ls", ":" ])
|
||||
else:
|
||||
print("Error: Port required for 'ls' command")
|
||||
case "reset":
|
||||
if port:
|
||||
with serial.Serial(port, baudrate=115200) as ser:
|
||||
ser.write(b'\x03\x03\x04')
|
||||
else:
|
||||
print("Error: Port required for 'reset' command")
|
||||
case "follow":
|
||||
if port:
|
||||
with serial.Serial(port, baudrate=115200) as ser:
|
||||
while True:
|
||||
if ser.in_waiting > 0: # Check if there is data in the buffer
|
||||
data = ser.readline().decode('utf-8').strip() # Read and decode the data
|
||||
print(data)
|
||||
else:
|
||||
print("Error: Port required for 'follow' command")
|
||||
case "db":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "cp", "-r", "db", ":" ])
|
||||
else:
|
||||
print("Error: Port required for 'db' command")
|
||||
88
docs/API.md
88
docs/API.md
@@ -2,10 +2,12 @@
|
||||
|
||||
This document covers:
|
||||
|
||||
1. **HTTP and WebSocket** exposed by the Raspberry Pi app (`src/main.py`) — profiles, presets, transport send, and related resources.
|
||||
2. **LED driver JSON** — the compact message format sent over the serial→ESP-NOW bridge to devices (same logical API as ESP-NOW payloads).
|
||||
1. **HTTP and WebSocket** exposed by the Raspberry Pi app (`src/main.py`) — profiles, zones, presets, transport send, pattern OTA helpers, and related resources.
|
||||
2. **LED driver JSON** — the compact **v1** message format. It is sent over the **serial → ESP-NOW bridge** to ESP32 peers and as **single JSON text messages** over the **outbound WebSocket** to **Wi-Fi** drivers (same logical fields).
|
||||
|
||||
Default listen address: `0.0.0.0`. Port defaults to **80**; override with the `PORT` environment variable (see `pipenv run run`).
|
||||
Default HTTP listen address: `0.0.0.0`. Port defaults to **80**; override with the **`PORT`** environment variable (see `pipenv run run`).
|
||||
|
||||
**Serial:** UART path and baud come from settings (defaults include `serial_port` such as `/dev/ttyS0` and `serial_baudrate`). **Wi-Fi drivers:** **UDP** on port **8766** is the **discovery** channel: each driver’s JSON hello (**`device_name`**, **MAC**, optional **`type`**) **creates or updates** that device in **`db/device.json`** (keyed by MAC); the Pi echoes the datagram. After a valid hello with **`v`:** **`"1"`**, the Pi also opens an **outbound WebSocket** to that IP (**`wifi_driver_ws_port`**, default **80**; **`wifi_driver_ws_path`**, default **`/ws`**) for v1 commands; presets are not pushed automatically on connect (use **Send Presets** / profile apply). The Pi may send periodic UDP **hello** nudges to known Wi‑Fi device IPs when the WebSocket is down (**`wifi_driver_hello_interval_s`** in settings).
|
||||
|
||||
All JSON APIs use `Content-Type: application/json` for bodies and responses unless noted.
|
||||
|
||||
@@ -15,8 +17,8 @@ All JSON APIs use `Content-Type: application/json` for bodies and responses unle
|
||||
|
||||
The main UI has two modes controlled by the mode toggle:
|
||||
|
||||
- **Run mode**: optimized for operation (tab/preset selection and profile apply).
|
||||
- **Edit mode**: shows editing/management controls (tabs, presets, patterns, colour palette, send presets, and profile management actions).
|
||||
- **Run mode**: optimized for operation (zone/preset selection and profile apply).
|
||||
- **Edit mode**: shows editing/management controls (zones, presets, patterns, colour palette, send presets, profile management actions, **Devices** registry for LED driver names/MACs, and related tools).
|
||||
|
||||
Profiles are available in both modes, but behavior differs:
|
||||
|
||||
@@ -40,7 +42,7 @@ Profiles are selected with **`POST /profiles/<id>/apply`**, which sets `current_
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/` | Main UI (`templates/index.html`) |
|
||||
| GET | `/settings` | Settings page (`templates/settings.html`) |
|
||||
| GET | `/settings/page` | Standalone settings page (`templates/settings.html`) |
|
||||
| GET | `/favicon.ico` | Empty response (204) |
|
||||
| GET | `/static/<path>` | Static files under `src/static/` |
|
||||
|
||||
@@ -50,10 +52,12 @@ Profiles are selected with **`POST /profiles/<id>/apply`**, which sets `current_
|
||||
|
||||
Connect to **`ws://<host>:<port>/ws`**.
|
||||
|
||||
- Send **JSON**: the object is forwarded to the transport (serial bridge → ESP-NOW) as JSON. Optional key **`to`**: 12-character hex MAC address; if present it is removed from the object and the payload is sent to that peer; otherwise the default destination is used.
|
||||
- Send **JSON**: the object is forwarded through the **serial sender** (6-byte MAC prefix + payload to the ESP-NOW bridge). Optional key **`to`**: 12-character hex MAC address; if present it is removed from the object and the payload is sent to that peer; otherwise the default destination from settings is used.
|
||||
- Send **non-JSON text**: forwarded as raw bytes with the default address.
|
||||
- On send failure, the server may reply with `{"error": "Send failed"}`.
|
||||
|
||||
Wi-Fi devices are not targeted by `/ws` directly; use **`POST /presets/send`**, device routes, or **`POST /patterns/<name>/send`** as appropriate.
|
||||
|
||||
---
|
||||
|
||||
## HTTP API by resource
|
||||
@@ -68,7 +72,30 @@ Below, `<id>` values are string identifiers used by the JSON stores (numeric str
|
||||
| PUT | `/settings/settings` | Merge keys into settings and save. Returns `{"message": "Settings updated successfully"}`. |
|
||||
| GET | `/settings/wifi/ap` | Saved Wi‑Fi AP fields: `saved_ssid`, `saved_password`, `saved_channel`, `active` (Pi: `active` is always false). |
|
||||
| POST | `/settings/wifi/ap` | Body: `ssid` (required), `password`, `channel` (1–11). Persists AP-related settings. |
|
||||
| GET | `/settings/page` | Serves `templates/settings.html` (same page as `GET /settings` from the root app, for convenience). |
|
||||
| GET | `/settings/page` | Serves `templates/settings.html`. |
|
||||
|
||||
### Devices — `/devices`
|
||||
|
||||
Registry in `db/device.json`: storage key **`<id>`** (string, e.g. `"1"`) maps to an object that always includes:
|
||||
|
||||
| Field | Description |
|
||||
|-------|-------------|
|
||||
| **`id`** | Same as the storage key (stable handle for URLs). |
|
||||
| **`name`** | Shown in the UI and used in `select` keys. |
|
||||
| **`type`** | `led` (only value today; extensible). |
|
||||
| **`transport`** | `espnow` or `wifi`. |
|
||||
| **`address`** | For **`espnow`**: optional 12-character lowercase hex MAC. For **`wifi`**: optional IP or hostname string. |
|
||||
| **`default_pattern`**, **`zones`** | Optional. Legacy **`tabs`** may still appear in old files and is migrated away on load. |
|
||||
|
||||
Existing records without `type` / `transport` / `id` are backfilled on load (`led`, `espnow`, and `id` = key).
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/devices` | Map of device id → device object. |
|
||||
| GET | `/devices/<id>` | One device, 404 if missing. |
|
||||
| POST | `/devices` | Create. Body: **`name`** (required), **`type`** (default `led`), **`transport`** (default `espnow`), optional **`address`**, **`default_pattern`**, **`zones`**. Returns `{ "<id>": { ... } }`, 201. |
|
||||
| PUT | `/devices/<id>` | Partial update. **`name`** cannot be cleared. **`id`** in the body is ignored. **`type`** / **`transport`** validated; **`address`** normalised for the resulting transport. |
|
||||
| DELETE | `/devices/<id>` | Remove device. |
|
||||
|
||||
### Profiles — `/profiles`
|
||||
|
||||
@@ -77,9 +104,9 @@ Below, `<id>` values are string identifiers used by the JSON stores (numeric str
|
||||
| GET | `/profiles` | `{"profiles": {...}, "current_profile_id": "<id>"}`. Ensures a default current profile when possible. |
|
||||
| GET | `/profiles/current` | `{"id": "...", "profile": {...}}` |
|
||||
| GET | `/profiles/<id>` | Single profile. If `<id>` is `current`, same as `/profiles/current`. |
|
||||
| POST | `/profiles` | Create profile. Body may include `name` and other fields. Optional `seed_dj_tab` (request-only) seeds a DJ tab + presets. New profiles always get a populated `default` tab. Returns `{ "<id>": { ... } }` with status 201. |
|
||||
| POST | `/profiles` | Create profile. Body may include `name` and other fields. Optional `seed_dj_zone` (request-only) seeds a DJ zone + presets. New profiles always get a populated `default` zone. Returns `{ "<id>": { ... } }` with status 201. |
|
||||
| POST | `/profiles/<id>/apply` | Sets session current profile to `<id>`. |
|
||||
| POST | `/profiles/<id>/clone` | Clone profile (tabs, palettes, presets). Body may include `name`. |
|
||||
| POST | `/profiles/<id>/clone` | Clone profile (zones, palettes, presets). Body may include `name`. |
|
||||
| PUT | `/profiles/current` | Update the current profile (from session). |
|
||||
| PUT | `/profiles/<id>` | Update profile by id. |
|
||||
| DELETE | `/profiles/<id>` | Delete profile. |
|
||||
@@ -120,18 +147,18 @@ Stored preset records can include:
|
||||
- `colors`: resolved hex colours for editor/display.
|
||||
- `palette_refs`: optional array of palette indexes parallel to `colors`. If a slot contains an integer index, the colour is linked to the current profile palette at that index.
|
||||
|
||||
### Tabs — `/tabs`
|
||||
### Zones — `/zones`
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/tabs` | `tabs`, `tab_order`, `current_tab_id`, `profile_id` for the session-backed profile. |
|
||||
| GET | `/tabs/current` | Current tab from cookie/session. |
|
||||
| POST | `/tabs` | Create tab; optional JSON `name`, `names`, `presets`; can append to current profile’s tab list. |
|
||||
| GET | `/tabs/<id>` | Tab JSON. |
|
||||
| PUT | `/tabs/<id>` | Update tab. |
|
||||
| DELETE | `/tabs/<id>` | Delete tab; can delete `current` to remove the active tab; updates profile tab list. |
|
||||
| POST | `/tabs/<id>/set-current` | Sets `current_tab` cookie. |
|
||||
| POST | `/tabs/<id>/clone` | Clone tab into current profile. |
|
||||
| GET | `/zones` | `zones` (map of zone id → zone object), `zone_order`, `current_zone_id`, `profile_id` for the session-backed profile. |
|
||||
| GET | `/zones/current` | Current zone from cookie/session. |
|
||||
| POST | `/zones` | Create zone; optional JSON `name`, `names`, `presets`; can append to current profile’s zone list. |
|
||||
| GET | `/zones/<id>` | Zone JSON. |
|
||||
| PUT | `/zones/<id>` | Update zone. |
|
||||
| DELETE | `/zones/<id>` | Delete zone; can delete `current` to remove the active zone; updates profile zone list. |
|
||||
| POST | `/zones/<id>/set-current` | Sets `current_zone` cookie. |
|
||||
| POST | `/zones/<id>/clone` | Clone zone into current profile. |
|
||||
|
||||
### Palettes — `/palettes`
|
||||
|
||||
@@ -175,20 +202,33 @@ Stored preset records can include:
|
||||
|
||||
### Patterns — `/patterns`
|
||||
|
||||
Pattern metadata lives in **`db/pattern.json`**; driver source files live under **`led-driver/src/patterns/`**. Several routes expose a **runtime map** (metadata merged with on-disk `.py` names so new files appear in menus).
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/patterns/definitions` | Contents of `pattern.json` (pattern metadata for the UI). |
|
||||
| GET | `/patterns` | All pattern records. |
|
||||
| GET | `/patterns/<id>` | One pattern. |
|
||||
| GET | `/patterns` | Runtime pattern map (object keyed by pattern id). |
|
||||
| GET | `/patterns/definitions` | Same runtime map (intended for UI “definitions” clients). |
|
||||
| GET | `/patterns/ota/manifest` | JSON `{"files":[{"name":"blink.py","url":"http://<Host>/patterns/ota/file/blink.py"},...]}` for OTA pulls. Requires **`Host`** header. |
|
||||
| GET | `/patterns/ota/file/<name>` | Raw **`.py`** source for one driver pattern (`name` must be a safe filename, e.g. `rainbow.py`). |
|
||||
| POST | `/patterns/<name>/send` | Push a **manifest** JSON line to **Wi-Fi** devices so they pull one pattern file over HTTP. Body may include **`device_id`** to target one device; otherwise all Wi-Fi devices with an **`address`** are tried. **`<name>`** may be with or without `.py`. |
|
||||
| POST | `/patterns/upload` | Body JSON: **`name`**, **`code`**, optional **`overwrite`** (default true). Writes **`led-driver/src/patterns/<name>.py`**. |
|
||||
| POST | `/patterns/driver` | Body JSON: **`name`** (identifier), **`code`**, optional metadata (`min_delay`, `max_delay`, `max_colors`, `n1`…`n8`, **`overwrite`**). Creates/updates both the **`.py`** file and **`db/pattern.json`** via the Pattern model. |
|
||||
| GET | `/patterns/<id>` | One pattern record from the Pattern model (metadata only). |
|
||||
| POST | `/patterns` | Create (`name`, optional `data`). |
|
||||
| PUT | `/patterns/<id>` | Update. |
|
||||
| DELETE | `/patterns/<id>` | Delete. |
|
||||
|
||||
**Devices — pattern OTA push**
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/devices/<id>/patterns/push` | Wi-Fi only. Asks the driver at **`address`** to pull pattern files from this server. Optional body **`manifest`**: either a **URL string** pointing at a manifest JSON document, or a **manifest object** (same shape as in driver messages). If omitted, a default manifest is built from the request **`Host`** header. |
|
||||
|
||||
---
|
||||
|
||||
## LED driver message format (transport / ESP-NOW)
|
||||
## LED driver message format (transport / ESP-NOW / Wi-Fi)
|
||||
|
||||
Messages are JSON objects. The Pi **`build_message()`** helper (`src/util/espnow_message.py`) produces the same shape sent over serial and forwarded by the ESP32 bridge.
|
||||
Messages are JSON objects. The Pi **`build_message()`** helper (`src/util/espnow_message.py`) produces the same shape sent over serial and forwarded by the ESP32 bridge, and the same logical object can be sent as a **single JSON text message** to a Wi-Fi driver over the **WebSocket**.
|
||||
|
||||
### Top-level fields
|
||||
|
||||
|
||||
@@ -350,10 +350,10 @@ Manage connected devices and create/manage device groups.
|
||||
|
||||
#### Layout
|
||||
- **Header:** Title with "Add Device" button
|
||||
- **Tabs:** Devices and Groups tabs
|
||||
- **Content Area:** Tab-specific content
|
||||
- **Zones:** Devices and Groups zones (zone buttons / zone strip)
|
||||
- **Content Area:** Zone-specific content
|
||||
|
||||
#### Devices Tab
|
||||
#### Devices Zone
|
||||
|
||||
**Device List**
|
||||
- **Display:** List of all known devices
|
||||
@@ -375,7 +375,7 @@ Manage connected devices and create/manage device groups.
|
||||
- **Actions:** Cancel, Save
|
||||
- **Note:** Only one master device per system. Adding a new master will demote existing master to slave.
|
||||
|
||||
#### Groups Tab
|
||||
#### Groups Zone
|
||||
|
||||
**Group List**
|
||||
- **Display:** List of all device groups
|
||||
@@ -397,7 +397,7 @@ Manage connected devices and create/manage device groups.
|
||||
- **Actions:** Cancel, Create
|
||||
|
||||
#### Design Specifications
|
||||
- **Tab Style:** Active tab has purple background, white text
|
||||
- **Zone Style:** Active zone has purple background, white text
|
||||
- **List Items:** Bordered cards with hover effects
|
||||
- **Modal:** Centered overlay with white card, shadow
|
||||
- **Status Badges:** Colored pills (green for online, red for offline)
|
||||
@@ -1495,7 +1495,7 @@ peak_mem = usqlite.mem_peak()
|
||||
|
||||
### Flow 2: Create Device Group
|
||||
|
||||
1. User navigates to Device Management → Groups tab
|
||||
1. User navigates to Device Management → Groups zone
|
||||
2. User clicks "Create Group", enters name, selects pattern/settings
|
||||
3. User selects devices to add (can include master), clicks "Create"
|
||||
4. Group appears in list
|
||||
@@ -1774,7 +1774,7 @@ peak_mem = usqlite.mem_peak()
|
||||
- Buttons respond to clicks
|
||||
- Sliders update values
|
||||
- Modals open/close
|
||||
- Tabs switch correctly
|
||||
- Zone buttons switch correctly
|
||||
- Preset selector works
|
||||
- Preset creation form validates input
|
||||
- Preset cards display correctly
|
||||
|
||||
54
docs/help.md
54
docs/help.md
@@ -1,6 +1,6 @@
|
||||
# LED controller — user guide
|
||||
|
||||
This page describes the **main web UI** served from the Raspberry Pi app: profiles, tabs, presets, colour palettes, and sending commands to LED devices over the serial → ESP-NOW bridge.
|
||||
This page describes the **main web UI** served from the Raspberry Pi app: profiles, **zones**, presets, colour palettes, and sending commands to LED devices. Traffic may go over the **serial → ESP-NOW bridge** or **Wi-Fi** (TCP to drivers on the LAN), depending on each device’s transport.
|
||||
|
||||
For HTTP routes and the wire format the driver expects, see **[API.md](API.md)**. For running the app locally, see the project **README**.
|
||||
|
||||
@@ -12,38 +12,38 @@ Figures below are **schematic** (layout and ideas), not pixel-perfect screenshot
|
||||
|
||||
The header has a mode toggle (desktop and mobile menu). The **label on the button is the mode you switch to** when you press it.
|
||||
|
||||

|
||||

|
||||
|
||||
*The active tab is highlighted. Extra management buttons appear only in Edit mode.*
|
||||
*The active zone is highlighted. Extra management buttons appear only in Edit mode.*
|
||||
|
||||
| Mode | Purpose |
|
||||
|------|--------|
|
||||
| **Run mode** | Day-to-day control: choose a tab, tap presets, apply profiles. Management buttons are hidden. |
|
||||
| **Edit mode** | Full setup: tabs, presets, patterns, colour palette, **Send Presets**, profile create/clone/delete, preset reordering, and per-tile **Edit** on the strip. |
|
||||
| **Run mode** | Day-to-day control: choose a zone, tap presets, apply profiles. Management buttons are hidden. |
|
||||
| **Edit mode** | Full setup: zones, presets, patterns, colour palette, **Send Presets**, profile create/clone/delete, preset reordering, and per-tile **Edit** on the strip. |
|
||||
|
||||
**Profiles** is available in both modes: in Run mode you can only **apply** a profile; in Edit mode you can also **create**, **clone**, and **delete** profiles.
|
||||
|
||||
---
|
||||
|
||||
## Tabs
|
||||
## Zones
|
||||
|
||||
- **Select a tab**: click its button in the top bar. The main area shows that tab’s preset strip and controls.
|
||||
- **Edit mode — open tab settings**: **right-click** a tab button to change its name, **device IDs** (comma-separated), and which presets appear on the tab. Device identifiers are matched to each device’s **name** when the app builds `select` messages for the driver.
|
||||
- **Tabs modal** (Edit mode): create new tabs from the header **Tabs** button. New tabs need a name and device ID list (defaults to `1` if you leave a simple placeholder).
|
||||
- **Brightness slider** (per tab): adjusts **global** brightness sent to devices (`b` in the driver message), with a short debounce so small drags do not flood the link.
|
||||
- **Select a zone**: click its button in the top bar. The main area shows that zone’s preset strip and controls.
|
||||
- **Edit mode — open zone settings**: **right-click** a zone button to change its name, **device IDs** (comma-separated), and which presets appear on the zone. Device identifiers are matched to each device’s **name** when the app builds `select` messages for the driver.
|
||||
- **Zones modal** (Edit mode): create new zones from the header **Zones** button. New zones need a name and device ID list (defaults to `1` if you leave a simple placeholder).
|
||||
- **Brightness slider** (per zone): adjusts **global** brightness sent to devices (`b` in the driver message), with a short debounce so small drags do not flood the link.
|
||||
|
||||
---
|
||||
|
||||
## Presets on the tab strip
|
||||
## Presets on the zone strip
|
||||
|
||||
- **Run and Edit mode**: click the **main part** of a preset tile to **select** that preset on all devices assigned to the current tab (same logical action as a `select` in the driver API).
|
||||
- **Run and Edit mode**: click the **main part** of a preset tile to **select** that preset on all devices assigned to the current zone (same logical action as a `select` in the driver API).
|
||||
- **Edit mode only**:
|
||||
- **Edit** beside a tile opens the preset editor for that preset, scoped to the current tab (so you can **Remove from tab** without deleting the preset from the profile).
|
||||
- **Drag and drop** tiles to reorder them; order is saved for that tab.
|
||||
- **Edit** beside a tile opens the preset editor for that preset, scoped to the current zone (so you can **Remove from zone** without deleting the preset from the profile).
|
||||
- **Drag and drop** tiles to reorder them; order is saved for that zone.
|
||||
|
||||

|
||||

|
||||
|
||||
*The slider controls global brightness for the tab’s devices. Click the coloured area of a tile to select that preset.*
|
||||
*The slider controls global brightness for the zone’s devices. Click the coloured area of a tile to select that preset.*
|
||||
|
||||
The **Presets** header button (Edit mode) opens a **profile-wide** list: **Add** new presets, **Edit**, **Send** (push definition over the transport), and **Delete** (removes the preset from the profile entirely).
|
||||
|
||||
@@ -55,10 +55,10 @@ The **Presets** header button (Edit mode) opens a **profile-wide** list: **Add**
|
||||
- **Colours**: choosing a value in the colour picker **adds** a swatch when the picker closes. Swatches can be **reordered** by dragging. Changing a swatch with the picker **clears** palette linkage for that slot.
|
||||
- **From Palette**: inserts a colour **linked** to the current profile’s palette. Linked slots show a **P** badge; if you change that palette entry later, presets using it update.
|
||||
- **Brightness (0–255)** and **Delay (ms)**: stored on the preset and sent with the compact preset payload.
|
||||
- **Try**: sends the current form values to devices on the **current tab**, then selects that preset — **without** `save` on the device (good for auditioning).
|
||||
- **Default**: updates the tab’s **default preset** and sends a **default** hint for those devices; it does not force the same live selection behaviour as clicking a tile.
|
||||
- **Try**: sends the current form values to devices on the **current zone**, then selects that preset — **without** `save` on the device (good for auditioning).
|
||||
- **Default**: updates the zone’s **default preset** and sends a **default** hint for those devices; it does not force the same live selection behaviour as clicking a tile.
|
||||
- **Save & Send**: writes the preset to the server, then pushes definitions with **save** so devices may persist them. It does **not** auto-select the preset on devices (use the strip or **Try** if you want that).
|
||||
- **Remove from tab** (when you opened the editor from a tab): removes the preset from **this tab’s list only**; the preset remains in the profile for other tabs.
|
||||
- **Remove from zone** (when you opened the editor from a zone): removes the preset from **this zone’s list only**; the preset remains in the profile for other zones.
|
||||
|
||||

|
||||
|
||||
@@ -68,21 +68,23 @@ The **Presets** header button (Edit mode) opens a **profile-wide** list: **Add**
|
||||
|
||||
## Profiles
|
||||
|
||||
- **Apply**: sets the **current profile** in your session. Tabs and presets you see are scoped to that profile.
|
||||
- **Edit mode — Create**: new profiles always get a populated **default** tab. Optionally tick **DJ tab** to also create a `dj` tab (device name `dj`) with starter DJ-oriented presets.
|
||||
- **Apply**: sets the **current profile** in your session. Zones and presets you see are scoped to that profile.
|
||||
- **Edit mode — Create**: new profiles always get a populated **default** zone. Optionally tick **DJ zone** to also create a `dj` zone (device name `dj`) with starter DJ-oriented presets.
|
||||
- **Clone** / **Delete**: available in Edit mode from the profile list.
|
||||
|
||||
---
|
||||
|
||||
## Send Presets (Edit mode)
|
||||
|
||||
**Send Presets** walks **every tab** in the **current profile**, collects each tab’s preset IDs, and calls **`POST /presets/send`** per tab (including each tab’s **default** preset when set). Use this to bulk-push definitions to hardware after editing, without clicking **Send** on every preset individually.
|
||||
**Send Presets** walks **every zone** in the **current profile**, collects each zone’s preset IDs, and calls **`POST /presets/send`** per zone (including each zone’s **default** preset when set). Use this to bulk-push definitions to hardware after editing, without clicking **Send** on every preset individually.
|
||||
|
||||
---
|
||||
|
||||
## Patterns
|
||||
|
||||
The **Patterns** dialog (Edit mode) is a **read-only reference**: pattern names and typical **delay** ranges from the pattern definitions. It does not change device behaviour by itself; patterns are chosen inside the preset editor.
|
||||
The **Patterns** dialog (Edit mode) lists pattern names and typical **delay** ranges from the pattern definitions. Choosing a pattern still happens inside the preset editor.
|
||||
|
||||
**Wi-Fi drivers** can install new pattern modules over HTTP: the REST API exposes **`/patterns/ota/*`**, **`POST /patterns/<name>/send`**, **`POST /patterns/upload`**, and **`POST /patterns/driver`** (see [API.md](API.md)). ESP-NOW devices follow the bridge/serial path you configure for preset traffic.
|
||||
|
||||
---
|
||||
|
||||
@@ -98,15 +100,15 @@ The **Patterns** dialog (Edit mode) is a **read-only reference**: pattern names
|
||||
|
||||
## Mobile layout
|
||||
|
||||
On narrow screens, use **Menu** to reach the same actions as the desktop header (Profiles, Tabs, Presets, Help, mode toggle, etc.).
|
||||
On narrow screens, use **Menu** to reach the same actions as the desktop header (Profiles, Zones, Presets, Help, mode toggle, etc.).
|
||||
|
||||

|
||||
|
||||
*Preset tiles behave the same once a tab is selected.*
|
||||
*Preset tiles behave the same once a zone is selected.*
|
||||
|
||||
---
|
||||
|
||||
## Further reading
|
||||
|
||||
- **[API.md](API.md)** — REST routes, session scoping, WebSocket `/ws`, and LED driver JSON (`presets`, `select`, `save`, `default`, pattern keys).
|
||||
- **[API.md](API.md)** — REST routes, session scoping, WebSocket `/ws`, and LED driver JSON (`presets`, `select`, `save`, `default`, pattern keys, pattern **manifest**).
|
||||
- **README** — `pipenv run run`, port 80 setup, and high-level behaviour.
|
||||
|
||||
BIN
docs/help.pdf
BIN
docs/help.pdf
Binary file not shown.
@@ -67,7 +67,7 @@
|
||||
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.tab {
|
||||
.zone {
|
||||
flex: 1;
|
||||
padding: 12px 24px;
|
||||
border: none;
|
||||
@@ -78,16 +78,16 @@
|
||||
transition: all 0.2s;
|
||||
}
|
||||
|
||||
.tab.active {
|
||||
.zone.active {
|
||||
background: #667eea;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.tab-content {
|
||||
.zone-content {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.tab-content.active {
|
||||
.zone-content.active {
|
||||
display: block;
|
||||
}
|
||||
|
||||
@@ -249,12 +249,12 @@
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('devices')">Devices</button>
|
||||
<button class="tab" onclick="switchTab('groups')">Groups</button>
|
||||
<button class="zone active" onclick="switchTab('devices')">Devices</button>
|
||||
<button class="zone" onclick="switchTab('groups')">Groups</button>
|
||||
</div>
|
||||
|
||||
<!-- Devices Tab -->
|
||||
<div id="devices-tab" class="tab-content active">
|
||||
<!-- Devices Zone -->
|
||||
<div id="devices-zone" class="zone-content active">
|
||||
<div class="card">
|
||||
<h2>Connected Devices</h2>
|
||||
<div class="device-item">
|
||||
@@ -313,8 +313,8 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Groups Tab -->
|
||||
<div id="groups-tab" class="tab-content">
|
||||
<!-- Groups Zone -->
|
||||
<div id="groups-zone" class="zone-content">
|
||||
<div class="card">
|
||||
<div style="display: flex; justify-content: space-between; align-items: center; margin-bottom: 20px;">
|
||||
<h2>Groups</h2>
|
||||
@@ -386,12 +386,12 @@
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function switchTab(tab) {
|
||||
document.querySelectorAll('.tab').forEach(t => t.classList.remove('active'));
|
||||
document.querySelectorAll('.tab-content').forEach(c => c.classList.remove('active'));
|
||||
function switchTab(zone) {
|
||||
document.querySelectorAll('.zone').forEach(t => t.classList.remove('active'));
|
||||
document.querySelectorAll('.zone-content').forEach(c => c.classList.remove('active'));
|
||||
|
||||
event.target.classList.add('active');
|
||||
document.getElementById(tab + '-tab').classList.add('active');
|
||||
document.getElementById(zone + '-zone').classList.add('active');
|
||||
}
|
||||
|
||||
function showAddDeviceModal() {
|
||||
|
||||
@@ -1,112 +0,0 @@
|
||||
# Benchmark: LRU eviction vs add-then-remove-after-use on ESP32.
|
||||
# Run on device: mpremote run esp32/benchmark_peers.py
|
||||
# (add/del_peer are timed; send() may fail if no peer is listening - timing still valid)
|
||||
import espnow
|
||||
import network
|
||||
import time
|
||||
|
||||
BROADCAST = b"\xff\xff\xff\xff\xff\xff"
|
||||
MAX_PEERS = 20
|
||||
ITERATIONS = 50
|
||||
PAYLOAD = b"x" * 32 # small payload
|
||||
|
||||
network.WLAN(network.STA_IF).active(True)
|
||||
esp = espnow.ESPNow()
|
||||
esp.active(True)
|
||||
esp.add_peer(BROADCAST)
|
||||
|
||||
# Build 19 dummy MACs so we have 20 peers total (broadcast + 19).
|
||||
def mac(i):
|
||||
return bytes([0, 0, 0, 0, 0, i])
|
||||
peers_list = [mac(i) for i in range(1, 20)]
|
||||
for p in peers_list:
|
||||
esp.add_peer(p)
|
||||
|
||||
# One "new" MAC we'll add/remove.
|
||||
new_mac = bytes([0, 0, 0, 0, 0, 99])
|
||||
|
||||
def bench_lru():
|
||||
"""LRU: ensure_peer (evict oldest + add new), send, update last_used."""
|
||||
last_used = {BROADCAST: time.ticks_ms()}
|
||||
for p in peers_list:
|
||||
last_used[p] = time.ticks_ms()
|
||||
# Pre-remove one so we have 19; ensure_peer(new) will add 20th.
|
||||
esp.del_peer(peers_list[-1])
|
||||
last_used.pop(peers_list[-1], None)
|
||||
# Now 19 peers. Each iteration: ensure_peer(new) -> add_peer(new), send, update.
|
||||
# Next iter: ensure_peer(new) -> already there, just send. So we need to force
|
||||
# eviction each time: use a different "new" each time so we always evict+add.
|
||||
t0 = time.ticks_us()
|
||||
for i in range(ITERATIONS):
|
||||
addr = bytes([0, 0, 0, 0, 0, 50 + (i % 30)]) # 30 different "new" MACs
|
||||
peers = esp.get_peers()
|
||||
peer_macs = [p[0] for p in peers]
|
||||
if addr not in peer_macs:
|
||||
if len(peer_macs) >= MAX_PEERS:
|
||||
oldest_mac = None
|
||||
oldest_ts = time.ticks_ms()
|
||||
for m in peer_macs:
|
||||
if m == BROADCAST:
|
||||
continue
|
||||
ts = last_used.get(m, 0)
|
||||
if ts <= oldest_ts:
|
||||
oldest_ts = ts
|
||||
oldest_mac = m
|
||||
if oldest_mac is not None:
|
||||
esp.del_peer(oldest_mac)
|
||||
last_used.pop(oldest_mac, None)
|
||||
esp.add_peer(addr)
|
||||
esp.send(addr, PAYLOAD)
|
||||
last_used[addr] = time.ticks_ms()
|
||||
t1 = time.ticks_us()
|
||||
return time.ticks_diff(t1, t0)
|
||||
|
||||
def bench_add_then_remove():
|
||||
"""Add peer, send, del_peer (remove after use). At 20 we must del one first."""
|
||||
# Start full: 20 peers. To add new we del any one, add new, send, del new.
|
||||
victim = peers_list[0]
|
||||
t0 = time.ticks_us()
|
||||
for i in range(ITERATIONS):
|
||||
esp.del_peer(victim) # make room
|
||||
esp.add_peer(new_mac)
|
||||
esp.send(new_mac, PAYLOAD)
|
||||
esp.del_peer(new_mac)
|
||||
esp.add_peer(victim) # put victim back so we're at 20 again
|
||||
t1 = time.ticks_us()
|
||||
return time.ticks_diff(t1, t0)
|
||||
|
||||
def bench_send_existing():
|
||||
"""Baseline: send to existing peer only (no add/del)."""
|
||||
t0 = time.ticks_us()
|
||||
for _ in range(ITERATIONS):
|
||||
esp.send(peers_list[0], PAYLOAD)
|
||||
t1 = time.ticks_us()
|
||||
return time.ticks_diff(t1, t0)
|
||||
|
||||
print("ESP-NOW peer benchmark ({} iterations)".format(ITERATIONS))
|
||||
print()
|
||||
|
||||
# Baseline: send to existing peer
|
||||
try:
|
||||
us = bench_send_existing()
|
||||
print("Send to existing peer only: {:>8} us total {:>7.1f} us/iter".format(us, us / ITERATIONS))
|
||||
except Exception as e:
|
||||
print("Send existing failed:", e)
|
||||
print()
|
||||
|
||||
# LRU: evict oldest then add new, send
|
||||
try:
|
||||
us = bench_lru()
|
||||
print("LRU (evict oldest + add + send): {:>8} us total {:>7.1f} us/iter".format(us, us / ITERATIONS))
|
||||
except Exception as e:
|
||||
print("LRU failed:", e)
|
||||
print()
|
||||
|
||||
# Add then remove after use
|
||||
try:
|
||||
us = bench_add_then_remove()
|
||||
print("Add then remove after use: {:>8} us total {:>7.1f} us/iter".format(us, us / ITERATIONS))
|
||||
except Exception as e:
|
||||
print("Add-then-remove failed:", e)
|
||||
print()
|
||||
print("Done.")
|
||||
@@ -1,72 +0,0 @@
|
||||
# Serial-to-ESP-NOW bridge: receives from Pi on UART, forwards to ESP-NOW peers.
|
||||
# Wire format: first 6 bytes = destination MAC, rest = payload. Address is always 6 bytes.
|
||||
from machine import Pin, UART
|
||||
import espnow
|
||||
import network
|
||||
import time
|
||||
|
||||
UART_BAUD = 912000
|
||||
BROADCAST = b"\xff\xff\xff\xff\xff\xff"
|
||||
MAX_PEERS = 20
|
||||
# Match led-driver / controller default settings wifi_channel (1–11)
|
||||
WIFI_CHANNEL = 6
|
||||
|
||||
sta = network.WLAN(network.STA_IF)
|
||||
sta.active(True)
|
||||
sta.config(pm=network.WLAN.PM_NONE, channel=WIFI_CHANNEL)
|
||||
print("WiFi STA channel:", sta.config("channel"), "(WIFI_CHANNEL=%s)" % WIFI_CHANNEL)
|
||||
|
||||
esp = espnow.ESPNow()
|
||||
esp.active(True)
|
||||
esp.add_peer(BROADCAST)
|
||||
|
||||
uart = UART(1, UART_BAUD, tx=Pin(21), rx=Pin(6))
|
||||
|
||||
# Track last send time per peer for LRU eviction (remove oldest when at limit).
|
||||
last_used = {BROADCAST: time.ticks_ms()}
|
||||
|
||||
|
||||
# ESP_ERR_ESPNOW_EXIST: peer already registered (ignore when adding).
|
||||
ESP_ERR_ESPNOW_EXIST = -12395
|
||||
|
||||
|
||||
def ensure_peer(addr):
|
||||
"""Ensure addr is in the peer list. When at 20 peers, remove the oldest-used (LRU)."""
|
||||
peers = esp.get_peers()
|
||||
peer_macs = [p[0] for p in peers]
|
||||
if addr in peer_macs:
|
||||
return
|
||||
if len(peer_macs) >= MAX_PEERS:
|
||||
# Remove the peer we used least recently (oldest).
|
||||
oldest_mac = None
|
||||
oldest_ts = time.ticks_ms()
|
||||
for mac in peer_macs:
|
||||
if mac == BROADCAST:
|
||||
continue
|
||||
ts = last_used.get(mac, 0)
|
||||
if ts <= oldest_ts:
|
||||
oldest_ts = ts
|
||||
oldest_mac = mac
|
||||
if oldest_mac is not None:
|
||||
esp.del_peer(oldest_mac)
|
||||
last_used.pop(oldest_mac, None)
|
||||
try:
|
||||
esp.add_peer(addr)
|
||||
except OSError as e:
|
||||
if e.args[0] != ESP_ERR_ESPNOW_EXIST:
|
||||
raise
|
||||
|
||||
|
||||
print("Starting ESP32 main.py")
|
||||
|
||||
while True:
|
||||
if uart.any():
|
||||
data = uart.read()
|
||||
if not data or len(data) < 6:
|
||||
continue
|
||||
print(f"Received data: {data}")
|
||||
addr = data[:6]
|
||||
payload = data[6:]
|
||||
ensure_peer(addr)
|
||||
esp.send(addr, payload)
|
||||
last_used[addr] = time.ticks_ms()
|
||||
7
espnow-sender/README.md
Normal file
7
espnow-sender/README.md
Normal file
@@ -0,0 +1,7 @@
|
||||
# espnow-sender
|
||||
|
||||
Minimal MicroPython project for receiving JSON over Microdot WebSocket.
|
||||
|
||||
- WebSocket endpoint: `/ws`
|
||||
- Entry point: `main.py`
|
||||
- Message template: `msg.json`
|
||||
120
espnow-sender/main.py
Normal file
120
espnow-sender/main.py
Normal file
@@ -0,0 +1,120 @@
|
||||
import asyncio
|
||||
import json
|
||||
|
||||
from microdot import Microdot
|
||||
from microdot.websocket import WebSocketError, with_websocket
|
||||
|
||||
import espnow
|
||||
import network
|
||||
from util import format_mac, parse_mac
|
||||
|
||||
|
||||
app = Microdot()
|
||||
_esp = None
|
||||
_known_peers = set()
|
||||
_ws_clients = set()
|
||||
|
||||
|
||||
def _init_espnow():
|
||||
global _esp
|
||||
sta = network.WLAN(network.STA_IF)
|
||||
sta.active(True)
|
||||
_esp = espnow.ESPNow()
|
||||
_esp.active(True)
|
||||
|
||||
|
||||
def _validate_envelope(obj):
|
||||
if obj.get("v") != "1":
|
||||
raise ValueError("message.v must be '1'")
|
||||
devices = obj["devices"]
|
||||
for address in devices.keys():
|
||||
parse_mac(address)
|
||||
return obj
|
||||
|
||||
|
||||
def _send_espnow(address, payload):
|
||||
if _esp is None:
|
||||
raise ValueError("espnow is not initialized")
|
||||
mac = parse_mac(address)
|
||||
msg = json.dumps(payload, separators=(",", ":")).encode("utf-8")
|
||||
if mac not in _known_peers:
|
||||
_esp.add_peer(mac)
|
||||
_known_peers.add(mac)
|
||||
_esp.send(mac, msg)
|
||||
return mac, len(msg)
|
||||
|
||||
|
||||
async def _broadcast_ws(obj):
|
||||
text = json.dumps(obj)
|
||||
dead = []
|
||||
for client in list(_ws_clients):
|
||||
try:
|
||||
await client.send(text)
|
||||
except Exception:
|
||||
dead.append(client)
|
||||
for client in dead:
|
||||
_ws_clients.discard(client)
|
||||
|
||||
|
||||
async def _espnow_receive_loop():
|
||||
while True:
|
||||
host, msg = _esp.recv(0)
|
||||
if not host:
|
||||
await asyncio.sleep(0.01)
|
||||
continue
|
||||
await _broadcast_ws(
|
||||
{
|
||||
"from": format_mac(host),
|
||||
"payload": msg.decode("utf-8"),
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@app.route("/ws")
|
||||
@with_websocket
|
||||
async def ws(request, ws):
|
||||
_ws_clients.add(ws)
|
||||
while True:
|
||||
try:
|
||||
raw = await ws.receive()
|
||||
except WebSocketError:
|
||||
break
|
||||
|
||||
if not raw:
|
||||
break
|
||||
|
||||
try:
|
||||
parsed = json.loads(raw)
|
||||
env = _validate_envelope(parsed)
|
||||
sent = []
|
||||
for address, payload in env["devices"].items():
|
||||
mac, payload_size = _send_espnow(address, payload)
|
||||
sent.append(
|
||||
{
|
||||
"address": format_mac(mac),
|
||||
"bytes": payload_size,
|
||||
}
|
||||
)
|
||||
except (ValueError, TypeError) as e:
|
||||
await ws.send(json.dumps({"ok": False, "error": str(e)}))
|
||||
continue
|
||||
|
||||
await ws.send(
|
||||
json.dumps(
|
||||
{
|
||||
"ok": True,
|
||||
"sent": sent,
|
||||
}
|
||||
)
|
||||
)
|
||||
_ws_clients.discard(ws)
|
||||
|
||||
|
||||
async def main(port=80):
|
||||
_init_espnow()
|
||||
asyncio.create_task(_espnow_receive_loop())
|
||||
await app.start_server(host="0.0.0.0", port=port)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main(port=80))
|
||||
24
espnow-sender/msg.json
Normal file
24
espnow-sender/msg.json
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"v": "1",
|
||||
"devices": {
|
||||
"ff:ff:ff:ff:ff:ff": {
|
||||
"presets": {
|
||||
"preset_id": {
|
||||
"pattern": "on",
|
||||
"colors": ["#FF0000"],
|
||||
"delay": 100,
|
||||
"brightness": 255,
|
||||
"auto": true
|
||||
}
|
||||
},
|
||||
"select": {
|
||||
"preset": "preset_id",
|
||||
"step": 0
|
||||
},
|
||||
"save": true,
|
||||
"default": "preset_id",
|
||||
"b": 255
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
12
espnow-sender/util.py
Normal file
12
espnow-sender/util.py
Normal file
@@ -0,0 +1,12 @@
|
||||
def parse_mac(value):
|
||||
raw = value.strip().lower().replace(":", "").replace("-", "")
|
||||
if len(raw) != 12:
|
||||
raise ValueError("address must be 12 hex chars or aa:bb:cc:dd:ee:ff")
|
||||
try:
|
||||
return bytes.fromhex(raw)
|
||||
except ValueError:
|
||||
raise ValueError("address contains non-hex characters")
|
||||
|
||||
|
||||
def format_mac(mac_bytes):
|
||||
return ":".join("{:02x}".format(b) for b in mac_bytes)
|
||||
Submodule led-driver updated: c42dff8975...2a768376d0
1
led-simulator
Submodule
1
led-simulator
Submodule
Submodule led-simulator added at 42c14361e8
2
led-tool
2
led-tool
Submodule led-tool updated: 3844aa9d6a...580fd11aca
123
led_bar_vertical_stand.scad
Normal file
123
led_bar_vertical_stand.scad
Normal file
@@ -0,0 +1,123 @@
|
||||
// Parametric LED bar vertical stand socket
|
||||
// For a bar nominally 14 x 17 mm, 2 m long.
|
||||
// This part is intended to be screwed to an MDF base.
|
||||
|
||||
// -------------------------
|
||||
// User parameters
|
||||
// -------------------------
|
||||
bar_w = 14; // Bar width (mm)
|
||||
bar_d = 17; // Bar depth (mm)
|
||||
clearance = 0.4; // Total clearance added to each axis (mm)
|
||||
|
||||
socket_height = 36; // Height of printed socket body (mm)
|
||||
wall = 3.2; // Socket wall thickness (mm)
|
||||
base_thickness = 5; // Printed bottom plate thickness (mm)
|
||||
|
||||
// USB cable/connector side opening
|
||||
usb_notch_enable = true;
|
||||
usb_notch_w = 11;
|
||||
usb_notch_h = 9;
|
||||
usb_notch_from_bottom = 6;
|
||||
usb_notch_side = "right"; // "right" or "left"
|
||||
|
||||
// Mounting ears for MDF screws
|
||||
ear_enable = true;
|
||||
ear_len = 16;
|
||||
ear_w = 16;
|
||||
ear_thickness = base_thickness;
|
||||
screw_hole_d = 4.2; // M4 clearance. Use 3.4 for M3.
|
||||
screw_hole_edge = 5.5; // Hole center offset from ear outer corner
|
||||
|
||||
// Optional clamp lip at top to reduce wobble
|
||||
top_lip_enable = true;
|
||||
top_lip_depth = 2.0; // Intrudes into opening on each side
|
||||
top_lip_height = 3.0;
|
||||
|
||||
$fn = 48;
|
||||
|
||||
// -------------------------
|
||||
// Derived
|
||||
// -------------------------
|
||||
inner_w = bar_w + clearance;
|
||||
inner_d = bar_d + clearance;
|
||||
|
||||
outer_w = inner_w + wall * 2;
|
||||
outer_d = inner_d + wall * 2;
|
||||
outer_h = socket_height;
|
||||
|
||||
module screw_hole() {
|
||||
cylinder(h = ear_thickness + 0.2, d = screw_hole_d);
|
||||
}
|
||||
|
||||
module mounting_ear(sign_y = 1) {
|
||||
translate([outer_w / 2, sign_y * (outer_d / 2), 0])
|
||||
cube([ear_len, ear_w, ear_thickness], center = false);
|
||||
}
|
||||
|
||||
module top_lip() {
|
||||
if (top_lip_enable) {
|
||||
// Front and back lips at the top of the socket.
|
||||
translate([wall, wall, outer_h - top_lip_height])
|
||||
cube([top_lip_depth, inner_d, top_lip_height]);
|
||||
|
||||
translate([outer_w - wall - top_lip_depth, wall, outer_h - top_lip_height])
|
||||
cube([top_lip_depth, inner_d, top_lip_height]);
|
||||
}
|
||||
}
|
||||
|
||||
difference() {
|
||||
union() {
|
||||
// Main body
|
||||
cube([outer_w, outer_d, outer_h], center = false);
|
||||
|
||||
// Base plate under socket for stiffness
|
||||
translate([0, 0, -base_thickness])
|
||||
cube([outer_w, outer_d, base_thickness], center = false);
|
||||
|
||||
// Mounting ears
|
||||
if (ear_enable) {
|
||||
translate([0, 0, -ear_thickness]) {
|
||||
mounting_ear(1);
|
||||
mounting_ear(-1);
|
||||
}
|
||||
}
|
||||
|
||||
top_lip();
|
||||
}
|
||||
|
||||
// Main bar cavity
|
||||
translate([wall, wall, 0])
|
||||
cube([inner_w, inner_d, outer_h + 0.2], center = false);
|
||||
|
||||
// USB side notch
|
||||
if (usb_notch_enable) {
|
||||
if (usb_notch_side == "right") {
|
||||
translate([outer_w - wall - 0.1, (outer_d - usb_notch_w) / 2, usb_notch_from_bottom])
|
||||
cube([wall + 0.3, usb_notch_w, usb_notch_h], center = false);
|
||||
} else {
|
||||
translate([-0.2, (outer_d - usb_notch_w) / 2, usb_notch_from_bottom])
|
||||
cube([wall + 0.3, usb_notch_w, usb_notch_h], center = false);
|
||||
}
|
||||
}
|
||||
|
||||
// Screw holes in ears
|
||||
if (ear_enable) {
|
||||
// Upper ear hole
|
||||
translate([
|
||||
outer_w / 2 + ear_len - screw_hole_edge,
|
||||
outer_d / 2 + ear_w - screw_hole_edge,
|
||||
-ear_thickness - 0.05
|
||||
]) screw_hole();
|
||||
|
||||
// Lower ear hole
|
||||
translate([
|
||||
outer_w / 2 + ear_len - screw_hole_edge,
|
||||
-outer_d / 2 + screw_hole_edge,
|
||||
-ear_thickness - 0.05
|
||||
]) screw_hole();
|
||||
}
|
||||
}
|
||||
|
||||
// Print orientation helper:
|
||||
// Keep the base/ears on the bed.
|
||||
// If fit is tight, increase clearance to 0.5 or 0.6.
|
||||
3
pyproject.toml
Normal file
3
pyproject.toml
Normal file
@@ -0,0 +1,3 @@
|
||||
[tool.pytest.ini_options]
|
||||
testpaths = ["tests"]
|
||||
python_files = ["test_endpoints_pytest.py"]
|
||||
@@ -1,4 +0,0 @@
|
||||
[pytest]
|
||||
testpaths = tests
|
||||
python_files = test_endpoints_pytest.py
|
||||
|
||||
@@ -1,4 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
# Copy esp32/main.py to the connected ESP32 as /main.py (single line, no wrap).
|
||||
cd "$(dirname "$0")/.."
|
||||
pipenv run mpremote fs cp esp32/main.py :/main.py
|
||||
16
scripts/dev-run.sh
Normal file
16
scripts/dev-run.sh
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
PORT="${PORT:-80}"
|
||||
|
||||
# On watchfiles restarts the previous process can linger briefly.
|
||||
# Proactively terminate any listener on the target port before boot.
|
||||
pids="$(ss -ltnp "sport = :$PORT" 2>/dev/null | sed -n 's/.*pid=\([0-9]\+\).*/\1/p' | sort -u)"
|
||||
if [ -n "${pids}" ]; then
|
||||
kill -TERM ${pids} 2>/dev/null || true
|
||||
sleep 0.3
|
||||
fi
|
||||
|
||||
cd "$ROOT_DIR/src"
|
||||
exec python main.py
|
||||
@@ -10,6 +10,18 @@ if [ ! -f "scripts/led-controller.service" ]; then
|
||||
echo "Run this script from the repo root."
|
||||
exit 1
|
||||
fi
|
||||
export PIPENV_VENV_IN_PROJECT="${PIPENV_VENV_IN_PROJECT:-1}"
|
||||
if command -v pipenv >/dev/null 2>&1; then
|
||||
PY="$(command -v python3)"
|
||||
if [ -z "$PY" ]; then
|
||||
echo "python3 not found; install python3." >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "Ensuring Pipenv deps with $PY (venv in project: .venv when PIPENV_VENV_IN_PROJECT=1)…"
|
||||
# --skip-lock: install from Pipfile only (avoids lock/Python hash mismatches on device).
|
||||
pipenv install --quiet --skip-lock --python "$PY"
|
||||
pipenv --venv > scripts/.led-controller-venv
|
||||
fi
|
||||
chmod +x scripts/start.sh
|
||||
sudo cp "scripts/led-controller.service" "$UNIT_PATH"
|
||||
sudo systemctl daemon-reload
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
[Unit]
|
||||
Description=LED Controller web server
|
||||
After=network-online.target
|
||||
Wants=network-online.target
|
||||
# Use network.target only. Ordering after network-online.target can block `systemctl start`
|
||||
# until wait-online finishes; Wi‑Fi/DHCP delays then look like a hung start job.
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
@@ -12,6 +13,8 @@ Environment=PATH=/home/pi/.local/bin:/usr/local/bin:/usr/bin:/bin
|
||||
ExecStart=/bin/bash /home/pi/led-controller/scripts/start.sh
|
||||
Restart=on-failure
|
||||
RestartSec=5
|
||||
# pipenv/first bind can be slow; avoid misleading "activating" forever if misconfigured
|
||||
TimeoutStartSec=120
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
|
||||
253
scripts/pi-eth-lan-router.sh
Executable file
253
scripts/pi-eth-lan-router.sh
Executable file
@@ -0,0 +1,253 @@
|
||||
#!/usr/bin/env bash
|
||||
# Configure Raspberry Pi OS: Wi-Fi client on IF_WAN (default wlan0), Ethernet IF_LAN
|
||||
# (default eth0) toward an external AP. Static LAN IP, DHCP via dnsmasq, NAT masquerade.
|
||||
#
|
||||
# Usage:
|
||||
# sudo ./pi-eth-lan-router.sh install
|
||||
# sudo ./pi-eth-lan-router.sh remove
|
||||
#
|
||||
# Environment overrides (optional):
|
||||
# IF_WAN=wlan0 IF_LAN=eth0 LAN_IP=192.168.4.1 LAN_PREFIX=24 \
|
||||
# DHCP_START=192.168.4.100 DHCP_END=192.168.4.200 \
|
||||
# DNSMASQ_DNS=1.1.1.1,8.8.8.8 \
|
||||
# sudo ./pi-eth-lan-router.sh install
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
IF_WAN="${IF_WAN:-wlan0}"
|
||||
IF_LAN="${IF_LAN:-eth0}"
|
||||
LAN_IP="${LAN_IP:-192.168.4.1}"
|
||||
LAN_PREFIX="${LAN_PREFIX:-24}"
|
||||
DHCP_START="${DHCP_START:-192.168.4.100}"
|
||||
DHCP_END="${DHCP_END:-192.168.4.200}"
|
||||
# Comma-separated DNS for DHCP clients (Pi does not need to run a resolver).
|
||||
DNSMASQ_DNS="${DNSMASQ_DNS:-1.1.1.1,8.8.8.8}"
|
||||
|
||||
NM_CON_NAME="pi-eth-lan-router"
|
||||
MARK_BEGIN="# BEGIN pi-eth-lan-router (scripts/pi-eth-lan-router.sh)"
|
||||
MARK_END="# END pi-eth-lan-router"
|
||||
SYSCTL_FILE="/etc/sysctl.d/99-pi-eth-lan-router.conf"
|
||||
DNSMASQ_SNIPPET="/etc/dnsmasq.d/pi-eth-lan-router.conf"
|
||||
NFT_SNIPPET="/etc/nftables.d/50-pi-eth-lan-router.nft"
|
||||
NFT_INCLUDE='include "/etc/nftables.d/50-pi-eth-lan-router.nft"'
|
||||
NFTABLES_CONF="/etc/nftables.conf"
|
||||
DHCPCD_CONF="/etc/dhcpcd.conf"
|
||||
|
||||
die() { echo "error: $*" >&2; exit 1; }
|
||||
log() { echo "$*"; }
|
||||
|
||||
need_root() {
|
||||
[[ "${EUID:-0}" -eq 0 ]] || die "run as root (sudo)"
|
||||
}
|
||||
|
||||
have_cmd() { command -v "$1" >/dev/null 2>&1; }
|
||||
|
||||
apt_install() {
|
||||
export DEBIAN_FRONTEND=noninteractive
|
||||
apt-get update -qq
|
||||
apt-get install -y -qq dnsmasq nftables
|
||||
}
|
||||
|
||||
write_sysctl() {
|
||||
cat >"$SYSCTL_FILE" <<EOF
|
||||
# Managed by scripts/pi-eth-lan-router.sh
|
||||
net.ipv4.ip_forward=1
|
||||
EOF
|
||||
sysctl --system -q 2>/dev/null || sysctl -p "$SYSCTL_FILE" || true
|
||||
}
|
||||
|
||||
remove_sysctl() {
|
||||
rm -f "$SYSCTL_FILE"
|
||||
sysctl --system -q 2>/dev/null || true
|
||||
}
|
||||
|
||||
write_dnsmasq() {
|
||||
local mask="255.255.255.0"
|
||||
if [[ "$LAN_PREFIX" != "24" ]]; then
|
||||
die "only LAN_PREFIX=24 is supported by this script (extend dnsmasq netmask manually)"
|
||||
fi
|
||||
cat >"$DNSMASQ_SNIPPET" <<EOF
|
||||
# Managed by scripts/pi-eth-lan-router.sh
|
||||
interface=$IF_LAN
|
||||
bind-interfaces
|
||||
dhcp-range=$DHCP_START,$DHCP_END,$mask,24h
|
||||
dhcp-option=option:router,$LAN_IP
|
||||
dhcp-option=option:dns-server,$DNSMASQ_DNS
|
||||
EOF
|
||||
}
|
||||
|
||||
remove_dnsmasq() {
|
||||
rm -f "$DNSMASQ_SNIPPET"
|
||||
}
|
||||
|
||||
write_nft() {
|
||||
mkdir -p /etc/nftables.d
|
||||
cat >"$NFT_SNIPPET" <<EOF
|
||||
# Managed by scripts/pi-eth-lan-router.sh
|
||||
table ip pi_eth_wlan_nat {
|
||||
chain postrouting {
|
||||
type nat hook postrouting priority 100; policy accept;
|
||||
oifname "$IF_WAN" masquerade
|
||||
}
|
||||
}
|
||||
EOF
|
||||
if [[ -f "$NFTABLES_CONF" ]] && ! grep -qF '50-pi-eth-lan-router.nft' "$NFTABLES_CONF" 2>/dev/null; then
|
||||
printf '\n# pi-eth-lan-router\n%s\n' "$NFT_INCLUDE" >>"$NFTABLES_CONF"
|
||||
elif [[ ! -f "$NFTABLES_CONF" ]]; then
|
||||
log "warning: $NFTABLES_CONF missing; NAT was not added for boot persistence. Install/configure nftables, or add: $NFT_INCLUDE"
|
||||
fi
|
||||
}
|
||||
|
||||
remove_nft() {
|
||||
rm -f "$NFT_SNIPPET"
|
||||
if [[ -f "$NFTABLES_CONF" ]]; then
|
||||
sed -i '/# pi-eth-lan-router/d;/50-pi-eth-lan-router\.nft/d' "$NFTABLES_CONF" || true
|
||||
fi
|
||||
nft delete table ip pi_eth_wlan_nat 2>/dev/null || true
|
||||
}
|
||||
|
||||
apply_nft() {
|
||||
if have_cmd nft; then
|
||||
nft delete table ip pi_eth_wlan_nat 2>/dev/null || true
|
||||
nft -f "$NFT_SNIPPET"
|
||||
fi
|
||||
}
|
||||
|
||||
configure_nm_eth() {
|
||||
have_cmd nmcli || return 1
|
||||
systemctl is-active --quiet NetworkManager 2>/dev/null || return 1
|
||||
|
||||
if nmcli -t -f NAME con show --active 2>/dev/null | grep -qxF "$NM_CON_NAME"; then
|
||||
nmcli con down "$NM_CON_NAME" || true
|
||||
fi
|
||||
if nmcli -t -f NAME con show 2>/dev/null | grep -qxF "$NM_CON_NAME"; then
|
||||
nmcli con mod "$NM_CON_NAME" \
|
||||
connection.interface-name "$IF_LAN" \
|
||||
ipv4.method manual \
|
||||
ipv4.addresses "${LAN_IP}/${LAN_PREFIX}" \
|
||||
ipv4.gateway "" \
|
||||
ipv4.dns "" \
|
||||
ipv4.never-default yes \
|
||||
ipv6.method ignore
|
||||
else
|
||||
nmcli con add type ethernet con-name "$NM_CON_NAME" ifname "$IF_LAN" \
|
||||
ipv4.method manual \
|
||||
ipv4.addresses "${LAN_IP}/${LAN_PREFIX}" \
|
||||
ipv4.gateway "" \
|
||||
ipv4.dns "" \
|
||||
ipv4.never-default yes \
|
||||
ipv6.method ignore
|
||||
fi
|
||||
if ! nmcli con up "$NM_CON_NAME"; then
|
||||
log "warning: could not activate '$NM_CON_NAME' (is $IF_LAN connected?); profile saved for next boot."
|
||||
fi
|
||||
return 0
|
||||
}
|
||||
|
||||
remove_nm_eth() {
|
||||
have_cmd nmcli || return 0
|
||||
if nmcli -t -f NAME con show 2>/dev/null | grep -qxF "$NM_CON_NAME"; then
|
||||
nmcli con delete "$NM_CON_NAME" || true
|
||||
fi
|
||||
}
|
||||
|
||||
configure_dhcpcd_eth() {
|
||||
[[ -f "$DHCPCD_CONF" ]] || return 1
|
||||
if grep -qF "$MARK_BEGIN" "$DHCPCD_CONF" 2>/dev/null; then
|
||||
sed -i "/$MARK_BEGIN/,/$MARK_END/d" "$DHCPCD_CONF" || true
|
||||
fi
|
||||
{
|
||||
echo "$MARK_BEGIN"
|
||||
echo "interface $IF_LAN"
|
||||
echo "static ip_address=${LAN_IP}/${LAN_PREFIX}"
|
||||
echo "nohook wpa_supplicant"
|
||||
echo "$MARK_END"
|
||||
} >>"$DHCPCD_CONF"
|
||||
systemctl restart dhcpcd 2>/dev/null || true
|
||||
return 0
|
||||
}
|
||||
|
||||
remove_dhcpcd_block() {
|
||||
[[ -f "$DHCPCD_CONF" ]] || return 0
|
||||
if grep -qF "$MARK_BEGIN" "$DHCPCD_CONF" 2>/dev/null; then
|
||||
sed -i "/$MARK_BEGIN/,/$MARK_END/d" "$DHCPCD_CONF" || true
|
||||
systemctl restart dhcpcd 2>/dev/null || true
|
||||
fi
|
||||
}
|
||||
|
||||
configure_eth_static() {
|
||||
if configure_nm_eth; then
|
||||
log "configured $IF_LAN via NetworkManager profile '$NM_CON_NAME'"
|
||||
return 0
|
||||
fi
|
||||
if configure_dhcpcd_eth; then
|
||||
log "configured $IF_LAN via dhcpcd ($DHCPCD_CONF)"
|
||||
return 0
|
||||
fi
|
||||
die "neither NetworkManager (active) nor $DHCPCD_CONF found; set $IF_LAN to ${LAN_IP}/${LAN_PREFIX} manually"
|
||||
}
|
||||
|
||||
remove_eth_static() {
|
||||
remove_nm_eth
|
||||
remove_dhcpcd_block
|
||||
}
|
||||
|
||||
do_install() {
|
||||
need_root
|
||||
log "installing packages (dnsmasq, nftables)…"
|
||||
apt_install
|
||||
|
||||
log "writing sysctl, dnsmasq, nftables snippets…"
|
||||
write_sysctl
|
||||
write_dnsmasq
|
||||
write_nft
|
||||
|
||||
log "setting static IP on $IF_LAN…"
|
||||
configure_eth_static
|
||||
|
||||
log "restarting dnsmasq…"
|
||||
systemctl enable dnsmasq
|
||||
systemctl restart dnsmasq
|
||||
|
||||
log "loading NAT rules and enabling nftables…"
|
||||
apply_nft
|
||||
systemctl enable nftables 2>/dev/null || true
|
||||
systemctl restart nftables 2>/dev/null || true
|
||||
|
||||
log "done. Connect $IF_LAN to the external AP (DHCP off on the AP)."
|
||||
log "Join Wi-Fi on $IF_WAN to the uplink network and complete any captive portal on the Pi."
|
||||
}
|
||||
|
||||
do_remove() {
|
||||
need_root
|
||||
remove_eth_static
|
||||
remove_dnsmasq
|
||||
systemctl restart dnsmasq 2>/dev/null || true
|
||||
|
||||
remove_nft
|
||||
systemctl restart nftables 2>/dev/null || true
|
||||
|
||||
remove_sysctl
|
||||
sysctl -w net.ipv4.ip_forward=0 2>/dev/null || true
|
||||
|
||||
log "removed pi-eth-lan-router configuration snippets and NM profile '$NM_CON_NAME' (if present)."
|
||||
}
|
||||
|
||||
usage() {
|
||||
cat <<EOF
|
||||
Usage: sudo $0 install|remove
|
||||
|
||||
WAN (Wi-Fi client): $IF_WAN
|
||||
LAN (Ethernet to AP): $IF_LAN
|
||||
LAN address: ${LAN_IP}/${LAN_PREFIX}
|
||||
DHCP range: $DHCP_START – $DHCP_END
|
||||
|
||||
Override with environment variables (see script header).
|
||||
EOF
|
||||
}
|
||||
|
||||
case "${1:-}" in
|
||||
install) do_install ;;
|
||||
remove) do_remove ;;
|
||||
*) usage; exit 1 ;;
|
||||
esac
|
||||
@@ -1,5 +1,38 @@
|
||||
#!/usr/bin/env bash
|
||||
# Start the LED controller web server (port 80 by default).
|
||||
cd "$(dirname "$0")/.."
|
||||
# Avoid `pipenv run` on the hot path — it re-resolves the env every time and is slow on a Pi.
|
||||
set -euo pipefail
|
||||
|
||||
ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
cd "$ROOT"
|
||||
export PORT="${PORT:-80}"
|
||||
pipenv run run
|
||||
export PIPENV_VENV_IN_PROJECT="${PIPENV_VENV_IN_PROJECT:-1}"
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||
CACHE="$SCRIPT_DIR/.led-controller-venv"
|
||||
PYTHON=""
|
||||
|
||||
if [ -x "$ROOT/.venv/bin/python" ]; then
|
||||
PYTHON="$ROOT/.venv/bin/python"
|
||||
elif [ -f "$CACHE" ]; then
|
||||
_v="$(tr -d '\r\n' < "$CACHE")"
|
||||
if [ -n "$_v" ] && [ -x "$_v/bin/python" ]; then
|
||||
PYTHON="$_v/bin/python"
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -z "$PYTHON" ] && command -v pipenv >/dev/null 2>&1; then
|
||||
_v="$(cd "$ROOT" && pipenv --venv 2>/dev/null || true)"
|
||||
if [ -n "${_v:-}" ] && [ -x "$_v/bin/python" ]; then
|
||||
PYTHON="$_v/bin/python"
|
||||
printf '%s\n' "$_v" > "$CACHE" || true
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -z "$PYTHON" ]; then
|
||||
echo 'led-controller: no venv resolved; using pipenv run (slow). Run: cd '"$ROOT"' && PIPENV_VENV_IN_PROJECT=1 pipenv install --skip-lock --python "$(command -v python3)"' >&2
|
||||
exec pipenv run run
|
||||
fi
|
||||
|
||||
cd "$ROOT/src"
|
||||
exec "$PYTHON" -u main.py
|
||||
|
||||
@@ -1,29 +1,331 @@
|
||||
from microdot import Microdot
|
||||
from models.device import Device
|
||||
from models.device import (
|
||||
Device,
|
||||
derive_device_mac,
|
||||
normalize_mac,
|
||||
validate_device_transport,
|
||||
validate_device_type,
|
||||
)
|
||||
from models.group import Group
|
||||
from models.transport import get_current_sender
|
||||
from settings import Settings
|
||||
from util.brightness_combine import effective_brightness_for_mac
|
||||
from models.wifi_ws_clients import (
|
||||
normalize_tcp_peer_ip,
|
||||
send_json_line_to_ip,
|
||||
tcp_client_connected,
|
||||
)
|
||||
from util.driver_patterns import driver_patterns_dir
|
||||
from util.espnow_message import build_message
|
||||
import asyncio
|
||||
import json
|
||||
import os
|
||||
import socket
|
||||
from urllib.parse import quote
|
||||
|
||||
# Ephemeral driver preset name (never written to Pi preset store; ``save`` not set on wire).
|
||||
_IDENTIFY_PRESET_KEY = "__identify"
|
||||
|
||||
# Short-key payload: 10 Hz full cycle = 50 ms on + 50 ms off (driver ``blink`` toggles each ``d`` ms).
|
||||
_IDENTIFY_DRIVER_PRESET = {
|
||||
"p": "blink",
|
||||
"c": ["#ff0000"],
|
||||
"d": 50,
|
||||
"b": 128,
|
||||
"a": True,
|
||||
"n1": 0,
|
||||
"n2": 0,
|
||||
"n3": 0,
|
||||
"n4": 0,
|
||||
"n5": 0,
|
||||
"n6": 0,
|
||||
}
|
||||
|
||||
|
||||
def _compact_v1_json(*, presets=None, select=None, save=False):
|
||||
"""Single-line v1 object; compact so serial/ESP-NOW stays small."""
|
||||
body = {"v": "1"}
|
||||
if presets is not None:
|
||||
body["presets"] = presets
|
||||
if save:
|
||||
body["save"] = True
|
||||
if select is not None:
|
||||
body["select"] = select
|
||||
return json.dumps(body, separators=(",", ":"))
|
||||
|
||||
# Seconds after identify blink before selecting built-in ``off`` (tests may monkeypatch).
|
||||
IDENTIFY_OFF_DELAY_S = 2.0
|
||||
|
||||
|
||||
def _validate_output_brightness(value):
|
||||
if value is None:
|
||||
return None
|
||||
try:
|
||||
b = int(value)
|
||||
except (TypeError, ValueError):
|
||||
raise ValueError("output_brightness must be an integer 0–255")
|
||||
if b < 0 or b > 255:
|
||||
raise ValueError("output_brightness must be between 0 and 255")
|
||||
return b
|
||||
|
||||
|
||||
def _brightness_save_message_json(b_val: int) -> str:
|
||||
b_val = max(0, min(255, int(b_val)))
|
||||
return json.dumps({"v": "1", "b": b_val, "save": True}, separators=(",", ":"))
|
||||
|
||||
|
||||
controller = Microdot()
|
||||
devices = Device()
|
||||
_group_registry = Group()
|
||||
_pi_settings = Settings()
|
||||
|
||||
|
||||
def _device_live_connected(dev_dict):
|
||||
"""
|
||||
Wi-Fi: whether the controller has an outbound WebSocket to this device's IP.
|
||||
ESP-NOW: None (no Wi-Fi session on the Pi for that transport).
|
||||
"""
|
||||
tr = (dev_dict.get("transport") or "espnow").strip().lower()
|
||||
if tr != "wifi":
|
||||
return None
|
||||
ip = normalize_tcp_peer_ip(dev_dict.get("address") or "")
|
||||
if not ip:
|
||||
return False
|
||||
return tcp_client_connected(ip)
|
||||
|
||||
|
||||
def _device_json_with_live_status(dev_dict):
|
||||
row = dict(dev_dict)
|
||||
row["connected"] = _device_live_connected(dev_dict)
|
||||
return row
|
||||
|
||||
|
||||
def _safe_pattern_filename(name):
|
||||
if not isinstance(name, str):
|
||||
return False
|
||||
if not name.endswith(".py"):
|
||||
return False
|
||||
if "/" in name or "\\" in name or ".." in name:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def _http_post_pattern_source(ip, filename, code_text, reload_patterns=True, timeout_s=10.0):
|
||||
"""POST source to driver /patterns/upload?name=...&reload=...; return True on 2xx."""
|
||||
if not isinstance(ip, str) or not ip.strip():
|
||||
return False
|
||||
if not isinstance(filename, str) or not filename:
|
||||
return False
|
||||
if not isinstance(code_text, str):
|
||||
return False
|
||||
|
||||
name_q = quote(filename, safe="")
|
||||
reload_q = "1" if reload_patterns else "0"
|
||||
path = "/patterns/upload?name=%s&reload=%s" % (name_q, reload_q)
|
||||
body = code_text.encode("utf-8")
|
||||
req = (
|
||||
"POST %s HTTP/1.1\r\n"
|
||||
"Host: %s\r\n"
|
||||
"Content-Type: text/plain; charset=utf-8\r\n"
|
||||
"Content-Length: %d\r\n"
|
||||
"Connection: close\r\n"
|
||||
"\r\n" % (path, ip, len(body))
|
||||
).encode("utf-8") + body
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
try:
|
||||
sock.settimeout(timeout_s)
|
||||
sock.connect((ip.strip(), 80))
|
||||
sock.sendall(req)
|
||||
data = b""
|
||||
while True:
|
||||
chunk = sock.recv(1024)
|
||||
if not chunk:
|
||||
break
|
||||
data += chunk
|
||||
except OSError:
|
||||
return False
|
||||
finally:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
first_line = data.split(b"\r\n", 1)[0] if data else b""
|
||||
return b" 2" in first_line
|
||||
|
||||
|
||||
async def _identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, name):
|
||||
try:
|
||||
await asyncio.sleep(IDENTIFY_OFF_DELAY_S)
|
||||
off_msg = build_message(select={name: ["off"]})
|
||||
if transport == "wifi":
|
||||
await send_json_line_to_ip(wifi_ip, off_msg)
|
||||
else:
|
||||
await sender.send(off_msg, addr=dev_id)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
async def send_identify_to_device(dev_id: str) -> tuple[int, str]:
|
||||
"""
|
||||
Send the same identify blink as ``POST /devices/<id>/identify``.
|
||||
|
||||
Returns ``(http_status, "")`` on success, or ``(status, error_message)`` on failure
|
||||
(status matches the single-device route).
|
||||
"""
|
||||
dev = devices.read(dev_id)
|
||||
if not dev:
|
||||
return 404, "Device not found"
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return 503, "Transport not configured"
|
||||
name = str(dev.get("name") or "").strip()
|
||||
if not name:
|
||||
return 400, "Device must have a name to identify"
|
||||
|
||||
transport = dev.get("transport") or "espnow"
|
||||
wifi_ip = None
|
||||
if transport == "wifi":
|
||||
wifi_ip = dev.get("address")
|
||||
if not wifi_ip:
|
||||
return 400, "Device has no IP address"
|
||||
|
||||
try:
|
||||
msg = _compact_v1_json(
|
||||
presets={_IDENTIFY_PRESET_KEY: dict(_IDENTIFY_DRIVER_PRESET)},
|
||||
select={name: [_IDENTIFY_PRESET_KEY]},
|
||||
)
|
||||
if transport == "wifi":
|
||||
ok = await send_json_line_to_ip(wifi_ip, msg)
|
||||
if not ok:
|
||||
return 503, "Wi-Fi driver not connected"
|
||||
else:
|
||||
await sender.send(msg, addr=dev_id)
|
||||
|
||||
asyncio.create_task(
|
||||
_identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, name)
|
||||
)
|
||||
except Exception as e:
|
||||
return 503, str(e)
|
||||
return 200, ""
|
||||
|
||||
|
||||
async def send_identify_to_group_devices(macs: list[str]) -> tuple[int, list[dict]]:
|
||||
"""
|
||||
Identify every listed registry MAC in one delivery round: merged ``select`` and a single
|
||||
ESP-NOW split envelope when multiple peers share the serial bridge (avoids per-device
|
||||
``SerialSender`` lock serialisation). Wi-Fi peers are sent in parallel as in
|
||||
``deliver_json_messages``.
|
||||
"""
|
||||
from util.driver_delivery import deliver_json_messages
|
||||
|
||||
errors: list[dict] = []
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return 0, [{"mac": "*", "error": "Transport not configured"}]
|
||||
|
||||
merged_select: dict[str, list[str]] = {}
|
||||
valid_macs: list[str] = []
|
||||
for dev_id in macs:
|
||||
dev = devices.read(dev_id)
|
||||
if not dev:
|
||||
errors.append({"mac": dev_id, "error": "Device not found"})
|
||||
continue
|
||||
name = str(dev.get("name") or "").strip()
|
||||
if not name:
|
||||
errors.append({"mac": dev_id, "error": "Device must have a name to identify"})
|
||||
continue
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
if transport == "wifi":
|
||||
if not dev.get("address"):
|
||||
errors.append({"mac": dev_id, "error": "Device has no IP address"})
|
||||
continue
|
||||
merged_select[name] = [_IDENTIFY_PRESET_KEY]
|
||||
valid_macs.append(dev_id)
|
||||
|
||||
if not merged_select:
|
||||
return 0, errors
|
||||
|
||||
try:
|
||||
msg = _compact_v1_json(
|
||||
presets={_IDENTIFY_PRESET_KEY: dict(_IDENTIFY_DRIVER_PRESET)},
|
||||
select=merged_select,
|
||||
)
|
||||
await deliver_json_messages(sender, [msg], valid_macs, devices, delay_s=0)
|
||||
except Exception as e:
|
||||
return 0, errors + [{"mac": "*", "error": str(e)}]
|
||||
|
||||
for dev_id in valid_macs:
|
||||
dev = devices.read(dev_id) or {}
|
||||
name = str(dev.get("name") or "").strip()
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
wifi_ip = dev.get("address") if transport == "wifi" else None
|
||||
asyncio.create_task(
|
||||
_identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, name)
|
||||
)
|
||||
|
||||
return len(valid_macs), errors
|
||||
|
||||
|
||||
@controller.get("")
|
||||
async def list_devices(request):
|
||||
"""List all devices."""
|
||||
"""List all devices (includes ``connected`` for live Wi-Fi WebSocket presence)."""
|
||||
devices_data = {}
|
||||
for dev_id in devices.list():
|
||||
d = devices.read(dev_id)
|
||||
if d:
|
||||
devices_data[dev_id] = d
|
||||
devices_data[dev_id] = _device_json_with_live_status(d)
|
||||
return json.dumps(devices_data), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/resolve-brightness")
|
||||
async def resolve_brightness_batch(request):
|
||||
"""
|
||||
POST JSON ``{ \"macs\": [\"..\"], \"zone_brightness\": optional 0–255 }``.
|
||||
Returns ``{ \"values\": { mac: combined_int } }`` — global × group(s) × device × zone (optional).
|
||||
"""
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
data = {}
|
||||
macs = data.get("macs")
|
||||
if not isinstance(macs, list):
|
||||
return json.dumps({"error": "macs must be an array"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
zb = None
|
||||
if isinstance(data, dict) and data.get("zone_brightness") is not None:
|
||||
try:
|
||||
zb = _validate_output_brightness(data.get("zone_brightness"))
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
values = {}
|
||||
for raw in macs:
|
||||
m = normalize_mac(str(raw))
|
||||
if not m:
|
||||
continue
|
||||
values[m] = effective_brightness_for_mac(
|
||||
_pi_settings,
|
||||
_group_registry,
|
||||
devices,
|
||||
m,
|
||||
zone_brightness=zb,
|
||||
)
|
||||
return json.dumps({"values": values}), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
async def get_device(request, id):
|
||||
"""Get a device by ID."""
|
||||
"""Get a device by ID (includes ``connected`` for live Wi-Fi WebSocket presence)."""
|
||||
dev = devices.read(id)
|
||||
if dev:
|
||||
return json.dumps(dev), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Device not found"}), 404
|
||||
return json.dumps(_device_json_with_live_status(dev)), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
|
||||
@controller.post("")
|
||||
@@ -32,37 +334,304 @@ async def create_device(request):
|
||||
try:
|
||||
data = request.json or {}
|
||||
name = data.get("name", "").strip()
|
||||
if not name:
|
||||
return json.dumps({"error": "name is required"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
try:
|
||||
device_type = validate_device_type(data.get("type", "led"))
|
||||
transport = validate_device_transport(data.get("transport", "espnow"))
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
address = data.get("address")
|
||||
mac = data.get("mac")
|
||||
if derive_device_mac(mac=mac, address=address, transport=transport) is None:
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "mac is required (12 hex digits); for Wi-Fi include mac plus IP in address"
|
||||
}
|
||||
), 400, {"Content-Type": "application/json"}
|
||||
default_pattern = data.get("default_pattern")
|
||||
tabs = data.get("tabs")
|
||||
if isinstance(tabs, list):
|
||||
tabs = [str(t) for t in tabs]
|
||||
zl = data.get("zones")
|
||||
if isinstance(zl, list):
|
||||
zl = [str(t) for t in zl]
|
||||
else:
|
||||
tabs = []
|
||||
dev_id = devices.create(name=name, address=address, default_pattern=default_pattern, tabs=tabs)
|
||||
zl = []
|
||||
dev_id = devices.create(
|
||||
name=name,
|
||||
address=address,
|
||||
mac=mac,
|
||||
default_pattern=default_pattern,
|
||||
zones=zl,
|
||||
device_type=device_type,
|
||||
transport=transport,
|
||||
)
|
||||
dev = devices.read(dev_id)
|
||||
return json.dumps({dev_id: dev}), 201, {"Content-Type": "application/json"}
|
||||
except ValueError as e:
|
||||
msg = str(e)
|
||||
code = 409 if "already exists" in msg.lower() else 400
|
||||
return json.dumps({"error": msg}), code, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.put("/<id>")
|
||||
async def update_device(request, id):
|
||||
"""Update a device."""
|
||||
try:
|
||||
data = request.json or {}
|
||||
if "tabs" in data and isinstance(data["tabs"], list):
|
||||
data["tabs"] = [str(t) for t in data["tabs"]]
|
||||
raw = request.json or {}
|
||||
data = dict(raw)
|
||||
data.pop("id", None)
|
||||
data.pop("addresses", None)
|
||||
data.pop("connected", None)
|
||||
if "name" in data:
|
||||
n = (data.get("name") or "").strip()
|
||||
if not n:
|
||||
return json.dumps({"error": "name cannot be empty"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
data["name"] = n
|
||||
if "type" in data:
|
||||
data["type"] = validate_device_type(data.get("type"))
|
||||
if "transport" in data:
|
||||
data["transport"] = validate_device_transport(data.get("transport"))
|
||||
if "zones" in data and isinstance(data["zones"], list):
|
||||
data["zones"] = [str(t) for t in data["zones"]]
|
||||
if "output_brightness" in data:
|
||||
data["output_brightness"] = _validate_output_brightness(data.get("output_brightness"))
|
||||
prev_doc = devices.read(id)
|
||||
if devices.update(id, data):
|
||||
if prev_doc and "name" in data:
|
||||
on = str(prev_doc.get("name") or "").strip()
|
||||
nn = str(data.get("name") or "").strip()
|
||||
if on and nn and on != nn:
|
||||
from util.beat_driver_route import remap_beat_route_device_name
|
||||
|
||||
remap_beat_route_device_name(on, nn)
|
||||
return json.dumps(devices.read(id)), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Device not found"}), 404
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.delete("/<id>")
|
||||
async def delete_device(request, id):
|
||||
"""Delete a device."""
|
||||
if devices.delete(id):
|
||||
return json.dumps({"message": "Device deleted successfully"}), 200
|
||||
return json.dumps({"error": "Device not found"}), 404
|
||||
return (
|
||||
json.dumps({"message": "Device deleted successfully"}),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
|
||||
@controller.post("/<id>/identify")
|
||||
async def identify_device(request, id):
|
||||
"""
|
||||
One v1 JSON object: ``presets.__identify`` (``d``=50 ms → 10 Hz blink) plus ``select`` for
|
||||
this device name — same combined shape as profile sends the driver already accepts over TCP
|
||||
/ ESP-NOW. No ``save``. After ``IDENTIFY_OFF_DELAY_S``, a background task selects ``off``.
|
||||
"""
|
||||
status, err = await send_identify_to_device(id)
|
||||
if status == 200:
|
||||
return json.dumps({"message": "Identify sent"}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
return json.dumps({"error": err}), status, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/<id>/brightness")
|
||||
async def push_device_output_brightness(request, id):
|
||||
"""
|
||||
Push combined brightness to the driver: global × group(s) × device × optional ``zone_brightness``
|
||||
in JSON body — single ``b`` (``v``/``b``/``save``). Wi‑Fi or ESP‑NOW.
|
||||
"""
|
||||
dev = devices.read(id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
body = request.json or {}
|
||||
zb = None
|
||||
if isinstance(body, dict) and body.get("zone_brightness") is not None:
|
||||
try:
|
||||
zb = _validate_output_brightness(body.get("zone_brightness"))
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
b_val = effective_brightness_for_mac(
|
||||
_pi_settings,
|
||||
_group_registry,
|
||||
devices,
|
||||
id,
|
||||
zone_brightness=zb,
|
||||
)
|
||||
|
||||
msg = _brightness_save_message_json(b_val)
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
|
||||
if transport == "wifi":
|
||||
ip = normalize_tcp_peer_ip(str(dev.get("address") or ""))
|
||||
if not ip:
|
||||
return json.dumps({"error": "Device has no IP address"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
ok = await send_json_line_to_ip(ip, msg)
|
||||
if not ok:
|
||||
return json.dumps({"error": "Wi-Fi driver not connected"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
else:
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return json.dumps({"error": "Transport not configured"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
try:
|
||||
await sender.send(msg, addr=id)
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 503, {"Content-Type": "application/json"}
|
||||
|
||||
return json.dumps({"message": "brightness sent", "brightness": b_val}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
|
||||
@controller.post("/<id>/driver-config")
|
||||
async def push_driver_config(request, id):
|
||||
"""
|
||||
Push ``device_config`` to a Wi‑Fi LED driver over WebSocket.
|
||||
Body JSON: optional ``name``, ``num_leds``, ``color_order``, ``startup_mode`` (default|last|off).
|
||||
"""
|
||||
dev = devices.read(id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
return json.dumps({"error": "driver-config is only for Wi-Fi devices"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
wifi_ip = str(dev.get("address") or "").strip()
|
||||
if not wifi_ip:
|
||||
return json.dumps({"error": "Device has no IP address"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
body = request.json or {}
|
||||
dc = {}
|
||||
if isinstance(body.get("name"), str) and body["name"].strip():
|
||||
dc["name"] = body["name"].strip()
|
||||
if "num_leds" in body:
|
||||
try:
|
||||
n = int(body["num_leds"])
|
||||
if 1 <= n <= 2048:
|
||||
dc["num_leds"] = n
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
if isinstance(body.get("color_order"), str):
|
||||
co = body["color_order"].strip().lower()
|
||||
if co in ("rgb", "rbg", "grb", "gbr", "brg", "bgr"):
|
||||
dc["color_order"] = co
|
||||
if isinstance(body.get("startup_mode"), str):
|
||||
sm = body["startup_mode"].strip().lower()
|
||||
if sm in ("default", "last", "off"):
|
||||
dc["startup_mode"] = sm
|
||||
if not dc:
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "Provide at least one of name, num_leds, color_order, startup_mode"
|
||||
}
|
||||
), 400, {"Content-Type": "application/json"}
|
||||
msg = json.dumps(
|
||||
{"v": "1", "device_config": dc, "save": True}, separators=(",", ":")
|
||||
)
|
||||
ok = await send_json_line_to_ip(wifi_ip, msg)
|
||||
if not ok:
|
||||
return json.dumps({"error": "Wi-Fi driver not connected"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
return json.dumps({"message": "driver-config sent"}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
|
||||
@controller.post("/<id>/patterns/push")
|
||||
async def push_patterns_ota(request, id):
|
||||
"""
|
||||
Push all local pattern files directly to a Wi-Fi LED driver over HTTP upload.
|
||||
"""
|
||||
dev = devices.read(id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
return json.dumps({"error": "Pattern OTA push is only supported for Wi-Fi devices"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
wifi_ip = str(dev.get("address") or "").strip()
|
||||
if not wifi_ip:
|
||||
return json.dumps({"error": "Device has no IP address"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
base_dir = driver_patterns_dir()
|
||||
try:
|
||||
names = sorted(os.listdir(base_dir))
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
files = [n for n in names if _safe_pattern_filename(n) and n != "__init__.py"]
|
||||
if not files:
|
||||
return json.dumps({"error": "No pattern files found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
sent = []
|
||||
failed = []
|
||||
total = len(files)
|
||||
for idx, filename in enumerate(files):
|
||||
path = os.path.join(base_dir, filename)
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
code = f.read()
|
||||
except OSError:
|
||||
failed.append(filename)
|
||||
continue
|
||||
reload_patterns = idx == (total - 1)
|
||||
ok = _http_post_pattern_source(
|
||||
wifi_ip,
|
||||
filename,
|
||||
code,
|
||||
reload_patterns=reload_patterns,
|
||||
timeout_s=10.0,
|
||||
)
|
||||
if ok:
|
||||
sent.append(filename)
|
||||
else:
|
||||
failed.append(filename)
|
||||
|
||||
if not sent:
|
||||
return json.dumps({"error": "Wi-Fi driver did not accept pattern uploads", "failed": failed}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
return json.dumps({
|
||||
"message": "Pattern files uploaded",
|
||||
"sent_count": len(sent),
|
||||
"sent": sent,
|
||||
"failed": failed,
|
||||
}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
@@ -1,50 +1,359 @@
|
||||
from microdot import Microdot
|
||||
from microdot.session import with_session
|
||||
import asyncio
|
||||
from models.group import Group
|
||||
from models.device import Device
|
||||
from models.transport import get_current_sender
|
||||
from models.wifi_ws_clients import normalize_tcp_peer_ip, send_json_line_to_ip
|
||||
from settings import Settings
|
||||
from util.brightness_combine import effective_brightness_for_mac
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
groups = Group()
|
||||
devices = Device()
|
||||
_pi_settings = Settings()
|
||||
|
||||
@controller.get('')
|
||||
async def list_groups(request):
|
||||
"""List all groups."""
|
||||
return json.dumps(groups), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
@controller.get('/<id>')
|
||||
async def get_group(request, id):
|
||||
"""Get a specific group by ID."""
|
||||
def _group_doc_visible_for_profile(doc, profile_id):
|
||||
if not isinstance(doc, dict):
|
||||
return False
|
||||
scoped = doc.get("profile_id")
|
||||
if scoped is None:
|
||||
scoped = doc.get("profileId")
|
||||
if scoped is None or str(scoped).strip() == "":
|
||||
return True
|
||||
if not profile_id:
|
||||
return False
|
||||
return str(scoped).strip() == str(profile_id).strip()
|
||||
|
||||
|
||||
def _filtered_groups_dict(session):
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
pid = get_current_profile_id(session)
|
||||
out = {}
|
||||
for gid, doc in groups.items():
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if _group_doc_visible_for_profile(doc, pid):
|
||||
out[str(gid)] = doc
|
||||
return out
|
||||
|
||||
|
||||
@controller.get("")
|
||||
@with_session
|
||||
async def list_groups(request, session):
|
||||
"""List groups visible for the current profile (shared + profile-scoped)."""
|
||||
return json.dumps(_filtered_groups_dict(session)), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
@with_session
|
||||
async def get_group(request, session, id):
|
||||
"""Get a specific group by ID (404 if scoped to another profile)."""
|
||||
group = groups.read(id)
|
||||
if group:
|
||||
return json.dumps(group), 200, {'Content-Type': 'application/json'}
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
if not group or not isinstance(group, dict):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
@controller.post('')
|
||||
async def create_group(request):
|
||||
"""Create a new group."""
|
||||
if not _group_doc_visible_for_profile(group, get_current_profile_id(session)):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
return json.dumps(group), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
def _sanitize_group_profile_id_write(data, session):
|
||||
"""Allow ``profile_id`` only for the active profile, or null to share across profiles."""
|
||||
if not isinstance(data, dict):
|
||||
return
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
cur = get_current_profile_id(session)
|
||||
if "profile_id" not in data and "profileId" not in data:
|
||||
return
|
||||
raw = data.get("profile_id")
|
||||
if raw is None and "profileId" in data:
|
||||
raw = data.get("profileId")
|
||||
if raw is None or raw == "":
|
||||
data.pop("profileId", None)
|
||||
data["profile_id"] = None
|
||||
return
|
||||
if not cur or str(raw).strip() != str(cur).strip():
|
||||
data.pop("profileId", None)
|
||||
data.pop("profile_id", None)
|
||||
|
||||
|
||||
@controller.post("")
|
||||
@with_session
|
||||
async def create_group(request, session):
|
||||
"""Create a new group (omit ``profile_id`` for shared; or ``profile_scoped``: true for this profile only)."""
|
||||
try:
|
||||
data = request.json or {}
|
||||
data = dict(request.json or {})
|
||||
name = data.get("name", "")
|
||||
profile_scoped = bool(data.pop("profile_scoped", False))
|
||||
_sanitize_group_profile_id_write(data, session)
|
||||
group_id = groups.create(name)
|
||||
if data:
|
||||
groups.update(group_id, data)
|
||||
return json.dumps(groups.read(group_id)), 201, {'Content-Type': 'application/json'}
|
||||
if profile_scoped:
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
cur = get_current_profile_id(session)
|
||||
if cur:
|
||||
groups.update(group_id, {"profile_id": str(cur)})
|
||||
return json.dumps(groups.read(group_id)), 201, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.put('/<id>')
|
||||
async def update_group(request, id):
|
||||
|
||||
@controller.put("/<id>")
|
||||
@with_session
|
||||
async def update_group(request, session, id):
|
||||
"""Update an existing group."""
|
||||
try:
|
||||
data = request.json
|
||||
if not isinstance(data, dict):
|
||||
return json.dumps({"error": "Invalid JSON"}), 400, {"Content-Type": "application/json"}
|
||||
data = dict(data)
|
||||
_sanitize_group_profile_id_write(data, session)
|
||||
if groups.update(id, data):
|
||||
return json.dumps(groups.read(id)), 200, {'Content-Type': 'application/json'}
|
||||
g = groups.read(id)
|
||||
if g:
|
||||
return json.dumps(g), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.delete('/<id>')
|
||||
async def delete_group(request, id):
|
||||
"""Delete a group."""
|
||||
@controller.delete("/<id>")
|
||||
@with_session
|
||||
async def delete_group(request, session, id):
|
||||
"""Delete a group (not allowed for another profile's scoped group)."""
|
||||
g = groups.read(id)
|
||||
if not g or not isinstance(g, dict):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
if not _group_doc_visible_for_profile(g, get_current_profile_id(session)):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
if groups.delete(id):
|
||||
return json.dumps({"message": "Group deleted successfully"}), 200
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
|
||||
|
||||
def _group_driver_config_payload(doc):
|
||||
"""Build ``device_config`` dict from stored group Wi‑Fi defaults (non-empty only)."""
|
||||
dc = {}
|
||||
if not isinstance(doc, dict):
|
||||
return dc
|
||||
nm = doc.get("wifi_driver_display_name")
|
||||
if isinstance(nm, str) and nm.strip():
|
||||
dc["name"] = nm.strip()
|
||||
nled = doc.get("wifi_driver_num_leds")
|
||||
if nled is not None:
|
||||
try:
|
||||
n = int(nled)
|
||||
if 1 <= n <= 2048:
|
||||
dc["num_leds"] = n
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
co = doc.get("wifi_color_order")
|
||||
if isinstance(co, str):
|
||||
c = co.strip().lower()
|
||||
if c in ("rgb", "rbg", "grb", "gbr", "brg", "bgr"):
|
||||
dc["color_order"] = c
|
||||
sm = doc.get("wifi_startup_mode")
|
||||
if isinstance(sm, str):
|
||||
s = sm.strip().lower()
|
||||
if s in ("default", "last", "off"):
|
||||
dc["startup_mode"] = s
|
||||
return dc
|
||||
|
||||
|
||||
def _read_group_for_session(session, id):
|
||||
g = groups.read(id)
|
||||
if not g or not isinstance(g, dict):
|
||||
return None
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
if not _group_doc_visible_for_profile(g, get_current_profile_id(session)):
|
||||
return None
|
||||
return g
|
||||
|
||||
|
||||
@controller.post("/<id>/driver-config")
|
||||
@with_session
|
||||
async def push_group_driver_config(request, session, id):
|
||||
"""
|
||||
Push group Wi‑Fi defaults to every Wi‑Fi device listed in the group (TCP WebSocket).
|
||||
Uses stored ``wifi_*`` fields on the group; optional JSON body may override for this send only.
|
||||
"""
|
||||
gdoc = _read_group_for_session(session, id)
|
||||
if not gdoc:
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
|
||||
body = request.json or {}
|
||||
merged = dict(gdoc)
|
||||
if isinstance(body, dict):
|
||||
for k in (
|
||||
"wifi_driver_display_name",
|
||||
"wifi_driver_num_leds",
|
||||
"wifi_color_order",
|
||||
"wifi_startup_mode",
|
||||
):
|
||||
if k in body:
|
||||
merged[k] = body[k]
|
||||
dc = _group_driver_config_payload(merged)
|
||||
if not dc:
|
||||
return json.dumps(
|
||||
{"error": "No driver defaults on this group (set display name, LEDs, colour order, or power-on pattern)"}
|
||||
), 400, {"Content-Type": "application/json"}
|
||||
|
||||
mac_list = gdoc.get("devices") if isinstance(gdoc.get("devices"), list) else []
|
||||
sent = 0
|
||||
errors = []
|
||||
msg = json.dumps(
|
||||
{"v": "1", "device_config": dc, "save": True}, separators=(",", ":")
|
||||
)
|
||||
tasks = []
|
||||
meta_macs = []
|
||||
for mac in mac_list:
|
||||
m = str(mac).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(m) != 12:
|
||||
continue
|
||||
dev = devices.read(m)
|
||||
if not dev:
|
||||
errors.append({"mac": m, "error": "not in registry"})
|
||||
continue
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
continue
|
||||
ip = normalize_tcp_peer_ip(str(dev.get("address") or ""))
|
||||
if not ip:
|
||||
errors.append({"mac": m, "error": "no IP"})
|
||||
continue
|
||||
tasks.append(send_json_line_to_ip(ip, msg))
|
||||
meta_macs.append(m)
|
||||
if tasks:
|
||||
results = await asyncio.gather(*tasks, return_exceptions=True)
|
||||
for m, r in zip(meta_macs, results):
|
||||
if r is True:
|
||||
sent += 1
|
||||
elif isinstance(r, Exception):
|
||||
errors.append({"mac": m, "error": str(r)})
|
||||
else:
|
||||
errors.append({"mac": m, "error": "driver not connected"})
|
||||
|
||||
return json.dumps(
|
||||
{"message": "driver-config sent", "sent": sent, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
def _brightness_save_message_json(b_val: int) -> str:
|
||||
b_val = max(0, min(255, int(b_val)))
|
||||
return json.dumps({"v": "1", "b": b_val, "save": True}, separators=(",", ":"))
|
||||
|
||||
|
||||
@controller.post("/<id>/brightness")
|
||||
@with_session
|
||||
async def push_group_output_brightness(request, session, id):
|
||||
"""
|
||||
Push combined brightness (global × group(s) × device) to each member — one ``b`` per device.
|
||||
"""
|
||||
gdoc = _read_group_for_session(session, id)
|
||||
if not gdoc:
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
|
||||
mac_list = gdoc.get("devices") if isinstance(gdoc.get("devices"), list) else []
|
||||
sent = 0
|
||||
errors = []
|
||||
sender = get_current_sender()
|
||||
|
||||
async def _push_brightness_one(m: str, dev: dict) -> tuple[str, bool, str | None]:
|
||||
b_val = effective_brightness_for_mac(
|
||||
_pi_settings,
|
||||
groups,
|
||||
devices,
|
||||
m,
|
||||
zone_brightness=None,
|
||||
)
|
||||
msg = _brightness_save_message_json(b_val)
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
if transport == "wifi":
|
||||
ip = normalize_tcp_peer_ip(str(dev.get("address") or ""))
|
||||
if not ip:
|
||||
return m, False, "no IP"
|
||||
ok = await send_json_line_to_ip(ip, msg)
|
||||
return m, bool(ok), None if ok else "driver not connected"
|
||||
if not sender:
|
||||
return m, False, "transport not configured"
|
||||
try:
|
||||
await sender.send(msg, addr=m)
|
||||
return m, True, None
|
||||
except Exception as e:
|
||||
return m, False, str(e)
|
||||
|
||||
tasks: list = []
|
||||
for mac in mac_list:
|
||||
m = str(mac).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(m) != 12:
|
||||
continue
|
||||
dev = devices.read(m)
|
||||
if not dev:
|
||||
errors.append({"mac": m, "error": "not in registry"})
|
||||
continue
|
||||
tasks.append(_push_brightness_one(m, dev))
|
||||
|
||||
if tasks:
|
||||
results = await asyncio.gather(*tasks, return_exceptions=True)
|
||||
for r in results:
|
||||
if isinstance(r, Exception):
|
||||
errors.append({"mac": "*", "error": str(r)})
|
||||
continue
|
||||
m, ok, err = r
|
||||
if ok:
|
||||
sent += 1
|
||||
elif err:
|
||||
errors.append({"mac": m, "error": err})
|
||||
|
||||
return json.dumps(
|
||||
{"message": "brightness sent", "sent": sent, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/<id>/identify")
|
||||
@with_session
|
||||
async def identify_group_devices(request, session, id):
|
||||
"""
|
||||
Run the same identify blink as ``POST /devices/<id>/identify`` for every registry member
|
||||
in parallel so all drivers in the group blink together.
|
||||
"""
|
||||
_ = request
|
||||
gdoc = _read_group_for_session(session, id)
|
||||
if not gdoc:
|
||||
return json.dumps({"error": "Group not found"}), 404, {"Content-Type": "application/json"}
|
||||
|
||||
mac_list = gdoc.get("devices") if isinstance(gdoc.get("devices"), list) else []
|
||||
if not mac_list:
|
||||
return json.dumps({"error": "Group has no devices"}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
from controllers.device import send_identify_to_group_devices
|
||||
|
||||
normalized: list[str] = []
|
||||
errors: list[dict] = []
|
||||
for mac in mac_list:
|
||||
m = str(mac).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(m) != 12:
|
||||
errors.append({"mac": str(mac), "error": "invalid MAC"})
|
||||
continue
|
||||
normalized.append(m)
|
||||
|
||||
if not normalized:
|
||||
return json.dumps(
|
||||
{"message": "identify group done", "sent": 0, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
sent, batch_errors = await send_identify_to_group_devices(normalized)
|
||||
errors.extend(batch_errors)
|
||||
|
||||
return json.dumps(
|
||||
{"message": "identify group done", "sent": sent, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
189
src/controllers/led_tool.py
Normal file
189
src/controllers/led_tool.py
Normal file
@@ -0,0 +1,189 @@
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from microdot import Microdot
|
||||
from serial.tools import list_ports
|
||||
|
||||
controller = Microdot()
|
||||
|
||||
|
||||
def _repo_root() -> str:
|
||||
return os.path.abspath(os.path.join(os.path.dirname(__file__), "..", ".."))
|
||||
|
||||
|
||||
def _led_cli_path() -> str:
|
||||
return os.path.join(_repo_root(), "led-tool", "cli.py")
|
||||
|
||||
|
||||
def _build_led_cli_command(port: str, payload: dict):
|
||||
cmd = [sys.executable, _led_cli_path(), "--port", port]
|
||||
|
||||
flag_map = (
|
||||
("name", "--name"),
|
||||
("led_pin", "--pin"),
|
||||
("num_leds", "--leds"),
|
||||
("brightness", "--brightness"),
|
||||
("transport", "--transport"),
|
||||
("ssid", "--ssid"),
|
||||
("password", "--wifi-password"),
|
||||
("wifi_channel", "--wifi-channel"),
|
||||
("default", "--default"),
|
||||
)
|
||||
|
||||
for key, flag in flag_map:
|
||||
value = payload.get(key)
|
||||
if value is None:
|
||||
continue
|
||||
value_str = str(value).strip()
|
||||
if value_str == "":
|
||||
continue
|
||||
cmd.extend([flag, value_str])
|
||||
|
||||
return cmd
|
||||
|
||||
|
||||
def _run_led_cli_command(cmd, cli_path: str, timeout_s=180):
|
||||
try:
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=timeout_s,
|
||||
cwd=os.path.dirname(cli_path),
|
||||
)
|
||||
except subprocess.TimeoutExpired:
|
||||
return (
|
||||
json.dumps({"error": "led-tool command timed out after 180 seconds"}),
|
||||
504,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
except Exception as exc:
|
||||
return (
|
||||
json.dumps({"error": str(exc)}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
return (
|
||||
json.dumps(
|
||||
{
|
||||
"ok": result.returncode == 0,
|
||||
"returncode": result.returncode,
|
||||
"stdout": result.stdout,
|
||||
"stderr": result.stderr,
|
||||
"command": cmd,
|
||||
}
|
||||
),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
|
||||
def _extract_settings_from_stdout(stdout: str):
|
||||
text = (stdout or "").strip()
|
||||
if not text:
|
||||
return None
|
||||
try:
|
||||
parsed = json.loads(text)
|
||||
return parsed if isinstance(parsed, dict) else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
@controller.get("/ports")
|
||||
async def list_serial_ports(request):
|
||||
ports = []
|
||||
for info in list_ports.comports():
|
||||
ports.append(
|
||||
{
|
||||
"device": info.device,
|
||||
"description": info.description,
|
||||
"hwid": info.hwid,
|
||||
}
|
||||
)
|
||||
return (
|
||||
json.dumps(
|
||||
{
|
||||
"ports": ports,
|
||||
"led_cli_exists": os.path.exists(_led_cli_path()),
|
||||
}
|
||||
),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
|
||||
@controller.post("/settings")
|
||||
async def apply_settings(request):
|
||||
data = request.json or {}
|
||||
port = str(data.get("port") or "").strip()
|
||||
if not port:
|
||||
return (
|
||||
json.dumps({"error": "port is required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cli_path = _led_cli_path()
|
||||
if not os.path.exists(cli_path):
|
||||
return (
|
||||
json.dumps({"error": "led-tool/cli.py not found"}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cmd = _build_led_cli_command(port, data) + ["--follow"]
|
||||
return _run_led_cli_command(cmd, cli_path, timeout_s=None)
|
||||
|
||||
|
||||
@controller.post("/reset")
|
||||
@controller.post("/reset/")
|
||||
async def reset_device(request):
|
||||
data = request.json or {}
|
||||
port = str(data.get("port") or "").strip()
|
||||
if not port:
|
||||
return (
|
||||
json.dumps({"error": "port is required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cli_path = _led_cli_path()
|
||||
if not os.path.exists(cli_path):
|
||||
return (
|
||||
json.dumps({"error": "led-tool/cli.py not found"}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cmd = [sys.executable, cli_path, "--port", port, "--reset", "--follow"]
|
||||
return _run_led_cli_command(cmd, cli_path, timeout_s=None)
|
||||
|
||||
|
||||
@controller.get("/settings")
|
||||
async def read_settings(request):
|
||||
port = str(request.args.get("port") or "").strip()
|
||||
if not port:
|
||||
return (
|
||||
json.dumps({"error": "port is required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cli_path = _led_cli_path()
|
||||
if not os.path.exists(cli_path):
|
||||
return (
|
||||
json.dumps({"error": "led-tool/cli.py not found"}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cmd = [sys.executable, cli_path, "--port", port, "--show"]
|
||||
body, status, headers = _run_led_cli_command(cmd, cli_path)
|
||||
if status != 200:
|
||||
return body, status, headers
|
||||
data = json.loads(body)
|
||||
data["settings"] = _extract_settings_from_stdout(data.get("stdout") or "")
|
||||
return json.dumps(data), status, headers
|
||||
@@ -1,19 +1,113 @@
|
||||
from microdot import Microdot
|
||||
from models.pattern import Pattern
|
||||
from models.device import Device
|
||||
from util.driver_patterns import (
|
||||
driver_patterns_dir,
|
||||
is_firmware_builtin_pattern_module,
|
||||
normalize_pattern_py_filename,
|
||||
)
|
||||
import json
|
||||
import sys
|
||||
import re
|
||||
import os
|
||||
import socket
|
||||
from urllib.parse import quote
|
||||
|
||||
controller = Microdot()
|
||||
patterns = Pattern()
|
||||
|
||||
|
||||
def _project_root():
|
||||
"""Project root (parent of ``src/``). CWD is often ``src/`` when running ``main.py``."""
|
||||
here = os.path.dirname(os.path.abspath(__file__))
|
||||
return os.path.abspath(os.path.join(here, "..", ".."))
|
||||
|
||||
|
||||
def _safe_pattern_filename(name):
|
||||
if not isinstance(name, str):
|
||||
return False
|
||||
if not name.endswith(".py"):
|
||||
return False
|
||||
if "/" in name or "\\" in name or ".." in name:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
_PATTERN_KEY_RE = re.compile(r"^[a-zA-Z_][a-zA-Z0-9_]{0,63}$")
|
||||
|
||||
|
||||
def _normalize_pattern_key(raw):
|
||||
"""Pattern id / module basename (no .py)."""
|
||||
if not isinstance(raw, str):
|
||||
return ""
|
||||
s = raw.strip()
|
||||
if s.lower().endswith(".py"):
|
||||
s = s[:-3].strip()
|
||||
return s
|
||||
|
||||
|
||||
def _valid_pattern_key(key):
|
||||
return bool(key and _PATTERN_KEY_RE.match(key))
|
||||
|
||||
|
||||
def _http_post_pattern_source(ip, filename, code_text, reload_patterns=True, timeout_s=10.0):
|
||||
"""POST source to driver /patterns/upload?name=...&reload=...; return True on 2xx."""
|
||||
if not isinstance(ip, str) or not ip.strip():
|
||||
return False
|
||||
if not isinstance(filename, str) or not filename:
|
||||
return False
|
||||
if not isinstance(code_text, str):
|
||||
return False
|
||||
|
||||
name_q = quote(filename, safe="")
|
||||
reload_q = "1" if reload_patterns else "0"
|
||||
path = "/patterns/upload?name=%s&reload=%s" % (name_q, reload_q)
|
||||
body = code_text.encode("utf-8")
|
||||
req = (
|
||||
"POST %s HTTP/1.1\r\n"
|
||||
"Host: %s\r\n"
|
||||
"Content-Type: text/plain; charset=utf-8\r\n"
|
||||
"Content-Length: %d\r\n"
|
||||
"Connection: close\r\n"
|
||||
"\r\n" % (path, ip, len(body))
|
||||
).encode("utf-8") + body
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
try:
|
||||
sock.settimeout(timeout_s)
|
||||
sock.connect((ip.strip(), 80))
|
||||
sock.sendall(req)
|
||||
data = b""
|
||||
while True:
|
||||
chunk = sock.recv(1024)
|
||||
if not chunk:
|
||||
break
|
||||
data += chunk
|
||||
except OSError:
|
||||
return False
|
||||
finally:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
first_line = data.split(b"\r\n", 1)[0] if data else b""
|
||||
# Accept any 2xx status.
|
||||
return b" 2" in first_line
|
||||
|
||||
def load_pattern_definitions():
|
||||
"""Load pattern definitions from pattern.json file."""
|
||||
try:
|
||||
# Try different paths for local development vs MicroPython
|
||||
paths = ['db/pattern.json', 'pattern.json', '/db/pattern.json']
|
||||
root = _project_root()
|
||||
paths = [
|
||||
os.path.join(root, "db", "pattern.json"),
|
||||
os.path.join(root, "pattern.json"),
|
||||
"db/pattern.json",
|
||||
"pattern.json",
|
||||
"/db/pattern.json",
|
||||
]
|
||||
for path in paths:
|
||||
try:
|
||||
with open(path, 'r') as f:
|
||||
with open(path, "r") as f:
|
||||
return json.load(f)
|
||||
except OSError:
|
||||
continue
|
||||
@@ -22,16 +116,341 @@ def load_pattern_definitions():
|
||||
print(f"Error loading pattern.json: {e}")
|
||||
return {}
|
||||
|
||||
|
||||
def load_driver_pattern_names():
|
||||
"""List available pattern module names from led-driver/src/patterns."""
|
||||
try:
|
||||
names = []
|
||||
for filename in os.listdir(driver_patterns_dir()):
|
||||
if not _safe_pattern_filename(filename) or filename == "__init__.py":
|
||||
continue
|
||||
names.append(filename[:-3])
|
||||
names.sort()
|
||||
return names
|
||||
except OSError:
|
||||
return []
|
||||
|
||||
|
||||
def build_runtime_pattern_map():
|
||||
"""
|
||||
Runtime pattern map for UI menus.
|
||||
Keep pattern DB metadata as primary, then add any local driver pattern files
|
||||
missing from the DB so new OTA files still appear in menus.
|
||||
"""
|
||||
definitions = load_pattern_definitions()
|
||||
available = load_driver_pattern_names()
|
||||
result = {}
|
||||
for name, meta in definitions.items():
|
||||
result[name] = dict(meta) if isinstance(meta, dict) else {}
|
||||
for name in available:
|
||||
if name not in result:
|
||||
result[name] = {}
|
||||
return result
|
||||
|
||||
@controller.get('/definitions')
|
||||
async def get_pattern_definitions(request):
|
||||
"""Get pattern definitions from pattern.json."""
|
||||
definitions = load_pattern_definitions()
|
||||
"""Get definitions for patterns currently available on the driver."""
|
||||
definitions = build_runtime_pattern_map()
|
||||
return json.dumps(definitions), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
|
||||
@controller.get('/ota/manifest')
|
||||
async def ota_manifest(request):
|
||||
"""Manifest of driver pattern source files for OTA pulls."""
|
||||
base_dir = driver_patterns_dir()
|
||||
host = request.headers.get("Host", "")
|
||||
if not host:
|
||||
return json.dumps({"error": "Missing Host header"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
try:
|
||||
names = sorted(os.listdir(base_dir))
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
files = []
|
||||
for name in names:
|
||||
if not _safe_pattern_filename(name) or name == "__init__.py":
|
||||
continue
|
||||
files.append({
|
||||
"name": name,
|
||||
"url": "http://%s/patterns/ota/file/%s" % (host, name),
|
||||
})
|
||||
|
||||
return json.dumps({"files": files}), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get('/ota/file/<name>')
|
||||
async def ota_pattern_file(request, name):
|
||||
"""Serve one driver pattern source file for OTA pulls."""
|
||||
fname = normalize_pattern_py_filename(name)
|
||||
if not fname or not _safe_pattern_filename(fname) or fname == "__init__.py":
|
||||
return json.dumps({"error": "Invalid filename"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if is_firmware_builtin_pattern_module(fname):
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "on and off are built into the driver firmware; there is no module file to serve.",
|
||||
}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
base = driver_patterns_dir()
|
||||
path = os.path.join(base, fname)
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
content = f.read()
|
||||
except OSError:
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "Pattern file not found",
|
||||
"path": path,
|
||||
"hint": "Ensure led-driver is present or set LED_CONTROLLER_PATTERNS_DIR.",
|
||||
}
|
||||
), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
return content, 200, {"Content-Type": "text/plain; charset=utf-8"}
|
||||
|
||||
|
||||
@controller.post('/<name>/send')
|
||||
async def send_pattern_to_device(request, name):
|
||||
"""Push one pattern source file directly to Wi-Fi driver(s) over HTTP."""
|
||||
if not isinstance(name, str):
|
||||
return json.dumps({"error": "Invalid pattern name"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
filename = normalize_pattern_py_filename(name)
|
||||
if not filename or not _safe_pattern_filename(filename) or filename == "__init__.py":
|
||||
return json.dumps({"error": "Invalid pattern filename"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if is_firmware_builtin_pattern_module(filename):
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "on and off are built into the driver firmware; send does not apply.",
|
||||
}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
devices = Device()
|
||||
body = request.json or {}
|
||||
requested_device_id = str(body.get("device_id") or "").strip()
|
||||
|
||||
base = driver_patterns_dir()
|
||||
path = os.path.join(base, filename)
|
||||
if not os.path.exists(path):
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "Pattern file not found",
|
||||
"path": path,
|
||||
"hint": "Ensure led-driver is present or set LED_CONTROLLER_PATTERNS_DIR.",
|
||||
}
|
||||
), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
source = f.read()
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
target_ids = []
|
||||
if requested_device_id:
|
||||
dev = devices.read(requested_device_id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
return json.dumps({"error": "Pattern send is only supported for Wi-Fi devices"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
target_ids = [requested_device_id]
|
||||
else:
|
||||
for did in devices.list():
|
||||
dev = devices.read(did) or {}
|
||||
if (dev.get("transport") or "").lower() == "wifi":
|
||||
target_ids.append(str(did))
|
||||
if not target_ids:
|
||||
return json.dumps({"error": "No Wi-Fi devices found"}), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
sent_ids = []
|
||||
for did in target_ids:
|
||||
dev = devices.read(did) or {}
|
||||
ip = str(dev.get("address") or "").strip()
|
||||
if not ip:
|
||||
continue
|
||||
ok = _http_post_pattern_source(ip, filename, source, reload_patterns=True, timeout_s=10.0)
|
||||
if ok:
|
||||
sent_ids.append(did)
|
||||
|
||||
if not sent_ids:
|
||||
return json.dumps({"error": "No Wi-Fi drivers accepted pattern upload"}), 503, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
return json.dumps({"message": "Pattern sent", "pattern": filename, "device_ids": sent_ids, "sent_count": len(sent_ids)}), 200, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
|
||||
@controller.post('/upload')
|
||||
async def upload_pattern_file(request):
|
||||
"""
|
||||
Upload a pattern source file to led-controller local storage.
|
||||
|
||||
Body JSON:
|
||||
{
|
||||
"name": "sparkle.py" | "sparkle",
|
||||
"code": "class Sparkle: ...",
|
||||
"overwrite": true | false # optional, default true
|
||||
}
|
||||
"""
|
||||
data = request.json or {}
|
||||
raw_name = data.get("name") or data.get("filename")
|
||||
code = data.get("code")
|
||||
overwrite = data.get("overwrite", True)
|
||||
overwrite = bool(overwrite)
|
||||
|
||||
if not isinstance(raw_name, str) or not raw_name.strip():
|
||||
return json.dumps({"error": "name is required"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
filename = raw_name.strip()
|
||||
if not filename.endswith(".py"):
|
||||
filename += ".py"
|
||||
if not _safe_pattern_filename(filename) or filename == "__init__.py":
|
||||
return json.dumps({"error": "invalid pattern filename"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if is_firmware_builtin_pattern_module(filename):
|
||||
return json.dumps(
|
||||
{"error": "on and off are built into the driver firmware; use a different pattern name."}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if not isinstance(code, str) or not code.strip():
|
||||
return json.dumps({"error": "code is required"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
path = os.path.join(driver_patterns_dir(), filename)
|
||||
exists = os.path.exists(path)
|
||||
if exists and not overwrite:
|
||||
return json.dumps({"error": "pattern file already exists", "name": filename}), 409, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
try:
|
||||
with open(path, "w") as f:
|
||||
f.write(code)
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
return json.dumps({
|
||||
"message": "Pattern uploaded",
|
||||
"name": filename,
|
||||
"overwrote": bool(exists),
|
||||
}), 201, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post('/driver')
|
||||
async def create_driver_pattern(request):
|
||||
"""
|
||||
Create a driver pattern: save ``.py`` under led-driver/src/patterns and
|
||||
metadata in db/pattern.json (Pattern model).
|
||||
|
||||
Body JSON:
|
||||
name, code (required),
|
||||
min_delay, max_delay, max_colors (optional numbers),
|
||||
has_background (optional bool),
|
||||
supports_manual (optional bool, default true if omitted in db),
|
||||
n1..n8 (optional string labels),
|
||||
overwrite (optional, default true).
|
||||
"""
|
||||
data = request.json or {}
|
||||
key = _normalize_pattern_key(data.get("name") or "")
|
||||
if not _valid_pattern_key(key):
|
||||
return json.dumps({
|
||||
"error": "name must be a valid Python identifier (e.g. sparkle, my_pattern)",
|
||||
}), 400, {"Content-Type": "application/json"}
|
||||
if is_firmware_builtin_pattern_module(key):
|
||||
return json.dumps(
|
||||
{"error": "on and off are built into the driver firmware; use a different pattern name."}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
code = data.get("code")
|
||||
if not isinstance(code, str) or not code.strip():
|
||||
return json.dumps({"error": "code is required (upload a .py file or paste source)"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
overwrite = bool(data.get("overwrite", True))
|
||||
|
||||
filename = key + ".py"
|
||||
py_path = os.path.join(driver_patterns_dir(), filename)
|
||||
if os.path.exists(py_path) and not overwrite:
|
||||
return json.dumps({"error": "pattern file already exists", "name": filename}), 409, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
meta = {}
|
||||
for fld in ("min_delay", "max_delay", "max_colors"):
|
||||
if fld not in data:
|
||||
continue
|
||||
try:
|
||||
meta[fld] = int(data[fld])
|
||||
except (TypeError, ValueError):
|
||||
return json.dumps({"error": "%s must be an integer" % fld}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
if "has_background" in data:
|
||||
meta["has_background"] = bool(data.get("has_background"))
|
||||
|
||||
if "supports_manual" in data:
|
||||
meta["supports_manual"] = bool(data.get("supports_manual"))
|
||||
|
||||
for i in range(1, 9):
|
||||
nk = "n%d" % i
|
||||
if nk not in data:
|
||||
continue
|
||||
lab = data[nk]
|
||||
if lab is None:
|
||||
continue
|
||||
s = str(lab).strip()
|
||||
if s:
|
||||
meta[nk] = s
|
||||
|
||||
try:
|
||||
with open(py_path, "w") as f:
|
||||
f.write(code)
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
if patterns.read(key):
|
||||
patterns.update(key, meta)
|
||||
else:
|
||||
patterns.create(key, meta)
|
||||
|
||||
return json.dumps({
|
||||
"message": "Pattern created",
|
||||
"name": key,
|
||||
"file": filename,
|
||||
"metadata": patterns.read(key),
|
||||
}), 201, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get('')
|
||||
async def list_patterns(request):
|
||||
"""List all patterns."""
|
||||
return json.dumps(patterns), 200, {'Content-Type': 'application/json'}
|
||||
"""List patterns for UI (DB metadata + local driver additions)."""
|
||||
return json.dumps(build_runtime_pattern_map()), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
|
||||
@controller.get('/<id>')
|
||||
|
||||
@@ -2,15 +2,29 @@ from microdot import Microdot
|
||||
from microdot.session import with_session
|
||||
from models.preset import Preset
|
||||
from models.profile import Profile
|
||||
from models.pallet import Palette
|
||||
from models.device import Device, normalize_mac
|
||||
from models.transport import get_current_sender
|
||||
from util.driver_delivery import deliver_json_messages, deliver_preset_broadcast_then_per_device
|
||||
from util.espnow_message import build_message, build_preset_dict
|
||||
import asyncio
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
presets = Preset()
|
||||
profiles = Profile()
|
||||
|
||||
|
||||
def _palette_colors_for_profile(profile_id):
|
||||
prof = profiles.read(str(profile_id))
|
||||
if not isinstance(prof, dict):
|
||||
return None
|
||||
pid = prof.get("palette_id") or prof.get("paletteId")
|
||||
if not pid:
|
||||
return None
|
||||
cols = Palette().read(str(pid))
|
||||
return cols if isinstance(cols, list) else None
|
||||
|
||||
|
||||
def get_current_profile_id(session=None):
|
||||
"""Get the current active profile ID from session or fallback to first."""
|
||||
profile_list = profiles.list()
|
||||
@@ -125,13 +139,17 @@ async def delete_preset(request, *args, **kwargs):
|
||||
@with_session
|
||||
async def send_presets(request, session):
|
||||
"""
|
||||
Send one or more presets to the LED driver (via serial transport).
|
||||
Send one or more presets to LED drivers (serial/ESP-NOW and/or TCP Wi-Fi clients).
|
||||
|
||||
Body JSON:
|
||||
{"preset_ids": ["1", "2", ...]} or {"ids": ["1", "2", ...]}
|
||||
Optional "targets": ["aabbccddeeff", ...] — registry MACs. When set: preset
|
||||
chunks are ESP-NOW broadcast once each; Wi-Fi drivers get the same chunks
|
||||
over TCP; if "default" is set, each target then gets a unicast default
|
||||
message (serial or TCP) with that device name in "targets".
|
||||
Omit targets for broadcast-only serial (legacy).
|
||||
|
||||
The controller looks up each preset, converts to API format, chunks into
|
||||
<= 240-byte messages, and sends them over the configured transport.
|
||||
Optional "destination_mac" / "to": single MAC when targets is omitted.
|
||||
"""
|
||||
try:
|
||||
data = request.json or {}
|
||||
@@ -144,11 +162,11 @@ async def send_presets(request, session):
|
||||
save_flag = data.get('save', True)
|
||||
save_flag = bool(save_flag)
|
||||
default_id = data.get('default')
|
||||
# Optional 12-char hex MAC to send to one device; omit for default (e.g. broadcast).
|
||||
destination_mac = data.get('destination_mac') or data.get('to')
|
||||
|
||||
# Build API-compliant preset map keyed by preset ID, include name
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
palette_colors = _palette_colors_for_profile(current_profile_id)
|
||||
presets_by_name = {}
|
||||
for pid in preset_ids:
|
||||
preset_data = presets.read(str(pid))
|
||||
@@ -157,7 +175,7 @@ async def send_presets(request, session):
|
||||
if str(preset_data.get("profile_id")) != str(current_profile_id):
|
||||
continue
|
||||
preset_key = str(pid)
|
||||
preset_payload = build_preset_dict(preset_data)
|
||||
preset_payload = build_preset_dict(preset_data, palette_colors)
|
||||
preset_payload["name"] = preset_data.get("name", "")
|
||||
presets_by_name[preset_key] = preset_payload
|
||||
|
||||
@@ -171,23 +189,13 @@ async def send_presets(request, session):
|
||||
if not sender:
|
||||
return json.dumps({"error": "Transport not configured"}), 503, {'Content-Type': 'application/json'}
|
||||
|
||||
async def send_chunk(chunk_presets, is_last):
|
||||
# Save/default should only be sent with the final presets chunk.
|
||||
msg = build_message(
|
||||
presets=chunk_presets,
|
||||
save=save_flag and is_last,
|
||||
default=default_id if is_last else None,
|
||||
)
|
||||
await sender.send(msg, addr=destination_mac)
|
||||
|
||||
MAX_BYTES = 240
|
||||
send_delay_s = 0.1
|
||||
entries = list(presets_by_name.items())
|
||||
total_presets = len(entries)
|
||||
messages_sent = 0
|
||||
|
||||
batch = {}
|
||||
last_msg = None
|
||||
chunk_messages = []
|
||||
for name, preset_obj in entries:
|
||||
test_batch = dict(batch)
|
||||
test_batch[name] = preset_obj
|
||||
@@ -196,28 +204,144 @@ async def send_presets(request, session):
|
||||
|
||||
if size <= MAX_BYTES or not batch:
|
||||
batch = test_batch
|
||||
last_msg = test_msg
|
||||
else:
|
||||
try:
|
||||
await send_chunk(batch, False)
|
||||
except Exception:
|
||||
return json.dumps({"error": "Send failed"}), 503, {'Content-Type': 'application/json'}
|
||||
await asyncio.sleep(send_delay_s)
|
||||
messages_sent += 1
|
||||
chunk_messages.append(
|
||||
build_message(
|
||||
presets=dict(batch),
|
||||
save=False,
|
||||
default=None,
|
||||
)
|
||||
)
|
||||
batch = {name: preset_obj}
|
||||
last_msg = build_message(presets=batch, save=save_flag, default=default_id)
|
||||
|
||||
if batch:
|
||||
try:
|
||||
await send_chunk(batch, True)
|
||||
except Exception:
|
||||
return json.dumps({"error": "Send failed"}), 503, {'Content-Type': 'application/json'}
|
||||
await asyncio.sleep(send_delay_s)
|
||||
messages_sent += 1
|
||||
chunk_messages.append(
|
||||
build_message(
|
||||
presets=dict(batch),
|
||||
save=save_flag,
|
||||
default=default_id,
|
||||
)
|
||||
)
|
||||
|
||||
target_list = None
|
||||
raw_targets = data.get("targets")
|
||||
if isinstance(raw_targets, list) and raw_targets:
|
||||
target_list = []
|
||||
for t in raw_targets:
|
||||
m = normalize_mac(str(t))
|
||||
if m:
|
||||
target_list.append(m)
|
||||
target_list = list(dict.fromkeys(target_list))
|
||||
if not target_list:
|
||||
target_list = None
|
||||
elif destination_mac:
|
||||
dm = normalize_mac(str(destination_mac))
|
||||
target_list = [dm] if dm else None
|
||||
|
||||
try:
|
||||
if target_list:
|
||||
deliveries = await deliver_preset_broadcast_then_per_device(
|
||||
sender,
|
||||
chunk_messages,
|
||||
target_list,
|
||||
Device(),
|
||||
str(default_id) if default_id is not None else None,
|
||||
delay_s=send_delay_s,
|
||||
)
|
||||
else:
|
||||
deliveries, _chunks = await deliver_json_messages(
|
||||
sender,
|
||||
chunk_messages,
|
||||
None,
|
||||
Device(),
|
||||
delay_s=send_delay_s,
|
||||
)
|
||||
except Exception:
|
||||
return json.dumps({"error": "Send failed"}), 503, {'Content-Type': 'application/json'}
|
||||
|
||||
return json.dumps({
|
||||
"message": "Presets sent",
|
||||
"presets_sent": total_presets,
|
||||
"messages_sent": messages_sent
|
||||
"messages_sent": deliveries,
|
||||
}), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
|
||||
@controller.post('/push')
|
||||
@with_session
|
||||
async def push_driver_messages(request, session):
|
||||
"""
|
||||
Deliver one or more raw v1 JSON objects to devices (ESP-NOW and/or TCP).
|
||||
|
||||
Body:
|
||||
{"sequence": [{ "v": "1", ... }, ...], "targets": ["mac", ...]}
|
||||
or a single {"payload": {...}, "targets": [...]}.
|
||||
"""
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
return json.dumps({"error": "Invalid JSON"}), 400, {'Content-Type': 'application/json'}
|
||||
|
||||
seq = data.get("sequence")
|
||||
if not seq and data.get("payload") is not None:
|
||||
seq = [data["payload"]]
|
||||
if not isinstance(seq, list) or not seq:
|
||||
return json.dumps({"error": "sequence or payload required"}), 400, {'Content-Type': 'application/json'}
|
||||
|
||||
raw_targets = data.get("targets")
|
||||
target_list = None
|
||||
if isinstance(raw_targets, list) and raw_targets:
|
||||
target_list = []
|
||||
for t in raw_targets:
|
||||
m = normalize_mac(str(t))
|
||||
if m:
|
||||
target_list.append(m)
|
||||
target_list = list(dict.fromkeys(target_list))
|
||||
if not target_list:
|
||||
target_list = None
|
||||
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return json.dumps({"error": "Transport not configured"}), 503, {'Content-Type': 'application/json'}
|
||||
|
||||
messages = []
|
||||
for item in seq:
|
||||
if isinstance(item, dict):
|
||||
messages.append(json.dumps(item))
|
||||
elif isinstance(item, str):
|
||||
messages.append(item)
|
||||
else:
|
||||
return json.dumps({"error": "sequence items must be objects or strings"}), 400, {'Content-Type': 'application/json'}
|
||||
|
||||
delay_s = data.get("delay_s", 0.05)
|
||||
try:
|
||||
delay_s = float(delay_s)
|
||||
except (TypeError, ValueError):
|
||||
delay_s = 0.05
|
||||
|
||||
try:
|
||||
deliveries, _chunks = await deliver_json_messages(
|
||||
sender,
|
||||
messages,
|
||||
target_list,
|
||||
Device(),
|
||||
delay_s=delay_s,
|
||||
)
|
||||
except Exception:
|
||||
return json.dumps({"error": "Send failed"}), 503, {'Content-Type': 'application/json'}
|
||||
|
||||
try:
|
||||
from util import sequence_playback as seq_pb
|
||||
from util.beat_driver_route import sync_beat_route_from_push_sequence
|
||||
|
||||
preserve = bool(seq_pb.playback_status().get("active"))
|
||||
sync_beat_route_from_push_sequence(
|
||||
seq, target_macs=target_list, preserve_parallel_lane_routes=preserve
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return json.dumps({
|
||||
"message": "Delivered",
|
||||
"deliveries": deliveries,
|
||||
}), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
from microdot import Microdot
|
||||
from microdot.session import with_session
|
||||
from models.profile import Profile
|
||||
from models.tab import Tab
|
||||
from models.zone import Zone
|
||||
from models.preset import Preset
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
profiles = Profile()
|
||||
tabs = Tab()
|
||||
zones = Zone()
|
||||
presets = Preset()
|
||||
|
||||
@controller.get('')
|
||||
@@ -83,20 +83,20 @@ async def create_profile(request):
|
||||
try:
|
||||
data = dict(request.json or {})
|
||||
name = data.get("name", "")
|
||||
seed_raw = data.get("seed_dj_tab", False)
|
||||
seed_raw = data.get("seed_dj_zone", False)
|
||||
if isinstance(seed_raw, str):
|
||||
seed_dj_tab = seed_raw.strip().lower() in ("1", "true", "yes", "on")
|
||||
seed_dj_zone = seed_raw.strip().lower() in ("1", "true", "yes", "on")
|
||||
else:
|
||||
seed_dj_tab = bool(seed_raw)
|
||||
seed_dj_zone = bool(seed_raw)
|
||||
# Request-only flag: do not persist on profile records.
|
||||
data.pop("seed_dj_tab", None)
|
||||
data.pop("seed_dj_zone", None)
|
||||
profile_id = profiles.create(name)
|
||||
# Avoid persisting request-only fields.
|
||||
data.pop("name", None)
|
||||
if data:
|
||||
profiles.update(profile_id, data)
|
||||
|
||||
# New profiles always start with a default tab pre-populated with starter presets.
|
||||
# New profiles always start with a default zone pre-populated with starter presets.
|
||||
default_preset_ids = []
|
||||
default_preset_defs = [
|
||||
{
|
||||
@@ -124,6 +124,15 @@ async def create_profile(request):
|
||||
"auto": True,
|
||||
"n1": 2,
|
||||
},
|
||||
{
|
||||
"name": "Colour Cycle",
|
||||
"pattern": "colour_cycle",
|
||||
"colors": ["#FF0000", "#00FF00", "#0000FF"],
|
||||
"brightness": 255,
|
||||
"delay": 100,
|
||||
"auto": True,
|
||||
"n1": 1,
|
||||
},
|
||||
{
|
||||
"name": "transition",
|
||||
"pattern": "transition",
|
||||
@@ -132,6 +141,39 @@ async def create_profile(request):
|
||||
"delay": 500,
|
||||
"auto": True,
|
||||
},
|
||||
{
|
||||
"name": "flicker",
|
||||
"pattern": "flicker",
|
||||
"colors": ["#FFB84D"],
|
||||
"brightness": 255,
|
||||
"delay": 80,
|
||||
"auto": True,
|
||||
"n1": 30,
|
||||
},
|
||||
{
|
||||
"name": "flame",
|
||||
"pattern": "flame",
|
||||
"colors": [],
|
||||
"brightness": 255,
|
||||
"delay": 50,
|
||||
"auto": True,
|
||||
"n1": 35,
|
||||
"n2": 2600,
|
||||
"n3": 0,
|
||||
"n4": 0,
|
||||
},
|
||||
{
|
||||
"name": "twinkle",
|
||||
"pattern": "twinkle",
|
||||
"colors": ["#78C8FF", "#508CFF", "#B478FF", "#64DCE8", "#A0C8FF"],
|
||||
"brightness": 255,
|
||||
"delay": 55,
|
||||
"auto": True,
|
||||
"n1": 72,
|
||||
"n2": 140,
|
||||
"n3": 2,
|
||||
"n4": 6,
|
||||
},
|
||||
]
|
||||
|
||||
for preset_data in default_preset_defs:
|
||||
@@ -139,18 +181,18 @@ async def create_profile(request):
|
||||
presets.update(pid, preset_data)
|
||||
default_preset_ids.append(str(pid))
|
||||
|
||||
default_tab_id = tabs.create(name="default", names=["1"], presets=[default_preset_ids])
|
||||
tabs.update(default_tab_id, {
|
||||
default_tab_id = zones.create(name="default", names=["1"], presets=[default_preset_ids])
|
||||
zones.update(default_tab_id, {
|
||||
"presets_flat": default_preset_ids,
|
||||
"default_preset": default_preset_ids[0] if default_preset_ids else None,
|
||||
})
|
||||
|
||||
profile = profiles.read(profile_id) or {}
|
||||
profile_tabs = profile.get("tabs", []) if isinstance(profile.get("tabs", []), list) else []
|
||||
profile_tabs = profile.get("zones", []) if isinstance(profile.get("zones", []), list) else []
|
||||
profile_tabs.append(str(default_tab_id))
|
||||
|
||||
if seed_dj_tab:
|
||||
# Seed a DJ-focused tab with three starter presets.
|
||||
if seed_dj_zone:
|
||||
# Seed a DJ-focused zone with three starter presets.
|
||||
seeded_preset_ids = []
|
||||
preset_defs = [
|
||||
{
|
||||
@@ -182,15 +224,15 @@ async def create_profile(request):
|
||||
presets.update(pid, preset_data)
|
||||
seeded_preset_ids.append(str(pid))
|
||||
|
||||
dj_tab_id = tabs.create(name="dj", names=["dj"], presets=[seeded_preset_ids])
|
||||
tabs.update(dj_tab_id, {
|
||||
dj_tab_id = zones.create(name="dj", names=["dj"], presets=[seeded_preset_ids])
|
||||
zones.update(dj_tab_id, {
|
||||
"presets_flat": seeded_preset_ids,
|
||||
"default_preset": seeded_preset_ids[0] if seeded_preset_ids else None,
|
||||
})
|
||||
|
||||
profile_tabs.append(str(dj_tab_id))
|
||||
|
||||
profiles.update(profile_id, {"tabs": profile_tabs})
|
||||
profiles.update(profile_id, {"zones": profile_tabs})
|
||||
|
||||
profile_data = profiles.read(profile_id)
|
||||
return json.dumps({profile_id: profile_data}), 201, {'Content-Type': 'application/json'}
|
||||
@@ -208,7 +250,7 @@ async def clone_profile(request, id):
|
||||
data = request.json or {}
|
||||
source_name = source.get("name") or f"Profile {id}"
|
||||
new_name = data.get("name") or source_name
|
||||
profile_type = source.get("type", "tabs")
|
||||
profile_type = source.get("type", "zones")
|
||||
|
||||
def allocate_id(model, cache):
|
||||
if "next" not in cache:
|
||||
@@ -255,28 +297,28 @@ async def clone_profile(request, id):
|
||||
palette_colors = []
|
||||
|
||||
# Clone tabs and presets used by those tabs
|
||||
source_tabs = source.get("tabs")
|
||||
source_tabs = source.get("zones")
|
||||
if not isinstance(source_tabs, list) or len(source_tabs) == 0:
|
||||
source_tabs = source.get("tab_order", [])
|
||||
source_tabs = source.get("zone_order", [])
|
||||
source_tabs = source_tabs or []
|
||||
cloned_tab_ids = []
|
||||
preset_id_map = {}
|
||||
new_tabs = {}
|
||||
new_presets = {}
|
||||
for tab_id in source_tabs:
|
||||
tab = tabs.read(tab_id)
|
||||
if not tab:
|
||||
for zone_id in source_tabs:
|
||||
zone = zones.read(zone_id)
|
||||
if not zone:
|
||||
continue
|
||||
tab_name = tab.get("name") or f"Tab {tab_id}"
|
||||
tab_name = zone.get("name") or f"Zone {zone_id}"
|
||||
clone_name = tab_name
|
||||
mapped_presets = map_preset_container(tab.get("presets"), preset_id_map, preset_cache, new_profile_id, new_presets)
|
||||
clone_id = allocate_id(tabs, tab_cache)
|
||||
mapped_presets = map_preset_container(zone.get("presets"), preset_id_map, preset_cache, new_profile_id, new_presets)
|
||||
clone_id = allocate_id(zones, tab_cache)
|
||||
clone_data = {
|
||||
"name": clone_name,
|
||||
"names": tab.get("names") or [],
|
||||
"names": zone.get("names") or [],
|
||||
"presets": mapped_presets if mapped_presets is not None else []
|
||||
}
|
||||
extra = {k: v for k, v in tab.items() if k not in ("name", "names", "presets")}
|
||||
extra = {k: v for k, v in zone.items() if k not in ("name", "names", "presets")}
|
||||
if "presets_flat" in extra:
|
||||
extra["presets_flat"] = map_preset_container(extra.get("presets_flat"), preset_id_map, preset_cache, new_profile_id, new_presets)
|
||||
if extra:
|
||||
@@ -287,7 +329,7 @@ async def clone_profile(request, id):
|
||||
new_profile_data = {
|
||||
"name": new_name,
|
||||
"type": profile_type,
|
||||
"tabs": cloned_tab_ids,
|
||||
"zones": cloned_tab_ids,
|
||||
"scenes": list(source.get("scenes", [])) if isinstance(source.get("scenes", []), list) else [],
|
||||
"palette_id": str(new_palette_id),
|
||||
}
|
||||
@@ -297,12 +339,12 @@ async def clone_profile(request, id):
|
||||
for pid, pdata in new_presets.items():
|
||||
presets[pid] = pdata
|
||||
for tid, tdata in new_tabs.items():
|
||||
tabs[tid] = tdata
|
||||
zones[tid] = tdata
|
||||
profiles[str(new_profile_id)] = new_profile_data
|
||||
|
||||
profiles._palette_model.save()
|
||||
presets.save()
|
||||
tabs.save()
|
||||
zones.save()
|
||||
profiles.save()
|
||||
|
||||
return json.dumps({new_profile_id: new_profile_data}), 201, {'Content-Type': 'application/json'}
|
||||
|
||||
@@ -1,51 +1,207 @@
|
||||
from microdot import Microdot
|
||||
from models.squence import Sequence
|
||||
from microdot.session import with_session
|
||||
from models.sequence import Sequence
|
||||
from models.profile import Profile
|
||||
from models.transport import get_current_sender
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
sequences = Sequence()
|
||||
profiles = Profile()
|
||||
|
||||
@controller.get('')
|
||||
async def list_sequences(request):
|
||||
"""List all sequences."""
|
||||
return json.dumps(sequences), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
@controller.get('/<id>')
|
||||
async def get_sequence(request, id):
|
||||
"""Get a specific sequence by ID."""
|
||||
sequence = sequences.read(id)
|
||||
if sequence:
|
||||
return json.dumps(sequence), 200, {'Content-Type': 'application/json'}
|
||||
def get_current_profile_id(session=None):
|
||||
"""Get the current active profile ID from session or fallback to first."""
|
||||
profile_list = profiles.list()
|
||||
session_profile = None
|
||||
if session is not None:
|
||||
session_profile = session.get("current_profile")
|
||||
if session_profile and session_profile in profile_list:
|
||||
return session_profile
|
||||
if profile_list:
|
||||
return profile_list[0]
|
||||
return None
|
||||
|
||||
|
||||
@controller.get("")
|
||||
@with_session
|
||||
async def list_sequences(request, session):
|
||||
"""List sequences for the current profile."""
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
if not current_profile_id:
|
||||
return json.dumps({}), 200, {"Content-Type": "application/json"}
|
||||
scoped = {
|
||||
sid: sdata
|
||||
for sid, sdata in sequences.items()
|
||||
if isinstance(sdata, dict)
|
||||
and str(sdata.get("profile_id")) == str(current_profile_id)
|
||||
}
|
||||
return json.dumps(scoped), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
@with_session
|
||||
async def get_sequence(request, session, id):
|
||||
"""Get a specific sequence by ID (current profile only)."""
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
seq = sequences.read(id)
|
||||
if (
|
||||
seq
|
||||
and current_profile_id
|
||||
and str(seq.get("profile_id")) == str(current_profile_id)
|
||||
):
|
||||
return json.dumps(seq), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
|
||||
@controller.post('')
|
||||
async def create_sequence(request):
|
||||
"""Create a new sequence."""
|
||||
try:
|
||||
data = request.json or {}
|
||||
group_name = data.get("group_name", "")
|
||||
preset_names = data.get("presets", None)
|
||||
sequence_id = sequences.create(group_name, preset_names)
|
||||
if data:
|
||||
sequences.update(sequence_id, data)
|
||||
return json.dumps(sequences.read(sequence_id)), 201, {'Content-Type': 'application/json'}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.put('/<id>')
|
||||
async def update_sequence(request, id):
|
||||
"""Update an existing sequence."""
|
||||
@controller.post("")
|
||||
@with_session
|
||||
async def create_sequence(request, session):
|
||||
"""Create a new sequence for the current profile."""
|
||||
try:
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
return (
|
||||
json.dumps({"error": "Invalid JSON"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
if not current_profile_id:
|
||||
return (
|
||||
json.dumps({"error": "No profile available"}),
|
||||
404,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
sequence_id = sequences.create(current_profile_id)
|
||||
if not isinstance(data, dict):
|
||||
data = {}
|
||||
data = dict(data)
|
||||
data["profile_id"] = str(current_profile_id)
|
||||
if sequences.update(sequence_id, data):
|
||||
seq_data = sequences.read(sequence_id)
|
||||
return (
|
||||
json.dumps({sequence_id: seq_data}),
|
||||
201,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
return (
|
||||
json.dumps({"error": "Failed to create sequence"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.put("/<id>")
|
||||
@with_session
|
||||
async def update_sequence(request, session, id):
|
||||
"""Update an existing sequence (current profile only)."""
|
||||
try:
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
seq = sequences.read(id)
|
||||
if not seq or str(seq.get("profile_id")) != str(current_profile_id):
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
data = request.json
|
||||
if not isinstance(data, dict):
|
||||
return (
|
||||
json.dumps({"error": "Invalid JSON"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
data = dict(data)
|
||||
data["profile_id"] = str(current_profile_id)
|
||||
if sequences.update(id, data):
|
||||
return json.dumps(sequences.read(id)), 200, {'Content-Type': 'application/json'}
|
||||
try:
|
||||
from util.sequence_playback import stop_if_playing_sequence
|
||||
|
||||
stop_if_playing_sequence(str(id))
|
||||
except Exception:
|
||||
pass
|
||||
return json.dumps(sequences.read(id)), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
@controller.delete('/<id>')
|
||||
async def delete_sequence(request, id):
|
||||
"""Delete a sequence."""
|
||||
|
||||
@controller.delete("/<id>")
|
||||
@with_session
|
||||
async def delete_sequence(request, session, id):
|
||||
"""Delete a sequence (current profile only)."""
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
seq = sequences.read(id)
|
||||
if not seq or str(seq.get("profile_id")) != str(current_profile_id):
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
try:
|
||||
from util.sequence_playback import stop_if_playing_sequence
|
||||
|
||||
stop_if_playing_sequence(str(id))
|
||||
except Exception:
|
||||
pass
|
||||
if sequences.delete(id):
|
||||
return json.dumps({"message": "Sequence deleted successfully"}), 200
|
||||
return (
|
||||
json.dumps({"message": "Sequence deleted successfully"}),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
|
||||
|
||||
@controller.post("/stop")
|
||||
@with_session
|
||||
async def stop_sequence_playback(request, session):
|
||||
"""Stop server-driven zone sequence playback."""
|
||||
_ = request
|
||||
try:
|
||||
from util.sequence_playback import stop
|
||||
|
||||
stop()
|
||||
return json.dumps({"ok": True}), 200, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/<id>/play")
|
||||
@with_session
|
||||
async def play_sequence(request, session, id):
|
||||
"""Start server-driven playback for a sequence in a zone (body: {\"zone_id\": \"...\"})."""
|
||||
if not get_current_sender():
|
||||
return (
|
||||
json.dumps({"error": "Transport not configured"}),
|
||||
503,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
if not current_profile_id:
|
||||
return (
|
||||
json.dumps({"error": "No profile available"}),
|
||||
404,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
data = {}
|
||||
if not isinstance(data, dict):
|
||||
data = {}
|
||||
zone_id = data.get("zone_id") or data.get("zoneId")
|
||||
if zone_id is None or str(zone_id).strip() == "":
|
||||
return (
|
||||
json.dumps({"error": "zone_id required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
zone_id = str(zone_id).strip()
|
||||
try:
|
||||
from util.sequence_playback import start
|
||||
|
||||
await start(zone_id, str(id), str(current_profile_id), data if isinstance(data, dict) else None)
|
||||
return json.dumps({"ok": True}), 200, {"Content-Type": "application/json"}
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
except RuntimeError as e:
|
||||
return json.dumps({"error": str(e)}), 503, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
from microdot import Microdot, send_file
|
||||
from settings import Settings
|
||||
import asyncio
|
||||
import json
|
||||
|
||||
from microdot import Microdot, send_file
|
||||
|
||||
from models import wifi_ws_clients
|
||||
from settings import Settings
|
||||
|
||||
controller = Microdot()
|
||||
settings = Settings()
|
||||
|
||||
@@ -63,17 +67,36 @@ def _validate_wifi_channel(value):
|
||||
return ch
|
||||
|
||||
|
||||
def _validate_global_brightness(value):
|
||||
"""Return int 0–255 or raise ValueError."""
|
||||
v = int(value)
|
||||
if v < 0 or v > 255:
|
||||
raise ValueError("global_brightness must be between 0 and 255")
|
||||
return v
|
||||
|
||||
|
||||
@controller.put('/settings')
|
||||
async def update_settings(request):
|
||||
"""Update general settings."""
|
||||
try:
|
||||
data = request.json
|
||||
global_brightness_changed = False
|
||||
for key, value in data.items():
|
||||
if key == 'wifi_channel' and value is not None:
|
||||
settings[key] = _validate_wifi_channel(value)
|
||||
elif key == 'global_brightness' and value is not None:
|
||||
settings[key] = _validate_global_brightness(value)
|
||||
global_brightness_changed = True
|
||||
else:
|
||||
settings[key] = value
|
||||
settings.save()
|
||||
if global_brightness_changed:
|
||||
try:
|
||||
asyncio.get_running_loop().create_task(
|
||||
wifi_ws_clients.broadcast_global_brightness_to_tcp_drivers()
|
||||
)
|
||||
except RuntimeError:
|
||||
pass
|
||||
return json.dumps({"message": "Settings updated successfully"}), 200, {'Content-Type': 'application/json'}
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@@ -1,346 +0,0 @@
|
||||
from microdot import Microdot, send_file
|
||||
from microdot.session import with_session
|
||||
from models.tab import Tab
|
||||
from models.profile import Profile
|
||||
import json
|
||||
import os
|
||||
import time
|
||||
|
||||
controller = Microdot()
|
||||
tabs = Tab()
|
||||
profiles = Profile()
|
||||
|
||||
def get_current_profile_id(session=None):
|
||||
"""Get the current active profile ID from session or fallback to first."""
|
||||
profile_list = profiles.list()
|
||||
session_profile = None
|
||||
if session is not None:
|
||||
session_profile = session.get('current_profile')
|
||||
if session_profile and session_profile in profile_list:
|
||||
return session_profile
|
||||
if profile_list:
|
||||
return profile_list[0]
|
||||
return None
|
||||
|
||||
def get_profile_tab_order(profile_id):
|
||||
"""Get the tab order for a profile."""
|
||||
if not profile_id:
|
||||
return []
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
# Support both "tab_order" (old) and "tabs" (new) format
|
||||
return profile.get("tabs", profile.get("tab_order", []))
|
||||
return []
|
||||
|
||||
def get_current_tab_id(request, session=None):
|
||||
"""Get the current tab ID from cookie."""
|
||||
# Read from cookie first
|
||||
current_tab = request.cookies.get('current_tab')
|
||||
if current_tab:
|
||||
return current_tab
|
||||
|
||||
# Fallback to first tab in current profile
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
# Support both "tabs" (new) and "tab_order" (old) format
|
||||
tabs_list = profile.get("tabs", profile.get("tab_order", []))
|
||||
if tabs_list:
|
||||
return tabs_list[0]
|
||||
return None
|
||||
|
||||
def _render_tabs_list_fragment(request, session):
|
||||
"""Helper function to render tabs list HTML fragment."""
|
||||
profile_id = get_current_profile_id(session)
|
||||
# #region agent log
|
||||
try:
|
||||
os.makedirs('/home/pi/led-controller/.cursor', exist_ok=True)
|
||||
with open('/home/pi/led-controller/.cursor/debug.log', 'a') as _log:
|
||||
_log.write(json.dumps({
|
||||
"sessionId": "debug-session",
|
||||
"runId": "tabs-pre-fix",
|
||||
"hypothesisId": "H1",
|
||||
"location": "src/controllers/tab.py:_render_tabs_list_fragment",
|
||||
"message": "tabs list fragment",
|
||||
"data": {
|
||||
"profile_id": profile_id,
|
||||
"profile_count": len(profiles.list())
|
||||
},
|
||||
"timestamp": int(time.time() * 1000)
|
||||
}) + "\n")
|
||||
except Exception:
|
||||
pass
|
||||
# #endregion
|
||||
if not profile_id:
|
||||
return '<div class="tabs-list">No profile selected</div>', 200, {'Content-Type': 'text/html'}
|
||||
|
||||
tab_order = get_profile_tab_order(profile_id)
|
||||
current_tab_id = get_current_tab_id(request, session)
|
||||
|
||||
html = '<div class="tabs-list">'
|
||||
for tab_id in tab_order:
|
||||
tab_data = tabs.read(tab_id)
|
||||
if tab_data:
|
||||
active_class = 'active' if str(tab_id) == str(current_tab_id) else ''
|
||||
tab_name = tab_data.get('name', 'Tab ' + str(tab_id))
|
||||
html += (
|
||||
'<button class="tab-button ' + active_class + '" '
|
||||
'hx-get="/tabs/' + str(tab_id) + '/content-fragment" '
|
||||
'hx-target="#tab-content" '
|
||||
'hx-swap="innerHTML" '
|
||||
'hx-push-url="true" '
|
||||
'hx-trigger="click" '
|
||||
'onclick="document.querySelectorAll(\'.tab-button\').forEach(b => b.classList.remove(\'active\')); this.classList.add(\'active\');">'
|
||||
+ tab_name +
|
||||
'</button>'
|
||||
)
|
||||
html += '</div>'
|
||||
return html, 200, {'Content-Type': 'text/html'}
|
||||
|
||||
def _render_tab_content_fragment(request, session, id):
|
||||
"""Helper function to render tab content HTML fragment."""
|
||||
# Handle 'current' as a special case
|
||||
if id == 'current':
|
||||
current_tab_id = get_current_tab_id(request, session)
|
||||
if not current_tab_id:
|
||||
accept_header = request.headers.get('Accept', '')
|
||||
wants_html = 'text/html' in accept_header
|
||||
if wants_html:
|
||||
return '<div class="error">No current tab set</div>', 404, {'Content-Type': 'text/html'}
|
||||
return json.dumps({"error": "No current tab set"}), 404
|
||||
id = current_tab_id
|
||||
|
||||
tab = tabs.read(id)
|
||||
if not tab:
|
||||
return '<div>Tab not found</div>', 404, {'Content-Type': 'text/html'}
|
||||
|
||||
# Set this tab as the current tab in session
|
||||
session['current_tab'] = str(id)
|
||||
session.save()
|
||||
|
||||
# If this is a direct page load (not HTMX), return full UI so CSS loads.
|
||||
if not request.headers.get('HX-Request'):
|
||||
return send_file('templates/index.html')
|
||||
|
||||
tab_name = tab.get('name', 'Tab ' + str(id))
|
||||
|
||||
html = (
|
||||
'<div class="presets-section" data-tab-id="' + str(id) + '">'
|
||||
'<h3>Presets</h3>'
|
||||
'<div class="profiles-actions" style="margin-bottom: 1rem;"></div>'
|
||||
'<div id="presets-list-tab" class="presets-list">'
|
||||
'<!-- Presets will be loaded here -->'
|
||||
'</div>'
|
||||
'</div>'
|
||||
)
|
||||
return html, 200, {'Content-Type': 'text/html'}
|
||||
|
||||
@controller.get('')
|
||||
@with_session
|
||||
async def list_tabs(request, session):
|
||||
"""List all tabs with current tab info."""
|
||||
profile_id = get_current_profile_id(session)
|
||||
current_tab_id = get_current_tab_id(request, session)
|
||||
|
||||
# Get tab order for current profile
|
||||
tab_order = get_profile_tab_order(profile_id) if profile_id else []
|
||||
|
||||
# Build tabs list with metadata
|
||||
tabs_data = {}
|
||||
for tab_id in tabs.list():
|
||||
tab_data = tabs.read(tab_id)
|
||||
if tab_data:
|
||||
tabs_data[tab_id] = tab_data
|
||||
|
||||
return json.dumps({
|
||||
"tabs": tabs_data,
|
||||
"tab_order": tab_order,
|
||||
"current_tab_id": current_tab_id,
|
||||
"profile_id": profile_id
|
||||
}), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
# Get current tab - returns JSON with tab data and content info
|
||||
@controller.get('/current')
|
||||
@with_session
|
||||
async def get_current_tab(request, session):
|
||||
"""Get the current tab from session."""
|
||||
current_tab_id = get_current_tab_id(request, session)
|
||||
if not current_tab_id:
|
||||
return json.dumps({"error": "No current tab set", "tab": None, "tab_id": None}), 404
|
||||
|
||||
tab = tabs.read(current_tab_id)
|
||||
if tab:
|
||||
return json.dumps({
|
||||
"tab": tab,
|
||||
"tab_id": current_tab_id
|
||||
}), 200, {'Content-Type': 'application/json'}
|
||||
return json.dumps({"error": "Tab not found", "tab": None, "tab_id": None}), 404
|
||||
|
||||
@controller.post('/<id>/set-current')
|
||||
async def set_current_tab(request, id):
|
||||
"""Set a tab as the current tab in cookie."""
|
||||
tab = tabs.read(id)
|
||||
if not tab:
|
||||
return json.dumps({"error": "Tab not found"}), 404
|
||||
|
||||
# Set cookie with current tab
|
||||
response_data = json.dumps({"message": "Current tab set", "tab_id": id})
|
||||
response = response_data, 200, {
|
||||
'Content-Type': 'application/json',
|
||||
'Set-Cookie': f'current_tab={id}; Path=/; Max-Age=31536000' # 1 year expiry
|
||||
}
|
||||
return response
|
||||
|
||||
@controller.get('/<id>')
|
||||
async def get_tab(request, id):
|
||||
"""Get a specific tab by ID."""
|
||||
tab = tabs.read(id)
|
||||
if tab:
|
||||
return json.dumps(tab), 200, {'Content-Type': 'application/json'}
|
||||
return json.dumps({"error": "Tab not found"}), 404
|
||||
|
||||
@controller.put('/<id>')
|
||||
async def update_tab(request, id):
|
||||
"""Update an existing tab."""
|
||||
try:
|
||||
data = request.json
|
||||
if tabs.update(id, data):
|
||||
return json.dumps(tabs.read(id)), 200, {'Content-Type': 'application/json'}
|
||||
return json.dumps({"error": "Tab not found"}), 404
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.delete('/<id>')
|
||||
@with_session
|
||||
async def delete_tab(request, session, id):
|
||||
"""Delete a tab."""
|
||||
try:
|
||||
# Handle 'current' tab ID
|
||||
if id == 'current':
|
||||
current_tab_id = get_current_tab_id(request, session)
|
||||
if current_tab_id:
|
||||
id = current_tab_id
|
||||
else:
|
||||
return json.dumps({"error": "No current tab to delete"}), 404
|
||||
|
||||
if tabs.delete(id):
|
||||
# Remove from profile's tabs
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
# Support both "tabs" (new) and "tab_order" (old) format
|
||||
tabs_list = profile.get('tabs', profile.get('tab_order', []))
|
||||
if id in tabs_list:
|
||||
tabs_list.remove(id)
|
||||
profile['tabs'] = tabs_list
|
||||
# Remove old tab_order if it exists
|
||||
if 'tab_order' in profile:
|
||||
del profile['tab_order']
|
||||
profiles.update(profile_id, profile)
|
||||
|
||||
# Clear cookie if the deleted tab was the current tab
|
||||
current_tab_id = get_current_tab_id(request, session)
|
||||
if current_tab_id == id:
|
||||
response_data = json.dumps({"message": "Tab deleted successfully"})
|
||||
response = response_data, 200, {
|
||||
'Content-Type': 'application/json',
|
||||
'Set-Cookie': 'current_tab=; Path=/; Max-Age=0' # Clear cookie
|
||||
}
|
||||
return response
|
||||
|
||||
return json.dumps({"message": "Tab deleted successfully"}), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
return json.dumps({"error": "Tab not found"}), 404
|
||||
except Exception as e:
|
||||
import sys
|
||||
try:
|
||||
sys.print_exception(e)
|
||||
except:
|
||||
pass
|
||||
return json.dumps({"error": str(e)}), 500, {'Content-Type': 'application/json'}
|
||||
|
||||
@controller.post('')
|
||||
@with_session
|
||||
async def create_tab(request, session):
|
||||
"""Create a new tab."""
|
||||
try:
|
||||
# Handle form data or JSON
|
||||
if request.form:
|
||||
name = request.form.get('name', '').strip()
|
||||
ids_str = request.form.get('ids', '1').strip()
|
||||
names = [id.strip() for id in ids_str.split(',') if id.strip()]
|
||||
preset_ids = None
|
||||
else:
|
||||
data = request.json or {}
|
||||
name = data.get("name", "")
|
||||
names = data.get("names", None)
|
||||
preset_ids = data.get("presets", None)
|
||||
|
||||
if not name:
|
||||
return json.dumps({"error": "Tab name cannot be empty"}), 400
|
||||
|
||||
tab_id = tabs.create(name, names, preset_ids)
|
||||
|
||||
# Add to current profile's tabs
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
# Support both "tabs" (new) and "tab_order" (old) format
|
||||
tabs_list = profile.get('tabs', profile.get('tab_order', []))
|
||||
if tab_id not in tabs_list:
|
||||
tabs_list.append(tab_id)
|
||||
profile['tabs'] = tabs_list
|
||||
# Remove old tab_order if it exists
|
||||
if 'tab_order' in profile:
|
||||
del profile['tab_order']
|
||||
profiles.update(profile_id, profile)
|
||||
|
||||
# Return JSON response with tab ID
|
||||
tab_data = tabs.read(tab_id)
|
||||
return json.dumps({tab_id: tab_data}), 201, {'Content-Type': 'application/json'}
|
||||
except Exception as e:
|
||||
import sys
|
||||
sys.print_exception(e)
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.post('/<id>/clone')
|
||||
@with_session
|
||||
async def clone_tab(request, session, id):
|
||||
"""Clone an existing tab and add it to the current profile."""
|
||||
try:
|
||||
source = tabs.read(id)
|
||||
if not source:
|
||||
return json.dumps({"error": "Tab not found"}), 404
|
||||
|
||||
data = request.json or {}
|
||||
source_name = source.get("name") or f"Tab {id}"
|
||||
new_name = data.get("name") or f"{source_name} Copy"
|
||||
clone_id = tabs.create(new_name, source.get("names"), source.get("presets"))
|
||||
extra = {k: v for k, v in source.items() if k not in ("name", "names", "presets")}
|
||||
if extra:
|
||||
tabs.update(clone_id, extra)
|
||||
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
tabs_list = profile.get('tabs', profile.get('tab_order', []))
|
||||
if clone_id not in tabs_list:
|
||||
tabs_list.append(clone_id)
|
||||
profile['tabs'] = tabs_list
|
||||
if 'tab_order' in profile:
|
||||
del profile['tab_order']
|
||||
profiles.update(profile_id, profile)
|
||||
|
||||
tab_data = tabs.read(clone_id)
|
||||
return json.dumps({clone_id: tab_data}), 201, {'Content-Type': 'application/json'}
|
||||
except Exception as e:
|
||||
import sys
|
||||
try:
|
||||
sys.print_exception(e)
|
||||
except:
|
||||
pass
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
377
src/controllers/zone.py
Normal file
377
src/controllers/zone.py
Normal file
@@ -0,0 +1,377 @@
|
||||
from microdot import Microdot, send_file
|
||||
from microdot.session import with_session
|
||||
from models.zone import Zone
|
||||
from models.profile import Profile
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
zones = Zone()
|
||||
profiles = Profile()
|
||||
|
||||
|
||||
def get_current_profile_id(session=None):
|
||||
"""Get the current active profile ID from session or fallback to first."""
|
||||
profile_list = profiles.list()
|
||||
session_profile = None
|
||||
if session is not None:
|
||||
session_profile = session.get("current_profile")
|
||||
if session_profile and session_profile in profile_list:
|
||||
return session_profile
|
||||
if profile_list:
|
||||
return profile_list[0]
|
||||
return None
|
||||
|
||||
|
||||
def _profile_zone_id_list(profile):
|
||||
"""Ordered zone ids for a profile (``zones``, legacy ``tabs``, or ``zone_order``)."""
|
||||
if not profile or not isinstance(profile, dict):
|
||||
return []
|
||||
z = profile.get("zones")
|
||||
if isinstance(z, list) and z:
|
||||
return list(z)
|
||||
t = profile.get("zones")
|
||||
if isinstance(t, list) and t:
|
||||
return list(t)
|
||||
o = profile.get("zone_order")
|
||||
if isinstance(o, list) and o:
|
||||
return list(o)
|
||||
return []
|
||||
|
||||
|
||||
def get_profile_zone_order(profile_id):
|
||||
if not profile_id:
|
||||
return []
|
||||
profile = profiles.read(profile_id)
|
||||
return _profile_zone_id_list(profile)
|
||||
|
||||
|
||||
def _set_profile_zone_order(profile, ids):
|
||||
profile["zones"] = list(ids)
|
||||
profile.pop("tabs", None)
|
||||
profile.pop("zone_order", None)
|
||||
|
||||
|
||||
def get_current_zone_id(request, session=None):
|
||||
"""Cookie ``current_zone``, legacy ``current_zone``, then first zone in profile."""
|
||||
z = request.cookies.get("current_zone") or request.cookies.get("current_zone")
|
||||
if z:
|
||||
return z
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
order = _profile_zone_id_list(profile)
|
||||
if order:
|
||||
return order[0]
|
||||
return None
|
||||
|
||||
|
||||
def _render_zones_list_fragment(request, session):
|
||||
"""Render zone strip HTML for HTMX / JS."""
|
||||
profile_id = get_current_profile_id(session)
|
||||
if not profile_id:
|
||||
return (
|
||||
'<div class="zones-list">No profile selected</div>',
|
||||
200,
|
||||
{"Content-Type": "text/html"},
|
||||
)
|
||||
|
||||
zone_order = get_profile_zone_order(profile_id)
|
||||
current_zone_id = get_current_zone_id(request, session)
|
||||
|
||||
html = '<div class="zones-list">'
|
||||
for zid in zone_order:
|
||||
zdata = zones.read(zid)
|
||||
if zdata:
|
||||
active_class = "active" if str(zid) == str(current_zone_id) else ""
|
||||
zname = zdata.get("name", "Zone " + str(zid))
|
||||
html += (
|
||||
'<button class="zone-button ' + active_class + '" '
|
||||
'hx-get="/zones/' + str(zid) + '/content-fragment" '
|
||||
'hx-target="#zone-content" '
|
||||
'hx-swap="innerHTML" '
|
||||
'hx-push-url="true" '
|
||||
'hx-trigger="click" '
|
||||
'onclick="document.querySelectorAll(\'.zone-button\').forEach(b => b.classList.remove(\'active\')); this.classList.add(\'active\');">'
|
||||
+ zname
|
||||
+ "</button>"
|
||||
)
|
||||
html += "</div>"
|
||||
return html, 200, {"Content-Type": "text/html"}
|
||||
|
||||
|
||||
def _render_zone_content_fragment(request, session, id):
|
||||
if id == "current":
|
||||
current_zone_id = get_current_zone_id(request, session)
|
||||
if not current_zone_id:
|
||||
accept_header = request.headers.get("Accept", "")
|
||||
wants_html = "text/html" in accept_header
|
||||
if wants_html:
|
||||
return (
|
||||
'<div class="error">No current zone set</div>',
|
||||
404,
|
||||
{"Content-Type": "text/html"},
|
||||
)
|
||||
return json.dumps({"error": "No current zone set"}), 404
|
||||
id = current_zone_id
|
||||
|
||||
z = zones.read(id)
|
||||
if not z:
|
||||
return '<div>Zone not found</div>', 404, {"Content-Type": "text/html"}
|
||||
|
||||
session["current_zone"] = str(id)
|
||||
session.save()
|
||||
|
||||
if not request.headers.get("HX-Request"):
|
||||
return send_file("templates/index.html")
|
||||
|
||||
html = (
|
||||
'<div class="presets-section" data-zone-id="' + str(id) + '">'
|
||||
"<h3>Presets</h3>"
|
||||
'<div class="profiles-actions" style="margin-bottom: 1rem;"></div>'
|
||||
'<div id="presets-list-zone" class="presets-list">'
|
||||
"<!-- Presets will be loaded here -->"
|
||||
"</div>"
|
||||
"</div>"
|
||||
)
|
||||
return html, 200, {"Content-Type": "text/html"}
|
||||
|
||||
|
||||
@controller.get("/<id>/content-fragment")
|
||||
@with_session
|
||||
async def zone_content_fragment(request, session, id):
|
||||
return _render_zone_content_fragment(request, session, id)
|
||||
|
||||
|
||||
@controller.get("")
|
||||
@with_session
|
||||
async def list_zones(request, session):
|
||||
profile_id = get_current_profile_id(session)
|
||||
current_zone_id = get_current_zone_id(request, session)
|
||||
zone_order = get_profile_zone_order(profile_id) if profile_id else []
|
||||
|
||||
zones_data = {}
|
||||
for zid in zones.list():
|
||||
zdata = zones.read(zid)
|
||||
if zdata:
|
||||
zones_data[zid] = zdata
|
||||
|
||||
return (
|
||||
json.dumps(
|
||||
{
|
||||
"zones": zones_data,
|
||||
"zone_order": zone_order,
|
||||
"current_zone_id": current_zone_id,
|
||||
"profile_id": profile_id,
|
||||
}
|
||||
),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
|
||||
@controller.get("/current")
|
||||
@with_session
|
||||
async def get_current_zone(request, session):
|
||||
current_zone_id = get_current_zone_id(request, session)
|
||||
if not current_zone_id:
|
||||
return (
|
||||
json.dumps({"error": "No current zone set", "zone": None, "zone_id": None}),
|
||||
404,
|
||||
)
|
||||
|
||||
z = zones.read(current_zone_id)
|
||||
if z:
|
||||
return (
|
||||
json.dumps({"zone": z, "zone_id": current_zone_id}),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
return (
|
||||
json.dumps({"error": "Zone not found", "zone": None, "zone_id": None}),
|
||||
404,
|
||||
)
|
||||
|
||||
|
||||
@controller.post("/<id>/set-current")
|
||||
async def set_current_zone(request, id):
|
||||
z = zones.read(id)
|
||||
if not z:
|
||||
return json.dumps({"error": "Zone not found"}), 404
|
||||
|
||||
response_data = json.dumps({"message": "Current zone set", "zone_id": id})
|
||||
return (
|
||||
response_data,
|
||||
200,
|
||||
{
|
||||
"Content-Type": "application/json",
|
||||
"Set-Cookie": (
|
||||
f"current_zone={id}; Path=/; Max-Age=31536000; SameSite=Lax"
|
||||
),
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
async def get_zone(request, id):
|
||||
z = zones.read(id)
|
||||
if z:
|
||||
return json.dumps(z), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Zone not found"}), 404
|
||||
|
||||
|
||||
@controller.put("/<id>")
|
||||
async def update_zone(request, id):
|
||||
try:
|
||||
data = request.json
|
||||
if zones.update(id, data):
|
||||
return json.dumps(zones.read(id)), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Zone not found"}), 404
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
|
||||
@controller.delete("/<id>")
|
||||
@with_session
|
||||
async def delete_zone(request, session, id):
|
||||
try:
|
||||
if id == "current":
|
||||
current_zone_id = get_current_zone_id(request, session)
|
||||
if current_zone_id:
|
||||
id = current_zone_id
|
||||
else:
|
||||
return json.dumps({"error": "No current zone to delete"}), 404
|
||||
|
||||
if zones.delete(id):
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
zlist = _profile_zone_id_list(profile)
|
||||
if id in zlist:
|
||||
zlist.remove(id)
|
||||
_set_profile_zone_order(profile, zlist)
|
||||
profiles.update(profile_id, profile)
|
||||
|
||||
current_zone_id = get_current_zone_id(request, session)
|
||||
if current_zone_id == id:
|
||||
response_data = json.dumps({"message": "Zone deleted successfully"})
|
||||
return (
|
||||
response_data,
|
||||
200,
|
||||
{
|
||||
"Content-Type": "application/json",
|
||||
"Set-Cookie": (
|
||||
"current_zone=; Path=/; Max-Age=0; SameSite=Lax"
|
||||
),
|
||||
},
|
||||
)
|
||||
|
||||
return json.dumps({"message": "Zone deleted successfully"}), 200, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
return json.dumps({"error": "Zone not found"}), 404
|
||||
except Exception as e:
|
||||
import sys
|
||||
|
||||
try:
|
||||
sys.print_exception(e)
|
||||
except Exception:
|
||||
pass
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("")
|
||||
@with_session
|
||||
async def create_zone(request, session):
|
||||
try:
|
||||
if request.form:
|
||||
name = request.form.get("name", "").strip()
|
||||
ids_str = request.form.get("ids", "1").strip()
|
||||
names = [i.strip() for i in ids_str.split(",") if i.strip()]
|
||||
preset_ids = None
|
||||
group_ids = []
|
||||
content_kind = None
|
||||
else:
|
||||
data = request.json or {}
|
||||
name = data.get("name", "")
|
||||
names = data.get("names")
|
||||
if names is None:
|
||||
names = data.get("ids")
|
||||
preset_ids = data.get("presets", None)
|
||||
group_ids = data.get("group_ids")
|
||||
if group_ids is None:
|
||||
group_ids = []
|
||||
if isinstance(group_ids, list):
|
||||
group_ids = [str(x) for x in group_ids if x is not None]
|
||||
else:
|
||||
group_ids = []
|
||||
raw_kind = data.get("content_kind")
|
||||
content_kind = raw_kind if raw_kind in ("presets", "sequences") else None
|
||||
|
||||
if not name:
|
||||
return json.dumps({"error": "Zone name cannot be empty"}), 400
|
||||
|
||||
zid = zones.create(name, names, preset_ids, group_ids, content_kind)
|
||||
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
zlist = _profile_zone_id_list(profile)
|
||||
if zid not in zlist:
|
||||
zlist.append(zid)
|
||||
_set_profile_zone_order(profile, zlist)
|
||||
profiles.update(profile_id, profile)
|
||||
|
||||
zdata = zones.read(zid)
|
||||
return json.dumps({zid: zdata}), 201, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
import sys
|
||||
|
||||
sys.print_exception(e)
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
|
||||
@controller.post("/<id>/clone")
|
||||
@with_session
|
||||
async def clone_zone(request, session, id):
|
||||
try:
|
||||
source = zones.read(id)
|
||||
if not source:
|
||||
return json.dumps({"error": "Zone not found"}), 404
|
||||
|
||||
data = request.json or {}
|
||||
source_name = source.get("name") or f"Zone {id}"
|
||||
new_name = data.get("name") or f"{source_name} Copy"
|
||||
clone_id = zones.create(
|
||||
new_name,
|
||||
source.get("names"),
|
||||
source.get("presets"),
|
||||
source.get("group_ids"),
|
||||
)
|
||||
extra = {k: v for k, v in source.items() if k not in ("name", "names", "presets")}
|
||||
if extra:
|
||||
zones.update(clone_id, extra)
|
||||
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
profile = profiles.read(profile_id)
|
||||
if profile:
|
||||
zlist = _profile_zone_id_list(profile)
|
||||
if clone_id not in zlist:
|
||||
zlist.append(clone_id)
|
||||
_set_profile_zone_order(profile, zlist)
|
||||
profiles.update(profile_id, profile)
|
||||
|
||||
zdata = zones.read(clone_id)
|
||||
return json.dumps({clone_id: zdata}), 201, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
import sys
|
||||
|
||||
try:
|
||||
sys.print_exception(e)
|
||||
except Exception:
|
||||
pass
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
504
src/main.py
504
src/main.py
@@ -1,6 +1,12 @@
|
||||
import asyncio
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
import secrets
|
||||
import signal
|
||||
import socket
|
||||
import threading
|
||||
import traceback
|
||||
from microdot import Microdot, send_file
|
||||
from microdot.websocket import with_websocket
|
||||
from microdot.session import Session
|
||||
@@ -10,12 +16,231 @@ import controllers.preset as preset
|
||||
import controllers.profile as profile
|
||||
import controllers.group as group
|
||||
import controllers.sequence as sequence
|
||||
import controllers.tab as tab
|
||||
import controllers.zone as zone
|
||||
import controllers.palette as palette
|
||||
import controllers.scene as scene
|
||||
import controllers.pattern as pattern
|
||||
import controllers.settings as settings_controller
|
||||
from models.transport import get_sender, set_sender
|
||||
import controllers.device as device_controller
|
||||
import controllers.led_tool as led_tool_controller
|
||||
from models.transport import get_sender, set_sender, get_current_sender
|
||||
from models.device import Device, normalize_mac
|
||||
from models import wifi_ws_clients as tcp_client_registry
|
||||
from util.device_status_broadcaster import (
|
||||
broadcast_device_tcp_snapshot_to,
|
||||
broadcast_device_tcp_status,
|
||||
register_device_status_ws,
|
||||
unregister_device_status_ws,
|
||||
)
|
||||
from util.audio_detector import AudioBeatDetector
|
||||
|
||||
_tcp_device_lock = threading.Lock()
|
||||
|
||||
DISCOVERY_UDP_PORT = 8766
|
||||
|
||||
|
||||
def _live_reload_enabled() -> bool:
|
||||
v = os.environ.get("LED_CONTROLLER_LIVE_RELOAD", "").strip().lower()
|
||||
return v not in ("", "0", "false", "no")
|
||||
|
||||
|
||||
def _register_udp_device_sync(
|
||||
device_name: str, peer_ip: str, mac, device_type=None
|
||||
) -> None:
|
||||
with _tcp_device_lock:
|
||||
try:
|
||||
d = Device()
|
||||
did, persisted = d.upsert_wifi_tcp_client(
|
||||
device_name, peer_ip, mac, device_type=device_type
|
||||
)
|
||||
if did and persisted:
|
||||
print(
|
||||
f"UDP device registered: mac={did} name={device_name!r} ip={peer_ip!r}"
|
||||
)
|
||||
except Exception as e:
|
||||
print(f"UDP device registry failed: {e}")
|
||||
traceback.print_exception(type(e), e, e.__traceback__)
|
||||
|
||||
|
||||
async def _handle_udp_discovery(sock, udp_holder=None) -> None:
|
||||
while True:
|
||||
try:
|
||||
data, addr = await asyncio.get_running_loop().sock_recvfrom(sock, 2048)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
except OSError as e:
|
||||
if udp_holder and udp_holder.get("closing"):
|
||||
break
|
||||
print(f"[UDP] recv failed: {e!r}")
|
||||
continue
|
||||
except Exception as e:
|
||||
print(f"[UDP] recv failed: {e!r}")
|
||||
continue
|
||||
peer_ip = addr[0] if addr else ""
|
||||
line = data.split(b"\n", 1)[0].strip()
|
||||
if line:
|
||||
try:
|
||||
parsed = json.loads(line.decode("utf-8"))
|
||||
if isinstance(parsed, dict):
|
||||
dns = str(parsed.get("device_name") or "").strip()
|
||||
mac = parsed.get("mac") or parsed.get("device_mac") or parsed.get(
|
||||
"sta_mac"
|
||||
)
|
||||
device_type = parsed.get("type") or parsed.get("device_type")
|
||||
if dns and normalize_mac(mac):
|
||||
_register_udp_device_sync(dns, peer_ip, mac, device_type)
|
||||
if str(parsed.get("v") or "") == "1":
|
||||
tcp_client_registry.ensure_driver_connection(peer_ip)
|
||||
except (UnicodeError, ValueError, TypeError):
|
||||
pass
|
||||
try:
|
||||
await asyncio.get_running_loop().sock_sendto(sock, data, addr)
|
||||
except Exception as e:
|
||||
print(f"[UDP] echo send failed: {e!r}")
|
||||
|
||||
|
||||
def _prime_wifi_outbound_driver_connections() -> None:
|
||||
"""
|
||||
For each Wi‑Fi device in the registry with a usable IPv4, start (or keep) the
|
||||
outbound WebSocket task. The client loop reconnects automatically if the link
|
||||
drops. Presets are not pushed automatically; use Send Presets / profile apply.
|
||||
"""
|
||||
n = 0
|
||||
try:
|
||||
dev = Device()
|
||||
for mac_key, doc in list(dev.items()):
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if doc.get("transport") != "wifi":
|
||||
continue
|
||||
ip = _ipv4_address(str(doc.get("address") or ""))
|
||||
if not ip:
|
||||
continue
|
||||
tcp_client_registry.ensure_driver_connection(ip)
|
||||
n += 1
|
||||
except Exception as e:
|
||||
print(f"[startup] Wi-Fi driver connection prime failed: {e!r}")
|
||||
traceback.print_exception(type(e), e, e.__traceback__)
|
||||
return
|
||||
if n:
|
||||
print(f"[startup] primed outbound WebSocket for {n} Wi-Fi driver(s)")
|
||||
|
||||
|
||||
def _ipv4_address(addr: str) -> str | None:
|
||||
"""Return dotted IPv4 string or None (hostnames skipped for UDP nudge)."""
|
||||
s = (addr or "").strip()
|
||||
if not s:
|
||||
return None
|
||||
parts = s.split(".")
|
||||
if len(parts) != 4:
|
||||
return None
|
||||
try:
|
||||
nums = [int(p) for p in parts]
|
||||
except ValueError:
|
||||
return None
|
||||
if not all(0 <= n <= 255 for n in nums):
|
||||
return None
|
||||
return s
|
||||
|
||||
|
||||
async def _periodic_wifi_driver_hello_loop(settings, udp_holder) -> None:
|
||||
"""
|
||||
While a registered Wi-Fi driver has no outbound WebSocket, send a short JSON hello on
|
||||
UDP discovery port so the device can announce itself and we can reconnect.
|
||||
"""
|
||||
try:
|
||||
interval = float(settings.get("wifi_driver_hello_interval_s", 10.0))
|
||||
except (TypeError, ValueError):
|
||||
interval = 10.0
|
||||
if interval <= 0:
|
||||
return
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
sock.setblocking(False)
|
||||
loop = asyncio.get_running_loop()
|
||||
try:
|
||||
while True:
|
||||
await asyncio.sleep(interval)
|
||||
if udp_holder.get("closing"):
|
||||
break
|
||||
try:
|
||||
dev = Device()
|
||||
except Exception as e:
|
||||
print(f"[hello] device list failed: {e!r}")
|
||||
continue
|
||||
for _mac_key, doc in list(dev.items()):
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if doc.get("transport") != "wifi":
|
||||
continue
|
||||
ip = _ipv4_address(str(doc.get("address") or ""))
|
||||
if not ip:
|
||||
continue
|
||||
if tcp_client_registry.tcp_client_connected(ip):
|
||||
continue
|
||||
name = (doc.get("name") or "").strip()
|
||||
mac = normalize_mac(doc.get("id") or _mac_key)
|
||||
if not name or not mac:
|
||||
continue
|
||||
line = (
|
||||
json.dumps(
|
||||
{"m": "hello", "device_name": name, "mac": mac},
|
||||
separators=(",", ":"),
|
||||
)
|
||||
+ "\n"
|
||||
)
|
||||
try:
|
||||
await loop.sock_sendto(
|
||||
sock, line.encode("utf-8"), (ip, DISCOVERY_UDP_PORT)
|
||||
)
|
||||
except OSError as e:
|
||||
print(f"[hello] UDP to {ip!r} failed: {e!r}")
|
||||
finally:
|
||||
try:
|
||||
sock.close()
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
|
||||
async def _run_udp_discovery_server(udp_holder=None) -> None:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
sock.setblocking(False)
|
||||
try:
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
except (AttributeError, OSError):
|
||||
pass
|
||||
try:
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
|
||||
except (AttributeError, OSError):
|
||||
pass
|
||||
sock.bind(("0.0.0.0", DISCOVERY_UDP_PORT))
|
||||
if udp_holder is not None:
|
||||
udp_holder["sock"] = sock
|
||||
print(f"UDP discovery listening on 0.0.0.0:{DISCOVERY_UDP_PORT}")
|
||||
try:
|
||||
await _handle_udp_discovery(sock, udp_holder)
|
||||
finally:
|
||||
if udp_holder is not None:
|
||||
udp_holder.pop("sock", None)
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
async def _send_bridge_wifi_channel(settings, sender):
|
||||
"""Tell the serial ESP32 bridge to set STA channel (settings wifi_channel); not forwarded as ESP-NOW."""
|
||||
try:
|
||||
ch = int(settings.get("wifi_channel", 6))
|
||||
except (TypeError, ValueError):
|
||||
ch = 6
|
||||
ch = max(1, min(11, ch))
|
||||
payload = json.dumps({"m": "bridge", "ch": ch}, separators=(",", ":"))
|
||||
try:
|
||||
await sender.send(payload, addr="ffffffffffff")
|
||||
print(f"[startup] bridge Wi-Fi channel -> {ch}")
|
||||
except Exception as e:
|
||||
print(f"[startup] bridge channel message failed: {e}")
|
||||
|
||||
|
||||
async def main(port=80):
|
||||
@@ -28,6 +253,29 @@ async def main(port=80):
|
||||
set_sender(sender)
|
||||
|
||||
app = Microdot()
|
||||
audio_detector = AudioBeatDetector()
|
||||
try:
|
||||
from util import audio_detector as audio_detector_module
|
||||
|
||||
audio_detector_module.set_shared_beat_detector(audio_detector)
|
||||
except Exception as e:
|
||||
print(f"[startup] audio detector shared registration skipped: {e!r}")
|
||||
try:
|
||||
from util.audio_run_persist import coerce_audio_device, read_audio_run_state
|
||||
|
||||
persisted = read_audio_run_state()
|
||||
if persisted.get("enabled"):
|
||||
dev = coerce_audio_device(persisted.get("device"))
|
||||
audio_detector.start(device=dev)
|
||||
print("[startup] audio beat detector started from saved run state")
|
||||
except Exception as e:
|
||||
print(f"[startup] audio auto-start skipped: {e!r}")
|
||||
from util import beat_driver_route
|
||||
|
||||
beat_driver_route.set_beat_route_main_loop(asyncio.get_running_loop())
|
||||
from util import sequence_playback as seq_pb
|
||||
|
||||
seq_pb.ensure_beat_consumer_started()
|
||||
|
||||
# Initialize sessions with a secret key from settings
|
||||
secret_key = settings.get('session_secret_key', 'led-controller-secret-key-change-in-production')
|
||||
@@ -40,7 +288,7 @@ async def main(port=80):
|
||||
('/profiles', profile, 'profile'),
|
||||
('/groups', group, 'group'),
|
||||
('/sequences', sequence, 'sequence'),
|
||||
('/tabs', tab, 'tab'),
|
||||
('/zones', zone, 'zone'),
|
||||
('/palettes', palette, 'palette'),
|
||||
('/scenes', scene, 'scene'),
|
||||
]
|
||||
@@ -50,29 +298,136 @@ async def main(port=80):
|
||||
app.mount(profile.controller, '/profiles')
|
||||
app.mount(group.controller, '/groups')
|
||||
app.mount(sequence.controller, '/sequences')
|
||||
app.mount(tab.controller, '/tabs')
|
||||
app.mount(zone.controller, '/zones')
|
||||
app.mount(palette.controller, '/palettes')
|
||||
app.mount(scene.controller, '/scenes')
|
||||
app.mount(pattern.controller, '/patterns')
|
||||
app.mount(settings_controller.controller, '/settings')
|
||||
|
||||
app.mount(device_controller.controller, '/devices')
|
||||
app.mount(led_tool_controller.controller, '/led-tool')
|
||||
|
||||
tcp_client_registry.set_settings(settings)
|
||||
tcp_client_registry.set_tcp_status_broadcaster(broadcast_device_tcp_status)
|
||||
|
||||
live_reload = _live_reload_enabled()
|
||||
dev_build_id = secrets.token_hex(12) if live_reload else None
|
||||
if live_reload:
|
||||
print(
|
||||
"[dev] LED_CONTROLLER_LIVE_RELOAD: browser refreshes when the server process restarts"
|
||||
)
|
||||
|
||||
if dev_build_id:
|
||||
|
||||
@app.route("/__dev/build-id")
|
||||
def dev_build_id_route(request):
|
||||
_ = request
|
||||
return (
|
||||
dev_build_id,
|
||||
200,
|
||||
{
|
||||
"Content-Type": "text/plain; charset=utf-8",
|
||||
"Cache-Control": "no-store",
|
||||
},
|
||||
)
|
||||
|
||||
# Serve index.html at root (cwd is src/ when run via pipenv run run)
|
||||
@app.route('/')
|
||||
@app.route("/")
|
||||
def index(request):
|
||||
"""Serve the main web UI."""
|
||||
return send_file('templates/index.html')
|
||||
|
||||
# Serve settings page
|
||||
@app.route('/settings')
|
||||
def settings_page(request):
|
||||
"""Serve the settings page."""
|
||||
return send_file('templates/settings.html')
|
||||
|
||||
if dev_build_id:
|
||||
try:
|
||||
with open("templates/index.html", encoding="utf-8") as f:
|
||||
html = f.read()
|
||||
tag = '<script src="/static/dev-live-reload.js" defer></script>'
|
||||
if "</body>" in html:
|
||||
html = html.replace("</body>", tag + "\n</body>", 1)
|
||||
return html, 200, {"Content-Type": "text/html; charset=utf-8"}
|
||||
except OSError:
|
||||
pass
|
||||
return send_file("templates/index.html")
|
||||
|
||||
# Favicon: avoid 404 in browser console (no file needed)
|
||||
@app.route('/favicon.ico')
|
||||
def favicon(request):
|
||||
return '', 204
|
||||
|
||||
@app.route('/api/audio/devices')
|
||||
async def audio_devices(request):
|
||||
_ = request
|
||||
try:
|
||||
return {
|
||||
"devices": audio_detector.list_input_devices(),
|
||||
"diagnostics": audio_detector.diagnostics(),
|
||||
}
|
||||
except Exception as e:
|
||||
return {"error": str(e)}, 500
|
||||
|
||||
@app.route('/api/audio/start', methods=['POST'])
|
||||
async def audio_start(request):
|
||||
payload = request.json if isinstance(request.json, dict) else {}
|
||||
device = payload.get("device", None)
|
||||
if device in ("", None):
|
||||
device = None
|
||||
else:
|
||||
try:
|
||||
device = int(device)
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
try:
|
||||
audio_detector.start(device=device)
|
||||
from util.audio_run_persist import write_audio_run_state
|
||||
|
||||
write_audio_run_state(enabled=True, device=device)
|
||||
return {"ok": True, "status": audio_detector.status()}
|
||||
except Exception as e:
|
||||
return {"ok": False, "error": str(e)}, 500
|
||||
|
||||
@app.route('/api/audio/stop', methods=['POST'])
|
||||
async def audio_stop(request):
|
||||
_ = request
|
||||
audio_detector.stop()
|
||||
from util.audio_run_persist import write_audio_run_state
|
||||
|
||||
write_audio_run_state(enabled=False)
|
||||
return {"ok": True, "status": audio_detector.status()}
|
||||
|
||||
@app.route('/api/audio/status')
|
||||
async def audio_status(request):
|
||||
_ = request
|
||||
from util import beat_driver_route
|
||||
from util import sequence_playback
|
||||
|
||||
st = audio_detector.status()
|
||||
st["sequence"] = sequence_playback.playback_status()
|
||||
st["manual_beat_stride"] = beat_driver_route.manual_beat_stride_status()
|
||||
seq = st.get("sequence")
|
||||
beat_readout = ""
|
||||
if isinstance(seq, dict) and str(seq.get("beat_readout") or "").strip():
|
||||
beat_readout = str(seq.get("beat_readout") or "").strip()
|
||||
elif st.get("running"):
|
||||
mb = st.get("manual_beat_stride")
|
||||
if isinstance(mb, dict) and mb.get("active"):
|
||||
try:
|
||||
n = int(mb.get("stride_n") or 1)
|
||||
except (TypeError, ValueError):
|
||||
n = 1
|
||||
n = max(1, min(64, n))
|
||||
try:
|
||||
bi = int(mb.get("beat_in_stride") or 1)
|
||||
except (TypeError, ValueError):
|
||||
bi = 1
|
||||
pos = min(n, max(1, bi))
|
||||
beat_readout = f"{pos}/{n}"
|
||||
else:
|
||||
try:
|
||||
bs = int(st.get("beat_seq") or 0)
|
||||
except (TypeError, ValueError):
|
||||
bs = 0
|
||||
if bs > 0:
|
||||
beat_readout = str(bs)
|
||||
st["beat_readout"] = beat_readout
|
||||
return {"status": st}
|
||||
|
||||
# Static file route
|
||||
@app.route("/static/<path:path>")
|
||||
def static_handler(request, path):
|
||||
@@ -85,41 +440,112 @@ async def main(port=80):
|
||||
@app.route('/ws')
|
||||
@with_websocket
|
||||
async def ws(request, ws):
|
||||
while True:
|
||||
data = await ws.receive()
|
||||
print(data)
|
||||
if data:
|
||||
try:
|
||||
parsed = json.loads(data)
|
||||
print("WS received JSON:", parsed)
|
||||
# Optional "to": 12-char hex MAC; rest is payload (sent with that address).
|
||||
addr = parsed.pop("to", None)
|
||||
payload = json.dumps(parsed) if parsed else data
|
||||
await sender.send(payload, addr=addr)
|
||||
except json.JSONDecodeError:
|
||||
# Not JSON: send raw with default address
|
||||
await register_device_status_ws(ws)
|
||||
await broadcast_device_tcp_snapshot_to(ws)
|
||||
try:
|
||||
while True:
|
||||
data = await ws.receive()
|
||||
print(data)
|
||||
if data:
|
||||
try:
|
||||
await sender.send(data)
|
||||
parsed = json.loads(data)
|
||||
print("WS received JSON:", parsed)
|
||||
# Optional "to": 12-char hex MAC; rest is payload (sent with that address).
|
||||
addr = parsed.pop("to", None)
|
||||
payload = json.dumps(parsed) if parsed else data
|
||||
await sender.send(payload, addr=addr)
|
||||
except json.JSONDecodeError:
|
||||
# Not JSON: send raw with default address
|
||||
try:
|
||||
await sender.send(data)
|
||||
except Exception:
|
||||
try:
|
||||
await ws.send(json.dumps({"error": "Send failed"}))
|
||||
except Exception:
|
||||
pass
|
||||
except Exception:
|
||||
try:
|
||||
await ws.send(json.dumps({"error": "Send failed"}))
|
||||
except Exception:
|
||||
pass
|
||||
except Exception:
|
||||
try:
|
||||
await ws.send(json.dumps({"error": "Send failed"}))
|
||||
except Exception:
|
||||
pass
|
||||
else:
|
||||
break
|
||||
else:
|
||||
break
|
||||
finally:
|
||||
await unregister_device_status_ws(ws)
|
||||
|
||||
|
||||
|
||||
server = asyncio.create_task(app.start_server(host="0.0.0.0", port=port))
|
||||
# Touch Device singleton early so db/device.json exists before first UDP hello.
|
||||
Device()
|
||||
await _send_bridge_wifi_channel(settings, sender)
|
||||
_prime_wifi_outbound_driver_connections()
|
||||
|
||||
while True:
|
||||
await asyncio.sleep(30)
|
||||
# cleanup before ending the application
|
||||
udp_holder = {"closing": False}
|
||||
loop = asyncio.get_running_loop()
|
||||
|
||||
def _graceful_shutdown(*_args):
|
||||
print("[server] shutting down...")
|
||||
udp_holder["closing"] = True
|
||||
try:
|
||||
audio_detector.stop()
|
||||
except Exception:
|
||||
pass
|
||||
u = udp_holder.get("sock")
|
||||
if u is not None:
|
||||
try:
|
||||
u.close()
|
||||
except OSError:
|
||||
pass
|
||||
tcp_client_registry.cancel_all_driver_tasks()
|
||||
if getattr(app, "server", None) is not None:
|
||||
app.shutdown()
|
||||
|
||||
shutdown_handlers_registered = False
|
||||
try:
|
||||
try:
|
||||
for sig in (signal.SIGINT, signal.SIGTERM):
|
||||
loop.add_signal_handler(sig, _graceful_shutdown)
|
||||
shutdown_handlers_registered = True
|
||||
except (NotImplementedError, RuntimeError):
|
||||
pass
|
||||
|
||||
# Await HTTP + UDP discovery; bind failures (e.g. port 80 in use) surface here.
|
||||
try:
|
||||
await asyncio.gather(
|
||||
app.start_server(host="0.0.0.0", port=port),
|
||||
_run_udp_discovery_server(udp_holder),
|
||||
_periodic_wifi_driver_hello_loop(settings, udp_holder),
|
||||
)
|
||||
except OSError as e:
|
||||
if e.errno == errno.EADDRINUSE:
|
||||
print(
|
||||
f"[server] bind failed (address already in use): {e!s}\n"
|
||||
f"[server] HTTP is configured for port {port} (env PORT). "
|
||||
f"Stop the other process or use a free port, e.g. PORT=8080 pipenv run run"
|
||||
)
|
||||
raise
|
||||
finally:
|
||||
try:
|
||||
audio_detector.stop()
|
||||
except Exception:
|
||||
pass
|
||||
srv = getattr(app, "server", None)
|
||||
if srv is not None:
|
||||
try:
|
||||
srv.close()
|
||||
await srv.wait_closed()
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
app.server = None
|
||||
except Exception:
|
||||
pass
|
||||
if shutdown_handlers_registered:
|
||||
for sig in (signal.SIGINT, signal.SIGTERM):
|
||||
try:
|
||||
loop.remove_signal_handler(sig)
|
||||
except (NotImplementedError, OSError, ValueError):
|
||||
pass
|
||||
|
||||
if __name__ == "__main__":
|
||||
import os
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user