Compare commits
51 Commits
fd618d7714
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 6c9e06f33b | |||
| c1c3e5d71b | |||
| c64dd736f2 | |||
| cad0aa7e59 | |||
| 0ae39ab94b | |||
| 822d9d8e01 | |||
| 1db905eaae | |||
| 3d6ef5c7b4 | |||
| 78a4ce009c | |||
| 7ccab6fbc4 | |||
|
|
827eb97203 | ||
|
|
3cca0cffc5 | ||
|
|
d36828bde2 | ||
|
|
ed0048c795 | ||
|
|
b316edbaf9 | ||
| c1b0c41ef2 | |||
| 3bb75d49de | |||
| 3d77cb448a | |||
| 49383c0003 | |||
| 7d821b9c1c | |||
| 9b7e387ea6 | |||
| b4f0d1891e | |||
| 0da30b6d6b | |||
| 6cbb728d9a | |||
| ff92451a76 | |||
| 60485bc06a | |||
| f6f299c3e5 | |||
| 66485f5c59 | |||
| 5f9ff9bcc9 | |||
| 35730b36f0 | |||
| d516833cc3 | |||
| 220be64dec | |||
| b433477c64 | |||
| 43b7047c57 | |||
| 167417d1ec | |||
| fb8141b320 | |||
| 96712dda88 | |||
| f5a7b42e7c | |||
| 1b1e9d727e | |||
| 668d29b786 | |||
| e5f42e099e | |||
| a9edda38ef | |||
| edec5ff460 | |||
|
|
264eb7296f | ||
|
|
fbd4295302 | ||
|
|
7bdb324ebc | ||
|
|
28b19b5219 | ||
|
|
75ddd559c9 | ||
|
|
5a1067263a | ||
|
|
e67de6215a | ||
|
|
7179b6531e |
45
.cursor/rules/led-driver.mdc
Normal file
45
.cursor/rules/led-driver.mdc
Normal file
@@ -0,0 +1,45 @@
|
||||
---
|
||||
description: led-driver — MicroPython ESP32: mpremote, imports, layout, I/O, no pycache in src
|
||||
globs: led-driver/**
|
||||
alwaysApply: false
|
||||
---
|
||||
|
||||
# led-driver (MicroPython / ESP32)
|
||||
|
||||
## Device and tests
|
||||
|
||||
1. Validate **MicroPython behaviour** under **`led-driver/`** with **`mpremote connect <PORT> …`** on the chip. Host **`python3`** does **not** prove the firmware build.
|
||||
|
||||
2. **Execution target is fixed:** treat **`led-driver/`** code as firmware that runs **only on MicroPython ESP32 devices**. Do **not** run `led-driver/src/main.py` (or other firmware modules) with host CPython as a normal execution path.
|
||||
|
||||
3. **Flow:** `mpremote connect <PORT> cp <local> :<on-flash>` then `run <script>.py`. Inline commands only — no **`.sh`** wrappers unless the user asks. Default serial placeholder: **`/dev/ttyACM0`**.
|
||||
|
||||
4. Checks that **import and run** code from **`led-driver/src/`** belong in **`led-driver/tests/`** and run with **`mpremote run …`**. **Do not** add **`pytest`** under **`led-controller/tests/`** that **`sys.path`**-loads **`led-driver/src`** and runs those modules on CPython.
|
||||
|
||||
## Import layout
|
||||
|
||||
4. **No** **`sys.path.insert`**, **`__file__`** path stitching, or other import-path hacks under **`led-driver/`**. Use device flash search path, or host **`PYTHONPATH`** / layout you control.
|
||||
|
||||
5. **No** “import fixer” code — fix copy order, flash paths, or env instead.
|
||||
|
||||
## Imports (fail loudly)
|
||||
|
||||
6. If a dependency does not load, **crash** and fix deployment or filesystem. **Do not** catch **`ImportError`** / **`ModuleNotFoundError`** around **`import`** / **`from … import`** for app/firmware modules (`settings`, `utils`, `network`, `machine`, …).
|
||||
|
||||
7. **Allowed — stdlib name pairs only** (MicroPython vs CPython): one **`except ImportError`**, then **one** fallback import, **no** extra logic in **`except`**:
|
||||
- `uos` → `os`
|
||||
- `ubinascii` → `binascii`
|
||||
- `utime` → `time`
|
||||
Not for “maybe the file exists on flash” — only different **stdlib** names.
|
||||
|
||||
8. **No** large inline reimplementations after **`except ImportError`** — deploy the real module.
|
||||
|
||||
## I/O
|
||||
|
||||
9. Non-blocking **recv** / **accept**: use plain **`except OSError:`** (or **break** on empty). **No** errno / EAGAIN / EWOULDBLOCK tables or **`getattr(errno, …)`** unless fixing a **documented** target bug.
|
||||
|
||||
10. Minimal **`try` / `except OSError`** around optional socket options (e.g. **`SO_REUSEADDR`**) is fine.
|
||||
|
||||
## Host Python and `src/`
|
||||
|
||||
11. **Do not** leave **`__pycache__/`** or **`.pyc`** under **`led-driver/src/`** from host runs. Remove if created; **`.gitignore`** already ignores it. Prefer **`PYTHONDONTWRITEBYTECODE=1`** or **`-B`** when host Python must touch **`led-driver/src/`**.
|
||||
14
.cursor/rules/pattern-workflow.mdc
Normal file
14
.cursor/rules/pattern-workflow.mdc
Normal file
@@ -0,0 +1,14 @@
|
||||
---
|
||||
description: Require test pattern, pattern metadata, and test preset for new patterns
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Pattern workflow requirements
|
||||
|
||||
1. When creating a new pattern under `led-driver/src/patterns/`, also add/update a corresponding test file in `led-driver/tests/patterns/`.
|
||||
|
||||
2. When adding a new pattern, ensure led-controller has `db/pattern.json`; if it does not exist, create it. Add the new pattern metadata and parameter mappings there. Optionally set **`supports_manual`** to `false` when the pattern is a poor fit for manual mode or audio beat triggers (smooth/blended animations); omit or `true` otherwise.
|
||||
|
||||
3. When adding a new pattern, add at least one test preset entry in `db/preset.json` in led-controller that uses the new pattern.
|
||||
|
||||
4. For any pattern that supports both auto and manual modes, keep behaviour parity unless explicitly requested otherwise: background colour handling, colour-cycling order, and parameter timing semantics (e.g. `n2`/`n3` meaning) must match between auto and manual paths.
|
||||
16
.cursor/rules/strict-user-scope.mdc
Normal file
16
.cursor/rules/strict-user-scope.mdc
Normal file
@@ -0,0 +1,16 @@
|
||||
---
|
||||
description: enforce strict user-scoped changes only
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Strict User Scope
|
||||
|
||||
1. Only implement exactly what the user asked for in the current message.
|
||||
|
||||
2. Do not add extra refactors, cleanups, renames, architecture changes, or behavioural changes unless the user explicitly asked for them.
|
||||
|
||||
3. If a potential improvement is noticed, mention it briefly and ask before changing code.
|
||||
|
||||
4. For revert/undo requests, perform the narrowest possible revert and do not modify anything else.
|
||||
|
||||
5. Keep edits minimal and local to the requested area.
|
||||
18
.cursor/rules/submodules-led-driver-tool.mdc
Normal file
18
.cursor/rules/submodules-led-driver-tool.mdc
Normal file
@@ -0,0 +1,18 @@
|
||||
---
|
||||
description: Keep led-driver and led-tool git submodules in sync when updating led-controller
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Submodule pointers (`led-driver`, `led-tool`)
|
||||
|
||||
This repo tracks **`led-driver`** and **`led-tool`** as git submodules (see `.gitmodules`).
|
||||
|
||||
When you **update led-controller** work that should ship with matching firmware or CLI behaviour—or when you finish changes **inside** those submodule directories—**record the new submodule commits in the parent repo**:
|
||||
|
||||
1. In each submodule, commit and push on its remote if there are local commits (or ensure the checkout is the intended revision).
|
||||
2. From the **led-controller** root: `git add led-driver led-tool` after their HEADs point at the right commits.
|
||||
3. Include the parent-repo commit that bumps the gitlinks (so CI and clones get consistent trees).
|
||||
|
||||
**Do not** leave submodule directories dirty or forgotten while presenting the parent repo as “done”: either commit the submodule pointer update in led-controller, or leave an explicit note if the user must push submodule remotes first.
|
||||
|
||||
If the user only asked for a submodule bump with no code edits, a single `chore(submodules): bump led-driver and led-tool` style commit is appropriate (see commit rule).
|
||||
16
.gitignore
vendored
16
.gitignore
vendored
@@ -1,5 +1,7 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
# led-driver/src is MicroPython source — never keep host __pycache__ there (see .cursor/rules/led-driver.mdc)
|
||||
led-driver/src/__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
@@ -23,8 +25,22 @@ ENV/
|
||||
Thumbs.db
|
||||
|
||||
# Project specific
|
||||
scripts/.led-controller-venv
|
||||
docs/.help-print.html
|
||||
settings.json
|
||||
# Track shared JSON + preset binaries; ignore other db/*.json (e.g. device, zone) locally
|
||||
db/*
|
||||
!db/group.json
|
||||
!db/palette.json
|
||||
!db/pattern.json
|
||||
!db/preset.json
|
||||
!db/profile.json
|
||||
!db/scene.json
|
||||
!db/sequence.json
|
||||
!db/presets/
|
||||
!db/presets/*.bin
|
||||
*.log
|
||||
*.db
|
||||
*.sqlite
|
||||
.pytest_cache/
|
||||
.ropeproject/
|
||||
|
||||
3
.gitmodules
vendored
3
.gitmodules
vendored
@@ -4,3 +4,6 @@
|
||||
[submodule "led-tool"]
|
||||
path = led-tool
|
||||
url = git@git.technical.kiwi:technicalkiwi/led-tool.git
|
||||
[submodule "led-simulator"]
|
||||
path = led-simulator
|
||||
url = git@git.technical.kiwi:technicalkiwi/led-simulator.git
|
||||
|
||||
16
Pipfile
16
Pipfile
@@ -13,17 +13,21 @@ requests = "*"
|
||||
selenium = "*"
|
||||
adafruit-ampy = "*"
|
||||
microdot = "*"
|
||||
websockets = "*"
|
||||
numpy = "*"
|
||||
sounddevice = "*"
|
||||
|
||||
[dev-packages]
|
||||
pytest = "*"
|
||||
|
||||
[requires]
|
||||
python_version = "3.12"
|
||||
python_version = "3.11"
|
||||
|
||||
[scripts]
|
||||
web = "python /home/pi/led-controller/tests/web.py"
|
||||
watch = "python -m watchfiles 'python tests/web.py' src tests"
|
||||
install = "pipenv install"
|
||||
web = "python tests/web.py"
|
||||
watch = "python -m watchfiles \"python tests/web.py\" src tests"
|
||||
run = "sh -c 'cd src && python main.py'"
|
||||
dev = "watchfiles \"sh -c 'cd src && python main.py'\" src"
|
||||
help-pdf = "sh scripts/build_help_pdf.sh"
|
||||
dev = "python -m watchfiles \"sh -c 'cd src && LED_CONTROLLER_LIVE_RELOAD=1 python main.py'\" src"
|
||||
test = "python -m pytest"
|
||||
test-browser = "sh -c 'python tests/web.py > /tmp/led-controller-web.log 2>&1 & pid=$!; trap \"kill $pid\" EXIT; sleep 2; LED_CONTROLLER_RUN_BROWSER_TESTS=1 LED_CONTROLLER_DEVICE_IP=http://127.0.0.1:5000 python -m pytest tests/test_browser.py'"
|
||||
test-browser-device = "sh -c 'LED_CONTROLLER_RUN_BROWSER_TESTS=1 python -m pytest tests/test_browser.py'"
|
||||
|
||||
846
Pipfile.lock
generated
846
Pipfile.lock
generated
File diff suppressed because it is too large
Load Diff
16
README.md
16
README.md
@@ -1,23 +1,26 @@
|
||||
# led-controller
|
||||
|
||||
LED controller web app for managing profiles, tabs, presets, and colour palettes, and sending commands to LED devices over the serial -> ESP-NOW bridge.
|
||||
LED controller web app for managing profiles, **zones**, presets, and colour palettes, and sending commands to LED devices. Outbound paths include:
|
||||
|
||||
- **Serial → ESP-NOW bridge**: JSON lines over UART to an ESP32 that forwards ESP-NOW frames (configure `serial_port` and baud in `settings.json` / Settings model).
|
||||
- **Wi-Fi LED drivers**: TCP JSON lines (default port **8765** on the Pi; drivers discover the controller via **UDP 8766** broadcast).
|
||||
|
||||
## Run
|
||||
|
||||
- One-time setup for port 80 without root: `sudo scripts/setup-port80.sh`
|
||||
- Start app: `pipenv run run`
|
||||
- Start app: `pipenv run run` (override listen port with the **`PORT`** environment variable)
|
||||
- Dev watcher (auto-restart on `src/` changes): `pipenv run dev`
|
||||
- Regenerate **`docs/help.pdf`** from **`docs/help.md`**: `pipenv run help-pdf` (requires **pandoc** and **chromium** on the host)
|
||||
|
||||
## UI modes
|
||||
|
||||
- **Run mode**: focused control view. Select tabs/presets and apply profiles. Editing actions are hidden.
|
||||
- **Edit mode**: management view. Shows Tabs, Presets, Patterns, Colour Palette, and Send Presets controls, plus per-tile preset edit/remove and drag-reorder.
|
||||
- **Run mode**: focused control view. Select zones/presets and apply profiles. Editing actions are hidden.
|
||||
- **Edit mode**: management view. Shows **Zones**, Presets, Patterns, Colour Palette, and Send Presets controls, plus per-tile preset edit/remove and drag-reorder.
|
||||
|
||||
## Profiles
|
||||
|
||||
- Applying a profile updates session scope and refreshes the active zone content.
|
||||
- In **Run mode**, Profiles supports apply-only behavior (no create/clone/delete).
|
||||
- In **Run mode**, Profiles supports apply-only behaviour (no create/clone/delete).
|
||||
- In **Edit mode**, Profiles supports create/clone/delete.
|
||||
- Creating a profile always creates a populated `default` zone (starter presets).
|
||||
- Optional **DJ zone** seeding creates:
|
||||
@@ -35,3 +38,6 @@ LED controller web app for managing profiles, tabs, presets, and colour palettes
|
||||
|
||||
- Main API reference: `docs/API.md`
|
||||
|
||||
## Driver pattern modules
|
||||
|
||||
Pattern **`.py`** sources live under **`led-driver/src/patterns`**. The Pi app resolves that path via `util.driver_patterns.driver_patterns_dir()`. If you deploy without that tree next to the app, set **`LED_CONTROLLER_PATTERNS_DIR`** to the directory that contains those files.
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
{"aabbccddeeff": {"id": "aabbccddeeff", "name": "one", "type": "led", "transport": "espnow", "address": "aabbccddeeff", "default_pattern": null, "zones": []}, "f0f5bdfd78b8": {"id": "f0f5bdfd78b8", "name": "a", "type": "led", "transport": "wifi", "address": "10.1.1.215", "default_pattern": null, "zones": []}}
|
||||
@@ -1 +1 @@
|
||||
{"1": {"name": "Main Group", "devices": ["1", "2", "3"]}, "2": {"name": "Accent Group", "devices": ["4", "5"]}}
|
||||
{"1": {"name": "group1", "devices": ["e8f60a16fb00", "e8f60a170794"], "wifi_driver_display_name": "desk", "wifi_driver_num_leds": 59, "wifi_color_order": "rgb", "wifi_startup_mode": "default", "pattern": "on", "colors": ["000000", "FF0000"], "brightness": 100, "delay": 100, "step_offset": 0, "step_increment": 1, "n1": 0, "n2": 0, "n3": 0, "n4": 0, "n5": 0, "n6": 0, "n7": 0, "n8": 0, "output_brightness": 255}, "2": {"name": "group2", "devices": ["188b0e1560a8"], "wifi_driver_display_name": null, "wifi_driver_num_leds": null, "wifi_color_order": "rgb", "wifi_startup_mode": "default", "output_brightness": 255, "pattern": "on", "colors": ["000000", "FF0000"], "brightness": 100, "delay": 100, "step_offset": 0, "step_increment": 1, "n1": 0, "n2": 0, "n3": 0, "n4": 0, "n5": 0, "n6": 0, "n7": 0, "n8": 0}}
|
||||
343
db/pattern.json
343
db/pattern.json
@@ -1,54 +1,291 @@
|
||||
{
|
||||
"on": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 1
|
||||
},
|
||||
"off": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0
|
||||
},
|
||||
"rainbow": {
|
||||
"n1": "Step Rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0
|
||||
},
|
||||
"transition": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10
|
||||
},
|
||||
"chase": {
|
||||
"n1": "Colour 1 Length",
|
||||
"n2": "Colour 2 Length",
|
||||
"n3": "Step 1",
|
||||
"n4": "Step 2",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2
|
||||
},
|
||||
"pulse": {
|
||||
"n1": "Attack",
|
||||
"n2": "Hold",
|
||||
"n3": "Decay",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10
|
||||
},
|
||||
"circle": {
|
||||
"n1": "Head Rate",
|
||||
"n2": "Max Length",
|
||||
"n3": "Tail Rate",
|
||||
"n4": "Min Length",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2
|
||||
},
|
||||
"blink": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10
|
||||
}
|
||||
}
|
||||
"on": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 1,
|
||||
"supports_manual": true
|
||||
},
|
||||
"off": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0,
|
||||
"supports_manual": true
|
||||
},
|
||||
"rainbow": {
|
||||
"n1": "Step Rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 0,
|
||||
"supports_manual": true
|
||||
},
|
||||
"colour_cycle": {
|
||||
"n1": "Step Rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"transition": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": false
|
||||
},
|
||||
"chase": {
|
||||
"n1": "Colour 1 Length",
|
||||
"n2": "Colour 2 Length",
|
||||
"n3": "Step 1",
|
||||
"n4": "Step 2",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"pulse": {
|
||||
"n1": "Attack",
|
||||
"n2": "Hold",
|
||||
"n3": "Decay",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"circle": {
|
||||
"n1": "Head Rate",
|
||||
"n2": "Max Length",
|
||||
"n3": "Tail Rate",
|
||||
"n4": "Min Length",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 2,
|
||||
"has_background": true,
|
||||
"supports_manual": false
|
||||
},
|
||||
"blink": {
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": false
|
||||
},
|
||||
"flicker": {
|
||||
"n1": "Min brightness",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"flame": {
|
||||
"n1": "Min brightness",
|
||||
"n2": "Breath period (ms)",
|
||||
"n3": "Spark gap min (ms, 0=default 10–30 s, -1=off)",
|
||||
"n4": "Spark gap max (ms)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": false
|
||||
},
|
||||
"twinkle": {
|
||||
"n1": "Twinkle activity (1–255, higher = more changes)",
|
||||
"n2": "Density (0–255, higher = more of the strip lit)",
|
||||
"n3": "Min adjacent LEDs per twinkle (same as max for fixed length)",
|
||||
"n4": "Max adjacent LEDs per twinkle (same as min for fixed length)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"radiate": {
|
||||
"n1": "Node spacing (LEDs)",
|
||||
"n2": "Out time (ms)",
|
||||
"n3": "In time (ms)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"meteor_rain": {
|
||||
"n1": "Tail length",
|
||||
"n2": "Speed (LEDs per frame)",
|
||||
"n3": "Fade amount (1-255)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"scanner": {
|
||||
"n1": "Eye width",
|
||||
"n2": "End pause (frames)",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"gradient_scroll": {
|
||||
"n1": "Scroll step rate",
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"max_colors": 10,
|
||||
"supports_manual": true
|
||||
},
|
||||
"comet_dual": {
|
||||
"n1": "Tail length",
|
||||
"n2": "Speed",
|
||||
"n3": "Gap",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"sparkle_trail": {
|
||||
"n1": "Spark density",
|
||||
"n2": "Decay",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": true
|
||||
},
|
||||
"wave": {
|
||||
"n1": "Wavelength",
|
||||
"n2": "Amplitude",
|
||||
"n3": "Drift speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"plasma": {
|
||||
"n1": "Scale",
|
||||
"n2": "Speed",
|
||||
"n3": "Contrast",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"segment_chase": {
|
||||
"n1": "Segment size",
|
||||
"n2": "Phase step",
|
||||
"n3": "Segment phase offset",
|
||||
"n4": "Gap per segment",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"bar_graph": {
|
||||
"n1": "Level percent",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": false
|
||||
},
|
||||
"breathing_dual": {
|
||||
"n1": "Phase offset",
|
||||
"n2": "Ease",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"strobe_burst": {
|
||||
"n1": "Burst count",
|
||||
"n2": "Burst gap",
|
||||
"n3": "Cooldown",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"rain_drops": {
|
||||
"n1": "Drop rate",
|
||||
"n2": "Ripple width",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"fireflies": {
|
||||
"n1": "Count",
|
||||
"n2": "Twinkle speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"clock_sweep": {
|
||||
"n1": "Hand width",
|
||||
"n2": "Marker interval",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"marquee": {
|
||||
"n1": "On length",
|
||||
"n2": "Off length",
|
||||
"n3": "Step",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"aurora": {
|
||||
"n1": "Band count",
|
||||
"n2": "Shimmer",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
},
|
||||
"snowfall": {
|
||||
"n1": "Flake density",
|
||||
"n2": "Fall speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"heartbeat": {
|
||||
"n1": "Pulse 1 ms",
|
||||
"n2": "Pulse 2 ms",
|
||||
"n3": "Pause ms",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"orbit": {
|
||||
"n1": "Orbit count",
|
||||
"n2": "Base speed",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"has_background": true,
|
||||
"supports_manual": true
|
||||
},
|
||||
"palette_morph": {
|
||||
"n1": "Morph ms",
|
||||
"n2": "Warp rate",
|
||||
"n3": "Turbulence",
|
||||
"max_colors": 10,
|
||||
"min_delay": 10,
|
||||
"max_delay": 10000,
|
||||
"supports_manual": false
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
BIN
db/presets/1.bin
Normal file
BIN
db/presets/1.bin
Normal file
Binary file not shown.
3
db/presets/10.bin
Normal file
3
db/presets/10.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœ%ÎÁ
|
||||
Â0Ð_‘ñšCSµJîæ'D$¶«
|
||||
ÄÝ’¦ˆˆÿntOovæ²opxz‘´zޱ¦P
|
||||
2
db/presets/11.bin
Normal file
2
db/presets/11.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xњ%ОAВ …б»<·,J5\Е4
|
||||
К $84SX4Ж»‹eхеНlюШЅ B
|
||||
1
db/presets/12.bin
Normal file
1
db/presets/12.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœ%ÎA л|·, ŠÐK˜ÆP;*
|
||||
2
db/presets/13.bin
Normal file
2
db/presets/13.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœEÎÁ
|
||||
Â0Ð_‘9ç`«Qɯˆ”Ô®ˆ»e“RDüwsðô˜™Ë¼ÁñIИx”uS²¬p˜c¤ü¬»J-ç‹Ã¨éþ¨LÅrï½ÃD9¾:¿uˆK„ª9pg¥Ñ#ØÂ»Æ¾á‡Æ±qú1«ÜR¦!Mö¡Ãç<0B><>1
|
||||
2
db/presets/14.bin
Normal file
2
db/presets/14.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ=ÎÝ
|
||||
!†á[‰¯StK[¼€½‰ˆ°v*ÁTü!"º÷Ü¤Žžá<C5BE>9˜¼¹4bu™VÙ…¢)…’ÿåVÎÁ…”¡÷XO“RœãÀpJöz+žr[R2ÌäÌzäœÁÔ KªÄàE;àKõ´èÓæß¶Ð²£:»Îø%¦p±ŽŽvn? ¼?<3F>¨2ú
|
||||
BIN
db/presets/15.bin
Normal file
BIN
db/presets/15.bin
Normal file
Binary file not shown.
BIN
db/presets/2.bin
Normal file
BIN
db/presets/2.bin
Normal file
Binary file not shown.
2
db/presets/3.bin
Normal file
2
db/presets/3.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœUÎÁ
|
||||
Â0ЙsM5Uò+"²µ«â¦lSDÄwiNž³3‡ý@èɈPJ2–fª•Uþn×’‹.ˆ§³Ã¨éþ¨Â‹å>‡‰3½}×9ÐZbÕ•ÄÛÀè‘]cß<08>¡qh7f-·”ù’&ûÁãûF9/.
|
||||
2
db/presets/30.bin
Normal file
2
db/presets/30.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœEÎÁ
|
||||
Â0Ð_‘9ç`«Qɯˆ”Ô®ˆ»e“RDüwsðô˜™Ë¼ÁñIИx”uS²¬p˜c¤ü¬»J-ç‹Ã¨éþ¨LÅrï½ÃD9¾:¿uˆK„ª9pg¥Ñ#ØÂ»Æ¾á‡Æ±qú1«ÜR¦!Mö¡Çç<0B>“1
|
||||
BIN
db/presets/31.bin
Normal file
BIN
db/presets/31.bin
Normal file
Binary file not shown.
2
db/presets/32.bin
Normal file
2
db/presets/32.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ%ͽÂ0àW©Ž5C~•&VÆ
|
||||
¡@<40>)uª4K…xwR<}ç»Á° —ks<DjÎ)¦…É•B™ë–¸ž¯µža;l¼×Ú{Üž9ïÂ4×ÁÐStl«kævÅ[a'ì…ƒpN¦œ|ˆô}ýmðý‡-‰
|
||||
1
db/presets/33.bin
Normal file
1
db/presets/33.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœMÎ1!†á¿b¾[=5ÌNÎnÆô@I°\€Åÿ»Å.²<oÚ¼Aîéa±?,ŽÅQ<C385>-f‚ÂìZó…xÓþÇ·œr©°'!h~<´î-Õg…k‰÷G#_ùØ0ùä^Ü#7-a;FX ka6ÂVØý˜K1ùKœø_Ÿ/ÐM4y
|
||||
BIN
db/presets/34.bin
Normal file
BIN
db/presets/34.bin
Normal file
Binary file not shown.
2
db/presets/35.bin
Normal file
2
db/presets/35.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ%ͽÂ0àW©Ž5C~•&VÆ
|
||||
¡@<40>)uª4K…xwR<}ç»Á° —ks<DjÎ)¦…É•B™ë–¸ž¯µža;l¼×Ú{Üž9ïÂ4×ÁÐStl«kævÅ[a'ì…ƒpN¦œ|ˆô}ýmðý‡-‰
|
||||
1
db/presets/36.bin
Normal file
1
db/presets/36.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœMÎ1!†á¿b¾[=5ÌNÎnÆô@I°\€Åÿ»Å.²<oÚ¼Aîéa±?,ŽÅQ<C385>-f‚ÂìZó…xÓþÇ·œr©°'!h~<´î-Õg…k‰÷G#_ùØ0ùä^Ü#7-a;FX ka6ÂVØý˜K1ùKœø_Ÿ/ÐM4y
|
||||
BIN
db/presets/37.bin
Normal file
BIN
db/presets/37.bin
Normal file
Binary file not shown.
BIN
db/presets/38.bin
Normal file
BIN
db/presets/38.bin
Normal file
Binary file not shown.
3
db/presets/39.bin
Normal file
3
db/presets/39.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœUÎÁ‚0„áw¯=¤jú*†<>
|
||||
[m\[²”ƒ1¾»…ž<}ÉÌåÿ ºÁÂsŸ$P˜]Î$ño'Y`¯88ÒÚ{ô
|
||||
7 ÷GŽ´”£5Fa"voX£Üšl–•bÛè2ÆvãXé*¦rªœ+—<>Y’LC˜JM³·1•ºAÈo5qeî¿?ªð9±
|
||||
BIN
db/presets/4.bin
Normal file
BIN
db/presets/4.bin
Normal file
Binary file not shown.
4
db/presets/40.bin
Normal file
4
db/presets/40.bin
Normal file
@@ -0,0 +1,4 @@
|
||||
PRST1xśMÎÁ‚0„áwŻ=$ű*†<>
|
||||
[%Y[RÚ1ľ»…^<}ÉĚĺ˙Ŕ™7<E284A2>`ĺPa51rpËäŇ
|
||||
tÇĹÚ©×<1A>Â#,ĎWtĽĺŁŞ{…™Ĺě V+<2B>=(†Ä
|
||||
®5m¶ŐťÎŻk@×B[č
|
||||
2
db/presets/41.bin
Normal file
2
db/presets/41.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xśmŹÁ‚0†ßĄ\wČ`ŮMQ^Â2ĄčâÜČ1ĆřînĚ‹‰—~í—?Mű#ďüC™›F 0IďŃ™w¶ÚşÄ˛š7Ľm<C4BD>ËĺMęveýuUąo<v[şć:'§.Wop
|
||||
ƨĺDN)ąx» <09><H¤)B2r"˘Śá@–Ć*ˇNŕ+&gGĄ±WC8<_ßĐéŽńpłhMţ”îýŹ!I°
|
||||
2
db/presets/42.bin
Normal file
2
db/presets/42.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xњUЋ;В0птТєp>°WAQґђ5X2Nд8BЬ;©hv¤·SМЃ_BдЙq(,њ’Др·Эg?ЗtEЕЅЦЦжТZіf
|
||||
·иПdНJcЊВ$ћЯ “ЮТJq…PѓЪј…t)ПР‚є]ЁАињњw,q¶ОЛи¦\Wп^rнЕ–є°yЇКѕ?Эh>Ў
|
||||
BIN
db/presets/43.bin
Normal file
BIN
db/presets/43.bin
Normal file
Binary file not shown.
2
db/presets/44.bin
Normal file
2
db/presets/44.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœEÎM
|
||||
Â0à«Ès›Eÿ¢’ôE$¶£â¤$Ó…ˆww0góÁ{o1o°„ŠìÊì™)Ã`õ"”Y‹6§˜r<CB9C>›°ÇFgƒk÷‡0-:k
|
||||
3
db/presets/45.bin
Normal file
3
db/presets/45.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœ=ŽA‚0E¯B>Û.
|
||||
*š€KC*ŒØ¤¶¤Æxw<1B>Í{™7‹y!ØÁ€)s5';9
|
||||
\å1Eï¡°XfJA~mø·1ú˜2ÌußkÙÕZo^ls\®ÉÍw”å¸mµÂDÞ>a:Q»r„á´’Bh¤Z)aW°/8tÇ‚ÓKŠ7çip“üÙàý)<¡
|
||||
3
db/presets/46.bin
Normal file
3
db/presets/46.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xś-ÎÁ‚0Đ_!õ‡Šdo˝ô'Ś!Ş’”–”ĺ`Ś˙î<˝ÍĚö<>čfű•‹!Íž‹qs
|
||||
‹cö9J·Çý?RHy]QZkŚÖ’•Zc-n
|
||||
÷<=_ý*“Zk…Ń÷µrşŤ<13>óćbę„T
|
||||
2
db/presets/47.bin
Normal file
2
db/presets/47.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1x<EFBFBD>5־A‚0…ב«<D791>ַ¶@Dׂ- —0ֶT©<54>X[2ֶxwG׳ש&»˜‚yXh°M\₪<>׀<EFBFBD><D780>‚ֹ8…<>0[
|
||||
’ור/חט#%ט=ֺ¾†q”·r\…¹כ<C2B9>ƒMע¥©*…ֹzף„מd5Gh¦ֵ*„Zz+6b-1l ¿´™m¦ֻל2ֺLסגה"7ֹy5<79>־ד:G
|
||||
2
db/presets/48.bin
Normal file
2
db/presets/48.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ-ÎÁ Ð_1ã•ÔZŽúÆ´«’ 4°Õã¿»Š§7;sÙ¢»,˜
|
||||
/îNP˜3å(í¿8¥<38>r<EFBFBD>Ýa©õ¶ìŽÙ_®©ÈÐh0RpOØN¢›9ÁržI!XÓˆ<C393>ØËW„ö{+]eSéL9<4C>} ƒåƒ÷ªù0¿
|
||||
2
db/presets/49.bin
Normal file
2
db/presets/49.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1x<EFBFBD>=ЮA
|
||||
Т0аЋШw<D0A8>EZ5JаK<14>б<EFBFBD>ZH<5A><48>L"онС<D0BD>Ћ7ќџѓFЄ<46>с!\e<>е<>`<60>I<EFBFBD>KдќнRHЅТ<D085>и<0E>ЕЮсlp-ѓу)<29>ЋНЕzС;=i<>/ee<65>иiІє:Sv<53>=МютЁсЧЦщG.щ>ОЬ<D09E>Овсѓ,<2C>1И
|
||||
BIN
db/presets/5.bin
Normal file
BIN
db/presets/5.bin
Normal file
Binary file not shown.
2
db/presets/50.bin
Normal file
2
db/presets/50.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ5ÎA‚0Ы<C390>϶‹‚ˆ¦è%Œ!F <20>–´ÃÂïîhu6o2ÿ/æ ïV‚Sâ"Ѹ’碟\"(lŽ™¢—ø—tÿ¤Kˆ æ‚ÒZ-#·ò£µ¸*Üâ<Nì)I¥ÖZa Å=`ZYÝΆãN
|
||||
¾‚i„¦0RðMæ˜i3§ÌùËÃ}^¨›ùÂë
|
||||
BIN
db/presets/51.bin
Normal file
BIN
db/presets/51.bin
Normal file
Binary file not shown.
BIN
db/presets/52.bin
Normal file
BIN
db/presets/52.bin
Normal file
Binary file not shown.
2
db/presets/53.bin
Normal file
2
db/presets/53.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ5Î=Â0†á«Tk†þQ<C3BE>À%*T%Ô@¥’TŽ; ÄÝIáå±ôzðÞ¾å¨ET Ž·JT,V•ŧšÃð·0‰ ‡Ë>¸8™OõS¨ËÒ`äÙ¾A]Zíª¤²²<C2B2>¯@M¢ÎÉ7 v;÷-hã˜é2§Ìyg‘pŸf¦1ýTáû^
|
||||
7˜
|
||||
3
db/presets/54.bin
Normal file
3
db/presets/54.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1x<EFBFBD>5ΞΝ
|
||||
Β0ΰW)γ5‡ώhΉϊ"%ΪU5)›νAΔww5xϊ–™9μΑ=BI
|
||||
v>Η%Α`q"ΔA»o<ώγK<CEB3>#'Ψ#6‡²ο†'ƒ3ϋΫ]%-κ²4<C2B2>hvOΨVO·J„^Ι T°MΦ<C2AD><CEA6>ΐκ"l3»L›ΩgΊΗ«<CE97>iτ“ώSαύ<01><>5%
|
||||
4
db/presets/55.bin
Normal file
4
db/presets/55.bin
Normal file
@@ -0,0 +1,4 @@
|
||||
PRST1xœMαÂ0Ð_A×5CZ ´™Q~!¨‘BR%î€ÿŽE¦gÝÝà7¢{˜
|
||||
ofŸiž
|
||||
ÇL9JõŸÞRH¹ÀœÐX{Ô½–¬µµ£ÆYášýýÁ‘ŠL:&
|
||||
îÓËéVN0œWRˆdB3[Ä]e_é+‡ÊðcÉiö<69>.~’¿Z|¾¡ 61
|
||||
1
db/presets/56.bin
Normal file
1
db/presets/56.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xœ5ŽAƒ E¯b¾[¨U+WiŒ¡2¶¦`š¦éÝ’nxÌ›Y¼Œ|ùPÌÚÎ<C39A>¿ˆ60l2r&.?ýýlµuâ‚Rõ|àCt%Wuß5®n½Ýƒ!OjÎiùN¹ÜN¦‚¨¢35DÑ@¤é”Ñft}ÆùÀæì²jšVÓª#TSL<53>-)ËìZ³ôŒßQ•AÓ
|
||||
1
db/presets/57.bin
Normal file
1
db/presets/57.bin
Normal file
@@ -0,0 +1 @@
|
||||
PRST1xњEО1В0Р« ПљЎiЎ ЂK „5)MЪФвоXНЂ—gщяБD72В‹lF—зВѓЙ‰pЋьoчR^@glOлаbpЛющ’И‹mУЬФлкЉ$ђдВС‚:ҐХљТЃ¬Іi/о+}еP9®L9=|а«ф‹пжg2д
|
||||
2
db/presets/58.bin
Normal file
2
db/presets/58.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ=ÎÍ
|
||||
Â0àW‘é5‡ô?ìM"} ‰vÕBMJ’D|wSž¾afû†5O!rˆ;³zç
|
||||
3
db/presets/59.bin
Normal file
3
db/presets/59.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1x°Mна
|
||||
б0ЮW▒вз╘SzTЯ%D╓╨L╣m├┬ЬНfКе\╬ДOЫ ╦'а┌)С"┤ЬЙ°ВP3╔ ⌡©П}LЖ└Й8≈dуNЖр²╝╘©?8P√⌠Zk┘√╪{ц6р╨▒#,╖▒┌≥Жb
|
||||
k└%Л4╜
|
||||
2
db/presets/6.bin
Normal file
2
db/presets/6.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœMÎK
|
||||
Â0…á½§ÜT£’tR$Ú«âMÉc âÞm<C39E>ˆ£þ39Oˆ»3,¦2Car¥p’¿rŽ!¦{ÀЍï‰0(œ’¿ÞŠpž‡Î…‘ƒ{À"WK„-©²‚hXMK•î;Ëú—6°¦±mìûSŠøèÇù’Æë
|
||||
4
db/presets/60.bin
Normal file
4
db/presets/60.bin
Normal file
@@ -0,0 +1,4 @@
|
||||
PRST1xœMÎA‚0Ы˜ï¶‹RÉ€KcŠŒBR[Òc¼»l\½Éÿùɼáí“ANr˜ÙFÙ
|
||||
V+ÂÑçê?½b
|
||||
8ö½éj<EFBFBD>‹Â—Ç,žS.ŒÖ
|
||||
;ûµù´›<04>Ä<EFBFBD>|ªL½uŨ)_ƒ
|
||||
BIN
db/presets/61.bin
Normal file
BIN
db/presets/61.bin
Normal file
Binary file not shown.
3
db/presets/62.bin
Normal file
3
db/presets/62.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœ5ŽA‚0E¯B>Û.
|
||||
*š€KCªŒBRÚ¦c¼»ÅÙ¼7óÿb>ðv"0Í\D눙Š)¤8@!ZÙ’—xOºò.¤æŠ²mµŒÜJW϶:n
|
||||
÷4¾ö4K¹ÖZ¡'gß0<C39F>¨]8ÀpZHÁW0ÕVðõÞô˜ÇŒSF“qθlˆ)<GGÝØË«¾?ð¹<
|
||||
3
db/presets/7.bin
Normal file
3
db/presets/7.bin
Normal file
@@ -0,0 +1,3 @@
|
||||
PRST1xœMŽ1Â0Eïò»fp
|
||||
<EFBFBD>(K/<2F>
|
||||
<EFBFBD>H!©Òt@ˆ»cÈÂô¾Ÿ¿%¿<>üƒá0†2F†Âìkå’þÕ˜c.ÜÝ0‘¸Î‘%œ.%Üî5ñ"•Þ…‰£J&RðkÍpµ¬¬<C2AC>´HA§e•6mÜÂÉQ2p_¹kØ7Øæ’¯!ò9Lò–Æû¼Ã1ó
|
||||
BIN
db/presets/8.bin
Normal file
BIN
db/presets/8.bin
Normal file
Binary file not shown.
2
db/presets/9.bin
Normal file
2
db/presets/9.bin
Normal file
@@ -0,0 +1,2 @@
|
||||
PRST1xœ%ÎK
|
||||
Ã0Ы”éÖ‹$ýâ«”ÜFnŽ›PJï^ÇÖæI£Í|Áf&hlFæÃ6¹HPXLŒ$œãÀù|d…~àhË WxŠ{O‘iÍ<69>®iFòæÝî»I1@GI¤À-tޏ«œ*çÊ¥rÜ*÷Â"Á:Oƒs<>¶´ò”{
|
||||
@@ -1 +1 @@
|
||||
{"1": {"group_name": "Main Group", "presets": ["1", "2"], "sequence_duration": 3000, "sequence_transition": 500, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0}, "2": {"group_name": "Accent Group", "presets": ["2", "3"], "sequence_duration": 2000, "sequence_transition": 300, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0}}
|
||||
{"1": {"group_name": "Main Group", "presets": ["1", "2"], "sequence_duration": 3000, "sequence_transition": 500, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0, "steps": [], "step_duration_ms": 3000, "loop": true, "name": "Main Group", "profile_id": "1", "lanes": [[{"preset_id": "42", "beats": 6}, {"preset_id": "5", "beats": 2}], [{"preset_id": "6", "beats": 1}]], "group_ids": ["1"], "advance_mode": "beats", "lanes_group_ids": [["1"], ["2"]]}, "2": {"group_name": "Accent Group", "presets": ["2", "3"], "sequence_duration": 2000, "sequence_transition": 300, "sequence_loop": true, "sequence_repeat_count": 0, "sequence_active": false, "sequence_index": 0, "sequence_start_time": 0, "steps": [{"preset_id": "2", "group_ids": [], "beats": 1}, {"preset_id": "3", "group_ids": [], "beats": 1}], "step_duration_ms": 2000, "loop": true, "name": "Accent Group", "profile_id": "1", "lanes": [[{"preset_id": "2", "group_ids": [], "beats": 1}, {"preset_id": "3", "group_ids": [], "beats": 1}]], "group_ids": [], "advance_mode": "time", "lanes_group_ids": [[]]}}
|
||||
@@ -1 +0,0 @@
|
||||
{"1": {"name": "default", "names": ["e", "c", "d", "a"], "presets": [["4", "2", "7"], ["3", "14", "5"], ["8", "10", "11"], ["9", "12", "1"], ["13", "37", "6"]], "presets_flat": ["4", "2", "7", "3", "14", "5", "8", "10", "11", "9", "12", "1", "13", "37", "6"], "default_preset": "15"}, "2": {"name": "default", "names": ["1", "2", "3", "4", "5", "6", "7", "8", "0", "a"], "presets": [["16", "17", "18"], ["19", "20", "21"], ["22", "23", "24"], ["25", "26", "27"], ["28", "29", "30"]], "presets_flat": ["16", "17", "18", "19", "20", "21", "22", "23", "24", "25", "26", "27", "28", "29", "30"]}, "3": {"name": "default", "names": ["1"], "presets": [], "default_preset": null}, "4": {"name": "default", "names": ["1"], "presets": [], "default_preset": null}, "5": {"name": "dj", "names": ["dj"], "presets": [["31", "32", "33"]], "default_preset": "31", "presets_flat": ["31", "32", "33"]}, "6": {"name": "default", "names": ["1"], "presets": [], "default_preset": null}, "7": {"name": "dj", "names": ["dj"], "presets": [["34", "35", "36"]], "default_preset": "34", "presets_flat": ["34", "35", "36"]}, "8": {"name": "test", "names": ["11"], "presets": [["1", "2", "3"], ["4", "5"]], "default_preset": "1", "presets_flat": ["1", "2", "3", "4", "5"]}}
|
||||
53
dev.py
53
dev.py
@@ -1,53 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import subprocess
|
||||
import serial
|
||||
import sys
|
||||
|
||||
print(sys.argv)
|
||||
|
||||
# Extract port (first arg if it's not a command)
|
||||
commands = ["src", "lib", "ls", "reset", "follow", "db"]
|
||||
port = None
|
||||
if len(sys.argv) > 1 and sys.argv[1] not in commands:
|
||||
port = sys.argv[1]
|
||||
|
||||
|
||||
for cmd in sys.argv[1:]:
|
||||
print(cmd)
|
||||
match cmd:
|
||||
case "src":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "cp", "-r", ".", ":" ], cwd="src")
|
||||
else:
|
||||
print("Error: Port required for 'src' command")
|
||||
case "lib":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "cp", "-r", "lib", ":" ])
|
||||
else:
|
||||
print("Error: Port required for 'lib' command")
|
||||
case "ls":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "ls", ":" ])
|
||||
else:
|
||||
print("Error: Port required for 'ls' command")
|
||||
case "reset":
|
||||
if port:
|
||||
with serial.Serial(port, baudrate=115200) as ser:
|
||||
ser.write(b'\x03\x03\x04')
|
||||
else:
|
||||
print("Error: Port required for 'reset' command")
|
||||
case "follow":
|
||||
if port:
|
||||
with serial.Serial(port, baudrate=115200) as ser:
|
||||
while True:
|
||||
if ser.in_waiting > 0: # Check if there is data in the buffer
|
||||
data = ser.readline().decode('utf-8').strip() # Read and decode the data
|
||||
print(data)
|
||||
else:
|
||||
print("Error: Port required for 'follow' command")
|
||||
case "db":
|
||||
if port:
|
||||
subprocess.call(["mpremote", "connect", port, "fs", "cp", "-r", "db", ":" ])
|
||||
else:
|
||||
print("Error: Port required for 'db' command")
|
||||
53
docs/API.md
53
docs/API.md
@@ -2,10 +2,12 @@
|
||||
|
||||
This document covers:
|
||||
|
||||
1. **HTTP and WebSocket** exposed by the Raspberry Pi app (`src/main.py`) — profiles, presets, transport send, and related resources.
|
||||
2. **LED driver JSON** — the compact message format sent over the serial→ESP-NOW bridge to devices (same logical API as ESP-NOW payloads).
|
||||
1. **HTTP and WebSocket** exposed by the Raspberry Pi app (`src/main.py`) — profiles, zones, presets, transport send, pattern OTA helpers, and related resources.
|
||||
2. **LED driver JSON** — the compact **v1** message format. It is sent over the **serial → ESP-NOW bridge** to ESP32 peers and as **single JSON text messages** over the **outbound WebSocket** to **Wi-Fi** drivers (same logical fields).
|
||||
|
||||
Default listen address: `0.0.0.0`. Port defaults to **80**; override with the `PORT` environment variable (see `pipenv run run`).
|
||||
Default HTTP listen address: `0.0.0.0`. Port defaults to **80**; override with the **`PORT`** environment variable (see `pipenv run run`).
|
||||
|
||||
**Serial:** UART path and baud come from settings (defaults include `serial_port` such as `/dev/ttyS0` and `serial_baudrate`). **Wi-Fi drivers:** **UDP** on port **8766** is the **discovery** channel: each driver’s JSON hello (**`device_name`**, **MAC**, optional **`type`**) **creates or updates** that device in **`db/device.json`** (keyed by MAC); the Pi echoes the datagram. After a valid hello with **`v`:** **`"1"`**, the Pi also opens an **outbound WebSocket** to that IP (**`wifi_driver_ws_port`**, default **80**; **`wifi_driver_ws_path`**, default **`/ws`**) for v1 commands; presets are not pushed automatically on connect (use **Send Presets** / profile apply). The Pi may send periodic UDP **hello** nudges to known Wi‑Fi device IPs when the WebSocket is down (**`wifi_driver_hello_interval_s`** in settings).
|
||||
|
||||
All JSON APIs use `Content-Type: application/json` for bodies and responses unless noted.
|
||||
|
||||
@@ -16,7 +18,7 @@ All JSON APIs use `Content-Type: application/json` for bodies and responses unle
|
||||
The main UI has two modes controlled by the mode toggle:
|
||||
|
||||
- **Run mode**: optimized for operation (zone/preset selection and profile apply).
|
||||
- **Edit mode**: shows editing/management controls (tabs, presets, patterns, colour palette, send presets, profile management actions, **Devices** registry for LED driver names/MACs, and related tools).
|
||||
- **Edit mode**: shows editing/management controls (zones, presets, patterns, colour palette, send presets, profile management actions, **Devices** registry for LED driver names/MACs, and related tools).
|
||||
|
||||
Profiles are available in both modes, but behavior differs:
|
||||
|
||||
@@ -40,7 +42,7 @@ Profiles are selected with **`POST /profiles/<id>/apply`**, which sets `current_
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/` | Main UI (`templates/index.html`) |
|
||||
| GET | `/settings` | Settings page (`templates/settings.html`) |
|
||||
| GET | `/settings/page` | Standalone settings page (`templates/settings.html`) |
|
||||
| GET | `/favicon.ico` | Empty response (204) |
|
||||
| GET | `/static/<path>` | Static files under `src/static/` |
|
||||
|
||||
@@ -50,10 +52,12 @@ Profiles are selected with **`POST /profiles/<id>/apply`**, which sets `current_
|
||||
|
||||
Connect to **`ws://<host>:<port>/ws`**.
|
||||
|
||||
- Send **JSON**: the object is forwarded to the transport (serial bridge → ESP-NOW) as JSON. Optional key **`to`**: 12-character hex MAC address; if present it is removed from the object and the payload is sent to that peer; otherwise the default destination is used.
|
||||
- Send **JSON**: the object is forwarded through the **serial sender** (6-byte MAC prefix + payload to the ESP-NOW bridge). Optional key **`to`**: 12-character hex MAC address; if present it is removed from the object and the payload is sent to that peer; otherwise the default destination from settings is used.
|
||||
- Send **non-JSON text**: forwarded as raw bytes with the default address.
|
||||
- On send failure, the server may reply with `{"error": "Send failed"}`.
|
||||
|
||||
Wi-Fi devices are not targeted by `/ws` directly; use **`POST /presets/send`**, device routes, or **`POST /patterns/<name>/send`** as appropriate.
|
||||
|
||||
---
|
||||
|
||||
## HTTP API by resource
|
||||
@@ -68,7 +72,7 @@ Below, `<id>` values are string identifiers used by the JSON stores (numeric str
|
||||
| PUT | `/settings/settings` | Merge keys into settings and save. Returns `{"message": "Settings updated successfully"}`. |
|
||||
| GET | `/settings/wifi/ap` | Saved Wi‑Fi AP fields: `saved_ssid`, `saved_password`, `saved_channel`, `active` (Pi: `active` is always false). |
|
||||
| POST | `/settings/wifi/ap` | Body: `ssid` (required), `password`, `channel` (1–11). Persists AP-related settings. |
|
||||
| GET | `/settings/page` | Serves `templates/settings.html` (same page as `GET /settings` from the root app, for convenience). |
|
||||
| GET | `/settings/page` | Serves `templates/settings.html`. |
|
||||
|
||||
### Devices — `/devices`
|
||||
|
||||
@@ -77,11 +81,11 @@ Registry in `db/device.json`: storage key **`<id>`** (string, e.g. `"1"`) maps t
|
||||
| Field | Description |
|
||||
|-------|-------------|
|
||||
| **`id`** | Same as the storage key (stable handle for URLs). |
|
||||
| **`name`** | Shown in tabs and used in `select` keys. |
|
||||
| **`name`** | Shown in the UI and used in `select` keys. |
|
||||
| **`type`** | `led` (only value today; extensible). |
|
||||
| **`transport`** | `espnow` or `wifi`. |
|
||||
| **`address`** | For **`espnow`**: optional 12-character lowercase hex MAC. For **`wifi`**: optional IP or hostname string. |
|
||||
| **`default_pattern`**, **`tabs`** | Optional, as before. |
|
||||
| **`default_pattern`**, **`zones`** | Optional. Legacy **`tabs`** may still appear in old files and is migrated away on load. |
|
||||
|
||||
Existing records without `type` / `transport` / `id` are backfilled on load (`led`, `espnow`, and `id` = key).
|
||||
|
||||
@@ -89,7 +93,7 @@ Existing records without `type` / `transport` / `id` are backfilled on load (`le
|
||||
|--------|------|-------------|
|
||||
| GET | `/devices` | Map of device id → device object. |
|
||||
| GET | `/devices/<id>` | One device, 404 if missing. |
|
||||
| POST | `/devices` | Create. Body: **`name`** (required), **`type`** (default `led`), **`transport`** (default `espnow`), optional **`address`**, **`default_pattern`**, **`tabs`**. Returns `{ "<id>": { ... } }`, 201. |
|
||||
| POST | `/devices` | Create. Body: **`name`** (required), **`type`** (default `led`), **`transport`** (default `espnow`), optional **`address`**, **`default_pattern`**, **`zones`**. Returns `{ "<id>": { ... } }`, 201. |
|
||||
| PUT | `/devices/<id>` | Partial update. **`name`** cannot be cleared. **`id`** in the body is ignored. **`type`** / **`transport`** validated; **`address`** normalised for the resulting transport. |
|
||||
| DELETE | `/devices/<id>` | Remove device. |
|
||||
|
||||
@@ -102,7 +106,7 @@ Existing records without `type` / `transport` / `id` are backfilled on load (`le
|
||||
| GET | `/profiles/<id>` | Single profile. If `<id>` is `current`, same as `/profiles/current`. |
|
||||
| POST | `/profiles` | Create profile. Body may include `name` and other fields. Optional `seed_dj_zone` (request-only) seeds a DJ zone + presets. New profiles always get a populated `default` zone. Returns `{ "<id>": { ... } }` with status 201. |
|
||||
| POST | `/profiles/<id>/apply` | Sets session current profile to `<id>`. |
|
||||
| POST | `/profiles/<id>/clone` | Clone profile (tabs, palettes, presets). Body may include `name`. |
|
||||
| POST | `/profiles/<id>/clone` | Clone profile (zones, palettes, presets). Body may include `name`. |
|
||||
| PUT | `/profiles/current` | Update the current profile (from session). |
|
||||
| PUT | `/profiles/<id>` | Update profile by id. |
|
||||
| DELETE | `/profiles/<id>` | Delete profile. |
|
||||
@@ -143,11 +147,11 @@ Stored preset records can include:
|
||||
- `colors`: resolved hex colours for editor/display.
|
||||
- `palette_refs`: optional array of palette indexes parallel to `colors`. If a slot contains an integer index, the colour is linked to the current profile palette at that index.
|
||||
|
||||
### Tabs — `/zones`
|
||||
### Zones — `/zones`
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/zones` | `tabs`, `zone_order`, `current_zone_id`, `profile_id` for the session-backed profile. |
|
||||
| GET | `/zones` | `zones` (map of zone id → zone object), `zone_order`, `current_zone_id`, `profile_id` for the session-backed profile. |
|
||||
| GET | `/zones/current` | Current zone from cookie/session. |
|
||||
| POST | `/zones` | Create zone; optional JSON `name`, `names`, `presets`; can append to current profile’s zone list. |
|
||||
| GET | `/zones/<id>` | Zone JSON. |
|
||||
@@ -198,20 +202,33 @@ Stored preset records can include:
|
||||
|
||||
### Patterns — `/patterns`
|
||||
|
||||
Pattern metadata lives in **`db/pattern.json`**; driver source files live under **`led-driver/src/patterns/`**. Several routes expose a **runtime map** (metadata merged with on-disk `.py` names so new files appear in menus).
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| GET | `/patterns/definitions` | Contents of `pattern.json` (pattern metadata for the UI). |
|
||||
| GET | `/patterns` | All pattern records. |
|
||||
| GET | `/patterns/<id>` | One pattern. |
|
||||
| GET | `/patterns` | Runtime pattern map (object keyed by pattern id). |
|
||||
| GET | `/patterns/definitions` | Same runtime map (intended for UI “definitions” clients). |
|
||||
| GET | `/patterns/ota/manifest` | JSON `{"files":[{"name":"blink.py","url":"http://<Host>/patterns/ota/file/blink.py"},...]}` for OTA pulls. Requires **`Host`** header. |
|
||||
| GET | `/patterns/ota/file/<name>` | Raw **`.py`** source for one driver pattern (`name` must be a safe filename, e.g. `rainbow.py`). |
|
||||
| POST | `/patterns/<name>/send` | Push a **manifest** JSON line to **Wi-Fi** devices so they pull one pattern file over HTTP. Body may include **`device_id`** to target one device; otherwise all Wi-Fi devices with an **`address`** are tried. **`<name>`** may be with or without `.py`. |
|
||||
| POST | `/patterns/upload` | Body JSON: **`name`**, **`code`**, optional **`overwrite`** (default true). Writes **`led-driver/src/patterns/<name>.py`**. |
|
||||
| POST | `/patterns/driver` | Body JSON: **`name`** (identifier), **`code`**, optional metadata (`min_delay`, `max_delay`, `max_colors`, `n1`…`n8`, **`overwrite`**). Creates/updates both the **`.py`** file and **`db/pattern.json`** via the Pattern model. |
|
||||
| GET | `/patterns/<id>` | One pattern record from the Pattern model (metadata only). |
|
||||
| POST | `/patterns` | Create (`name`, optional `data`). |
|
||||
| PUT | `/patterns/<id>` | Update. |
|
||||
| DELETE | `/patterns/<id>` | Delete. |
|
||||
|
||||
**Devices — pattern OTA push**
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/devices/<id>/patterns/push` | Wi-Fi only. Asks the driver at **`address`** to pull pattern files from this server. Optional body **`manifest`**: either a **URL string** pointing at a manifest JSON document, or a **manifest object** (same shape as in driver messages). If omitted, a default manifest is built from the request **`Host`** header. |
|
||||
|
||||
---
|
||||
|
||||
## LED driver message format (transport / ESP-NOW)
|
||||
## LED driver message format (transport / ESP-NOW / Wi-Fi)
|
||||
|
||||
Messages are JSON objects. The Pi **`build_message()`** helper (`src/util/espnow_message.py`) produces the same shape sent over serial and forwarded by the ESP32 bridge.
|
||||
Messages are JSON objects. The Pi **`build_message()`** helper (`src/util/espnow_message.py`) produces the same shape sent over serial and forwarded by the ESP32 bridge, and the same logical object can be sent as a **single JSON text message** to a Wi-Fi driver over the **WebSocket**.
|
||||
|
||||
### Top-level fields
|
||||
|
||||
|
||||
@@ -350,7 +350,7 @@ Manage connected devices and create/manage device groups.
|
||||
|
||||
#### Layout
|
||||
- **Header:** Title with "Add Device" button
|
||||
- **Tabs:** Devices and Groups tabs
|
||||
- **Zones:** Devices and Groups zones (zone buttons / zone strip)
|
||||
- **Content Area:** Zone-specific content
|
||||
|
||||
#### Devices Zone
|
||||
@@ -1774,7 +1774,7 @@ peak_mem = usqlite.mem_peak()
|
||||
- Buttons respond to clicks
|
||||
- Sliders update values
|
||||
- Modals open/close
|
||||
- Tabs switch correctly
|
||||
- Zone buttons switch correctly
|
||||
- Preset selector works
|
||||
- Preset creation form validates input
|
||||
- Preset cards display correctly
|
||||
|
||||
20
docs/help.md
20
docs/help.md
@@ -1,6 +1,6 @@
|
||||
# LED controller — user guide
|
||||
|
||||
This page describes the **main web UI** served from the Raspberry Pi app: profiles, tabs, presets, colour palettes, and sending commands to LED devices over the serial → ESP-NOW bridge.
|
||||
This page describes the **main web UI** served from the Raspberry Pi app: profiles, **zones**, presets, colour palettes, and sending commands to LED devices. Traffic may go over the **serial → ESP-NOW bridge** or **Wi-Fi** (TCP to drivers on the LAN), depending on each device’s transport.
|
||||
|
||||
For HTTP routes and the wire format the driver expects, see **[API.md](API.md)**. For running the app locally, see the project **README**.
|
||||
|
||||
@@ -12,24 +12,24 @@ Figures below are **schematic** (layout and ideas), not pixel-perfect screenshot
|
||||
|
||||
The header has a mode toggle (desktop and mobile menu). The **label on the button is the mode you switch to** when you press it.
|
||||
|
||||

|
||||

|
||||
|
||||
*The active zone is highlighted. Extra management buttons appear only in Edit mode.*
|
||||
|
||||
| Mode | Purpose |
|
||||
|------|--------|
|
||||
| **Run mode** | Day-to-day control: choose a zone, tap presets, apply profiles. Management buttons are hidden. |
|
||||
| **Edit mode** | Full setup: tabs, presets, patterns, colour palette, **Send Presets**, profile create/clone/delete, preset reordering, and per-tile **Edit** on the strip. |
|
||||
| **Edit mode** | Full setup: zones, presets, patterns, colour palette, **Send Presets**, profile create/clone/delete, preset reordering, and per-tile **Edit** on the strip. |
|
||||
|
||||
**Profiles** is available in both modes: in Run mode you can only **apply** a profile; in Edit mode you can also **create**, **clone**, and **delete** profiles.
|
||||
|
||||
---
|
||||
|
||||
## Tabs
|
||||
## Zones
|
||||
|
||||
- **Select a zone**: click its button in the top bar. The main area shows that zone’s preset strip and controls.
|
||||
- **Edit mode — open zone settings**: **right-click** a zone button to change its name, **device IDs** (comma-separated), and which presets appear on the zone. Device identifiers are matched to each device’s **name** when the app builds `select` messages for the driver.
|
||||
- **Tabs modal** (Edit mode): create new tabs from the header **Tabs** button. New tabs need a name and device ID list (defaults to `1` if you leave a simple placeholder).
|
||||
- **Zones modal** (Edit mode): create new zones from the header **Zones** button. New zones need a name and device ID list (defaults to `1` if you leave a simple placeholder).
|
||||
- **Brightness slider** (per zone): adjusts **global** brightness sent to devices (`b` in the driver message), with a short debounce so small drags do not flood the link.
|
||||
|
||||
---
|
||||
@@ -68,7 +68,7 @@ The **Presets** header button (Edit mode) opens a **profile-wide** list: **Add**
|
||||
|
||||
## Profiles
|
||||
|
||||
- **Apply**: sets the **current profile** in your session. Tabs and presets you see are scoped to that profile.
|
||||
- **Apply**: sets the **current profile** in your session. Zones and presets you see are scoped to that profile.
|
||||
- **Edit mode — Create**: new profiles always get a populated **default** zone. Optionally tick **DJ zone** to also create a `dj` zone (device name `dj`) with starter DJ-oriented presets.
|
||||
- **Clone** / **Delete**: available in Edit mode from the profile list.
|
||||
|
||||
@@ -82,7 +82,9 @@ The **Presets** header button (Edit mode) opens a **profile-wide** list: **Add**
|
||||
|
||||
## Patterns
|
||||
|
||||
The **Patterns** dialog (Edit mode) is a **read-only reference**: pattern names and typical **delay** ranges from the pattern definitions. It does not change device behaviour by itself; patterns are chosen inside the preset editor.
|
||||
The **Patterns** dialog (Edit mode) lists pattern names and typical **delay** ranges from the pattern definitions. Choosing a pattern still happens inside the preset editor.
|
||||
|
||||
**Wi-Fi drivers** can install new pattern modules over HTTP: the REST API exposes **`/patterns/ota/*`**, **`POST /patterns/<name>/send`**, **`POST /patterns/upload`**, and **`POST /patterns/driver`** (see [API.md](API.md)). ESP-NOW devices follow the bridge/serial path you configure for preset traffic.
|
||||
|
||||
---
|
||||
|
||||
@@ -98,7 +100,7 @@ The **Patterns** dialog (Edit mode) is a **read-only reference**: pattern names
|
||||
|
||||
## Mobile layout
|
||||
|
||||
On narrow screens, use **Menu** to reach the same actions as the desktop header (Profiles, Tabs, Presets, Help, mode toggle, etc.).
|
||||
On narrow screens, use **Menu** to reach the same actions as the desktop header (Profiles, Zones, Presets, Help, mode toggle, etc.).
|
||||
|
||||

|
||||
|
||||
@@ -108,5 +110,5 @@ On narrow screens, use **Menu** to reach the same actions as the desktop header
|
||||
|
||||
## Further reading
|
||||
|
||||
- **[API.md](API.md)** — REST routes, session scoping, WebSocket `/ws`, and LED driver JSON (`presets`, `select`, `save`, `default`, pattern keys).
|
||||
- **[API.md](API.md)** — REST routes, session scoping, WebSocket `/ws`, and LED driver JSON (`presets`, `select`, `save`, `default`, pattern keys, pattern **manifest**).
|
||||
- **README** — `pipenv run run`, port 80 setup, and high-level behaviour.
|
||||
|
||||
BIN
docs/help.pdf
BIN
docs/help.pdf
Binary file not shown.
@@ -1,112 +0,0 @@
|
||||
# Benchmark: LRU eviction vs add-then-remove-after-use on ESP32.
|
||||
# Run on device: mpremote run esp32/benchmark_peers.py
|
||||
# (add/del_peer are timed; send() may fail if no peer is listening - timing still valid)
|
||||
import espnow
|
||||
import network
|
||||
import time
|
||||
|
||||
BROADCAST = b"\xff\xff\xff\xff\xff\xff"
|
||||
MAX_PEERS = 20
|
||||
ITERATIONS = 50
|
||||
PAYLOAD = b"x" * 32 # small payload
|
||||
|
||||
network.WLAN(network.STA_IF).active(True)
|
||||
esp = espnow.ESPNow()
|
||||
esp.active(True)
|
||||
esp.add_peer(BROADCAST)
|
||||
|
||||
# Build 19 dummy MACs so we have 20 peers total (broadcast + 19).
|
||||
def mac(i):
|
||||
return bytes([0, 0, 0, 0, 0, i])
|
||||
peers_list = [mac(i) for i in range(1, 20)]
|
||||
for p in peers_list:
|
||||
esp.add_peer(p)
|
||||
|
||||
# One "new" MAC we'll add/remove.
|
||||
new_mac = bytes([0, 0, 0, 0, 0, 99])
|
||||
|
||||
def bench_lru():
|
||||
"""LRU: ensure_peer (evict oldest + add new), send, update last_used."""
|
||||
last_used = {BROADCAST: time.ticks_ms()}
|
||||
for p in peers_list:
|
||||
last_used[p] = time.ticks_ms()
|
||||
# Pre-remove one so we have 19; ensure_peer(new) will add 20th.
|
||||
esp.del_peer(peers_list[-1])
|
||||
last_used.pop(peers_list[-1], None)
|
||||
# Now 19 peers. Each iteration: ensure_peer(new) -> add_peer(new), send, update.
|
||||
# Next iter: ensure_peer(new) -> already there, just send. So we need to force
|
||||
# eviction each time: use a different "new" each time so we always evict+add.
|
||||
t0 = time.ticks_us()
|
||||
for i in range(ITERATIONS):
|
||||
addr = bytes([0, 0, 0, 0, 0, 50 + (i % 30)]) # 30 different "new" MACs
|
||||
peers = esp.get_peers()
|
||||
peer_macs = [p[0] for p in peers]
|
||||
if addr not in peer_macs:
|
||||
if len(peer_macs) >= MAX_PEERS:
|
||||
oldest_mac = None
|
||||
oldest_ts = time.ticks_ms()
|
||||
for m in peer_macs:
|
||||
if m == BROADCAST:
|
||||
continue
|
||||
ts = last_used.get(m, 0)
|
||||
if ts <= oldest_ts:
|
||||
oldest_ts = ts
|
||||
oldest_mac = m
|
||||
if oldest_mac is not None:
|
||||
esp.del_peer(oldest_mac)
|
||||
last_used.pop(oldest_mac, None)
|
||||
esp.add_peer(addr)
|
||||
esp.send(addr, PAYLOAD)
|
||||
last_used[addr] = time.ticks_ms()
|
||||
t1 = time.ticks_us()
|
||||
return time.ticks_diff(t1, t0)
|
||||
|
||||
def bench_add_then_remove():
|
||||
"""Add peer, send, del_peer (remove after use). At 20 we must del one first."""
|
||||
# Start full: 20 peers. To add new we del any one, add new, send, del new.
|
||||
victim = peers_list[0]
|
||||
t0 = time.ticks_us()
|
||||
for i in range(ITERATIONS):
|
||||
esp.del_peer(victim) # make room
|
||||
esp.add_peer(new_mac)
|
||||
esp.send(new_mac, PAYLOAD)
|
||||
esp.del_peer(new_mac)
|
||||
esp.add_peer(victim) # put victim back so we're at 20 again
|
||||
t1 = time.ticks_us()
|
||||
return time.ticks_diff(t1, t0)
|
||||
|
||||
def bench_send_existing():
|
||||
"""Baseline: send to existing peer only (no add/del)."""
|
||||
t0 = time.ticks_us()
|
||||
for _ in range(ITERATIONS):
|
||||
esp.send(peers_list[0], PAYLOAD)
|
||||
t1 = time.ticks_us()
|
||||
return time.ticks_diff(t1, t0)
|
||||
|
||||
print("ESP-NOW peer benchmark ({} iterations)".format(ITERATIONS))
|
||||
print()
|
||||
|
||||
# Baseline: send to existing peer
|
||||
try:
|
||||
us = bench_send_existing()
|
||||
print("Send to existing peer only: {:>8} us total {:>7.1f} us/iter".format(us, us / ITERATIONS))
|
||||
except Exception as e:
|
||||
print("Send existing failed:", e)
|
||||
print()
|
||||
|
||||
# LRU: evict oldest then add new, send
|
||||
try:
|
||||
us = bench_lru()
|
||||
print("LRU (evict oldest + add + send): {:>8} us total {:>7.1f} us/iter".format(us, us / ITERATIONS))
|
||||
except Exception as e:
|
||||
print("LRU failed:", e)
|
||||
print()
|
||||
|
||||
# Add then remove after use
|
||||
try:
|
||||
us = bench_add_then_remove()
|
||||
print("Add then remove after use: {:>8} us total {:>7.1f} us/iter".format(us, us / ITERATIONS))
|
||||
except Exception as e:
|
||||
print("Add-then-remove failed:", e)
|
||||
print()
|
||||
print("Done.")
|
||||
253
esp32/main.py
253
esp32/main.py
@@ -1,253 +0,0 @@
|
||||
# Serial-to-ESP-NOW bridge: JSON in both directions on UART + ESP-NOW.
|
||||
#
|
||||
# Pi → UART (two supported forms):
|
||||
# A) Legacy: 6 bytes destination MAC + UTF-8 JSON payload (one write = one frame).
|
||||
# B) Newline JSON: one object per line, UTF-8, ending with \n
|
||||
# - Multicast via ESP32: {"m":"split","peers":["12hex",...],"body":{...}}
|
||||
# - Unicast / broadcast: {"to":"12hex","v":"1",...} (all keys except to/dest go to peers)
|
||||
#
|
||||
# ESP-NOW → Pi: newline-delimited JSON, one object per packet:
|
||||
# {"dir":"espnow_rx","from":"<12hex>","payload":{...}} if body was JSON
|
||||
# {"dir":"espnow_rx","from":"<12hex>","payload_text":"..."} if UTF-8 not JSON
|
||||
# {"dir":"espnow_rx","from":"<12hex>","payload_b64":"..."} if binary
|
||||
from machine import Pin, UART
|
||||
import espnow
|
||||
import json
|
||||
import network
|
||||
import time
|
||||
import ubinascii
|
||||
|
||||
UART_BAUD = 912000
|
||||
BROADCAST = b"\xff\xff\xff\xff\xff\xff"
|
||||
MAX_PEERS = 20
|
||||
WIFI_CHANNEL = 6
|
||||
|
||||
sta = network.WLAN(network.STA_IF)
|
||||
sta.active(True)
|
||||
sta.config(pm=network.WLAN.PM_NONE, channel=WIFI_CHANNEL)
|
||||
print("WiFi STA channel:", sta.config("channel"), "(WIFI_CHANNEL=%s)" % WIFI_CHANNEL)
|
||||
|
||||
esp = espnow.ESPNow()
|
||||
esp.active(True)
|
||||
esp.add_peer(BROADCAST)
|
||||
|
||||
uart = UART(1, UART_BAUD, tx=Pin(21), rx=Pin(6))
|
||||
|
||||
last_used = {BROADCAST: time.ticks_ms()}
|
||||
uart_rx_buf = b""
|
||||
|
||||
ESP_ERR_ESPNOW_EXIST = -12395
|
||||
|
||||
|
||||
def ensure_peer(addr):
|
||||
peers = esp.get_peers()
|
||||
peer_macs = [p[0] for p in peers]
|
||||
if addr in peer_macs:
|
||||
return
|
||||
if len(peer_macs) >= MAX_PEERS:
|
||||
oldest_mac = None
|
||||
oldest_ts = time.ticks_ms()
|
||||
for mac in peer_macs:
|
||||
if mac == BROADCAST:
|
||||
continue
|
||||
ts = last_used.get(mac, 0)
|
||||
if ts <= oldest_ts:
|
||||
oldest_ts = ts
|
||||
oldest_mac = mac
|
||||
if oldest_mac is not None:
|
||||
esp.del_peer(oldest_mac)
|
||||
last_used.pop(oldest_mac, None)
|
||||
try:
|
||||
esp.add_peer(addr)
|
||||
except OSError as e:
|
||||
if e.args[0] != ESP_ERR_ESPNOW_EXIST:
|
||||
raise
|
||||
|
||||
|
||||
def try_apply_bridge_config(obj):
|
||||
"""Pi sends {"m":"bridge","ch":1..11} — set STA channel only; do not ESP-NOW forward."""
|
||||
if not isinstance(obj, dict) or obj.get("m") != "bridge":
|
||||
return False
|
||||
ch = obj.get("ch")
|
||||
if ch is None:
|
||||
ch = obj.get("wifi_channel")
|
||||
if ch is None:
|
||||
return True
|
||||
try:
|
||||
n = int(ch)
|
||||
if 1 <= n <= 11:
|
||||
sta.config(pm=network.WLAN.PM_NONE, channel=n)
|
||||
print("Bridge STA channel ->", n)
|
||||
except Exception as e:
|
||||
print("bridge config:", e)
|
||||
return True
|
||||
|
||||
|
||||
def send_split_from_obj(obj):
|
||||
"""obj has m=split, peers=[12hex,...], body=dict."""
|
||||
body = obj.get("body")
|
||||
if body is None:
|
||||
return
|
||||
try:
|
||||
out = json.dumps(body).encode("utf-8")
|
||||
except (TypeError, ValueError):
|
||||
return
|
||||
for peer in obj.get("peers") or []:
|
||||
if not isinstance(peer, str) or len(peer) != 12:
|
||||
continue
|
||||
try:
|
||||
mac = bytes.fromhex(peer)
|
||||
except ValueError:
|
||||
continue
|
||||
if len(mac) != 6:
|
||||
continue
|
||||
ensure_peer(mac)
|
||||
esp.send(mac, out)
|
||||
last_used[mac] = time.ticks_ms()
|
||||
|
||||
|
||||
def process_broadcast_payload_split_or_flood(payload):
|
||||
try:
|
||||
text = payload.decode("utf-8")
|
||||
obj = json.loads(text)
|
||||
except Exception:
|
||||
obj = None
|
||||
if isinstance(obj, dict) and try_apply_bridge_config(obj):
|
||||
return
|
||||
if (
|
||||
isinstance(obj, dict)
|
||||
and obj.get("m") == "split"
|
||||
and isinstance(obj.get("peers"), list)
|
||||
):
|
||||
send_split_from_obj(obj)
|
||||
return
|
||||
ensure_peer(BROADCAST)
|
||||
esp.send(BROADCAST, payload)
|
||||
last_used[BROADCAST] = time.ticks_ms()
|
||||
|
||||
|
||||
def process_legacy_uart_frame(data):
|
||||
if not data or len(data) < 6:
|
||||
return
|
||||
addr = data[:6]
|
||||
payload = data[6:]
|
||||
if addr == BROADCAST:
|
||||
process_broadcast_payload_split_or_flood(payload)
|
||||
return
|
||||
ensure_peer(addr)
|
||||
esp.send(addr, payload)
|
||||
last_used[addr] = time.ticks_ms()
|
||||
|
||||
|
||||
def handle_json_command_line(obj):
|
||||
if not isinstance(obj, dict):
|
||||
return
|
||||
if try_apply_bridge_config(obj):
|
||||
return
|
||||
if obj.get("m") == "split" and isinstance(obj.get("peers"), list):
|
||||
send_split_from_obj(obj)
|
||||
return
|
||||
to = obj.get("to") or obj.get("dest")
|
||||
if isinstance(to, str) and len(to) == 12:
|
||||
try:
|
||||
mac = bytes.fromhex(to)
|
||||
except ValueError:
|
||||
return
|
||||
if len(mac) != 6:
|
||||
return
|
||||
body = {k: v for k, v in obj.items() if k not in ("to", "dest")}
|
||||
if not body:
|
||||
return
|
||||
try:
|
||||
out = json.dumps(body).encode("utf-8")
|
||||
except (TypeError, ValueError):
|
||||
return
|
||||
ensure_peer(mac)
|
||||
esp.send(mac, out)
|
||||
last_used[mac] = time.ticks_ms()
|
||||
|
||||
|
||||
def drain_uart_json_lines():
|
||||
"""Parse leading newline-delimited JSON objects from uart_rx_buf; leave rest."""
|
||||
global uart_rx_buf
|
||||
while True:
|
||||
s = uart_rx_buf.lstrip()
|
||||
if not s:
|
||||
uart_rx_buf = b""
|
||||
return
|
||||
if s[0] != ord("{"):
|
||||
uart_rx_buf = s
|
||||
return
|
||||
nl = s.find(b"\n")
|
||||
if nl < 0:
|
||||
uart_rx_buf = s
|
||||
return
|
||||
line = s[:nl].strip()
|
||||
uart_rx_buf = s[nl + 1 :]
|
||||
if line:
|
||||
try:
|
||||
text = line.decode("utf-8")
|
||||
obj = json.loads(text)
|
||||
handle_json_command_line(obj)
|
||||
except Exception as e:
|
||||
print("UART JSON line error:", e)
|
||||
# continue; there may be another JSON line in buffer
|
||||
|
||||
|
||||
def drain_uart_legacy_frame():
|
||||
"""If buffer does not start with '{', treat whole buffer as one 6-byte MAC + JSON frame."""
|
||||
global uart_rx_buf
|
||||
s = uart_rx_buf
|
||||
if not s or s[0] == ord("{"):
|
||||
return
|
||||
if len(s) < 6:
|
||||
return
|
||||
data = s
|
||||
uart_rx_buf = b""
|
||||
process_legacy_uart_frame(data)
|
||||
|
||||
|
||||
def forward_espnow_to_uart(mac, msg):
|
||||
peer_hex = ubinascii.hexlify(mac).decode()
|
||||
try:
|
||||
text = msg.decode("utf-8")
|
||||
try:
|
||||
payload = json.loads(text)
|
||||
line_obj = {"dir": "espnow_rx", "from": peer_hex, "payload": payload}
|
||||
except ValueError:
|
||||
line_obj = {"dir": "espnow_rx", "from": peer_hex, "payload_text": text}
|
||||
except UnicodeDecodeError:
|
||||
line_obj = {
|
||||
"dir": "espnow_rx",
|
||||
"from": peer_hex,
|
||||
"payload_b64": ubinascii.b64encode(msg).decode(),
|
||||
}
|
||||
try:
|
||||
line = json.dumps(line_obj) + "\n"
|
||||
uart.write(line.encode("utf-8"))
|
||||
except Exception as e:
|
||||
print("UART TX error:", e)
|
||||
|
||||
|
||||
print("Starting ESP32 bridge (UART JSON + legacy MAC+JSON, ESP-NOW RX → UART JSON lines)")
|
||||
|
||||
while True:
|
||||
idle = True
|
||||
if uart.any():
|
||||
idle = False
|
||||
uart_rx_buf += uart.read()
|
||||
drain_uart_json_lines()
|
||||
drain_uart_legacy_frame()
|
||||
|
||||
try:
|
||||
peer, msg = esp.recv(0)
|
||||
except OSError:
|
||||
peer, msg = None, None
|
||||
|
||||
if peer is not None and msg is not None:
|
||||
idle = False
|
||||
if len(peer) == 6:
|
||||
forward_espnow_to_uart(peer, msg)
|
||||
|
||||
if idle:
|
||||
time.sleep_ms(1)
|
||||
7
espnow-sender/README.md
Normal file
7
espnow-sender/README.md
Normal file
@@ -0,0 +1,7 @@
|
||||
# espnow-sender
|
||||
|
||||
Minimal MicroPython project for receiving JSON over Microdot WebSocket.
|
||||
|
||||
- WebSocket endpoint: `/ws`
|
||||
- Entry point: `main.py`
|
||||
- Message template: `msg.json`
|
||||
120
espnow-sender/main.py
Normal file
120
espnow-sender/main.py
Normal file
@@ -0,0 +1,120 @@
|
||||
import asyncio
|
||||
import json
|
||||
|
||||
from microdot import Microdot
|
||||
from microdot.websocket import WebSocketError, with_websocket
|
||||
|
||||
import espnow
|
||||
import network
|
||||
from util import format_mac, parse_mac
|
||||
|
||||
|
||||
app = Microdot()
|
||||
_esp = None
|
||||
_known_peers = set()
|
||||
_ws_clients = set()
|
||||
|
||||
|
||||
def _init_espnow():
|
||||
global _esp
|
||||
sta = network.WLAN(network.STA_IF)
|
||||
sta.active(True)
|
||||
_esp = espnow.ESPNow()
|
||||
_esp.active(True)
|
||||
|
||||
|
||||
def _validate_envelope(obj):
|
||||
if obj.get("v") != "1":
|
||||
raise ValueError("message.v must be '1'")
|
||||
devices = obj["devices"]
|
||||
for address in devices.keys():
|
||||
parse_mac(address)
|
||||
return obj
|
||||
|
||||
|
||||
def _send_espnow(address, payload):
|
||||
if _esp is None:
|
||||
raise ValueError("espnow is not initialized")
|
||||
mac = parse_mac(address)
|
||||
msg = json.dumps(payload, separators=(",", ":")).encode("utf-8")
|
||||
if mac not in _known_peers:
|
||||
_esp.add_peer(mac)
|
||||
_known_peers.add(mac)
|
||||
_esp.send(mac, msg)
|
||||
return mac, len(msg)
|
||||
|
||||
|
||||
async def _broadcast_ws(obj):
|
||||
text = json.dumps(obj)
|
||||
dead = []
|
||||
for client in list(_ws_clients):
|
||||
try:
|
||||
await client.send(text)
|
||||
except Exception:
|
||||
dead.append(client)
|
||||
for client in dead:
|
||||
_ws_clients.discard(client)
|
||||
|
||||
|
||||
async def _espnow_receive_loop():
|
||||
while True:
|
||||
host, msg = _esp.recv(0)
|
||||
if not host:
|
||||
await asyncio.sleep(0.01)
|
||||
continue
|
||||
await _broadcast_ws(
|
||||
{
|
||||
"from": format_mac(host),
|
||||
"payload": msg.decode("utf-8"),
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@app.route("/ws")
|
||||
@with_websocket
|
||||
async def ws(request, ws):
|
||||
_ws_clients.add(ws)
|
||||
while True:
|
||||
try:
|
||||
raw = await ws.receive()
|
||||
except WebSocketError:
|
||||
break
|
||||
|
||||
if not raw:
|
||||
break
|
||||
|
||||
try:
|
||||
parsed = json.loads(raw)
|
||||
env = _validate_envelope(parsed)
|
||||
sent = []
|
||||
for address, payload in env["devices"].items():
|
||||
mac, payload_size = _send_espnow(address, payload)
|
||||
sent.append(
|
||||
{
|
||||
"address": format_mac(mac),
|
||||
"bytes": payload_size,
|
||||
}
|
||||
)
|
||||
except (ValueError, TypeError) as e:
|
||||
await ws.send(json.dumps({"ok": False, "error": str(e)}))
|
||||
continue
|
||||
|
||||
await ws.send(
|
||||
json.dumps(
|
||||
{
|
||||
"ok": True,
|
||||
"sent": sent,
|
||||
}
|
||||
)
|
||||
)
|
||||
_ws_clients.discard(ws)
|
||||
|
||||
|
||||
async def main(port=80):
|
||||
_init_espnow()
|
||||
asyncio.create_task(_espnow_receive_loop())
|
||||
await app.start_server(host="0.0.0.0", port=port)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main(port=80))
|
||||
24
espnow-sender/msg.json
Normal file
24
espnow-sender/msg.json
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"v": "1",
|
||||
"devices": {
|
||||
"ff:ff:ff:ff:ff:ff": {
|
||||
"presets": {
|
||||
"preset_id": {
|
||||
"pattern": "on",
|
||||
"colors": ["#FF0000"],
|
||||
"delay": 100,
|
||||
"brightness": 255,
|
||||
"auto": true
|
||||
}
|
||||
},
|
||||
"select": {
|
||||
"preset": "preset_id",
|
||||
"step": 0
|
||||
},
|
||||
"save": true,
|
||||
"default": "preset_id",
|
||||
"b": 255
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
12
espnow-sender/util.py
Normal file
12
espnow-sender/util.py
Normal file
@@ -0,0 +1,12 @@
|
||||
def parse_mac(value):
|
||||
raw = value.strip().lower().replace(":", "").replace("-", "")
|
||||
if len(raw) != 12:
|
||||
raise ValueError("address must be 12 hex chars or aa:bb:cc:dd:ee:ff")
|
||||
try:
|
||||
return bytes.fromhex(raw)
|
||||
except ValueError:
|
||||
raise ValueError("address contains non-hex characters")
|
||||
|
||||
|
||||
def format_mac(mac_bytes):
|
||||
return ":".join("{:02x}".format(b) for b in mac_bytes)
|
||||
Submodule led-driver updated: cef9e00819...2a768376d0
1
led-simulator
Submodule
1
led-simulator
Submodule
Submodule led-simulator added at 42c14361e8
2
led-tool
2
led-tool
Submodule led-tool updated: e86312437c...580fd11aca
123
led_bar_vertical_stand.scad
Normal file
123
led_bar_vertical_stand.scad
Normal file
@@ -0,0 +1,123 @@
|
||||
// Parametric LED bar vertical stand socket
|
||||
// For a bar nominally 14 x 17 mm, 2 m long.
|
||||
// This part is intended to be screwed to an MDF base.
|
||||
|
||||
// -------------------------
|
||||
// User parameters
|
||||
// -------------------------
|
||||
bar_w = 14; // Bar width (mm)
|
||||
bar_d = 17; // Bar depth (mm)
|
||||
clearance = 0.4; // Total clearance added to each axis (mm)
|
||||
|
||||
socket_height = 36; // Height of printed socket body (mm)
|
||||
wall = 3.2; // Socket wall thickness (mm)
|
||||
base_thickness = 5; // Printed bottom plate thickness (mm)
|
||||
|
||||
// USB cable/connector side opening
|
||||
usb_notch_enable = true;
|
||||
usb_notch_w = 11;
|
||||
usb_notch_h = 9;
|
||||
usb_notch_from_bottom = 6;
|
||||
usb_notch_side = "right"; // "right" or "left"
|
||||
|
||||
// Mounting ears for MDF screws
|
||||
ear_enable = true;
|
||||
ear_len = 16;
|
||||
ear_w = 16;
|
||||
ear_thickness = base_thickness;
|
||||
screw_hole_d = 4.2; // M4 clearance. Use 3.4 for M3.
|
||||
screw_hole_edge = 5.5; // Hole center offset from ear outer corner
|
||||
|
||||
// Optional clamp lip at top to reduce wobble
|
||||
top_lip_enable = true;
|
||||
top_lip_depth = 2.0; // Intrudes into opening on each side
|
||||
top_lip_height = 3.0;
|
||||
|
||||
$fn = 48;
|
||||
|
||||
// -------------------------
|
||||
// Derived
|
||||
// -------------------------
|
||||
inner_w = bar_w + clearance;
|
||||
inner_d = bar_d + clearance;
|
||||
|
||||
outer_w = inner_w + wall * 2;
|
||||
outer_d = inner_d + wall * 2;
|
||||
outer_h = socket_height;
|
||||
|
||||
module screw_hole() {
|
||||
cylinder(h = ear_thickness + 0.2, d = screw_hole_d);
|
||||
}
|
||||
|
||||
module mounting_ear(sign_y = 1) {
|
||||
translate([outer_w / 2, sign_y * (outer_d / 2), 0])
|
||||
cube([ear_len, ear_w, ear_thickness], center = false);
|
||||
}
|
||||
|
||||
module top_lip() {
|
||||
if (top_lip_enable) {
|
||||
// Front and back lips at the top of the socket.
|
||||
translate([wall, wall, outer_h - top_lip_height])
|
||||
cube([top_lip_depth, inner_d, top_lip_height]);
|
||||
|
||||
translate([outer_w - wall - top_lip_depth, wall, outer_h - top_lip_height])
|
||||
cube([top_lip_depth, inner_d, top_lip_height]);
|
||||
}
|
||||
}
|
||||
|
||||
difference() {
|
||||
union() {
|
||||
// Main body
|
||||
cube([outer_w, outer_d, outer_h], center = false);
|
||||
|
||||
// Base plate under socket for stiffness
|
||||
translate([0, 0, -base_thickness])
|
||||
cube([outer_w, outer_d, base_thickness], center = false);
|
||||
|
||||
// Mounting ears
|
||||
if (ear_enable) {
|
||||
translate([0, 0, -ear_thickness]) {
|
||||
mounting_ear(1);
|
||||
mounting_ear(-1);
|
||||
}
|
||||
}
|
||||
|
||||
top_lip();
|
||||
}
|
||||
|
||||
// Main bar cavity
|
||||
translate([wall, wall, 0])
|
||||
cube([inner_w, inner_d, outer_h + 0.2], center = false);
|
||||
|
||||
// USB side notch
|
||||
if (usb_notch_enable) {
|
||||
if (usb_notch_side == "right") {
|
||||
translate([outer_w - wall - 0.1, (outer_d - usb_notch_w) / 2, usb_notch_from_bottom])
|
||||
cube([wall + 0.3, usb_notch_w, usb_notch_h], center = false);
|
||||
} else {
|
||||
translate([-0.2, (outer_d - usb_notch_w) / 2, usb_notch_from_bottom])
|
||||
cube([wall + 0.3, usb_notch_w, usb_notch_h], center = false);
|
||||
}
|
||||
}
|
||||
|
||||
// Screw holes in ears
|
||||
if (ear_enable) {
|
||||
// Upper ear hole
|
||||
translate([
|
||||
outer_w / 2 + ear_len - screw_hole_edge,
|
||||
outer_d / 2 + ear_w - screw_hole_edge,
|
||||
-ear_thickness - 0.05
|
||||
]) screw_hole();
|
||||
|
||||
// Lower ear hole
|
||||
translate([
|
||||
outer_w / 2 + ear_len - screw_hole_edge,
|
||||
-outer_d / 2 + screw_hole_edge,
|
||||
-ear_thickness - 0.05
|
||||
]) screw_hole();
|
||||
}
|
||||
}
|
||||
|
||||
// Print orientation helper:
|
||||
// Keep the base/ears on the bed.
|
||||
// If fit is tight, increase clearance to 0.5 or 0.6.
|
||||
3
pyproject.toml
Normal file
3
pyproject.toml
Normal file
@@ -0,0 +1,3 @@
|
||||
[tool.pytest.ini_options]
|
||||
testpaths = ["tests"]
|
||||
python_files = ["test_endpoints_pytest.py"]
|
||||
@@ -1,4 +0,0 @@
|
||||
[pytest]
|
||||
testpaths = tests
|
||||
python_files = test_endpoints_pytest.py
|
||||
|
||||
@@ -1,4 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
# Copy esp32/main.py to the connected ESP32 as /main.py (single line, no wrap).
|
||||
cd "$(dirname "$0")/.."
|
||||
pipenv run mpremote fs cp esp32/main.py :/main.py
|
||||
16
scripts/dev-run.sh
Normal file
16
scripts/dev-run.sh
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
PORT="${PORT:-80}"
|
||||
|
||||
# On watchfiles restarts the previous process can linger briefly.
|
||||
# Proactively terminate any listener on the target port before boot.
|
||||
pids="$(ss -ltnp "sport = :$PORT" 2>/dev/null | sed -n 's/.*pid=\([0-9]\+\).*/\1/p' | sort -u)"
|
||||
if [ -n "${pids}" ]; then
|
||||
kill -TERM ${pids} 2>/dev/null || true
|
||||
sleep 0.3
|
||||
fi
|
||||
|
||||
cd "$ROOT_DIR/src"
|
||||
exec python main.py
|
||||
@@ -10,6 +10,18 @@ if [ ! -f "scripts/led-controller.service" ]; then
|
||||
echo "Run this script from the repo root."
|
||||
exit 1
|
||||
fi
|
||||
export PIPENV_VENV_IN_PROJECT="${PIPENV_VENV_IN_PROJECT:-1}"
|
||||
if command -v pipenv >/dev/null 2>&1; then
|
||||
PY="$(command -v python3)"
|
||||
if [ -z "$PY" ]; then
|
||||
echo "python3 not found; install python3." >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "Ensuring Pipenv deps with $PY (venv in project: .venv when PIPENV_VENV_IN_PROJECT=1)…"
|
||||
# --skip-lock: install from Pipfile only (avoids lock/Python hash mismatches on device).
|
||||
pipenv install --quiet --skip-lock --python "$PY"
|
||||
pipenv --venv > scripts/.led-controller-venv
|
||||
fi
|
||||
chmod +x scripts/start.sh
|
||||
sudo cp "scripts/led-controller.service" "$UNIT_PATH"
|
||||
sudo systemctl daemon-reload
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
[Unit]
|
||||
Description=LED Controller web server
|
||||
After=network-online.target
|
||||
Wants=network-online.target
|
||||
# Use network.target only. Ordering after network-online.target can block `systemctl start`
|
||||
# until wait-online finishes; Wi‑Fi/DHCP delays then look like a hung start job.
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
@@ -12,6 +13,8 @@ Environment=PATH=/home/pi/.local/bin:/usr/local/bin:/usr/bin:/bin
|
||||
ExecStart=/bin/bash /home/pi/led-controller/scripts/start.sh
|
||||
Restart=on-failure
|
||||
RestartSec=5
|
||||
# pipenv/first bind can be slow; avoid misleading "activating" forever if misconfigured
|
||||
TimeoutStartSec=120
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
|
||||
253
scripts/pi-eth-lan-router.sh
Executable file
253
scripts/pi-eth-lan-router.sh
Executable file
@@ -0,0 +1,253 @@
|
||||
#!/usr/bin/env bash
|
||||
# Configure Raspberry Pi OS: Wi-Fi client on IF_WAN (default wlan0), Ethernet IF_LAN
|
||||
# (default eth0) toward an external AP. Static LAN IP, DHCP via dnsmasq, NAT masquerade.
|
||||
#
|
||||
# Usage:
|
||||
# sudo ./pi-eth-lan-router.sh install
|
||||
# sudo ./pi-eth-lan-router.sh remove
|
||||
#
|
||||
# Environment overrides (optional):
|
||||
# IF_WAN=wlan0 IF_LAN=eth0 LAN_IP=192.168.4.1 LAN_PREFIX=24 \
|
||||
# DHCP_START=192.168.4.100 DHCP_END=192.168.4.200 \
|
||||
# DNSMASQ_DNS=1.1.1.1,8.8.8.8 \
|
||||
# sudo ./pi-eth-lan-router.sh install
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
IF_WAN="${IF_WAN:-wlan0}"
|
||||
IF_LAN="${IF_LAN:-eth0}"
|
||||
LAN_IP="${LAN_IP:-192.168.4.1}"
|
||||
LAN_PREFIX="${LAN_PREFIX:-24}"
|
||||
DHCP_START="${DHCP_START:-192.168.4.100}"
|
||||
DHCP_END="${DHCP_END:-192.168.4.200}"
|
||||
# Comma-separated DNS for DHCP clients (Pi does not need to run a resolver).
|
||||
DNSMASQ_DNS="${DNSMASQ_DNS:-1.1.1.1,8.8.8.8}"
|
||||
|
||||
NM_CON_NAME="pi-eth-lan-router"
|
||||
MARK_BEGIN="# BEGIN pi-eth-lan-router (scripts/pi-eth-lan-router.sh)"
|
||||
MARK_END="# END pi-eth-lan-router"
|
||||
SYSCTL_FILE="/etc/sysctl.d/99-pi-eth-lan-router.conf"
|
||||
DNSMASQ_SNIPPET="/etc/dnsmasq.d/pi-eth-lan-router.conf"
|
||||
NFT_SNIPPET="/etc/nftables.d/50-pi-eth-lan-router.nft"
|
||||
NFT_INCLUDE='include "/etc/nftables.d/50-pi-eth-lan-router.nft"'
|
||||
NFTABLES_CONF="/etc/nftables.conf"
|
||||
DHCPCD_CONF="/etc/dhcpcd.conf"
|
||||
|
||||
die() { echo "error: $*" >&2; exit 1; }
|
||||
log() { echo "$*"; }
|
||||
|
||||
need_root() {
|
||||
[[ "${EUID:-0}" -eq 0 ]] || die "run as root (sudo)"
|
||||
}
|
||||
|
||||
have_cmd() { command -v "$1" >/dev/null 2>&1; }
|
||||
|
||||
apt_install() {
|
||||
export DEBIAN_FRONTEND=noninteractive
|
||||
apt-get update -qq
|
||||
apt-get install -y -qq dnsmasq nftables
|
||||
}
|
||||
|
||||
write_sysctl() {
|
||||
cat >"$SYSCTL_FILE" <<EOF
|
||||
# Managed by scripts/pi-eth-lan-router.sh
|
||||
net.ipv4.ip_forward=1
|
||||
EOF
|
||||
sysctl --system -q 2>/dev/null || sysctl -p "$SYSCTL_FILE" || true
|
||||
}
|
||||
|
||||
remove_sysctl() {
|
||||
rm -f "$SYSCTL_FILE"
|
||||
sysctl --system -q 2>/dev/null || true
|
||||
}
|
||||
|
||||
write_dnsmasq() {
|
||||
local mask="255.255.255.0"
|
||||
if [[ "$LAN_PREFIX" != "24" ]]; then
|
||||
die "only LAN_PREFIX=24 is supported by this script (extend dnsmasq netmask manually)"
|
||||
fi
|
||||
cat >"$DNSMASQ_SNIPPET" <<EOF
|
||||
# Managed by scripts/pi-eth-lan-router.sh
|
||||
interface=$IF_LAN
|
||||
bind-interfaces
|
||||
dhcp-range=$DHCP_START,$DHCP_END,$mask,24h
|
||||
dhcp-option=option:router,$LAN_IP
|
||||
dhcp-option=option:dns-server,$DNSMASQ_DNS
|
||||
EOF
|
||||
}
|
||||
|
||||
remove_dnsmasq() {
|
||||
rm -f "$DNSMASQ_SNIPPET"
|
||||
}
|
||||
|
||||
write_nft() {
|
||||
mkdir -p /etc/nftables.d
|
||||
cat >"$NFT_SNIPPET" <<EOF
|
||||
# Managed by scripts/pi-eth-lan-router.sh
|
||||
table ip pi_eth_wlan_nat {
|
||||
chain postrouting {
|
||||
type nat hook postrouting priority 100; policy accept;
|
||||
oifname "$IF_WAN" masquerade
|
||||
}
|
||||
}
|
||||
EOF
|
||||
if [[ -f "$NFTABLES_CONF" ]] && ! grep -qF '50-pi-eth-lan-router.nft' "$NFTABLES_CONF" 2>/dev/null; then
|
||||
printf '\n# pi-eth-lan-router\n%s\n' "$NFT_INCLUDE" >>"$NFTABLES_CONF"
|
||||
elif [[ ! -f "$NFTABLES_CONF" ]]; then
|
||||
log "warning: $NFTABLES_CONF missing; NAT was not added for boot persistence. Install/configure nftables, or add: $NFT_INCLUDE"
|
||||
fi
|
||||
}
|
||||
|
||||
remove_nft() {
|
||||
rm -f "$NFT_SNIPPET"
|
||||
if [[ -f "$NFTABLES_CONF" ]]; then
|
||||
sed -i '/# pi-eth-lan-router/d;/50-pi-eth-lan-router\.nft/d' "$NFTABLES_CONF" || true
|
||||
fi
|
||||
nft delete table ip pi_eth_wlan_nat 2>/dev/null || true
|
||||
}
|
||||
|
||||
apply_nft() {
|
||||
if have_cmd nft; then
|
||||
nft delete table ip pi_eth_wlan_nat 2>/dev/null || true
|
||||
nft -f "$NFT_SNIPPET"
|
||||
fi
|
||||
}
|
||||
|
||||
configure_nm_eth() {
|
||||
have_cmd nmcli || return 1
|
||||
systemctl is-active --quiet NetworkManager 2>/dev/null || return 1
|
||||
|
||||
if nmcli -t -f NAME con show --active 2>/dev/null | grep -qxF "$NM_CON_NAME"; then
|
||||
nmcli con down "$NM_CON_NAME" || true
|
||||
fi
|
||||
if nmcli -t -f NAME con show 2>/dev/null | grep -qxF "$NM_CON_NAME"; then
|
||||
nmcli con mod "$NM_CON_NAME" \
|
||||
connection.interface-name "$IF_LAN" \
|
||||
ipv4.method manual \
|
||||
ipv4.addresses "${LAN_IP}/${LAN_PREFIX}" \
|
||||
ipv4.gateway "" \
|
||||
ipv4.dns "" \
|
||||
ipv4.never-default yes \
|
||||
ipv6.method ignore
|
||||
else
|
||||
nmcli con add type ethernet con-name "$NM_CON_NAME" ifname "$IF_LAN" \
|
||||
ipv4.method manual \
|
||||
ipv4.addresses "${LAN_IP}/${LAN_PREFIX}" \
|
||||
ipv4.gateway "" \
|
||||
ipv4.dns "" \
|
||||
ipv4.never-default yes \
|
||||
ipv6.method ignore
|
||||
fi
|
||||
if ! nmcli con up "$NM_CON_NAME"; then
|
||||
log "warning: could not activate '$NM_CON_NAME' (is $IF_LAN connected?); profile saved for next boot."
|
||||
fi
|
||||
return 0
|
||||
}
|
||||
|
||||
remove_nm_eth() {
|
||||
have_cmd nmcli || return 0
|
||||
if nmcli -t -f NAME con show 2>/dev/null | grep -qxF "$NM_CON_NAME"; then
|
||||
nmcli con delete "$NM_CON_NAME" || true
|
||||
fi
|
||||
}
|
||||
|
||||
configure_dhcpcd_eth() {
|
||||
[[ -f "$DHCPCD_CONF" ]] || return 1
|
||||
if grep -qF "$MARK_BEGIN" "$DHCPCD_CONF" 2>/dev/null; then
|
||||
sed -i "/$MARK_BEGIN/,/$MARK_END/d" "$DHCPCD_CONF" || true
|
||||
fi
|
||||
{
|
||||
echo "$MARK_BEGIN"
|
||||
echo "interface $IF_LAN"
|
||||
echo "static ip_address=${LAN_IP}/${LAN_PREFIX}"
|
||||
echo "nohook wpa_supplicant"
|
||||
echo "$MARK_END"
|
||||
} >>"$DHCPCD_CONF"
|
||||
systemctl restart dhcpcd 2>/dev/null || true
|
||||
return 0
|
||||
}
|
||||
|
||||
remove_dhcpcd_block() {
|
||||
[[ -f "$DHCPCD_CONF" ]] || return 0
|
||||
if grep -qF "$MARK_BEGIN" "$DHCPCD_CONF" 2>/dev/null; then
|
||||
sed -i "/$MARK_BEGIN/,/$MARK_END/d" "$DHCPCD_CONF" || true
|
||||
systemctl restart dhcpcd 2>/dev/null || true
|
||||
fi
|
||||
}
|
||||
|
||||
configure_eth_static() {
|
||||
if configure_nm_eth; then
|
||||
log "configured $IF_LAN via NetworkManager profile '$NM_CON_NAME'"
|
||||
return 0
|
||||
fi
|
||||
if configure_dhcpcd_eth; then
|
||||
log "configured $IF_LAN via dhcpcd ($DHCPCD_CONF)"
|
||||
return 0
|
||||
fi
|
||||
die "neither NetworkManager (active) nor $DHCPCD_CONF found; set $IF_LAN to ${LAN_IP}/${LAN_PREFIX} manually"
|
||||
}
|
||||
|
||||
remove_eth_static() {
|
||||
remove_nm_eth
|
||||
remove_dhcpcd_block
|
||||
}
|
||||
|
||||
do_install() {
|
||||
need_root
|
||||
log "installing packages (dnsmasq, nftables)…"
|
||||
apt_install
|
||||
|
||||
log "writing sysctl, dnsmasq, nftables snippets…"
|
||||
write_sysctl
|
||||
write_dnsmasq
|
||||
write_nft
|
||||
|
||||
log "setting static IP on $IF_LAN…"
|
||||
configure_eth_static
|
||||
|
||||
log "restarting dnsmasq…"
|
||||
systemctl enable dnsmasq
|
||||
systemctl restart dnsmasq
|
||||
|
||||
log "loading NAT rules and enabling nftables…"
|
||||
apply_nft
|
||||
systemctl enable nftables 2>/dev/null || true
|
||||
systemctl restart nftables 2>/dev/null || true
|
||||
|
||||
log "done. Connect $IF_LAN to the external AP (DHCP off on the AP)."
|
||||
log "Join Wi-Fi on $IF_WAN to the uplink network and complete any captive portal on the Pi."
|
||||
}
|
||||
|
||||
do_remove() {
|
||||
need_root
|
||||
remove_eth_static
|
||||
remove_dnsmasq
|
||||
systemctl restart dnsmasq 2>/dev/null || true
|
||||
|
||||
remove_nft
|
||||
systemctl restart nftables 2>/dev/null || true
|
||||
|
||||
remove_sysctl
|
||||
sysctl -w net.ipv4.ip_forward=0 2>/dev/null || true
|
||||
|
||||
log "removed pi-eth-lan-router configuration snippets and NM profile '$NM_CON_NAME' (if present)."
|
||||
}
|
||||
|
||||
usage() {
|
||||
cat <<EOF
|
||||
Usage: sudo $0 install|remove
|
||||
|
||||
WAN (Wi-Fi client): $IF_WAN
|
||||
LAN (Ethernet to AP): $IF_LAN
|
||||
LAN address: ${LAN_IP}/${LAN_PREFIX}
|
||||
DHCP range: $DHCP_START – $DHCP_END
|
||||
|
||||
Override with environment variables (see script header).
|
||||
EOF
|
||||
}
|
||||
|
||||
case "${1:-}" in
|
||||
install) do_install ;;
|
||||
remove) do_remove ;;
|
||||
*) usage; exit 1 ;;
|
||||
esac
|
||||
@@ -1,5 +1,38 @@
|
||||
#!/usr/bin/env bash
|
||||
# Start the LED controller web server (port 80 by default).
|
||||
cd "$(dirname "$0")/.."
|
||||
# Avoid `pipenv run` on the hot path — it re-resolves the env every time and is slow on a Pi.
|
||||
set -euo pipefail
|
||||
|
||||
ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
cd "$ROOT"
|
||||
export PORT="${PORT:-80}"
|
||||
pipenv run run
|
||||
export PIPENV_VENV_IN_PROJECT="${PIPENV_VENV_IN_PROJECT:-1}"
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||
CACHE="$SCRIPT_DIR/.led-controller-venv"
|
||||
PYTHON=""
|
||||
|
||||
if [ -x "$ROOT/.venv/bin/python" ]; then
|
||||
PYTHON="$ROOT/.venv/bin/python"
|
||||
elif [ -f "$CACHE" ]; then
|
||||
_v="$(tr -d '\r\n' < "$CACHE")"
|
||||
if [ -n "$_v" ] && [ -x "$_v/bin/python" ]; then
|
||||
PYTHON="$_v/bin/python"
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -z "$PYTHON" ] && command -v pipenv >/dev/null 2>&1; then
|
||||
_v="$(cd "$ROOT" && pipenv --venv 2>/dev/null || true)"
|
||||
if [ -n "${_v:-}" ] && [ -x "$_v/bin/python" ]; then
|
||||
PYTHON="$_v/bin/python"
|
||||
printf '%s\n' "$_v" > "$CACHE" || true
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -z "$PYTHON" ]; then
|
||||
echo 'led-controller: no venv resolved; using pipenv run (slow). Run: cd '"$ROOT"' && PIPENV_VENV_IN_PROJECT=1 pipenv install --skip-lock --python "$(command -v python3)"' >&2
|
||||
exec pipenv run run
|
||||
fi
|
||||
|
||||
cd "$ROOT/src"
|
||||
exec "$PYTHON" -u main.py
|
||||
|
||||
@@ -2,18 +2,26 @@ from microdot import Microdot
|
||||
from models.device import (
|
||||
Device,
|
||||
derive_device_mac,
|
||||
normalize_mac,
|
||||
validate_device_transport,
|
||||
validate_device_type,
|
||||
)
|
||||
from models.group import Group
|
||||
from models.transport import get_current_sender
|
||||
from models.tcp_clients import (
|
||||
from settings import Settings
|
||||
from util.brightness_combine import effective_brightness_for_mac
|
||||
from models.wifi_ws_clients import (
|
||||
normalize_tcp_peer_ip,
|
||||
send_json_line_to_ip,
|
||||
tcp_client_connected,
|
||||
)
|
||||
from util.driver_patterns import driver_patterns_dir
|
||||
from util.espnow_message import build_message
|
||||
import asyncio
|
||||
import json
|
||||
import os
|
||||
import socket
|
||||
from urllib.parse import quote
|
||||
|
||||
# Ephemeral driver preset name (never written to Pi preset store; ``save`` not set on wire).
|
||||
_IDENTIFY_PRESET_KEY = "__identify"
|
||||
@@ -48,14 +56,34 @@ def _compact_v1_json(*, presets=None, select=None, save=False):
|
||||
# Seconds after identify blink before selecting built-in ``off`` (tests may monkeypatch).
|
||||
IDENTIFY_OFF_DELAY_S = 2.0
|
||||
|
||||
|
||||
def _validate_output_brightness(value):
|
||||
if value is None:
|
||||
return None
|
||||
try:
|
||||
b = int(value)
|
||||
except (TypeError, ValueError):
|
||||
raise ValueError("output_brightness must be an integer 0–255")
|
||||
if b < 0 or b > 255:
|
||||
raise ValueError("output_brightness must be between 0 and 255")
|
||||
return b
|
||||
|
||||
|
||||
def _brightness_save_message_json(b_val: int) -> str:
|
||||
b_val = max(0, min(255, int(b_val)))
|
||||
return json.dumps({"v": "1", "b": b_val, "save": True}, separators=(",", ":"))
|
||||
|
||||
|
||||
controller = Microdot()
|
||||
devices = Device()
|
||||
_group_registry = Group()
|
||||
_pi_settings = Settings()
|
||||
|
||||
|
||||
def _device_live_connected(dev_dict):
|
||||
"""
|
||||
Wi-Fi: whether a TCP client is registered for this device's address (IP).
|
||||
ESP-NOW: None (no TCP session on the Pi for that transport).
|
||||
Wi-Fi: whether the controller has an outbound WebSocket to this device's IP.
|
||||
ESP-NOW: None (no Wi-Fi session on the Pi for that transport).
|
||||
"""
|
||||
tr = (dev_dict.get("transport") or "espnow").strip().lower()
|
||||
if tr != "wifi":
|
||||
@@ -72,6 +100,61 @@ def _device_json_with_live_status(dev_dict):
|
||||
return row
|
||||
|
||||
|
||||
def _safe_pattern_filename(name):
|
||||
if not isinstance(name, str):
|
||||
return False
|
||||
if not name.endswith(".py"):
|
||||
return False
|
||||
if "/" in name or "\\" in name or ".." in name:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def _http_post_pattern_source(ip, filename, code_text, reload_patterns=True, timeout_s=10.0):
|
||||
"""POST source to driver /patterns/upload?name=...&reload=...; return True on 2xx."""
|
||||
if not isinstance(ip, str) or not ip.strip():
|
||||
return False
|
||||
if not isinstance(filename, str) or not filename:
|
||||
return False
|
||||
if not isinstance(code_text, str):
|
||||
return False
|
||||
|
||||
name_q = quote(filename, safe="")
|
||||
reload_q = "1" if reload_patterns else "0"
|
||||
path = "/patterns/upload?name=%s&reload=%s" % (name_q, reload_q)
|
||||
body = code_text.encode("utf-8")
|
||||
req = (
|
||||
"POST %s HTTP/1.1\r\n"
|
||||
"Host: %s\r\n"
|
||||
"Content-Type: text/plain; charset=utf-8\r\n"
|
||||
"Content-Length: %d\r\n"
|
||||
"Connection: close\r\n"
|
||||
"\r\n" % (path, ip, len(body))
|
||||
).encode("utf-8") + body
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
try:
|
||||
sock.settimeout(timeout_s)
|
||||
sock.connect((ip.strip(), 80))
|
||||
sock.sendall(req)
|
||||
data = b""
|
||||
while True:
|
||||
chunk = sock.recv(1024)
|
||||
if not chunk:
|
||||
break
|
||||
data += chunk
|
||||
except OSError:
|
||||
return False
|
||||
finally:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
first_line = data.split(b"\r\n", 1)[0] if data else b""
|
||||
return b" 2" in first_line
|
||||
|
||||
|
||||
async def _identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, name):
|
||||
try:
|
||||
await asyncio.sleep(IDENTIFY_OFF_DELAY_S)
|
||||
@@ -84,9 +167,110 @@ async def _identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, nam
|
||||
pass
|
||||
|
||||
|
||||
async def send_identify_to_device(dev_id: str) -> tuple[int, str]:
|
||||
"""
|
||||
Send the same identify blink as ``POST /devices/<id>/identify``.
|
||||
|
||||
Returns ``(http_status, "")`` on success, or ``(status, error_message)`` on failure
|
||||
(status matches the single-device route).
|
||||
"""
|
||||
dev = devices.read(dev_id)
|
||||
if not dev:
|
||||
return 404, "Device not found"
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return 503, "Transport not configured"
|
||||
name = str(dev.get("name") or "").strip()
|
||||
if not name:
|
||||
return 400, "Device must have a name to identify"
|
||||
|
||||
transport = dev.get("transport") or "espnow"
|
||||
wifi_ip = None
|
||||
if transport == "wifi":
|
||||
wifi_ip = dev.get("address")
|
||||
if not wifi_ip:
|
||||
return 400, "Device has no IP address"
|
||||
|
||||
try:
|
||||
msg = _compact_v1_json(
|
||||
presets={_IDENTIFY_PRESET_KEY: dict(_IDENTIFY_DRIVER_PRESET)},
|
||||
select={name: [_IDENTIFY_PRESET_KEY]},
|
||||
)
|
||||
if transport == "wifi":
|
||||
ok = await send_json_line_to_ip(wifi_ip, msg)
|
||||
if not ok:
|
||||
return 503, "Wi-Fi driver not connected"
|
||||
else:
|
||||
await sender.send(msg, addr=dev_id)
|
||||
|
||||
asyncio.create_task(
|
||||
_identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, name)
|
||||
)
|
||||
except Exception as e:
|
||||
return 503, str(e)
|
||||
return 200, ""
|
||||
|
||||
|
||||
async def send_identify_to_group_devices(macs: list[str]) -> tuple[int, list[dict]]:
|
||||
"""
|
||||
Identify every listed registry MAC in one delivery round: merged ``select`` and a single
|
||||
ESP-NOW split envelope when multiple peers share the serial bridge (avoids per-device
|
||||
``SerialSender`` lock serialisation). Wi-Fi peers are sent in parallel as in
|
||||
``deliver_json_messages``.
|
||||
"""
|
||||
from util.driver_delivery import deliver_json_messages
|
||||
|
||||
errors: list[dict] = []
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return 0, [{"mac": "*", "error": "Transport not configured"}]
|
||||
|
||||
merged_select: dict[str, list[str]] = {}
|
||||
valid_macs: list[str] = []
|
||||
for dev_id in macs:
|
||||
dev = devices.read(dev_id)
|
||||
if not dev:
|
||||
errors.append({"mac": dev_id, "error": "Device not found"})
|
||||
continue
|
||||
name = str(dev.get("name") or "").strip()
|
||||
if not name:
|
||||
errors.append({"mac": dev_id, "error": "Device must have a name to identify"})
|
||||
continue
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
if transport == "wifi":
|
||||
if not dev.get("address"):
|
||||
errors.append({"mac": dev_id, "error": "Device has no IP address"})
|
||||
continue
|
||||
merged_select[name] = [_IDENTIFY_PRESET_KEY]
|
||||
valid_macs.append(dev_id)
|
||||
|
||||
if not merged_select:
|
||||
return 0, errors
|
||||
|
||||
try:
|
||||
msg = _compact_v1_json(
|
||||
presets={_IDENTIFY_PRESET_KEY: dict(_IDENTIFY_DRIVER_PRESET)},
|
||||
select=merged_select,
|
||||
)
|
||||
await deliver_json_messages(sender, [msg], valid_macs, devices, delay_s=0)
|
||||
except Exception as e:
|
||||
return 0, errors + [{"mac": "*", "error": str(e)}]
|
||||
|
||||
for dev_id in valid_macs:
|
||||
dev = devices.read(dev_id) or {}
|
||||
name = str(dev.get("name") or "").strip()
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
wifi_ip = dev.get("address") if transport == "wifi" else None
|
||||
asyncio.create_task(
|
||||
_identify_send_off_after_delay(sender, transport, wifi_ip, dev_id, name)
|
||||
)
|
||||
|
||||
return len(valid_macs), errors
|
||||
|
||||
|
||||
@controller.get("")
|
||||
async def list_devices(request):
|
||||
"""List all devices (includes ``connected`` for live Wi-Fi TCP presence)."""
|
||||
"""List all devices (includes ``connected`` for live Wi-Fi WebSocket presence)."""
|
||||
devices_data = {}
|
||||
for dev_id in devices.list():
|
||||
d = devices.read(dev_id)
|
||||
@@ -95,9 +279,45 @@ async def list_devices(request):
|
||||
return json.dumps(devices_data), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/resolve-brightness")
|
||||
async def resolve_brightness_batch(request):
|
||||
"""
|
||||
POST JSON ``{ \"macs\": [\"..\"], \"zone_brightness\": optional 0–255 }``.
|
||||
Returns ``{ \"values\": { mac: combined_int } }`` — global × group(s) × device × zone (optional).
|
||||
"""
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
data = {}
|
||||
macs = data.get("macs")
|
||||
if not isinstance(macs, list):
|
||||
return json.dumps({"error": "macs must be an array"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
zb = None
|
||||
if isinstance(data, dict) and data.get("zone_brightness") is not None:
|
||||
try:
|
||||
zb = _validate_output_brightness(data.get("zone_brightness"))
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
values = {}
|
||||
for raw in macs:
|
||||
m = normalize_mac(str(raw))
|
||||
if not m:
|
||||
continue
|
||||
values[m] = effective_brightness_for_mac(
|
||||
_pi_settings,
|
||||
_group_registry,
|
||||
devices,
|
||||
m,
|
||||
zone_brightness=zb,
|
||||
)
|
||||
return json.dumps({"values": values}), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
async def get_device(request, id):
|
||||
"""Get a device by ID (includes ``connected`` for live Wi-Fi TCP presence)."""
|
||||
"""Get a device by ID (includes ``connected`` for live Wi-Fi WebSocket presence)."""
|
||||
dev = devices.read(id)
|
||||
if dev:
|
||||
return json.dumps(_device_json_with_live_status(dev)), 200, {
|
||||
@@ -180,7 +400,17 @@ async def update_device(request, id):
|
||||
data["transport"] = validate_device_transport(data.get("transport"))
|
||||
if "zones" in data and isinstance(data["zones"], list):
|
||||
data["zones"] = [str(t) for t in data["zones"]]
|
||||
if "output_brightness" in data:
|
||||
data["output_brightness"] = _validate_output_brightness(data.get("output_brightness"))
|
||||
prev_doc = devices.read(id)
|
||||
if devices.update(id, data):
|
||||
if prev_doc and "name" in data:
|
||||
on = str(prev_doc.get("name") or "").strip()
|
||||
nn = str(data.get("name") or "").strip()
|
||||
if on and nn and on != nn:
|
||||
from util.beat_driver_route import remap_beat_route_device_name
|
||||
|
||||
remap_beat_route_device_name(on, nn)
|
||||
return json.dumps(devices.read(id)), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
@@ -212,50 +442,196 @@ async def identify_device(request, id):
|
||||
this device name — same combined shape as profile sends the driver already accepts over TCP
|
||||
/ ESP-NOW. No ``save``. After ``IDENTIFY_OFF_DELAY_S``, a background task selects ``off``.
|
||||
"""
|
||||
status, err = await send_identify_to_device(id)
|
||||
if status == 200:
|
||||
return json.dumps({"message": "Identify sent"}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
return json.dumps({"error": err}), status, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/<id>/brightness")
|
||||
async def push_device_output_brightness(request, id):
|
||||
"""
|
||||
Push combined brightness to the driver: global × group(s) × device × optional ``zone_brightness``
|
||||
in JSON body — single ``b`` (``v``/``b``/``save``). Wi‑Fi or ESP‑NOW.
|
||||
"""
|
||||
dev = devices.read(id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return json.dumps({"error": "Transport not configured"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
name = str(dev.get("name") or "").strip()
|
||||
if not name:
|
||||
return json.dumps({"error": "Device must have a name to identify"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
body = request.json or {}
|
||||
zb = None
|
||||
if isinstance(body, dict) and body.get("zone_brightness") is not None:
|
||||
try:
|
||||
zb = _validate_output_brightness(body.get("zone_brightness"))
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
b_val = effective_brightness_for_mac(
|
||||
_pi_settings,
|
||||
_group_registry,
|
||||
devices,
|
||||
id,
|
||||
zone_brightness=zb,
|
||||
)
|
||||
|
||||
msg = _brightness_save_message_json(b_val)
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
|
||||
transport = dev.get("transport") or "espnow"
|
||||
wifi_ip = None
|
||||
if transport == "wifi":
|
||||
wifi_ip = dev.get("address")
|
||||
if not wifi_ip:
|
||||
ip = normalize_tcp_peer_ip(str(dev.get("address") or ""))
|
||||
if not ip:
|
||||
return json.dumps({"error": "Device has no IP address"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
try:
|
||||
msg = _compact_v1_json(
|
||||
presets={_IDENTIFY_PRESET_KEY: dict(_IDENTIFY_DRIVER_PRESET)},
|
||||
select={name: [_IDENTIFY_PRESET_KEY]},
|
||||
)
|
||||
if transport == "wifi":
|
||||
ok = await send_json_line_to_ip(wifi_ip, msg)
|
||||
if not ok:
|
||||
return json.dumps({"error": "Wi-Fi driver not connected"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
else:
|
||||
ok = await send_json_line_to_ip(ip, msg)
|
||||
if not ok:
|
||||
return json.dumps({"error": "Wi-Fi driver not connected"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
else:
|
||||
sender = get_current_sender()
|
||||
if not sender:
|
||||
return json.dumps({"error": "Transport not configured"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
try:
|
||||
await sender.send(msg, addr=id)
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 503, {"Content-Type": "application/json"}
|
||||
|
||||
asyncio.create_task(
|
||||
_identify_send_off_after_delay(sender, transport, wifi_ip, id, name)
|
||||
)
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 503, {"Content-Type": "application/json"}
|
||||
return json.dumps({"message": "Identify sent"}), 200, {
|
||||
return json.dumps({"message": "brightness sent", "brightness": b_val}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
|
||||
@controller.post("/<id>/driver-config")
|
||||
async def push_driver_config(request, id):
|
||||
"""
|
||||
Push ``device_config`` to a Wi‑Fi LED driver over WebSocket.
|
||||
Body JSON: optional ``name``, ``num_leds``, ``color_order``, ``startup_mode`` (default|last|off).
|
||||
"""
|
||||
dev = devices.read(id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
return json.dumps({"error": "driver-config is only for Wi-Fi devices"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
wifi_ip = str(dev.get("address") or "").strip()
|
||||
if not wifi_ip:
|
||||
return json.dumps({"error": "Device has no IP address"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
body = request.json or {}
|
||||
dc = {}
|
||||
if isinstance(body.get("name"), str) and body["name"].strip():
|
||||
dc["name"] = body["name"].strip()
|
||||
if "num_leds" in body:
|
||||
try:
|
||||
n = int(body["num_leds"])
|
||||
if 1 <= n <= 2048:
|
||||
dc["num_leds"] = n
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
if isinstance(body.get("color_order"), str):
|
||||
co = body["color_order"].strip().lower()
|
||||
if co in ("rgb", "rbg", "grb", "gbr", "brg", "bgr"):
|
||||
dc["color_order"] = co
|
||||
if isinstance(body.get("startup_mode"), str):
|
||||
sm = body["startup_mode"].strip().lower()
|
||||
if sm in ("default", "last", "off"):
|
||||
dc["startup_mode"] = sm
|
||||
if not dc:
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "Provide at least one of name, num_leds, color_order, startup_mode"
|
||||
}
|
||||
), 400, {"Content-Type": "application/json"}
|
||||
msg = json.dumps(
|
||||
{"v": "1", "device_config": dc, "save": True}, separators=(",", ":")
|
||||
)
|
||||
ok = await send_json_line_to_ip(wifi_ip, msg)
|
||||
if not ok:
|
||||
return json.dumps({"error": "Wi-Fi driver not connected"}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
return json.dumps({"message": "driver-config sent"}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
|
||||
@controller.post("/<id>/patterns/push")
|
||||
async def push_patterns_ota(request, id):
|
||||
"""
|
||||
Push all local pattern files directly to a Wi-Fi LED driver over HTTP upload.
|
||||
"""
|
||||
dev = devices.read(id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
return json.dumps({"error": "Pattern OTA push is only supported for Wi-Fi devices"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
wifi_ip = str(dev.get("address") or "").strip()
|
||||
if not wifi_ip:
|
||||
return json.dumps({"error": "Device has no IP address"}), 400, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
base_dir = driver_patterns_dir()
|
||||
try:
|
||||
names = sorted(os.listdir(base_dir))
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
files = [n for n in names if _safe_pattern_filename(n) and n != "__init__.py"]
|
||||
if not files:
|
||||
return json.dumps({"error": "No pattern files found"}), 404, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
sent = []
|
||||
failed = []
|
||||
total = len(files)
|
||||
for idx, filename in enumerate(files):
|
||||
path = os.path.join(base_dir, filename)
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
code = f.read()
|
||||
except OSError:
|
||||
failed.append(filename)
|
||||
continue
|
||||
reload_patterns = idx == (total - 1)
|
||||
ok = _http_post_pattern_source(
|
||||
wifi_ip,
|
||||
filename,
|
||||
code,
|
||||
reload_patterns=reload_patterns,
|
||||
timeout_s=10.0,
|
||||
)
|
||||
if ok:
|
||||
sent.append(filename)
|
||||
else:
|
||||
failed.append(filename)
|
||||
|
||||
if not sent:
|
||||
return json.dumps({"error": "Wi-Fi driver did not accept pattern uploads", "failed": failed}), 503, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
return json.dumps({
|
||||
"message": "Pattern files uploaded",
|
||||
"sent_count": len(sent),
|
||||
"sent": sent,
|
||||
"failed": failed,
|
||||
}), 200, {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
@@ -1,50 +1,359 @@
|
||||
from microdot import Microdot
|
||||
from microdot.session import with_session
|
||||
import asyncio
|
||||
from models.group import Group
|
||||
from models.device import Device
|
||||
from models.transport import get_current_sender
|
||||
from models.wifi_ws_clients import normalize_tcp_peer_ip, send_json_line_to_ip
|
||||
from settings import Settings
|
||||
from util.brightness_combine import effective_brightness_for_mac
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
groups = Group()
|
||||
devices = Device()
|
||||
_pi_settings = Settings()
|
||||
|
||||
@controller.get('')
|
||||
async def list_groups(request):
|
||||
"""List all groups."""
|
||||
return json.dumps(groups), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
@controller.get('/<id>')
|
||||
async def get_group(request, id):
|
||||
"""Get a specific group by ID."""
|
||||
def _group_doc_visible_for_profile(doc, profile_id):
|
||||
if not isinstance(doc, dict):
|
||||
return False
|
||||
scoped = doc.get("profile_id")
|
||||
if scoped is None:
|
||||
scoped = doc.get("profileId")
|
||||
if scoped is None or str(scoped).strip() == "":
|
||||
return True
|
||||
if not profile_id:
|
||||
return False
|
||||
return str(scoped).strip() == str(profile_id).strip()
|
||||
|
||||
|
||||
def _filtered_groups_dict(session):
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
pid = get_current_profile_id(session)
|
||||
out = {}
|
||||
for gid, doc in groups.items():
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if _group_doc_visible_for_profile(doc, pid):
|
||||
out[str(gid)] = doc
|
||||
return out
|
||||
|
||||
|
||||
@controller.get("")
|
||||
@with_session
|
||||
async def list_groups(request, session):
|
||||
"""List groups visible for the current profile (shared + profile-scoped)."""
|
||||
return json.dumps(_filtered_groups_dict(session)), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
@with_session
|
||||
async def get_group(request, session, id):
|
||||
"""Get a specific group by ID (404 if scoped to another profile)."""
|
||||
group = groups.read(id)
|
||||
if group:
|
||||
return json.dumps(group), 200, {'Content-Type': 'application/json'}
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
if not group or not isinstance(group, dict):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
@controller.post('')
|
||||
async def create_group(request):
|
||||
"""Create a new group."""
|
||||
if not _group_doc_visible_for_profile(group, get_current_profile_id(session)):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
return json.dumps(group), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
def _sanitize_group_profile_id_write(data, session):
|
||||
"""Allow ``profile_id`` only for the active profile, or null to share across profiles."""
|
||||
if not isinstance(data, dict):
|
||||
return
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
cur = get_current_profile_id(session)
|
||||
if "profile_id" not in data and "profileId" not in data:
|
||||
return
|
||||
raw = data.get("profile_id")
|
||||
if raw is None and "profileId" in data:
|
||||
raw = data.get("profileId")
|
||||
if raw is None or raw == "":
|
||||
data.pop("profileId", None)
|
||||
data["profile_id"] = None
|
||||
return
|
||||
if not cur or str(raw).strip() != str(cur).strip():
|
||||
data.pop("profileId", None)
|
||||
data.pop("profile_id", None)
|
||||
|
||||
|
||||
@controller.post("")
|
||||
@with_session
|
||||
async def create_group(request, session):
|
||||
"""Create a new group (omit ``profile_id`` for shared; or ``profile_scoped``: true for this profile only)."""
|
||||
try:
|
||||
data = request.json or {}
|
||||
data = dict(request.json or {})
|
||||
name = data.get("name", "")
|
||||
profile_scoped = bool(data.pop("profile_scoped", False))
|
||||
_sanitize_group_profile_id_write(data, session)
|
||||
group_id = groups.create(name)
|
||||
if data:
|
||||
groups.update(group_id, data)
|
||||
return json.dumps(groups.read(group_id)), 201, {'Content-Type': 'application/json'}
|
||||
if profile_scoped:
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
cur = get_current_profile_id(session)
|
||||
if cur:
|
||||
groups.update(group_id, {"profile_id": str(cur)})
|
||||
return json.dumps(groups.read(group_id)), 201, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.put('/<id>')
|
||||
async def update_group(request, id):
|
||||
|
||||
@controller.put("/<id>")
|
||||
@with_session
|
||||
async def update_group(request, session, id):
|
||||
"""Update an existing group."""
|
||||
try:
|
||||
data = request.json
|
||||
if not isinstance(data, dict):
|
||||
return json.dumps({"error": "Invalid JSON"}), 400, {"Content-Type": "application/json"}
|
||||
data = dict(data)
|
||||
_sanitize_group_profile_id_write(data, session)
|
||||
if groups.update(id, data):
|
||||
return json.dumps(groups.read(id)), 200, {'Content-Type': 'application/json'}
|
||||
g = groups.read(id)
|
||||
if g:
|
||||
return json.dumps(g), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.delete('/<id>')
|
||||
async def delete_group(request, id):
|
||||
"""Delete a group."""
|
||||
@controller.delete("/<id>")
|
||||
@with_session
|
||||
async def delete_group(request, session, id):
|
||||
"""Delete a group (not allowed for another profile's scoped group)."""
|
||||
g = groups.read(id)
|
||||
if not g or not isinstance(g, dict):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
if not _group_doc_visible_for_profile(g, get_current_profile_id(session)):
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
if groups.delete(id):
|
||||
return json.dumps({"message": "Group deleted successfully"}), 200
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
|
||||
|
||||
def _group_driver_config_payload(doc):
|
||||
"""Build ``device_config`` dict from stored group Wi‑Fi defaults (non-empty only)."""
|
||||
dc = {}
|
||||
if not isinstance(doc, dict):
|
||||
return dc
|
||||
nm = doc.get("wifi_driver_display_name")
|
||||
if isinstance(nm, str) and nm.strip():
|
||||
dc["name"] = nm.strip()
|
||||
nled = doc.get("wifi_driver_num_leds")
|
||||
if nled is not None:
|
||||
try:
|
||||
n = int(nled)
|
||||
if 1 <= n <= 2048:
|
||||
dc["num_leds"] = n
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
co = doc.get("wifi_color_order")
|
||||
if isinstance(co, str):
|
||||
c = co.strip().lower()
|
||||
if c in ("rgb", "rbg", "grb", "gbr", "brg", "bgr"):
|
||||
dc["color_order"] = c
|
||||
sm = doc.get("wifi_startup_mode")
|
||||
if isinstance(sm, str):
|
||||
s = sm.strip().lower()
|
||||
if s in ("default", "last", "off"):
|
||||
dc["startup_mode"] = s
|
||||
return dc
|
||||
|
||||
|
||||
def _read_group_for_session(session, id):
|
||||
g = groups.read(id)
|
||||
if not g or not isinstance(g, dict):
|
||||
return None
|
||||
from controllers.zone import get_current_profile_id
|
||||
|
||||
if not _group_doc_visible_for_profile(g, get_current_profile_id(session)):
|
||||
return None
|
||||
return g
|
||||
|
||||
|
||||
@controller.post("/<id>/driver-config")
|
||||
@with_session
|
||||
async def push_group_driver_config(request, session, id):
|
||||
"""
|
||||
Push group Wi‑Fi defaults to every Wi‑Fi device listed in the group (TCP WebSocket).
|
||||
Uses stored ``wifi_*`` fields on the group; optional JSON body may override for this send only.
|
||||
"""
|
||||
gdoc = _read_group_for_session(session, id)
|
||||
if not gdoc:
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
|
||||
body = request.json or {}
|
||||
merged = dict(gdoc)
|
||||
if isinstance(body, dict):
|
||||
for k in (
|
||||
"wifi_driver_display_name",
|
||||
"wifi_driver_num_leds",
|
||||
"wifi_color_order",
|
||||
"wifi_startup_mode",
|
||||
):
|
||||
if k in body:
|
||||
merged[k] = body[k]
|
||||
dc = _group_driver_config_payload(merged)
|
||||
if not dc:
|
||||
return json.dumps(
|
||||
{"error": "No driver defaults on this group (set display name, LEDs, colour order, or power-on pattern)"}
|
||||
), 400, {"Content-Type": "application/json"}
|
||||
|
||||
mac_list = gdoc.get("devices") if isinstance(gdoc.get("devices"), list) else []
|
||||
sent = 0
|
||||
errors = []
|
||||
msg = json.dumps(
|
||||
{"v": "1", "device_config": dc, "save": True}, separators=(",", ":")
|
||||
)
|
||||
tasks = []
|
||||
meta_macs = []
|
||||
for mac in mac_list:
|
||||
m = str(mac).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(m) != 12:
|
||||
continue
|
||||
dev = devices.read(m)
|
||||
if not dev:
|
||||
errors.append({"mac": m, "error": "not in registry"})
|
||||
continue
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
continue
|
||||
ip = normalize_tcp_peer_ip(str(dev.get("address") or ""))
|
||||
if not ip:
|
||||
errors.append({"mac": m, "error": "no IP"})
|
||||
continue
|
||||
tasks.append(send_json_line_to_ip(ip, msg))
|
||||
meta_macs.append(m)
|
||||
if tasks:
|
||||
results = await asyncio.gather(*tasks, return_exceptions=True)
|
||||
for m, r in zip(meta_macs, results):
|
||||
if r is True:
|
||||
sent += 1
|
||||
elif isinstance(r, Exception):
|
||||
errors.append({"mac": m, "error": str(r)})
|
||||
else:
|
||||
errors.append({"mac": m, "error": "driver not connected"})
|
||||
|
||||
return json.dumps(
|
||||
{"message": "driver-config sent", "sent": sent, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
def _brightness_save_message_json(b_val: int) -> str:
|
||||
b_val = max(0, min(255, int(b_val)))
|
||||
return json.dumps({"v": "1", "b": b_val, "save": True}, separators=(",", ":"))
|
||||
|
||||
|
||||
@controller.post("/<id>/brightness")
|
||||
@with_session
|
||||
async def push_group_output_brightness(request, session, id):
|
||||
"""
|
||||
Push combined brightness (global × group(s) × device) to each member — one ``b`` per device.
|
||||
"""
|
||||
gdoc = _read_group_for_session(session, id)
|
||||
if not gdoc:
|
||||
return json.dumps({"error": "Group not found"}), 404
|
||||
|
||||
mac_list = gdoc.get("devices") if isinstance(gdoc.get("devices"), list) else []
|
||||
sent = 0
|
||||
errors = []
|
||||
sender = get_current_sender()
|
||||
|
||||
async def _push_brightness_one(m: str, dev: dict) -> tuple[str, bool, str | None]:
|
||||
b_val = effective_brightness_for_mac(
|
||||
_pi_settings,
|
||||
groups,
|
||||
devices,
|
||||
m,
|
||||
zone_brightness=None,
|
||||
)
|
||||
msg = _brightness_save_message_json(b_val)
|
||||
transport = (dev.get("transport") or "espnow").strip().lower()
|
||||
if transport == "wifi":
|
||||
ip = normalize_tcp_peer_ip(str(dev.get("address") or ""))
|
||||
if not ip:
|
||||
return m, False, "no IP"
|
||||
ok = await send_json_line_to_ip(ip, msg)
|
||||
return m, bool(ok), None if ok else "driver not connected"
|
||||
if not sender:
|
||||
return m, False, "transport not configured"
|
||||
try:
|
||||
await sender.send(msg, addr=m)
|
||||
return m, True, None
|
||||
except Exception as e:
|
||||
return m, False, str(e)
|
||||
|
||||
tasks: list = []
|
||||
for mac in mac_list:
|
||||
m = str(mac).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(m) != 12:
|
||||
continue
|
||||
dev = devices.read(m)
|
||||
if not dev:
|
||||
errors.append({"mac": m, "error": "not in registry"})
|
||||
continue
|
||||
tasks.append(_push_brightness_one(m, dev))
|
||||
|
||||
if tasks:
|
||||
results = await asyncio.gather(*tasks, return_exceptions=True)
|
||||
for r in results:
|
||||
if isinstance(r, Exception):
|
||||
errors.append({"mac": "*", "error": str(r)})
|
||||
continue
|
||||
m, ok, err = r
|
||||
if ok:
|
||||
sent += 1
|
||||
elif err:
|
||||
errors.append({"mac": m, "error": err})
|
||||
|
||||
return json.dumps(
|
||||
{"message": "brightness sent", "sent": sent, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/<id>/identify")
|
||||
@with_session
|
||||
async def identify_group_devices(request, session, id):
|
||||
"""
|
||||
Run the same identify blink as ``POST /devices/<id>/identify`` for every registry member
|
||||
in parallel so all drivers in the group blink together.
|
||||
"""
|
||||
_ = request
|
||||
gdoc = _read_group_for_session(session, id)
|
||||
if not gdoc:
|
||||
return json.dumps({"error": "Group not found"}), 404, {"Content-Type": "application/json"}
|
||||
|
||||
mac_list = gdoc.get("devices") if isinstance(gdoc.get("devices"), list) else []
|
||||
if not mac_list:
|
||||
return json.dumps({"error": "Group has no devices"}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
from controllers.device import send_identify_to_group_devices
|
||||
|
||||
normalized: list[str] = []
|
||||
errors: list[dict] = []
|
||||
for mac in mac_list:
|
||||
m = str(mac).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(m) != 12:
|
||||
errors.append({"mac": str(mac), "error": "invalid MAC"})
|
||||
continue
|
||||
normalized.append(m)
|
||||
|
||||
if not normalized:
|
||||
return json.dumps(
|
||||
{"message": "identify group done", "sent": 0, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
sent, batch_errors = await send_identify_to_group_devices(normalized)
|
||||
errors.extend(batch_errors)
|
||||
|
||||
return json.dumps(
|
||||
{"message": "identify group done", "sent": sent, "errors": errors}
|
||||
), 200, {"Content-Type": "application/json"}
|
||||
|
||||
189
src/controllers/led_tool.py
Normal file
189
src/controllers/led_tool.py
Normal file
@@ -0,0 +1,189 @@
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from microdot import Microdot
|
||||
from serial.tools import list_ports
|
||||
|
||||
controller = Microdot()
|
||||
|
||||
|
||||
def _repo_root() -> str:
|
||||
return os.path.abspath(os.path.join(os.path.dirname(__file__), "..", ".."))
|
||||
|
||||
|
||||
def _led_cli_path() -> str:
|
||||
return os.path.join(_repo_root(), "led-tool", "cli.py")
|
||||
|
||||
|
||||
def _build_led_cli_command(port: str, payload: dict):
|
||||
cmd = [sys.executable, _led_cli_path(), "--port", port]
|
||||
|
||||
flag_map = (
|
||||
("name", "--name"),
|
||||
("led_pin", "--pin"),
|
||||
("num_leds", "--leds"),
|
||||
("brightness", "--brightness"),
|
||||
("transport", "--transport"),
|
||||
("ssid", "--ssid"),
|
||||
("password", "--wifi-password"),
|
||||
("wifi_channel", "--wifi-channel"),
|
||||
("default", "--default"),
|
||||
)
|
||||
|
||||
for key, flag in flag_map:
|
||||
value = payload.get(key)
|
||||
if value is None:
|
||||
continue
|
||||
value_str = str(value).strip()
|
||||
if value_str == "":
|
||||
continue
|
||||
cmd.extend([flag, value_str])
|
||||
|
||||
return cmd
|
||||
|
||||
|
||||
def _run_led_cli_command(cmd, cli_path: str, timeout_s=180):
|
||||
try:
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=timeout_s,
|
||||
cwd=os.path.dirname(cli_path),
|
||||
)
|
||||
except subprocess.TimeoutExpired:
|
||||
return (
|
||||
json.dumps({"error": "led-tool command timed out after 180 seconds"}),
|
||||
504,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
except Exception as exc:
|
||||
return (
|
||||
json.dumps({"error": str(exc)}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
return (
|
||||
json.dumps(
|
||||
{
|
||||
"ok": result.returncode == 0,
|
||||
"returncode": result.returncode,
|
||||
"stdout": result.stdout,
|
||||
"stderr": result.stderr,
|
||||
"command": cmd,
|
||||
}
|
||||
),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
|
||||
def _extract_settings_from_stdout(stdout: str):
|
||||
text = (stdout or "").strip()
|
||||
if not text:
|
||||
return None
|
||||
try:
|
||||
parsed = json.loads(text)
|
||||
return parsed if isinstance(parsed, dict) else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
@controller.get("/ports")
|
||||
async def list_serial_ports(request):
|
||||
ports = []
|
||||
for info in list_ports.comports():
|
||||
ports.append(
|
||||
{
|
||||
"device": info.device,
|
||||
"description": info.description,
|
||||
"hwid": info.hwid,
|
||||
}
|
||||
)
|
||||
return (
|
||||
json.dumps(
|
||||
{
|
||||
"ports": ports,
|
||||
"led_cli_exists": os.path.exists(_led_cli_path()),
|
||||
}
|
||||
),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
|
||||
@controller.post("/settings")
|
||||
async def apply_settings(request):
|
||||
data = request.json or {}
|
||||
port = str(data.get("port") or "").strip()
|
||||
if not port:
|
||||
return (
|
||||
json.dumps({"error": "port is required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cli_path = _led_cli_path()
|
||||
if not os.path.exists(cli_path):
|
||||
return (
|
||||
json.dumps({"error": "led-tool/cli.py not found"}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cmd = _build_led_cli_command(port, data) + ["--follow"]
|
||||
return _run_led_cli_command(cmd, cli_path, timeout_s=None)
|
||||
|
||||
|
||||
@controller.post("/reset")
|
||||
@controller.post("/reset/")
|
||||
async def reset_device(request):
|
||||
data = request.json or {}
|
||||
port = str(data.get("port") or "").strip()
|
||||
if not port:
|
||||
return (
|
||||
json.dumps({"error": "port is required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cli_path = _led_cli_path()
|
||||
if not os.path.exists(cli_path):
|
||||
return (
|
||||
json.dumps({"error": "led-tool/cli.py not found"}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cmd = [sys.executable, cli_path, "--port", port, "--reset", "--follow"]
|
||||
return _run_led_cli_command(cmd, cli_path, timeout_s=None)
|
||||
|
||||
|
||||
@controller.get("/settings")
|
||||
async def read_settings(request):
|
||||
port = str(request.args.get("port") or "").strip()
|
||||
if not port:
|
||||
return (
|
||||
json.dumps({"error": "port is required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cli_path = _led_cli_path()
|
||||
if not os.path.exists(cli_path):
|
||||
return (
|
||||
json.dumps({"error": "led-tool/cli.py not found"}),
|
||||
500,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
|
||||
cmd = [sys.executable, cli_path, "--port", port, "--show"]
|
||||
body, status, headers = _run_led_cli_command(cmd, cli_path)
|
||||
if status != 200:
|
||||
return body, status, headers
|
||||
data = json.loads(body)
|
||||
data["settings"] = _extract_settings_from_stdout(data.get("stdout") or "")
|
||||
return json.dumps(data), status, headers
|
||||
@@ -1,19 +1,113 @@
|
||||
from microdot import Microdot
|
||||
from models.pattern import Pattern
|
||||
from models.device import Device
|
||||
from util.driver_patterns import (
|
||||
driver_patterns_dir,
|
||||
is_firmware_builtin_pattern_module,
|
||||
normalize_pattern_py_filename,
|
||||
)
|
||||
import json
|
||||
import sys
|
||||
import re
|
||||
import os
|
||||
import socket
|
||||
from urllib.parse import quote
|
||||
|
||||
controller = Microdot()
|
||||
patterns = Pattern()
|
||||
|
||||
|
||||
def _project_root():
|
||||
"""Project root (parent of ``src/``). CWD is often ``src/`` when running ``main.py``."""
|
||||
here = os.path.dirname(os.path.abspath(__file__))
|
||||
return os.path.abspath(os.path.join(here, "..", ".."))
|
||||
|
||||
|
||||
def _safe_pattern_filename(name):
|
||||
if not isinstance(name, str):
|
||||
return False
|
||||
if not name.endswith(".py"):
|
||||
return False
|
||||
if "/" in name or "\\" in name or ".." in name:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
_PATTERN_KEY_RE = re.compile(r"^[a-zA-Z_][a-zA-Z0-9_]{0,63}$")
|
||||
|
||||
|
||||
def _normalize_pattern_key(raw):
|
||||
"""Pattern id / module basename (no .py)."""
|
||||
if not isinstance(raw, str):
|
||||
return ""
|
||||
s = raw.strip()
|
||||
if s.lower().endswith(".py"):
|
||||
s = s[:-3].strip()
|
||||
return s
|
||||
|
||||
|
||||
def _valid_pattern_key(key):
|
||||
return bool(key and _PATTERN_KEY_RE.match(key))
|
||||
|
||||
|
||||
def _http_post_pattern_source(ip, filename, code_text, reload_patterns=True, timeout_s=10.0):
|
||||
"""POST source to driver /patterns/upload?name=...&reload=...; return True on 2xx."""
|
||||
if not isinstance(ip, str) or not ip.strip():
|
||||
return False
|
||||
if not isinstance(filename, str) or not filename:
|
||||
return False
|
||||
if not isinstance(code_text, str):
|
||||
return False
|
||||
|
||||
name_q = quote(filename, safe="")
|
||||
reload_q = "1" if reload_patterns else "0"
|
||||
path = "/patterns/upload?name=%s&reload=%s" % (name_q, reload_q)
|
||||
body = code_text.encode("utf-8")
|
||||
req = (
|
||||
"POST %s HTTP/1.1\r\n"
|
||||
"Host: %s\r\n"
|
||||
"Content-Type: text/plain; charset=utf-8\r\n"
|
||||
"Content-Length: %d\r\n"
|
||||
"Connection: close\r\n"
|
||||
"\r\n" % (path, ip, len(body))
|
||||
).encode("utf-8") + body
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
try:
|
||||
sock.settimeout(timeout_s)
|
||||
sock.connect((ip.strip(), 80))
|
||||
sock.sendall(req)
|
||||
data = b""
|
||||
while True:
|
||||
chunk = sock.recv(1024)
|
||||
if not chunk:
|
||||
break
|
||||
data += chunk
|
||||
except OSError:
|
||||
return False
|
||||
finally:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
first_line = data.split(b"\r\n", 1)[0] if data else b""
|
||||
# Accept any 2xx status.
|
||||
return b" 2" in first_line
|
||||
|
||||
def load_pattern_definitions():
|
||||
"""Load pattern definitions from pattern.json file."""
|
||||
try:
|
||||
# Try different paths for local development vs MicroPython
|
||||
paths = ['db/pattern.json', 'pattern.json', '/db/pattern.json']
|
||||
root = _project_root()
|
||||
paths = [
|
||||
os.path.join(root, "db", "pattern.json"),
|
||||
os.path.join(root, "pattern.json"),
|
||||
"db/pattern.json",
|
||||
"pattern.json",
|
||||
"/db/pattern.json",
|
||||
]
|
||||
for path in paths:
|
||||
try:
|
||||
with open(path, 'r') as f:
|
||||
with open(path, "r") as f:
|
||||
return json.load(f)
|
||||
except OSError:
|
||||
continue
|
||||
@@ -22,16 +116,341 @@ def load_pattern_definitions():
|
||||
print(f"Error loading pattern.json: {e}")
|
||||
return {}
|
||||
|
||||
|
||||
def load_driver_pattern_names():
|
||||
"""List available pattern module names from led-driver/src/patterns."""
|
||||
try:
|
||||
names = []
|
||||
for filename in os.listdir(driver_patterns_dir()):
|
||||
if not _safe_pattern_filename(filename) or filename == "__init__.py":
|
||||
continue
|
||||
names.append(filename[:-3])
|
||||
names.sort()
|
||||
return names
|
||||
except OSError:
|
||||
return []
|
||||
|
||||
|
||||
def build_runtime_pattern_map():
|
||||
"""
|
||||
Runtime pattern map for UI menus.
|
||||
Keep pattern DB metadata as primary, then add any local driver pattern files
|
||||
missing from the DB so new OTA files still appear in menus.
|
||||
"""
|
||||
definitions = load_pattern_definitions()
|
||||
available = load_driver_pattern_names()
|
||||
result = {}
|
||||
for name, meta in definitions.items():
|
||||
result[name] = dict(meta) if isinstance(meta, dict) else {}
|
||||
for name in available:
|
||||
if name not in result:
|
||||
result[name] = {}
|
||||
return result
|
||||
|
||||
@controller.get('/definitions')
|
||||
async def get_pattern_definitions(request):
|
||||
"""Get pattern definitions from pattern.json."""
|
||||
definitions = load_pattern_definitions()
|
||||
"""Get definitions for patterns currently available on the driver."""
|
||||
definitions = build_runtime_pattern_map()
|
||||
return json.dumps(definitions), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
|
||||
@controller.get('/ota/manifest')
|
||||
async def ota_manifest(request):
|
||||
"""Manifest of driver pattern source files for OTA pulls."""
|
||||
base_dir = driver_patterns_dir()
|
||||
host = request.headers.get("Host", "")
|
||||
if not host:
|
||||
return json.dumps({"error": "Missing Host header"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
try:
|
||||
names = sorted(os.listdir(base_dir))
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
files = []
|
||||
for name in names:
|
||||
if not _safe_pattern_filename(name) or name == "__init__.py":
|
||||
continue
|
||||
files.append({
|
||||
"name": name,
|
||||
"url": "http://%s/patterns/ota/file/%s" % (host, name),
|
||||
})
|
||||
|
||||
return json.dumps({"files": files}), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get('/ota/file/<name>')
|
||||
async def ota_pattern_file(request, name):
|
||||
"""Serve one driver pattern source file for OTA pulls."""
|
||||
fname = normalize_pattern_py_filename(name)
|
||||
if not fname or not _safe_pattern_filename(fname) or fname == "__init__.py":
|
||||
return json.dumps({"error": "Invalid filename"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if is_firmware_builtin_pattern_module(fname):
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "on and off are built into the driver firmware; there is no module file to serve.",
|
||||
}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
base = driver_patterns_dir()
|
||||
path = os.path.join(base, fname)
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
content = f.read()
|
||||
except OSError:
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "Pattern file not found",
|
||||
"path": path,
|
||||
"hint": "Ensure led-driver is present or set LED_CONTROLLER_PATTERNS_DIR.",
|
||||
}
|
||||
), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
return content, 200, {"Content-Type": "text/plain; charset=utf-8"}
|
||||
|
||||
|
||||
@controller.post('/<name>/send')
|
||||
async def send_pattern_to_device(request, name):
|
||||
"""Push one pattern source file directly to Wi-Fi driver(s) over HTTP."""
|
||||
if not isinstance(name, str):
|
||||
return json.dumps({"error": "Invalid pattern name"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
filename = normalize_pattern_py_filename(name)
|
||||
if not filename or not _safe_pattern_filename(filename) or filename == "__init__.py":
|
||||
return json.dumps({"error": "Invalid pattern filename"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if is_firmware_builtin_pattern_module(filename):
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "on and off are built into the driver firmware; send does not apply.",
|
||||
}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
devices = Device()
|
||||
body = request.json or {}
|
||||
requested_device_id = str(body.get("device_id") or "").strip()
|
||||
|
||||
base = driver_patterns_dir()
|
||||
path = os.path.join(base, filename)
|
||||
if not os.path.exists(path):
|
||||
return json.dumps(
|
||||
{
|
||||
"error": "Pattern file not found",
|
||||
"path": path,
|
||||
"hint": "Ensure led-driver is present or set LED_CONTROLLER_PATTERNS_DIR.",
|
||||
}
|
||||
), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
source = f.read()
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
target_ids = []
|
||||
if requested_device_id:
|
||||
dev = devices.read(requested_device_id)
|
||||
if not dev:
|
||||
return json.dumps({"error": "Device not found"}), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if (dev.get("transport") or "").lower() != "wifi":
|
||||
return json.dumps({"error": "Pattern send is only supported for Wi-Fi devices"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
target_ids = [requested_device_id]
|
||||
else:
|
||||
for did in devices.list():
|
||||
dev = devices.read(did) or {}
|
||||
if (dev.get("transport") or "").lower() == "wifi":
|
||||
target_ids.append(str(did))
|
||||
if not target_ids:
|
||||
return json.dumps({"error": "No Wi-Fi devices found"}), 404, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
sent_ids = []
|
||||
for did in target_ids:
|
||||
dev = devices.read(did) or {}
|
||||
ip = str(dev.get("address") or "").strip()
|
||||
if not ip:
|
||||
continue
|
||||
ok = _http_post_pattern_source(ip, filename, source, reload_patterns=True, timeout_s=10.0)
|
||||
if ok:
|
||||
sent_ids.append(did)
|
||||
|
||||
if not sent_ids:
|
||||
return json.dumps({"error": "No Wi-Fi drivers accepted pattern upload"}), 503, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
return json.dumps({"message": "Pattern sent", "pattern": filename, "device_ids": sent_ids, "sent_count": len(sent_ids)}), 200, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
|
||||
@controller.post('/upload')
|
||||
async def upload_pattern_file(request):
|
||||
"""
|
||||
Upload a pattern source file to led-controller local storage.
|
||||
|
||||
Body JSON:
|
||||
{
|
||||
"name": "sparkle.py" | "sparkle",
|
||||
"code": "class Sparkle: ...",
|
||||
"overwrite": true | false # optional, default true
|
||||
}
|
||||
"""
|
||||
data = request.json or {}
|
||||
raw_name = data.get("name") or data.get("filename")
|
||||
code = data.get("code")
|
||||
overwrite = data.get("overwrite", True)
|
||||
overwrite = bool(overwrite)
|
||||
|
||||
if not isinstance(raw_name, str) or not raw_name.strip():
|
||||
return json.dumps({"error": "name is required"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
filename = raw_name.strip()
|
||||
if not filename.endswith(".py"):
|
||||
filename += ".py"
|
||||
if not _safe_pattern_filename(filename) or filename == "__init__.py":
|
||||
return json.dumps({"error": "invalid pattern filename"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if is_firmware_builtin_pattern_module(filename):
|
||||
return json.dumps(
|
||||
{"error": "on and off are built into the driver firmware; use a different pattern name."}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
if not isinstance(code, str) or not code.strip():
|
||||
return json.dumps({"error": "code is required"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
path = os.path.join(driver_patterns_dir(), filename)
|
||||
exists = os.path.exists(path)
|
||||
if exists and not overwrite:
|
||||
return json.dumps({"error": "pattern file already exists", "name": filename}), 409, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
try:
|
||||
with open(path, "w") as f:
|
||||
f.write(code)
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
return json.dumps({
|
||||
"message": "Pattern uploaded",
|
||||
"name": filename,
|
||||
"overwrote": bool(exists),
|
||||
}), 201, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post('/driver')
|
||||
async def create_driver_pattern(request):
|
||||
"""
|
||||
Create a driver pattern: save ``.py`` under led-driver/src/patterns and
|
||||
metadata in db/pattern.json (Pattern model).
|
||||
|
||||
Body JSON:
|
||||
name, code (required),
|
||||
min_delay, max_delay, max_colors (optional numbers),
|
||||
has_background (optional bool),
|
||||
supports_manual (optional bool, default true if omitted in db),
|
||||
n1..n8 (optional string labels),
|
||||
overwrite (optional, default true).
|
||||
"""
|
||||
data = request.json or {}
|
||||
key = _normalize_pattern_key(data.get("name") or "")
|
||||
if not _valid_pattern_key(key):
|
||||
return json.dumps({
|
||||
"error": "name must be a valid Python identifier (e.g. sparkle, my_pattern)",
|
||||
}), 400, {"Content-Type": "application/json"}
|
||||
if is_firmware_builtin_pattern_module(key):
|
||||
return json.dumps(
|
||||
{"error": "on and off are built into the driver firmware; use a different pattern name."}
|
||||
), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
code = data.get("code")
|
||||
if not isinstance(code, str) or not code.strip():
|
||||
return json.dumps({"error": "code is required (upload a .py file or paste source)"}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
overwrite = bool(data.get("overwrite", True))
|
||||
|
||||
filename = key + ".py"
|
||||
py_path = os.path.join(driver_patterns_dir(), filename)
|
||||
if os.path.exists(py_path) and not overwrite:
|
||||
return json.dumps({"error": "pattern file already exists", "name": filename}), 409, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
meta = {}
|
||||
for fld in ("min_delay", "max_delay", "max_colors"):
|
||||
if fld not in data:
|
||||
continue
|
||||
try:
|
||||
meta[fld] = int(data[fld])
|
||||
except (TypeError, ValueError):
|
||||
return json.dumps({"error": "%s must be an integer" % fld}), 400, {
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
if "has_background" in data:
|
||||
meta["has_background"] = bool(data.get("has_background"))
|
||||
|
||||
if "supports_manual" in data:
|
||||
meta["supports_manual"] = bool(data.get("supports_manual"))
|
||||
|
||||
for i in range(1, 9):
|
||||
nk = "n%d" % i
|
||||
if nk not in data:
|
||||
continue
|
||||
lab = data[nk]
|
||||
if lab is None:
|
||||
continue
|
||||
s = str(lab).strip()
|
||||
if s:
|
||||
meta[nk] = s
|
||||
|
||||
try:
|
||||
with open(py_path, "w") as f:
|
||||
f.write(code)
|
||||
except OSError as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
if patterns.read(key):
|
||||
patterns.update(key, meta)
|
||||
else:
|
||||
patterns.create(key, meta)
|
||||
|
||||
return json.dumps({
|
||||
"message": "Pattern created",
|
||||
"name": key,
|
||||
"file": filename,
|
||||
"metadata": patterns.read(key),
|
||||
}), 201, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get('')
|
||||
async def list_patterns(request):
|
||||
"""List all patterns."""
|
||||
return json.dumps(patterns), 200, {'Content-Type': 'application/json'}
|
||||
"""List patterns for UI (DB metadata + local driver additions)."""
|
||||
return json.dumps(build_runtime_pattern_map()), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
|
||||
@controller.get('/<id>')
|
||||
|
||||
@@ -2,6 +2,7 @@ from microdot import Microdot
|
||||
from microdot.session import with_session
|
||||
from models.preset import Preset
|
||||
from models.profile import Profile
|
||||
from models.pallet import Palette
|
||||
from models.device import Device, normalize_mac
|
||||
from models.transport import get_current_sender
|
||||
from util.driver_delivery import deliver_json_messages, deliver_preset_broadcast_then_per_device
|
||||
@@ -12,6 +13,18 @@ controller = Microdot()
|
||||
presets = Preset()
|
||||
profiles = Profile()
|
||||
|
||||
|
||||
def _palette_colors_for_profile(profile_id):
|
||||
prof = profiles.read(str(profile_id))
|
||||
if not isinstance(prof, dict):
|
||||
return None
|
||||
pid = prof.get("palette_id") or prof.get("paletteId")
|
||||
if not pid:
|
||||
return None
|
||||
cols = Palette().read(str(pid))
|
||||
return cols if isinstance(cols, list) else None
|
||||
|
||||
|
||||
def get_current_profile_id(session=None):
|
||||
"""Get the current active profile ID from session or fallback to first."""
|
||||
profile_list = profiles.list()
|
||||
@@ -153,6 +166,7 @@ async def send_presets(request, session):
|
||||
|
||||
# Build API-compliant preset map keyed by preset ID, include name
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
palette_colors = _palette_colors_for_profile(current_profile_id)
|
||||
presets_by_name = {}
|
||||
for pid in preset_ids:
|
||||
preset_data = presets.read(str(pid))
|
||||
@@ -161,7 +175,7 @@ async def send_presets(request, session):
|
||||
if str(preset_data.get("profile_id")) != str(current_profile_id):
|
||||
continue
|
||||
preset_key = str(pid)
|
||||
preset_payload = build_preset_dict(preset_data)
|
||||
preset_payload = build_preset_dict(preset_data, palette_colors)
|
||||
preset_payload["name"] = preset_data.get("name", "")
|
||||
presets_by_name[preset_key] = preset_payload
|
||||
|
||||
@@ -315,6 +329,17 @@ async def push_driver_messages(request, session):
|
||||
except Exception:
|
||||
return json.dumps({"error": "Send failed"}), 503, {'Content-Type': 'application/json'}
|
||||
|
||||
try:
|
||||
from util import sequence_playback as seq_pb
|
||||
from util.beat_driver_route import sync_beat_route_from_push_sequence
|
||||
|
||||
preserve = bool(seq_pb.playback_status().get("active"))
|
||||
sync_beat_route_from_push_sequence(
|
||||
seq, target_macs=target_list, preserve_parallel_lane_routes=preserve
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return json.dumps({
|
||||
"message": "Delivered",
|
||||
"deliveries": deliveries,
|
||||
|
||||
@@ -124,6 +124,15 @@ async def create_profile(request):
|
||||
"auto": True,
|
||||
"n1": 2,
|
||||
},
|
||||
{
|
||||
"name": "Colour Cycle",
|
||||
"pattern": "colour_cycle",
|
||||
"colors": ["#FF0000", "#00FF00", "#0000FF"],
|
||||
"brightness": 255,
|
||||
"delay": 100,
|
||||
"auto": True,
|
||||
"n1": 1,
|
||||
},
|
||||
{
|
||||
"name": "transition",
|
||||
"pattern": "transition",
|
||||
@@ -132,6 +141,39 @@ async def create_profile(request):
|
||||
"delay": 500,
|
||||
"auto": True,
|
||||
},
|
||||
{
|
||||
"name": "flicker",
|
||||
"pattern": "flicker",
|
||||
"colors": ["#FFB84D"],
|
||||
"brightness": 255,
|
||||
"delay": 80,
|
||||
"auto": True,
|
||||
"n1": 30,
|
||||
},
|
||||
{
|
||||
"name": "flame",
|
||||
"pattern": "flame",
|
||||
"colors": [],
|
||||
"brightness": 255,
|
||||
"delay": 50,
|
||||
"auto": True,
|
||||
"n1": 35,
|
||||
"n2": 2600,
|
||||
"n3": 0,
|
||||
"n4": 0,
|
||||
},
|
||||
{
|
||||
"name": "twinkle",
|
||||
"pattern": "twinkle",
|
||||
"colors": ["#78C8FF", "#508CFF", "#B478FF", "#64DCE8", "#A0C8FF"],
|
||||
"brightness": 255,
|
||||
"delay": 55,
|
||||
"auto": True,
|
||||
"n1": 72,
|
||||
"n2": 140,
|
||||
"n3": 2,
|
||||
"n4": 6,
|
||||
},
|
||||
]
|
||||
|
||||
for preset_data in default_preset_defs:
|
||||
|
||||
@@ -1,51 +1,207 @@
|
||||
from microdot import Microdot
|
||||
from models.squence import Sequence
|
||||
from microdot.session import with_session
|
||||
from models.sequence import Sequence
|
||||
from models.profile import Profile
|
||||
from models.transport import get_current_sender
|
||||
import json
|
||||
|
||||
controller = Microdot()
|
||||
sequences = Sequence()
|
||||
profiles = Profile()
|
||||
|
||||
@controller.get('')
|
||||
async def list_sequences(request):
|
||||
"""List all sequences."""
|
||||
return json.dumps(sequences), 200, {'Content-Type': 'application/json'}
|
||||
|
||||
@controller.get('/<id>')
|
||||
async def get_sequence(request, id):
|
||||
"""Get a specific sequence by ID."""
|
||||
sequence = sequences.read(id)
|
||||
if sequence:
|
||||
return json.dumps(sequence), 200, {'Content-Type': 'application/json'}
|
||||
def get_current_profile_id(session=None):
|
||||
"""Get the current active profile ID from session or fallback to first."""
|
||||
profile_list = profiles.list()
|
||||
session_profile = None
|
||||
if session is not None:
|
||||
session_profile = session.get("current_profile")
|
||||
if session_profile and session_profile in profile_list:
|
||||
return session_profile
|
||||
if profile_list:
|
||||
return profile_list[0]
|
||||
return None
|
||||
|
||||
|
||||
@controller.get("")
|
||||
@with_session
|
||||
async def list_sequences(request, session):
|
||||
"""List sequences for the current profile."""
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
if not current_profile_id:
|
||||
return json.dumps({}), 200, {"Content-Type": "application/json"}
|
||||
scoped = {
|
||||
sid: sdata
|
||||
for sid, sdata in sequences.items()
|
||||
if isinstance(sdata, dict)
|
||||
and str(sdata.get("profile_id")) == str(current_profile_id)
|
||||
}
|
||||
return json.dumps(scoped), 200, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.get("/<id>")
|
||||
@with_session
|
||||
async def get_sequence(request, session, id):
|
||||
"""Get a specific sequence by ID (current profile only)."""
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
seq = sequences.read(id)
|
||||
if (
|
||||
seq
|
||||
and current_profile_id
|
||||
and str(seq.get("profile_id")) == str(current_profile_id)
|
||||
):
|
||||
return json.dumps(seq), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
|
||||
@controller.post('')
|
||||
async def create_sequence(request):
|
||||
"""Create a new sequence."""
|
||||
try:
|
||||
data = request.json or {}
|
||||
group_name = data.get("group_name", "")
|
||||
preset_names = data.get("presets", None)
|
||||
sequence_id = sequences.create(group_name, preset_names)
|
||||
if data:
|
||||
sequences.update(sequence_id, data)
|
||||
return json.dumps(sequences.read(sequence_id)), 201, {'Content-Type': 'application/json'}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@controller.put('/<id>')
|
||||
async def update_sequence(request, id):
|
||||
"""Update an existing sequence."""
|
||||
@controller.post("")
|
||||
@with_session
|
||||
async def create_sequence(request, session):
|
||||
"""Create a new sequence for the current profile."""
|
||||
try:
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
return (
|
||||
json.dumps({"error": "Invalid JSON"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
if not current_profile_id:
|
||||
return (
|
||||
json.dumps({"error": "No profile available"}),
|
||||
404,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
sequence_id = sequences.create(current_profile_id)
|
||||
if not isinstance(data, dict):
|
||||
data = {}
|
||||
data = dict(data)
|
||||
data["profile_id"] = str(current_profile_id)
|
||||
if sequences.update(sequence_id, data):
|
||||
seq_data = sequences.read(sequence_id)
|
||||
return (
|
||||
json.dumps({sequence_id: seq_data}),
|
||||
201,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
return (
|
||||
json.dumps({"error": "Failed to create sequence"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.put("/<id>")
|
||||
@with_session
|
||||
async def update_sequence(request, session, id):
|
||||
"""Update an existing sequence (current profile only)."""
|
||||
try:
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
seq = sequences.read(id)
|
||||
if not seq or str(seq.get("profile_id")) != str(current_profile_id):
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
data = request.json
|
||||
if not isinstance(data, dict):
|
||||
return (
|
||||
json.dumps({"error": "Invalid JSON"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
data = dict(data)
|
||||
data["profile_id"] = str(current_profile_id)
|
||||
if sequences.update(id, data):
|
||||
return json.dumps(sequences.read(id)), 200, {'Content-Type': 'application/json'}
|
||||
try:
|
||||
from util.sequence_playback import stop_if_playing_sequence
|
||||
|
||||
stop_if_playing_sequence(str(id))
|
||||
except Exception:
|
||||
pass
|
||||
return json.dumps(sequences.read(id)), 200, {"Content-Type": "application/json"}
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
|
||||
@controller.delete('/<id>')
|
||||
async def delete_sequence(request, id):
|
||||
"""Delete a sequence."""
|
||||
|
||||
@controller.delete("/<id>")
|
||||
@with_session
|
||||
async def delete_sequence(request, session, id):
|
||||
"""Delete a sequence (current profile only)."""
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
seq = sequences.read(id)
|
||||
if not seq or str(seq.get("profile_id")) != str(current_profile_id):
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
try:
|
||||
from util.sequence_playback import stop_if_playing_sequence
|
||||
|
||||
stop_if_playing_sequence(str(id))
|
||||
except Exception:
|
||||
pass
|
||||
if sequences.delete(id):
|
||||
return json.dumps({"message": "Sequence deleted successfully"}), 200
|
||||
return (
|
||||
json.dumps({"message": "Sequence deleted successfully"}),
|
||||
200,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
return json.dumps({"error": "Sequence not found"}), 404
|
||||
|
||||
|
||||
@controller.post("/stop")
|
||||
@with_session
|
||||
async def stop_sequence_playback(request, session):
|
||||
"""Stop server-driven zone sequence playback."""
|
||||
_ = request
|
||||
try:
|
||||
from util.sequence_playback import stop
|
||||
|
||||
stop()
|
||||
return json.dumps({"ok": True}), 200, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
|
||||
@controller.post("/<id>/play")
|
||||
@with_session
|
||||
async def play_sequence(request, session, id):
|
||||
"""Start server-driven playback for a sequence in a zone (body: {\"zone_id\": \"...\"})."""
|
||||
if not get_current_sender():
|
||||
return (
|
||||
json.dumps({"error": "Transport not configured"}),
|
||||
503,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
current_profile_id = get_current_profile_id(session)
|
||||
if not current_profile_id:
|
||||
return (
|
||||
json.dumps({"error": "No profile available"}),
|
||||
404,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
try:
|
||||
data = request.json or {}
|
||||
except Exception:
|
||||
data = {}
|
||||
if not isinstance(data, dict):
|
||||
data = {}
|
||||
zone_id = data.get("zone_id") or data.get("zoneId")
|
||||
if zone_id is None or str(zone_id).strip() == "":
|
||||
return (
|
||||
json.dumps({"error": "zone_id required"}),
|
||||
400,
|
||||
{"Content-Type": "application/json"},
|
||||
)
|
||||
zone_id = str(zone_id).strip()
|
||||
try:
|
||||
from util.sequence_playback import start
|
||||
|
||||
await start(zone_id, str(id), str(current_profile_id), data if isinstance(data, dict) else None)
|
||||
return json.dumps({"ok": True}), 200, {"Content-Type": "application/json"}
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400, {"Content-Type": "application/json"}
|
||||
except RuntimeError as e:
|
||||
return json.dumps({"error": str(e)}), 503, {"Content-Type": "application/json"}
|
||||
except Exception as e:
|
||||
return json.dumps({"error": str(e)}), 500, {"Content-Type": "application/json"}
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
from microdot import Microdot, send_file
|
||||
from settings import Settings
|
||||
import asyncio
|
||||
import json
|
||||
|
||||
from microdot import Microdot, send_file
|
||||
|
||||
from models import wifi_ws_clients
|
||||
from settings import Settings
|
||||
|
||||
controller = Microdot()
|
||||
settings = Settings()
|
||||
|
||||
@@ -63,17 +67,36 @@ def _validate_wifi_channel(value):
|
||||
return ch
|
||||
|
||||
|
||||
def _validate_global_brightness(value):
|
||||
"""Return int 0–255 or raise ValueError."""
|
||||
v = int(value)
|
||||
if v < 0 or v > 255:
|
||||
raise ValueError("global_brightness must be between 0 and 255")
|
||||
return v
|
||||
|
||||
|
||||
@controller.put('/settings')
|
||||
async def update_settings(request):
|
||||
"""Update general settings."""
|
||||
try:
|
||||
data = request.json
|
||||
global_brightness_changed = False
|
||||
for key, value in data.items():
|
||||
if key == 'wifi_channel' and value is not None:
|
||||
settings[key] = _validate_wifi_channel(value)
|
||||
elif key == 'global_brightness' and value is not None:
|
||||
settings[key] = _validate_global_brightness(value)
|
||||
global_brightness_changed = True
|
||||
else:
|
||||
settings[key] = value
|
||||
settings.save()
|
||||
if global_brightness_changed:
|
||||
try:
|
||||
asyncio.get_running_loop().create_task(
|
||||
wifi_ws_clients.broadcast_global_brightness_to_tcp_drivers()
|
||||
)
|
||||
except RuntimeError:
|
||||
pass
|
||||
return json.dumps({"message": "Settings updated successfully"}), 200, {'Content-Type': 'application/json'}
|
||||
except ValueError as e:
|
||||
return json.dumps({"error": str(e)}), 400
|
||||
|
||||
@@ -290,6 +290,8 @@ async def create_zone(request, session):
|
||||
ids_str = request.form.get("ids", "1").strip()
|
||||
names = [i.strip() for i in ids_str.split(",") if i.strip()]
|
||||
preset_ids = None
|
||||
group_ids = []
|
||||
content_kind = None
|
||||
else:
|
||||
data = request.json or {}
|
||||
name = data.get("name", "")
|
||||
@@ -297,11 +299,20 @@ async def create_zone(request, session):
|
||||
if names is None:
|
||||
names = data.get("ids")
|
||||
preset_ids = data.get("presets", None)
|
||||
group_ids = data.get("group_ids")
|
||||
if group_ids is None:
|
||||
group_ids = []
|
||||
if isinstance(group_ids, list):
|
||||
group_ids = [str(x) for x in group_ids if x is not None]
|
||||
else:
|
||||
group_ids = []
|
||||
raw_kind = data.get("content_kind")
|
||||
content_kind = raw_kind if raw_kind in ("presets", "sequences") else None
|
||||
|
||||
if not name:
|
||||
return json.dumps({"error": "Zone name cannot be empty"}), 400
|
||||
|
||||
zid = zones.create(name, names, preset_ids)
|
||||
zid = zones.create(name, names, preset_ids, group_ids, content_kind)
|
||||
|
||||
profile_id = get_current_profile_id(session)
|
||||
if profile_id:
|
||||
@@ -333,7 +344,12 @@ async def clone_zone(request, session, id):
|
||||
data = request.json or {}
|
||||
source_name = source.get("name") or f"Zone {id}"
|
||||
new_name = data.get("name") or f"{source_name} Copy"
|
||||
clone_id = zones.create(new_name, source.get("names"), source.get("presets"))
|
||||
clone_id = zones.create(
|
||||
new_name,
|
||||
source.get("names"),
|
||||
source.get("presets"),
|
||||
source.get("group_ids"),
|
||||
)
|
||||
extra = {k: v for k, v in source.items() if k not in ("name", "names", "presets")}
|
||||
if extra:
|
||||
zones.update(clone_id, extra)
|
||||
|
||||
575
src/main.py
575
src/main.py
@@ -2,6 +2,8 @@ import asyncio
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
import secrets
|
||||
import signal
|
||||
import socket
|
||||
import threading
|
||||
import traceback
|
||||
@@ -20,198 +22,210 @@ import controllers.scene as scene
|
||||
import controllers.pattern as pattern
|
||||
import controllers.settings as settings_controller
|
||||
import controllers.device as device_controller
|
||||
import controllers.led_tool as led_tool_controller
|
||||
from models.transport import get_sender, set_sender, get_current_sender
|
||||
from models.device import Device, normalize_mac
|
||||
from models import tcp_clients as tcp_client_registry
|
||||
from models import wifi_ws_clients as tcp_client_registry
|
||||
from util.device_status_broadcaster import (
|
||||
broadcast_device_tcp_snapshot_to,
|
||||
broadcast_device_tcp_status,
|
||||
register_device_status_ws,
|
||||
unregister_device_status_ws,
|
||||
)
|
||||
from util.audio_detector import AudioBeatDetector
|
||||
|
||||
_tcp_device_lock = threading.Lock()
|
||||
|
||||
# Wi-Fi drivers send one hello line then stay quiet; periodic outbound data makes dead peers
|
||||
# fail drain() within this interval (keepalive alone is often slow or ineffective).
|
||||
TCP_LIVENESS_PING_INTERVAL_S = 12.0
|
||||
|
||||
# Keepalive or lossy Wi-Fi can still surface OSError(110) / TimeoutError on recv or wait_closed.
|
||||
_TCP_PEER_GONE = (
|
||||
BrokenPipeError,
|
||||
ConnectionResetError,
|
||||
ConnectionAbortedError,
|
||||
ConnectionRefusedError,
|
||||
TimeoutError,
|
||||
OSError,
|
||||
)
|
||||
DISCOVERY_UDP_PORT = 8766
|
||||
|
||||
|
||||
def _tcp_socket_from_writer(writer):
|
||||
sock = writer.get_extra_info("socket")
|
||||
if sock is not None:
|
||||
return sock
|
||||
transport = getattr(writer, "transport", None)
|
||||
if transport is not None:
|
||||
return transport.get_extra_info("socket")
|
||||
return None
|
||||
def _live_reload_enabled() -> bool:
|
||||
v = os.environ.get("LED_CONTROLLER_LIVE_RELOAD", "").strip().lower()
|
||||
return v not in ("", "0", "false", "no")
|
||||
|
||||
|
||||
def _enable_tcp_keepalive(writer) -> None:
|
||||
"""
|
||||
Detect vanished peers (power off, Wi-Fi drop) without waiting for a send() failure.
|
||||
Linux: shorten time before the first keepalive probe; other platforms: SO_KEEPALIVE only.
|
||||
"""
|
||||
sock = _tcp_socket_from_writer(writer)
|
||||
if sock is None:
|
||||
return
|
||||
try:
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
||||
except OSError:
|
||||
return
|
||||
if hasattr(socket, "TCP_KEEPIDLE"):
|
||||
try:
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 120)
|
||||
except OSError:
|
||||
pass
|
||||
if hasattr(socket, "TCP_KEEPINTVL"):
|
||||
try:
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 15)
|
||||
except OSError:
|
||||
pass
|
||||
if hasattr(socket, "TCP_KEEPCNT"):
|
||||
try:
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 4)
|
||||
except OSError:
|
||||
pass
|
||||
# Do not set TCP_USER_TIMEOUT: a short value causes Errno 110 on recv for Wi-Fi peers
|
||||
# when ACKs are delayed (ESP power save, lossy links). Liveness pings already clear dead
|
||||
# sessions via drain().
|
||||
|
||||
|
||||
async def _tcp_liveness_ping_loop(writer, peer_ip: str) -> None:
|
||||
"""Send a bare newline so ``drain()`` fails soon after the peer disappears."""
|
||||
while True:
|
||||
await asyncio.sleep(TCP_LIVENESS_PING_INTERVAL_S)
|
||||
if writer.is_closing():
|
||||
return
|
||||
try:
|
||||
writer.write(b"\n")
|
||||
await writer.drain()
|
||||
except Exception as exc:
|
||||
print(f"[TCP] liveness ping failed {peer_ip!r}: {exc!r}")
|
||||
tcp_client_registry.unregister_tcp_writer(peer_ip, writer)
|
||||
try:
|
||||
writer.close()
|
||||
except Exception:
|
||||
pass
|
||||
return
|
||||
|
||||
|
||||
def _register_tcp_device_sync(
|
||||
def _register_udp_device_sync(
|
||||
device_name: str, peer_ip: str, mac, device_type=None
|
||||
) -> None:
|
||||
with _tcp_device_lock:
|
||||
try:
|
||||
d = Device()
|
||||
did = d.upsert_wifi_tcp_client(
|
||||
did, persisted = d.upsert_wifi_tcp_client(
|
||||
device_name, peer_ip, mac, device_type=device_type
|
||||
)
|
||||
if did:
|
||||
if did and persisted:
|
||||
print(
|
||||
f"TCP device registered: mac={did} name={device_name!r} ip={peer_ip!r}"
|
||||
f"UDP device registered: mac={did} name={device_name!r} ip={peer_ip!r}"
|
||||
)
|
||||
except Exception as e:
|
||||
print(f"TCP device registry failed: {e}")
|
||||
print(f"UDP device registry failed: {e}")
|
||||
traceback.print_exception(type(e), e, e.__traceback__)
|
||||
|
||||
|
||||
async def _handle_tcp_client(reader, writer):
|
||||
"""Read newline-delimited JSON from Wi-Fi LED drivers; forward to serial bridge."""
|
||||
peer = writer.get_extra_info("peername")
|
||||
peer_ip = peer[0] if peer else ""
|
||||
peer_label = f"{peer_ip}:{peer[1]}" if peer and len(peer) > 1 else peer_ip or "?"
|
||||
print(f"[TCP] client connected {peer_label}")
|
||||
_enable_tcp_keepalive(writer)
|
||||
tcp_client_registry.register_tcp_writer(peer_ip, writer)
|
||||
ping_task = asyncio.create_task(_tcp_liveness_ping_loop(writer, peer_ip))
|
||||
sender = get_current_sender()
|
||||
buf = b""
|
||||
try:
|
||||
while True:
|
||||
try:
|
||||
chunk = await reader.read(4096)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
except _TCP_PEER_GONE as e:
|
||||
print(f"[TCP] read ended ({peer_label}): {e!r}")
|
||||
tcp_client_registry.unregister_tcp_writer(peer_ip, writer)
|
||||
break
|
||||
if not chunk:
|
||||
break
|
||||
buf += chunk
|
||||
while b"\n" in buf:
|
||||
raw_line, buf = buf.split(b"\n", 1)
|
||||
line = raw_line.strip()
|
||||
if not line:
|
||||
continue
|
||||
try:
|
||||
text = line.decode("utf-8")
|
||||
except UnicodeDecodeError:
|
||||
print(
|
||||
f"[TCP] recv {peer_label} (non-UTF-8, {len(line)} bytes): {line!r}"
|
||||
)
|
||||
continue
|
||||
print(f"[TCP] recv {peer_label}: {text}")
|
||||
try:
|
||||
parsed = json.loads(text)
|
||||
except json.JSONDecodeError:
|
||||
if sender:
|
||||
try:
|
||||
await sender.send(text)
|
||||
except Exception:
|
||||
pass
|
||||
continue
|
||||
if isinstance(parsed, dict):
|
||||
dns = str(parsed.get("device_name") or "").strip()
|
||||
mac = parsed.get("mac") or parsed.get("device_mac") or parsed.get("sta_mac")
|
||||
device_type = parsed.get("type") or parsed.get("device_type")
|
||||
if dns and normalize_mac(mac):
|
||||
_register_tcp_device_sync(
|
||||
dns, peer_ip, mac, device_type=device_type
|
||||
)
|
||||
addr = parsed.pop("to", None)
|
||||
payload = json.dumps(parsed) if parsed else "{}"
|
||||
if sender:
|
||||
try:
|
||||
await sender.send(payload, addr=addr)
|
||||
except Exception as e:
|
||||
print(f"TCP forward to bridge failed: {e}")
|
||||
elif sender:
|
||||
try:
|
||||
await sender.send(text)
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
# Drop registry + broadcast connected:false before awaiting ping/close so the UI
|
||||
# does not stay green if ping or wait_closed blocks on a timed-out peer.
|
||||
outcome = tcp_client_registry.unregister_tcp_writer(peer_ip, writer)
|
||||
if outcome == "superseded":
|
||||
print(
|
||||
f"[TCP] TCP session ended (same IP already has a newer connection): {peer_label}"
|
||||
)
|
||||
ping_task.cancel()
|
||||
async def _handle_udp_discovery(sock, udp_holder=None) -> None:
|
||||
while True:
|
||||
try:
|
||||
await ping_task
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
try:
|
||||
writer.close()
|
||||
await writer.wait_closed()
|
||||
data, addr = await asyncio.get_running_loop().sock_recvfrom(sock, 2048)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
except _TCP_PEER_GONE:
|
||||
tcp_client_registry.unregister_tcp_writer(peer_ip, writer)
|
||||
except OSError as e:
|
||||
if udp_holder and udp_holder.get("closing"):
|
||||
break
|
||||
print(f"[UDP] recv failed: {e!r}")
|
||||
continue
|
||||
except Exception as e:
|
||||
print(f"[UDP] recv failed: {e!r}")
|
||||
continue
|
||||
peer_ip = addr[0] if addr else ""
|
||||
line = data.split(b"\n", 1)[0].strip()
|
||||
if line:
|
||||
try:
|
||||
parsed = json.loads(line.decode("utf-8"))
|
||||
if isinstance(parsed, dict):
|
||||
dns = str(parsed.get("device_name") or "").strip()
|
||||
mac = parsed.get("mac") or parsed.get("device_mac") or parsed.get(
|
||||
"sta_mac"
|
||||
)
|
||||
device_type = parsed.get("type") or parsed.get("device_type")
|
||||
if dns and normalize_mac(mac):
|
||||
_register_udp_device_sync(dns, peer_ip, mac, device_type)
|
||||
if str(parsed.get("v") or "") == "1":
|
||||
tcp_client_registry.ensure_driver_connection(peer_ip)
|
||||
except (UnicodeError, ValueError, TypeError):
|
||||
pass
|
||||
try:
|
||||
await asyncio.get_running_loop().sock_sendto(sock, data, addr)
|
||||
except Exception as e:
|
||||
print(f"[UDP] echo send failed: {e!r}")
|
||||
|
||||
|
||||
def _prime_wifi_outbound_driver_connections() -> None:
|
||||
"""
|
||||
For each Wi‑Fi device in the registry with a usable IPv4, start (or keep) the
|
||||
outbound WebSocket task. The client loop reconnects automatically if the link
|
||||
drops. Presets are not pushed automatically; use Send Presets / profile apply.
|
||||
"""
|
||||
n = 0
|
||||
try:
|
||||
dev = Device()
|
||||
for mac_key, doc in list(dev.items()):
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if doc.get("transport") != "wifi":
|
||||
continue
|
||||
ip = _ipv4_address(str(doc.get("address") or ""))
|
||||
if not ip:
|
||||
continue
|
||||
tcp_client_registry.ensure_driver_connection(ip)
|
||||
n += 1
|
||||
except Exception as e:
|
||||
print(f"[startup] Wi-Fi driver connection prime failed: {e!r}")
|
||||
traceback.print_exception(type(e), e, e.__traceback__)
|
||||
return
|
||||
if n:
|
||||
print(f"[startup] primed outbound WebSocket for {n} Wi-Fi driver(s)")
|
||||
|
||||
|
||||
def _ipv4_address(addr: str) -> str | None:
|
||||
"""Return dotted IPv4 string or None (hostnames skipped for UDP nudge)."""
|
||||
s = (addr or "").strip()
|
||||
if not s:
|
||||
return None
|
||||
parts = s.split(".")
|
||||
if len(parts) != 4:
|
||||
return None
|
||||
try:
|
||||
nums = [int(p) for p in parts]
|
||||
except ValueError:
|
||||
return None
|
||||
if not all(0 <= n <= 255 for n in nums):
|
||||
return None
|
||||
return s
|
||||
|
||||
|
||||
async def _periodic_wifi_driver_hello_loop(settings, udp_holder) -> None:
|
||||
"""
|
||||
While a registered Wi-Fi driver has no outbound WebSocket, send a short JSON hello on
|
||||
UDP discovery port so the device can announce itself and we can reconnect.
|
||||
"""
|
||||
try:
|
||||
interval = float(settings.get("wifi_driver_hello_interval_s", 10.0))
|
||||
except (TypeError, ValueError):
|
||||
interval = 10.0
|
||||
if interval <= 0:
|
||||
return
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
sock.setblocking(False)
|
||||
loop = asyncio.get_running_loop()
|
||||
try:
|
||||
while True:
|
||||
await asyncio.sleep(interval)
|
||||
if udp_holder.get("closing"):
|
||||
break
|
||||
try:
|
||||
dev = Device()
|
||||
except Exception as e:
|
||||
print(f"[hello] device list failed: {e!r}")
|
||||
continue
|
||||
for _mac_key, doc in list(dev.items()):
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if doc.get("transport") != "wifi":
|
||||
continue
|
||||
ip = _ipv4_address(str(doc.get("address") or ""))
|
||||
if not ip:
|
||||
continue
|
||||
if tcp_client_registry.tcp_client_connected(ip):
|
||||
continue
|
||||
name = (doc.get("name") or "").strip()
|
||||
mac = normalize_mac(doc.get("id") or _mac_key)
|
||||
if not name or not mac:
|
||||
continue
|
||||
line = (
|
||||
json.dumps(
|
||||
{"m": "hello", "device_name": name, "mac": mac},
|
||||
separators=(",", ":"),
|
||||
)
|
||||
+ "\n"
|
||||
)
|
||||
try:
|
||||
await loop.sock_sendto(
|
||||
sock, line.encode("utf-8"), (ip, DISCOVERY_UDP_PORT)
|
||||
)
|
||||
except OSError as e:
|
||||
print(f"[hello] UDP to {ip!r} failed: {e!r}")
|
||||
finally:
|
||||
try:
|
||||
sock.close()
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
|
||||
async def _run_udp_discovery_server(udp_holder=None) -> None:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
sock.setblocking(False)
|
||||
try:
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
except (AttributeError, OSError):
|
||||
pass
|
||||
try:
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
|
||||
except (AttributeError, OSError):
|
||||
pass
|
||||
sock.bind(("0.0.0.0", DISCOVERY_UDP_PORT))
|
||||
if udp_holder is not None:
|
||||
udp_holder["sock"] = sock
|
||||
print(f"UDP discovery listening on 0.0.0.0:{DISCOVERY_UDP_PORT}")
|
||||
try:
|
||||
await _handle_udp_discovery(sock, udp_holder)
|
||||
finally:
|
||||
if udp_holder is not None:
|
||||
udp_holder.pop("sock", None)
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
async def _send_bridge_wifi_channel(settings, sender):
|
||||
@@ -229,17 +243,6 @@ async def _send_bridge_wifi_channel(settings, sender):
|
||||
print(f"[startup] bridge channel message failed: {e}")
|
||||
|
||||
|
||||
async def _run_tcp_server(settings):
|
||||
if not settings.get("tcp_enabled", True):
|
||||
print("TCP server disabled (tcp_enabled=false)")
|
||||
return
|
||||
port = int(settings.get("tcp_port", 8765))
|
||||
server = await asyncio.start_server(_handle_tcp_client, "0.0.0.0", port)
|
||||
print(f"TCP server listening on 0.0.0.0:{port}")
|
||||
async with server:
|
||||
await server.serve_forever()
|
||||
|
||||
|
||||
async def main(port=80):
|
||||
settings = Settings()
|
||||
print(settings)
|
||||
@@ -250,6 +253,29 @@ async def main(port=80):
|
||||
set_sender(sender)
|
||||
|
||||
app = Microdot()
|
||||
audio_detector = AudioBeatDetector()
|
||||
try:
|
||||
from util import audio_detector as audio_detector_module
|
||||
|
||||
audio_detector_module.set_shared_beat_detector(audio_detector)
|
||||
except Exception as e:
|
||||
print(f"[startup] audio detector shared registration skipped: {e!r}")
|
||||
try:
|
||||
from util.audio_run_persist import coerce_audio_device, read_audio_run_state
|
||||
|
||||
persisted = read_audio_run_state()
|
||||
if persisted.get("enabled"):
|
||||
dev = coerce_audio_device(persisted.get("device"))
|
||||
audio_detector.start(device=dev)
|
||||
print("[startup] audio beat detector started from saved run state")
|
||||
except Exception as e:
|
||||
print(f"[startup] audio auto-start skipped: {e!r}")
|
||||
from util import beat_driver_route
|
||||
|
||||
beat_driver_route.set_beat_route_main_loop(asyncio.get_running_loop())
|
||||
from util import sequence_playback as seq_pb
|
||||
|
||||
seq_pb.ensure_beat_consumer_started()
|
||||
|
||||
# Initialize sessions with a secret key from settings
|
||||
secret_key = settings.get('session_secret_key', 'led-controller-secret-key-change-in-production')
|
||||
@@ -278,26 +304,130 @@ async def main(port=80):
|
||||
app.mount(pattern.controller, '/patterns')
|
||||
app.mount(settings_controller.controller, '/settings')
|
||||
app.mount(device_controller.controller, '/devices')
|
||||
app.mount(led_tool_controller.controller, '/led-tool')
|
||||
|
||||
tcp_client_registry.set_settings(settings)
|
||||
tcp_client_registry.set_tcp_status_broadcaster(broadcast_device_tcp_status)
|
||||
|
||||
live_reload = _live_reload_enabled()
|
||||
dev_build_id = secrets.token_hex(12) if live_reload else None
|
||||
if live_reload:
|
||||
print(
|
||||
"[dev] LED_CONTROLLER_LIVE_RELOAD: browser refreshes when the server process restarts"
|
||||
)
|
||||
|
||||
if dev_build_id:
|
||||
|
||||
@app.route("/__dev/build-id")
|
||||
def dev_build_id_route(request):
|
||||
_ = request
|
||||
return (
|
||||
dev_build_id,
|
||||
200,
|
||||
{
|
||||
"Content-Type": "text/plain; charset=utf-8",
|
||||
"Cache-Control": "no-store",
|
||||
},
|
||||
)
|
||||
|
||||
# Serve index.html at root (cwd is src/ when run via pipenv run run)
|
||||
@app.route('/')
|
||||
@app.route("/")
|
||||
def index(request):
|
||||
"""Serve the main web UI."""
|
||||
return send_file('templates/index.html')
|
||||
|
||||
# Serve settings page
|
||||
@app.route('/settings')
|
||||
def settings_page(request):
|
||||
"""Serve the settings page."""
|
||||
return send_file('templates/settings.html')
|
||||
|
||||
if dev_build_id:
|
||||
try:
|
||||
with open("templates/index.html", encoding="utf-8") as f:
|
||||
html = f.read()
|
||||
tag = '<script src="/static/dev-live-reload.js" defer></script>'
|
||||
if "</body>" in html:
|
||||
html = html.replace("</body>", tag + "\n</body>", 1)
|
||||
return html, 200, {"Content-Type": "text/html; charset=utf-8"}
|
||||
except OSError:
|
||||
pass
|
||||
return send_file("templates/index.html")
|
||||
|
||||
# Favicon: avoid 404 in browser console (no file needed)
|
||||
@app.route('/favicon.ico')
|
||||
def favicon(request):
|
||||
return '', 204
|
||||
|
||||
@app.route('/api/audio/devices')
|
||||
async def audio_devices(request):
|
||||
_ = request
|
||||
try:
|
||||
return {
|
||||
"devices": audio_detector.list_input_devices(),
|
||||
"diagnostics": audio_detector.diagnostics(),
|
||||
}
|
||||
except Exception as e:
|
||||
return {"error": str(e)}, 500
|
||||
|
||||
@app.route('/api/audio/start', methods=['POST'])
|
||||
async def audio_start(request):
|
||||
payload = request.json if isinstance(request.json, dict) else {}
|
||||
device = payload.get("device", None)
|
||||
if device in ("", None):
|
||||
device = None
|
||||
else:
|
||||
try:
|
||||
device = int(device)
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
try:
|
||||
audio_detector.start(device=device)
|
||||
from util.audio_run_persist import write_audio_run_state
|
||||
|
||||
write_audio_run_state(enabled=True, device=device)
|
||||
return {"ok": True, "status": audio_detector.status()}
|
||||
except Exception as e:
|
||||
return {"ok": False, "error": str(e)}, 500
|
||||
|
||||
@app.route('/api/audio/stop', methods=['POST'])
|
||||
async def audio_stop(request):
|
||||
_ = request
|
||||
audio_detector.stop()
|
||||
from util.audio_run_persist import write_audio_run_state
|
||||
|
||||
write_audio_run_state(enabled=False)
|
||||
return {"ok": True, "status": audio_detector.status()}
|
||||
|
||||
@app.route('/api/audio/status')
|
||||
async def audio_status(request):
|
||||
_ = request
|
||||
from util import beat_driver_route
|
||||
from util import sequence_playback
|
||||
|
||||
st = audio_detector.status()
|
||||
st["sequence"] = sequence_playback.playback_status()
|
||||
st["manual_beat_stride"] = beat_driver_route.manual_beat_stride_status()
|
||||
seq = st.get("sequence")
|
||||
beat_readout = ""
|
||||
if isinstance(seq, dict) and str(seq.get("beat_readout") or "").strip():
|
||||
beat_readout = str(seq.get("beat_readout") or "").strip()
|
||||
elif st.get("running"):
|
||||
mb = st.get("manual_beat_stride")
|
||||
if isinstance(mb, dict) and mb.get("active"):
|
||||
try:
|
||||
n = int(mb.get("stride_n") or 1)
|
||||
except (TypeError, ValueError):
|
||||
n = 1
|
||||
n = max(1, min(64, n))
|
||||
try:
|
||||
bi = int(mb.get("beat_in_stride") or 1)
|
||||
except (TypeError, ValueError):
|
||||
bi = 1
|
||||
pos = min(n, max(1, bi))
|
||||
beat_readout = f"{pos}/{n}"
|
||||
else:
|
||||
try:
|
||||
bs = int(st.get("beat_seq") or 0)
|
||||
except (TypeError, ValueError):
|
||||
bs = 0
|
||||
if bs > 0:
|
||||
beat_readout = str(bs)
|
||||
st["beat_readout"] = beat_readout
|
||||
return {"status": st}
|
||||
|
||||
# Static file route
|
||||
@app.route("/static/<path:path>")
|
||||
def static_handler(request, path):
|
||||
@@ -345,28 +475,77 @@ async def main(port=80):
|
||||
|
||||
|
||||
|
||||
# Touch Device singleton early so db/device.json exists before first TCP hello.
|
||||
# Touch Device singleton early so db/device.json exists before first UDP hello.
|
||||
Device()
|
||||
await _send_bridge_wifi_channel(settings, sender)
|
||||
_prime_wifi_outbound_driver_connections()
|
||||
|
||||
# Await HTTP + driver TCP together so bind failures (e.g. port 80 in use) surface
|
||||
# here instead of as an unretrieved Task exception; the UI WebSocket drops if HTTP
|
||||
# never starts, which clears Wi-Fi presence dots.
|
||||
udp_holder = {"closing": False}
|
||||
loop = asyncio.get_running_loop()
|
||||
|
||||
def _graceful_shutdown(*_args):
|
||||
print("[server] shutting down...")
|
||||
udp_holder["closing"] = True
|
||||
try:
|
||||
audio_detector.stop()
|
||||
except Exception:
|
||||
pass
|
||||
u = udp_holder.get("sock")
|
||||
if u is not None:
|
||||
try:
|
||||
u.close()
|
||||
except OSError:
|
||||
pass
|
||||
tcp_client_registry.cancel_all_driver_tasks()
|
||||
if getattr(app, "server", None) is not None:
|
||||
app.shutdown()
|
||||
|
||||
shutdown_handlers_registered = False
|
||||
try:
|
||||
await asyncio.gather(
|
||||
app.start_server(host="0.0.0.0", port=port),
|
||||
_run_tcp_server(settings),
|
||||
)
|
||||
except OSError as e:
|
||||
if e.errno == errno.EADDRINUSE:
|
||||
tcp_p = int(settings.get("tcp_port", 8765))
|
||||
print(
|
||||
f"[server] bind failed (address already in use): {e!s}\n"
|
||||
f"[server] HTTP is configured for port {port} (env PORT); "
|
||||
f"Wi-Fi LED drivers use tcp_port {tcp_p}. "
|
||||
f"Stop the other process or use a free port, e.g. PORT=8080 pipenv run run"
|
||||
try:
|
||||
for sig in (signal.SIGINT, signal.SIGTERM):
|
||||
loop.add_signal_handler(sig, _graceful_shutdown)
|
||||
shutdown_handlers_registered = True
|
||||
except (NotImplementedError, RuntimeError):
|
||||
pass
|
||||
|
||||
# Await HTTP + UDP discovery; bind failures (e.g. port 80 in use) surface here.
|
||||
try:
|
||||
await asyncio.gather(
|
||||
app.start_server(host="0.0.0.0", port=port),
|
||||
_run_udp_discovery_server(udp_holder),
|
||||
_periodic_wifi_driver_hello_loop(settings, udp_holder),
|
||||
)
|
||||
raise
|
||||
except OSError as e:
|
||||
if e.errno == errno.EADDRINUSE:
|
||||
print(
|
||||
f"[server] bind failed (address already in use): {e!s}\n"
|
||||
f"[server] HTTP is configured for port {port} (env PORT). "
|
||||
f"Stop the other process or use a free port, e.g. PORT=8080 pipenv run run"
|
||||
)
|
||||
raise
|
||||
finally:
|
||||
try:
|
||||
audio_detector.stop()
|
||||
except Exception:
|
||||
pass
|
||||
srv = getattr(app, "server", None)
|
||||
if srv is not None:
|
||||
try:
|
||||
srv.close()
|
||||
await srv.wait_closed()
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
app.server = None
|
||||
except Exception:
|
||||
pass
|
||||
if shutdown_handlers_registered:
|
||||
for sig in (signal.SIGINT, signal.SIGTERM):
|
||||
try:
|
||||
loop.remove_signal_handler(sig)
|
||||
except (NotImplementedError, OSError, ValueError):
|
||||
pass
|
||||
|
||||
if __name__ == "__main__":
|
||||
import os
|
||||
|
||||
@@ -38,6 +38,29 @@ def normalize_mac(mac):
|
||||
return None
|
||||
|
||||
|
||||
def resolve_device_mac_for_select_routing(devices, name_key):
|
||||
"""
|
||||
Map a v1 ``select`` map key to device storage id (MAC).
|
||||
|
||||
Matches the registry **name**, or ``led-<12hex>`` as a MAC hint (default driver
|
||||
name form) so routing still works after the device is renamed in the registry.
|
||||
"""
|
||||
k = str(name_key or "").strip()
|
||||
if not k:
|
||||
return None
|
||||
for did in devices.list():
|
||||
doc = devices.read(did) or {}
|
||||
if str(doc.get("name") or "").strip() == k:
|
||||
m = normalize_mac(did)
|
||||
if m:
|
||||
return m
|
||||
if k.startswith("led-"):
|
||||
m = normalize_mac(k[4:])
|
||||
if m and devices.read(m):
|
||||
return m
|
||||
return None
|
||||
|
||||
|
||||
def derive_device_mac(mac=None, address=None, transport="espnow"):
|
||||
"""
|
||||
Resolve the device MAC used as storage id.
|
||||
@@ -237,16 +260,19 @@ class Device(Model):
|
||||
"""
|
||||
Register or update a Wi-Fi client by **MAC** (storage id). Updates **name**,
|
||||
**address** (peer IP), and optionally **type** from the client hello when valid.
|
||||
|
||||
Returns ``(mac_hex | None, persisted)`` where **persisted** is True iff ``save()``
|
||||
ran (new row or field changes). Duplicate hellos with identical data are no-ops.
|
||||
"""
|
||||
mac_hex = normalize_mac(mac)
|
||||
if not mac_hex:
|
||||
return None
|
||||
return None, False
|
||||
name = (device_name or "").strip()
|
||||
if not name:
|
||||
return None
|
||||
return None, False
|
||||
ip = normalize_address_for_transport(peer_ip, "wifi")
|
||||
if not ip:
|
||||
return None
|
||||
return None, False
|
||||
resolved_type = None
|
||||
if device_type is not None:
|
||||
try:
|
||||
@@ -254,7 +280,8 @@ class Device(Model):
|
||||
except ValueError:
|
||||
resolved_type = None
|
||||
if mac_hex in self:
|
||||
merged = dict(self[mac_hex])
|
||||
prev = self[mac_hex]
|
||||
merged = dict(prev)
|
||||
merged["name"] = name
|
||||
if resolved_type is not None:
|
||||
merged["type"] = resolved_type
|
||||
@@ -263,9 +290,11 @@ class Device(Model):
|
||||
merged["transport"] = "wifi"
|
||||
merged["address"] = ip
|
||||
merged["id"] = mac_hex
|
||||
if merged == prev:
|
||||
return mac_hex, False
|
||||
self[mac_hex] = merged
|
||||
self.save()
|
||||
return mac_hex
|
||||
return mac_hex, True
|
||||
self[mac_hex] = {
|
||||
"id": mac_hex,
|
||||
"name": name,
|
||||
@@ -276,4 +305,4 @@ class Device(Model):
|
||||
"zones": [],
|
||||
}
|
||||
self.save()
|
||||
return mac_hex
|
||||
return mac_hex, True
|
||||
|
||||
@@ -1,14 +1,71 @@
|
||||
from models.model import Model
|
||||
|
||||
|
||||
class Group(Model):
|
||||
"""Device groups (members + optional Wi‑Fi driver defaults); also pattern fields for sequences.
|
||||
|
||||
Omit ``profile_id`` (or set it null) for a **shared** group: every profile can attach it to
|
||||
zones and sequences. Set ``profile_id`` to a profile id to show the group only when that
|
||||
profile is active (still one global record in ``group.json``).
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
|
||||
def load(self):
|
||||
super().load()
|
||||
changed = False
|
||||
for gid, doc in list(self.items()):
|
||||
if not isinstance(doc, dict):
|
||||
continue
|
||||
if self._migrate_record(doc):
|
||||
changed = True
|
||||
if changed:
|
||||
self.save()
|
||||
|
||||
def _migrate_record(self, doc):
|
||||
changed = False
|
||||
raw_dev = doc.get("devices")
|
||||
if raw_dev is None:
|
||||
doc["devices"] = []
|
||||
changed = True
|
||||
elif isinstance(raw_dev, list):
|
||||
norm = []
|
||||
for x in raw_dev:
|
||||
if x is None:
|
||||
continue
|
||||
s = str(x).strip().lower().replace(":", "").replace("-", "")
|
||||
if len(s) == 12 and all(c in "0123456789abcdef" for c in s):
|
||||
norm.append(s)
|
||||
else:
|
||||
norm.append(str(x).strip())
|
||||
if norm != raw_dev:
|
||||
doc["devices"] = norm
|
||||
changed = True
|
||||
for key in (
|
||||
"wifi_driver_display_name",
|
||||
"wifi_driver_num_leds",
|
||||
"wifi_color_order",
|
||||
"wifi_startup_mode",
|
||||
):
|
||||
if key not in doc:
|
||||
doc[key] = None
|
||||
changed = True
|
||||
if "output_brightness" not in doc:
|
||||
doc["output_brightness"] = 255
|
||||
changed = True
|
||||
return changed
|
||||
|
||||
def create(self, name=""):
|
||||
next_id = self.get_next_id()
|
||||
self[next_id] = {
|
||||
"name": name,
|
||||
"devices": [],
|
||||
"wifi_driver_display_name": None,
|
||||
"wifi_driver_num_leds": None,
|
||||
"wifi_color_order": None,
|
||||
"wifi_startup_mode": None,
|
||||
"output_brightness": 255,
|
||||
"pattern": "on",
|
||||
"colors": ["000000", "FF0000"],
|
||||
"brightness": 100,
|
||||
@@ -22,7 +79,7 @@ class Group(Model):
|
||||
"n5": 0,
|
||||
"n6": 0,
|
||||
"n7": 0,
|
||||
"n8": 0
|
||||
"n8": 0,
|
||||
}
|
||||
self.save()
|
||||
return next_id
|
||||
|
||||
125
src/models/http_driver.py
Normal file
125
src/models/http_driver.py
Normal file
@@ -0,0 +1,125 @@
|
||||
"""Wi-Fi LED drivers over HTTP long-poll (same port as the web UI).
|
||||
|
||||
Drivers POST /driver/v1/poll; the controller responds with queued JSON lines.
|
||||
Presence: last poll within DRIVER_HTTP_SEEN_S counts as connected.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import time
|
||||
|
||||
from models.wifi_peer import normalize_wifi_peer_ip
|
||||
|
||||
# Must exceed max ``wait_s`` (60) on /driver/v1/poll so sessions are not pruned mid-wait.
|
||||
DRIVER_HTTP_SEEN_S = 90.0
|
||||
_QUEUE_MAX = 64
|
||||
|
||||
_queues: dict[str, asyncio.Queue] = {}
|
||||
_last_poll: dict[str, float] = {}
|
||||
_connected_flag: set[str] = set()
|
||||
_status_broadcast = None
|
||||
|
||||
|
||||
def set_wifi_driver_status_broadcaster(coro) -> None:
|
||||
global _status_broadcast
|
||||
_status_broadcast = coro
|
||||
|
||||
|
||||
def _schedule_status(ip: str, connected: bool) -> None:
|
||||
fn = _status_broadcast
|
||||
if not fn:
|
||||
return
|
||||
try:
|
||||
loop = asyncio.get_running_loop()
|
||||
except RuntimeError:
|
||||
return
|
||||
try:
|
||||
loop.create_task(fn(ip, connected))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
def _get_queue(ip: str) -> asyncio.Queue:
|
||||
q = _queues.get(ip)
|
||||
if q is None:
|
||||
q = asyncio.Queue(maxsize=_QUEUE_MAX)
|
||||
_queues[ip] = q
|
||||
return q
|
||||
|
||||
|
||||
def prune_stale_http_sessions() -> None:
|
||||
"""Drop timed-out sessions, clear queues, broadcast disconnect."""
|
||||
now = time.monotonic()
|
||||
for ip in list(_last_poll.keys()):
|
||||
if now - _last_poll[ip] <= DRIVER_HTTP_SEEN_S:
|
||||
continue
|
||||
_last_poll.pop(ip, None)
|
||||
_queues.pop(ip, None)
|
||||
if ip in _connected_flag:
|
||||
_connected_flag.discard(ip)
|
||||
_schedule_status(ip, False)
|
||||
print(f"[HTTP driver] session timed out: {ip}")
|
||||
|
||||
|
||||
def touch_http_session(ip: str) -> None:
|
||||
ip = normalize_wifi_peer_ip(ip)
|
||||
if not ip:
|
||||
return
|
||||
prune_stale_http_sessions()
|
||||
now = time.monotonic()
|
||||
_last_poll[ip] = now
|
||||
if ip not in _connected_flag:
|
||||
_connected_flag.add(ip)
|
||||
_schedule_status(ip, True)
|
||||
|
||||
|
||||
def wifi_driver_connected(ip: str) -> bool:
|
||||
prune_stale_http_sessions()
|
||||
key = normalize_wifi_peer_ip(ip)
|
||||
return bool(key and key in _connected_flag)
|
||||
|
||||
|
||||
def list_connected_driver_ips():
|
||||
prune_stale_http_sessions()
|
||||
return list(_connected_flag)
|
||||
|
||||
|
||||
async def enqueue_json_line(ip: str, json_str: str) -> bool:
|
||||
ip = normalize_wifi_peer_ip(ip)
|
||||
if not ip:
|
||||
return False
|
||||
line = json_str[:-1] if json_str.endswith("\n") else json_str
|
||||
q = _get_queue(ip)
|
||||
while True:
|
||||
try:
|
||||
q.put_nowait(line)
|
||||
return True
|
||||
except asyncio.QueueFull:
|
||||
try:
|
||||
q.get_nowait()
|
||||
except asyncio.QueueEmpty:
|
||||
pass
|
||||
|
||||
|
||||
async def send_json_line_to_ip(ip: str, json_str: str) -> bool:
|
||||
"""Queue one JSON line for the driver to receive on the next long-poll."""
|
||||
return await enqueue_json_line(ip, json_str)
|
||||
|
||||
|
||||
async def collect_lines_after_touch(ip: str, wait_s: float) -> list[str]:
|
||||
"""Wait up to wait_s for first line, then drain the rest (non-blocking)."""
|
||||
ip = normalize_wifi_peer_ip(ip)
|
||||
if not ip:
|
||||
return []
|
||||
q = _get_queue(ip)
|
||||
lines: list[str] = []
|
||||
try:
|
||||
first = await asyncio.wait_for(q.get(), timeout=wait_s)
|
||||
lines.append(first)
|
||||
while True:
|
||||
try:
|
||||
lines.append(q.get_nowait())
|
||||
except asyncio.QueueEmpty:
|
||||
break
|
||||
except asyncio.TimeoutError:
|
||||
pass
|
||||
return lines
|
||||
@@ -15,6 +15,9 @@ class Preset(Model):
|
||||
if default_profile_id is not None:
|
||||
preset_data["profile_id"] = str(default_profile_id)
|
||||
changed = True
|
||||
if isinstance(preset_data, dict) and "group_ids" in preset_data:
|
||||
preset_data.pop("group_ids", None)
|
||||
changed = True
|
||||
if changed:
|
||||
self.save()
|
||||
except Exception:
|
||||
@@ -26,6 +29,7 @@ class Preset(Model):
|
||||
"name": "",
|
||||
"pattern": "",
|
||||
"colors": [],
|
||||
"background": "#000000",
|
||||
"brightness": 0,
|
||||
"delay": 0,
|
||||
"n1": 0,
|
||||
@@ -36,6 +40,7 @@ class Preset(Model):
|
||||
"n6": 0,
|
||||
"n7": 0,
|
||||
"n8": 0,
|
||||
"manual_beat_n": 1,
|
||||
"profile_id": str(profile_id) if profile_id is not None else None,
|
||||
}
|
||||
self.save()
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user