Second heatup PJ probe with tightened X_entry (T_c width 6K vs
baseline 14K) gives:
T=60s: 5710 sets in 101s — T_c envelope [281.05, 291.0] ✅
T=300s: 12932 sets in 206s — T_c envelope [281.05, 291.0] ✅
T_c envelope STABLE (identical at 60s and 300s) — the tube reached
steady shape and stopped growing. Low-T_avg trip (280) cleared at
lower bound 281.05, ~1K margin.
**First sound nonlinear reach-avoid proof for any mode of this plant:**
for the tightened entry and T = 300s, every inv1_holds halfspace
holds along the tube. Sound w.r.t. PJ dynamics (<= 0.1% error vs
full state).
The baseline wider-entry run was loose on T_c low bound (272.4),
confirming that the looseness was entry-box-width driven (14K too
wide for TMJets + orderQ=2) rather than intrinsic to the method.
Entry splitting / refinement is the path to the full baseline set.
Also: LaTeX preamble now has the unicode-to-math literate map
attached to the listing STYLES themselves (not just global \lstset),
so terminal-output listings pasted from Julia with Δ, ≥, °, ✅ etc.
render correctly. Journal 34 pages, clean build.
OVERNIGHT_NOTES.md updated with tight-entry win.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
journal/ directory, LaTeX-based, dated entries, callout boxes for
derivations / decisions / dead ends / limitations, plus an \apass{}
macro for in-line markers when a later deep-pass is needed.
Retroactive A-style entries for 2026-04-17 (controllers, linearization,
LQR, operation-mode linear reach, Lyapunov barrier) and 2026-04-20
(predicates restructure into deadbands+safety+invariants, OL-vs-CL
barrier analysis, mode-obligation taxonomy, heatup-rate-as-halfspace,
mode_boundaries, first Julia nonlinear reach attempt).
Both entries include derivations written out in math, dead-ends I
hit, code snippets with commentary, figure embeds, and terminal
output where it changed what we did next. The goal is invention-log
depth — readable 4 years from now without the git history to help.
journal/README.md documents the conventions. journal.tex aggregates
all entries into one PDF via latexmk.
Kept claude_memory/ separate as per earlier agreement — those are
short AI-context notes, different audience.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>