improved the logtool, but not quite there yet?

This commit is contained in:
Thaddeus Hughes
2026-03-30 11:39:04 -05:00
parent 9eb283420a
commit 837ec18fad
24 changed files with 55223 additions and 57 deletions

BIN
logtool/16MAR2026_2223.bin Normal file

Binary file not shown.

View File

@@ -0,0 +1,2 @@
Reading storage-16MAR2026-1728.bin ...
Parsed 52 entries

BIN
logtool/16MAR2026_2225.bin Normal file

Binary file not shown.

View File

@@ -0,0 +1,58 @@
Reading storage-16MAR2026-1728.bin ...
Parsed 52 entries
Time State Bat(V) Drive(A) Jack(A) Aux(A) Counter Stable Raw DrHeat JkHeat AxHeat
----------------------- -------------------- ------------------- --------------------------------------- ----------------------------------- ------------------------- ------- ----------------- ---------------- ------------------------------------ -------------------------------------- ------
UNK(0xd4) — — — — — — — — — —
1970-01-01 00:00:15.894 IDLE -34926674051072.000 0.00 18141941858304.00 -0.00 15947 DRIVE+AUX2 - 0.0 0.0 0.0
UNK(0xc0) — — — — — — — — — —
UNK(0x3d) — — — — — — — — — —
UNK(0x3c) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
2458965396544290816 IDLE -0.000 0.00 -0.00 0.00 -13883 SAFETY+DRIVE+AUX2 SAFETY+JACK+AUX2 3.8 0.0 0.0
UNK(0x40) — — — — — — — — — —
UNK(0x9c) — — — — — — — — — —
PARSE_ERR 0.000 0.00 0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0x30) — — — — — — — — — —
72339069014654230 IDLE -0.000 0.00 -0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0x20) — — — — — — — — — —
15564440319108055738 IDLE 0.000 0.00 0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0x91) — — — — — — — — — —
UNK(0x20) — — — — — — — — — —
UNK(0xcd) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
72339069014687196 IDLE 3.486 0.00 0.00 0.00 16147 SAFETY+AUX2 DRIVE+AUX2 0.0 0.0 0.0
PARSE_ERR 0.000 0.00 0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0xea) — — — — — — — — — —
UNK(0x11) — — — — — — — — — —
UNK(0x41) — — — — — — — — — —
UNK(0xae) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
PARSE_ERR 0.000 0.00 0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0x20) — — — — — — — — — —
11096869488759107671 CALIBRATE_JACK_DELAY 0.000 55944611772684566528.00 -0.00 0.00 0 - - 0.0 67650244627660800.0 0.0
UNK(0x80) — — — — — — — — — —
281474977709058 JACK_UP_START 0.000 0.00 -0.00 -295735836036957208576.00 15488 SAFETY+DRIVE SAFETY -0.0 0.0 0.0
UNK(0xd3) — — — — — — — — — —
6151020166072509758 MOVE_START_DELAY 0.000 0.00 11751184508542699927613527293952.00 0.00 -27648 AUX2 AUX2 184761406165897707520.0 912500379009562710927105661457137664.0 96.1
UNK(0xb8) — — — — — — — — — —
1972-01-28 13:05:11.952 IDLE 0.000 68671368807317504.00 0.00 12.83 -26752 JACK+AUX2 SAFETY+DRIVE -0.0 124029976523403327445139456.0 0.0
4430992851352526 IDLE 0.000 -76302429880697741276410658934489088.00 0.00 0.00 16718 - DRIVE+AUX2 1923670617821681863848652481495040.0 7180627444374621246193664.0 0.0
UNK(0x40) — — — — — — — — — —
PARSE_ERR 0.000 0.00 0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0x11) — — — — — — — — — —
UNK(0x18) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
PARSE_ERR 0.000 0.00 0.00 0.00 0 - - 0.0 0.0 0.0
UNK(0xd4) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
UNK(0x35) — — — — — — — — — —
UNK(0x60) — — — — — — — — — —
UNK(0x5b) — — — — — — — — — —
UNK(0x7f) — — — — — — — — — —
UNK(0x0f) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
UNK(0x3e) — — — — — — — — — —
UNK(0x40) — — — — — — — — — —
UNK(0xff) — — — — — — — — — —
Entries : 52 total (15 FSM, 0 BAT, 0 CRASH, 0 BOOT, 0 TIME_SET)

BIN
logtool/17MAR2026_0815.bin Normal file

Binary file not shown.

View File

@@ -0,0 +1,3 @@
Reading storage-16MAR2026-1728.bin ...
Log offsets: tail=16384 head=4329920
Parsed 54831 entries

BIN
logtool/17MAR2026_0816.bin Normal file

Binary file not shown.

54841
logtool/17MAR2026_0816.txt Normal file

File diff suppressed because it is too large Load Diff

BIN
logtool/17MAR2026_0819.bin Normal file

Binary file not shown.

View File

@@ -0,0 +1,3 @@
Reading storage-16MAR2026-1728.bin ...
Log offsets: tail=16384 head=4329920
Parsed 54831 entries

BIN
logtool/17MAR2026_0821.bin Normal file

Binary file not shown.

View File

@@ -0,0 +1,3 @@
Reading storage-16MAR2026-1728.bin ...
Log offsets: tail=16384 head=4329920
Parsed 54831 entries

BIN
logtool/17MAR2026_0859.bin Normal file

Binary file not shown.

View File

@@ -0,0 +1,3 @@
Reading storage-16MAR2026-1728.bin ...
Log offsets: tail=16384 head=4329920
Parsed 70748 entries

29
logtool/debug-notes.md Normal file
View File

@@ -0,0 +1,29 @@
# Logtool Debug Notes — 16 MAR 2026
## Problem
`storage-16MAR2026-1728.bin` not parsing correctly.
## Root Cause (SOLVED)
Two issues in `parser.py`:
### 1. Unrecognized file format
The file format is `[4B tail BE][4B head BE][raw log data]` — no JSON header.
- `head - tail == file_size - 8` (the raw log data is exactly the flash region from tail to head)
- The old HTTP format was: `[4B json_len BE][json][4B tail BE][4B head BE][raw log data]`
- The parser's autodetect only recognized the old HTTP format (and required json_len < 8192)
- **Fix:** Added detection for bare tail+head format in `autodetect_and_parse()`
### 2. Type-first vs type-last detection failure
The log entries use **type-first** format: `[len][type][payload]`
But `_try_detect_type_first()` returned False because:
- First entry had type=0x00 at both positions (ambiguous)
- Timestamp was near-zero (RTC not yet set), so timestamp sanity check failed
- Function gave up after only 1 entry (`break` at end of loop)
- **Fix:** Loop over multiple entries (up to 200), added voltage sanity check (0.5-60V)
## Verification
- 54,831 entries parsed successfully
- FSM states: IDLE (51169), JACK_UP (1078), DRIVE (994), etc.
- Voltages: 3.3V13.3V (reasonable for 3S LiPo system)
- Timestamps: Jan 13 Feb 6, 2026 (after RTC set)
- Old HTTP-format .bin files still parse correctly

View File

@@ -114,7 +114,7 @@ def show_plots(entries: list, title: str = "SC-F001 Log"):
add_crash_lines(ax3)
ax3.grid(True, alpha=0.3)
ax3.xaxis.set_major_formatter(mdates.DateFormatter('%H:%M:%S'))
ax3.xaxis.set_major_formatter(mdates.AutoDateFormatter(ax3.xaxis.get_major_locator()))
fig.autofmt_xdate()
plt.tight_layout()
plt.show()
@@ -151,7 +151,7 @@ def live_plot(url: str, interval_s: float = 2.0):
}
axes[1].legend(fontsize=8, loc='upper right')
axes[3].legend(fontsize=8, loc='upper right')
axes[3].xaxis.set_major_formatter(mdates.DateFormatter('%H:%M:%S'))
axes[3].xaxis.set_major_formatter(mdates.AutoDateFormatter(axes[3].xaxis.get_major_locator()))
state = {'current_tail': 0, 'first': True}

View File

@@ -108,10 +108,15 @@ def _ts_to_str(ts_ms: int) -> str:
def _unpack_fsm(payload: bytes, fsm_states: dict) -> dict:
if len(payload) < 39:
raise ValueError(f"FSM payload too short: {len(payload)} < 39")
ts_ms, bat_V, drive_A, jack_A, aux_A, counter, sensors, \
drive_heat, jack_heat, aux_heat = struct.unpack_from('<QffffhBfff', payload, 0)
if len(payload) < 27:
raise ValueError(f"FSM payload too short: {len(payload)} < 27")
ts_ms, bat_V, drive_A, jack_A, aux_A, counter, sensors = \
struct.unpack_from('<QffffhB', payload, 0)
drive_heat = jack_heat = aux_heat = 0.0
if len(payload) >= 31:
drive_heat, = struct.unpack_from('<f', payload, 27)
if len(payload) >= 39:
jack_heat, aux_heat = struct.unpack_from('<ff', payload, 31)
return {
'ts_ms': ts_ms,
'time_str': _ts_to_str(ts_ms),
@@ -187,10 +192,19 @@ def _unpack_time_set(payload: bytes) -> dict:
}
def parse_entries(data: bytes, fsm_states: dict = None) -> list:
def _is_valid_entry_type(t: int) -> bool:
return (0 <= t <= 12) or t in (LOG_TYPE_BAT, LOG_TYPE_CRASH, LOG_TYPE_BOOT, LOG_TYPE_TIME_SET)
def parse_entries(data: bytes, fsm_states: dict = None, type_first: bool = False) -> list:
"""
Parse a stream of raw binary log entries.
Returns list of dicts, each with 'entry_type' and type-specific fields.
Entry format depends on type_first:
False (current FW): [len u8][payload (len-1 bytes)][type u8]
True (old FW): [len u8][type u8][payload (len-1 bytes)]
In both cases total bytes consumed per entry = len + 1.
"""
if fsm_states is None:
fsm_states = _FALLBACK_FSM_STATES
@@ -202,25 +216,70 @@ def parse_entries(data: bytes, fsm_states: dict = None) -> list:
while i < n:
b = data[i]
# Erased flash or sector padding → done or skip sector
if b == 0xFF:
break
if b == 0x00:
# Sector padding: skip to next 4096-byte boundary
# Erased flash or sector padding → skip to next sector
if b == 0xFF or b == 0x00:
sector_size = 4096
next_sector = ((i // sector_size) + 1) * sector_size
i = next_sector
continue
# In type_first (old FW) format, sectors have a small zero-pad header
# that isn't full-sector padding. Only skip individual zero bytes.
if type_first and b == 0x00:
i += 1
continue
entry_len = b # stored len = payload_size + 1
payload_size = entry_len - 1
type_offset = i + 1 + payload_size # = i + entry_len
end_offset = i + entry_len # last byte of this entry's content
if type_offset >= n:
if end_offset >= n:
break # truncated
payload = data[i + 1 : i + 1 + payload_size]
entry_type = data[type_offset]
# Detect entry format: with type byte (total = len+1) or without (total = len).
# Check if data[end_offset] is the start of the next entry (no type byte)
# vs a type byte followed by the next entry at end_offset+1.
has_type_byte = True
if end_offset + 1 < n:
next_at_len = data[end_offset] # byte right after payload
next_at_len1 = data[end_offset + 1] # byte one further
# If the byte at end_offset looks like a valid next-entry len byte
# (matches current entry len or is another plausible len), and the
# byte at end_offset+1 does NOT, then there's no type byte.
next_ok = next_at_len not in (0x00, 0xFF) and next_at_len < 250
next1_ok = next_at_len1 not in (0x00, 0xFF) and next_at_len1 < 250
if next_ok and not _is_valid_entry_type(next_at_len):
# end_offset byte isn't a valid type, treat as next entry (no type)
has_type_byte = False
elif next_ok and next_at_len == entry_len and not next1_ok:
# Same len repeating at stride=len (not len+1) → no type byte
has_type_byte = False
if not has_type_byte:
# No type byte: [len][payload], total = len bytes, FSM type implied
payload = data[i + 1 : i + entry_len]
entry_type = 0 # default to IDLE / FSM
i = end_offset # advance by len (not len+1)
elif type_first:
entry_type = data[i + 1]
payload = data[i + 2 : i + 1 + entry_len]
# Fallback: if type-first gives invalid type, try type-last
if not _is_valid_entry_type(entry_type):
alt_type = data[end_offset]
if _is_valid_entry_type(alt_type):
entry_type = alt_type
payload = data[i + 1 : i + 1 + payload_size]
i = end_offset + 1
else:
payload = data[i + 1 : i + 1 + payload_size]
entry_type = data[end_offset]
# Fallback: if type-last gives invalid type, try type-first
if not _is_valid_entry_type(entry_type):
alt_type = data[i + 1]
if _is_valid_entry_type(alt_type):
entry_type = alt_type
payload = data[i + 2 : i + 1 + entry_len]
i = end_offset + 1
try:
if 0 <= entry_type <= 12:
@@ -258,7 +317,7 @@ def parse_entries(data: bytes, fsm_states: dict = None) -> list:
}
entries.append(e)
i = type_offset + 1 # advance past type byte
# i was already advanced in the format-detection block above
return entries
@@ -285,18 +344,118 @@ def parse_response(blob: bytes, fsm_states: dict = None) -> tuple:
return meta, tail, head, entries
def _detect_old_partition_dump(blob: bytes) -> int:
"""
Detect old firmware partition dump format.
Old format: 8-byte file header + 0x4000 bytes params + log entries
with type byte at the start of each entry's content region.
Returns the log data start offset, or 0 if not detected.
"""
if len(blob) < 0x4100:
return 0
# Check if offset 0x4000 looks like a log sector: leading zero-pad
# followed by a valid entry with a valid type byte at +1 (type-first format)
base = 0x4000
# Find first non-zero byte in the sector
first_nz = 0
while first_nz < 4096 and blob[base + first_nz] == 0x00:
first_nz += 1
if first_nz >= 4096:
return 0
entry_len = blob[base + first_nz]
if entry_len < 2 or base + first_nz + 1 + entry_len > len(blob):
return 0
# In old format, the type byte is the first byte after the len byte
entry_type = blob[base + first_nz + 1]
if _is_valid_entry_type(entry_type):
return base
return 0
def _try_detect_type_first(data: bytes) -> bool:
"""
Given raw log entry data, try to determine if entries use
type-first format (old FW) vs type-last format (current FW).
Samples multiple entries and checks which placement yields
valid entry types, plausible timestamps, or reasonable voltages.
"""
i = 0
n = len(data)
attempts = 0
max_attempts = 200
while i < n and attempts < max_attempts:
b = data[i]
if b == 0xFF:
break
if b == 0x00:
i = ((i // 4096) + 1) * 4096
continue
entry_len = b
end_offset = i + entry_len
if end_offset >= n:
break
# type-last (current): type is at end_offset
type_last = data[end_offset]
# type-first (old): type is at i+1
type_first_val = data[i + 1]
last_valid = _is_valid_entry_type(type_last)
first_valid = _is_valid_entry_type(type_first_val)
if first_valid and not last_valid:
return True
if last_valid and not first_valid:
return False
# Both valid or neither — try parsing the payload to disambiguate
if first_valid and last_valid:
payload_first = data[i + 2 : i + 1 + entry_len]
payload_last = data[i + 1 : i + 1 + entry_len - 1]
for payload, is_first in [(payload_first, True), (payload_last, False)]:
if len(payload) >= 12:
ts = struct.unpack_from('<Q', payload, 0)[0]
# Plausible if timestamp is 2020-2030 in ms
if 1577836800000 < ts < 1893456000000:
return is_first
# Also check if the float at offset 8 is a reasonable voltage (0-60V)
v = struct.unpack_from('<f', payload, 8)[0]
if 0.5 < v < 60.0:
return is_first
# Advance to next entry and keep trying
i = end_offset + 1
attempts += 1
return False
def autodetect_and_parse(blob: bytes, fsm_states: dict = None) -> tuple:
"""
Auto-detect whether blob is HTTP response format or raw flash binary.
Auto-detect whether blob is HTTP response format, old partition dump,
or raw flash binary.
Returns (json_meta_or_None, tail_or_None, head_or_None, entries).
"""
# HTTP format: first 4 bytes = BE uint32 json_len, byte 4 should be '{'
if len(blob) >= 5:
candidate_len = struct.unpack_from('>I', blob, 0)[0]
if candidate_len < 8192 and blob[4:5] == b'{':
if candidate_len < len(blob) and blob[4:5] == b'{':
meta, tail, head, entries = parse_response(blob, fsm_states)
return meta, tail, head, entries
# Raw binary
entries = parse_entries(blob, fsm_states)
# Bare tail+head format: [4B tail BE][4B head BE][raw log data]
# Detect by checking if head - tail == len(blob) - 8
if len(blob) >= 16:
tail_val, head_val = struct.unpack_from('>II', blob, 0)
if head_val > tail_val and (head_val - tail_val) == len(blob) - 8:
log_data = blob[8:]
type_first = _try_detect_type_first(log_data)
entries = parse_entries(log_data, fsm_states, type_first=type_first)
return None, tail_val, head_val, entries
# Old partition dump: 8-byte header + 0x4000 params + log entries (type-first)
log_offset = _detect_old_partition_dump(blob)
if log_offset > 0:
log_data = blob[log_offset:]
type_first = _try_detect_type_first(log_data)
entries = parse_entries(log_data, fsm_states, type_first=type_first)
return None, None, None, entries
# Raw binary — auto-detect type placement
type_first = _try_detect_type_first(blob)
entries = parse_entries(blob, fsm_states, type_first=type_first)
return None, None, None, entries

Binary file not shown.