Decode compact non-direct map event rows

This commit is contained in:
Jan Petykiewicz 2026-04-19 00:51:09 -07:00
commit 224a2d19e8
2 changed files with 459 additions and 85 deletions

View file

@ -167,10 +167,14 @@ const PACKED_EVENT_RECORD_SYNTHETIC_MAGIC: &[u8; 4] = b"RPE1";
const PACKED_EVENT_RECORD_TEMPLATE_SYNTHETIC_MAGIC: &[u8; 4] = b"RPT1";
const PACKED_EVENT_REAL_CONDITION_MARKER: u16 = 0x526f;
const PACKED_EVENT_REAL_GROUPED_EFFECT_MARKER: u16 = 0x4eb8;
const PACKED_EVENT_REAL_RECORD_TERMINATOR_MARKER: u16 = 0x4eb9;
const PACKED_EVENT_REAL_CONDITION_ROW_LEN: usize = 0x1e;
const PACKED_EVENT_REAL_GROUPED_EFFECT_ROW_LEN: usize = 0x28;
const PACKED_EVENT_REAL_GROUP_COUNT: usize = 4;
const PACKED_EVENT_REAL_COMPACT_CONTROL_LEN: usize = 37;
const PACKED_EVENT_NONDIRECT_CONDITION_ROW_SERIALIZED_LEN: usize = 22;
const PACKED_EVENT_NONDIRECT_GROUPED_EFFECT_ROW_SERIALIZED_LEN: usize = 45;
const PACKED_EVENT_NONDIRECT_OPTIONAL_NAME_BLOCK_LEN: usize = 0x64;
const PACKED_EVENT_TEXT_BAND_LABELS: [&str; 6] = [
"primary_text_band",
"secondary_text_band_0",
@ -4781,7 +4785,7 @@ fn build_periodic_company_service_trace_report(
"the trigger-kind field itself is now bounded as an ordinary loaded per-event lane rather than a startup-only special class: restore-side loader 0x00433130 repopulates live event collection 0x0062be18 from packed chunk family 0x4e21/0x4e22, and the event-detail editor strip 0x004d90ba..0x004d91ed writes [event+0x7ef] across the full 0x00..0x0a range through controls 0x4e98..0x4ea2, including kind 8 at 0x004d91b3".to_string(),
"that keeps 0x00444d92 -> 0x00432f40(kind 8) on the ordinary loaded runtime-effect pipeline too: world bring-up is servicing pre-existing rows from 0x0062be18 rather than a one-off startup-only record class synthesized outside the collection".to_string(),
"the event-detail editor family now ties that trigger-kind field to the ordinary runtime-effect builders too: selected-event control family 0x004db02a / 0x004db1b8..0x004db309 mirrors current [event+0x7ef] back into controls 0x4e98..0x4ea2 under root control 0x4e84, while editor-side builder 0x004db9e5..0x004db9f1 allocates a runtime-effect row from compact payload into 0x0062be18 through 0x00432ea0 before rebinding the selected event id".to_string(),
"bundle-side inspection now grounds the ordinary startup collection further too: War Effort.gmp exposes a non-direct 0x4e99/0x4e9a/0x4e9b runtime-event collection at 0x74740c/0x7543f4/0x7554cf with 24 live rows, and those rows now segment cleanly as compact 0x526f-delimited bodies with repeated 0x4eb8 grouped-effect markers plus optional 0x4eb9 terminators rather than disappearing behind a missing bundle probe".to_string(),
"bundle-side inspection now grounds the ordinary startup collection further too: the non-direct 0x4e99/0x4e9a/0x4e9b runtime-event collection decodes as a compact serializer family recovered from 0x00433060/0x00430d70/0x00433130 rather than an opaque raw blob, and sampled maps such as War Effort/British Isles/Germany/Texas Tea now decode their compact rows into actual condition/grouped summaries instead of signature-only parity".to_string(),
],
blockers: vec![
"current atlas evidence now grounds one tuple-backed owner path too: loader tuple field [+0x0c] reaches [site+0x276] through 0x0046f073 / 0x004707ff -> 0x0040ef10, but the classified 0x004707ff caller belongs to multiplayer transport selector-0x13 rather than ordinary save-load restore, so a non-transport persisted source family is still needed for shellless acquisition".to_string(),
@ -4789,7 +4793,7 @@ fn build_periodic_company_service_trace_report(
"the paired collection-side triplet serializer 0x00413440 is ruled down too, so the missing ordinary restored-row owner seam likely sits outside the currently bounded direct allocator/finalize/store families and the tagged 0x36b1/0x36b2/0x36b3 load-save strip".to_string(),
"the load-side stream owner 0x00413280 is ruled down to cached-source/candidate replay through vtable slot +0x40 and 0x0040ce60, so the missing ordinary restored-row owner seam still sits beyond the current stream-load bridge too".to_string(),
"the checked ordinary restore ordering is ruled down too: 0x00413280 stream load, 0x00481210 dynamic side-buffer refresh, and 0x004133b0 local-runtime replay all sit on the bring-up strip without re-entering 0x004134d0 / 0x0040f6d0 / 0x0040ef10 for already-restored rows".to_string(),
"the grouped opcode dispatcher 0x00431b20 is still not a tagged restore owner, but the remaining uncertainty is narrower now than 'is kind 8 synthetic' or 'does kind 8 live on a separate editor/build class': restore-side 0x00433130 reloads ordinary live event rows into 0x0062be18, the event-detail editor exposes [event+0x7ef] across 0x00..0x0a including kind 8, the same editor family reaches ordinary runtime-effect allocator 0x00432ea0, and War Effort.gmp now proves that the bundle-side collection carries 24 compact non-direct 0x526f/0x4eb8(/0x4eb9) row bodies, so the open question is the field mapping inside that compact row family and which loaded kind-8 rows can actually reach the placed-structure mutation opcodes under 0x00431b20".to_string(),
"the grouped opcode dispatcher 0x00431b20 is still not a tagged restore owner, but the remaining uncertainty is now narrower than compact row framing too: restore-side 0x00433130 reloads ordinary live event rows into 0x0062be18, 0x00433060/0x00430d70 serialize the compact non-direct bundle rows, the event-detail editor exposes [event+0x7ef] across 0x00..0x0a including kind 8, and sampled map bundles now decode into concrete grouped descriptors, so the open question is which serialized/live rows correlate to trigger kind 8 and which of those loaded rows can actually reach the placed-structure mutation opcodes under 0x00431b20".to_string(),
],
},
SmpServiceConsumerHypothesis {
@ -9426,7 +9430,8 @@ fn try_parse_nondirect_event_runtime_record_summaries(
records_payload_offset: usize,
live_entry_ids: &[u32],
) -> Option<Vec<SmpLoadedPackedEventRecordSummary>> {
let marker_offsets = find_u16_le_offsets(records_payload, PACKED_EVENT_REAL_CONDITION_MARKER);
let marker_offsets =
find_u32_le_offsets(records_payload, PACKED_EVENT_REAL_CONDITION_MARKER as u32);
if marker_offsets.len() != live_entry_ids.len() || marker_offsets.first().copied() != Some(0) {
return None;
}
@ -9439,83 +9444,374 @@ fn try_parse_nondirect_event_runtime_record_summaries(
let start = *record_offsets.get(record_index)?;
let end = *record_offsets.get(record_index + 1)?;
let record_body = records_payload.get(start..end)?;
let grouped_marker_relative_offset =
find_u16_le_offsets(record_body, PACKED_EVENT_REAL_GROUPED_EFFECT_MARKER)
.into_iter()
.next();
let end_marker_relative_offset =
find_u16_le_offsets(record_body, 0x4eb9).into_iter().next();
let head_signature_words = read_u16_window(record_body, 0, 18);
let post_group_signature_words = grouped_marker_relative_offset
.map(|offset| offset + 2)
.map(|offset| read_u16_window(record_body, offset, 12))
.unwrap_or_default();
let ascii_preview_before_grouped_marker = grouped_marker_relative_offset
.and_then(|offset| record_body.get(..offset).map(ascii_preview));
let mut notes = vec![
"decoded from non-direct 0x4e99/0x4e9a/0x4e9b map-bundle row segmentation using 0x526f-delimited slices".to_string(),
format!(
"compact signature family = {}",
compact_nondirect_signature_family(
grouped_marker_relative_offset,
&head_signature_words,
&post_group_signature_words,
)
),
format!(
"head signature u16 words = {}",
format_u16_word_signature(&head_signature_words)
),
];
if let Some(offset) = grouped_marker_relative_offset {
notes.push(format!(
"grouped-effect marker 0x4eb8 at relative offset +0x{offset:x}"
));
if !post_group_signature_words.is_empty() {
notes.push(format!(
"post-group signature u16 words = {}",
format_u16_word_signature(&post_group_signature_words)
));
}
}
if let Some(offset) = end_marker_relative_offset {
notes.push(format!(
"row terminator marker 0x4eb9 at relative offset +0x{offset:x}"
));
}
if let Some(preview) = ascii_preview_before_grouped_marker {
notes.push(format!("ascii preview before grouped marker = {preview}"));
}
records.push(SmpLoadedPackedEventRecordSummary {
let record = parse_nondirect_event_runtime_record_summary(
record_body,
records_payload_offset + start,
record_index,
live_entry_id,
payload_offset: Some(records_payload_offset + start),
payload_len: Some(end.saturating_sub(start)),
decode_status: "compact_nondirect_parity_only".to_string(),
payload_family: "real_packed_nondirect_compact_v1".to_string(),
trigger_kind: None,
active: None,
marks_collection_dirty: None,
one_shot: None,
compact_control: None,
text_bands: Vec::new(),
standalone_condition_row_count: 0,
standalone_condition_rows: Vec::new(),
negative_sentinel_scope: None,
grouped_effect_row_counts: vec![0, 0, 0, 0],
grouped_effect_rows: Vec::new(),
decoded_conditions: Vec::new(),
decoded_actions: Vec::new(),
executable_import_ready: false,
notes,
});
)
.or_else(|| {
build_nondirect_event_runtime_record_summary_from_signatures(
record_body,
records_payload_offset + start,
record_index,
live_entry_id,
)
})?;
records.push(record);
}
Some(records)
}
fn parse_nondirect_event_runtime_record_summary(
record_body: &[u8],
payload_offset: usize,
record_index: usize,
live_entry_id: u32,
) -> Option<SmpLoadedPackedEventRecordSummary> {
let mut cursor = 0usize;
if read_u32_at(record_body, cursor)? != PACKED_EVENT_REAL_CONDITION_MARKER as u32 {
return None;
}
cursor += 4;
let standalone_condition_row_count = usize::try_from(read_u32_at(record_body, cursor)?).ok()?;
cursor += 4;
let mut standalone_condition_rows = Vec::with_capacity(standalone_condition_row_count);
for row_index in 0..standalone_condition_row_count {
let remaining_minimum = standalone_condition_row_count
.checked_sub(row_index + 1)?
.checked_mul(PACKED_EVENT_NONDIRECT_CONDITION_ROW_SERIALIZED_LEN)?
.checked_add(4)?
.checked_add(PACKED_EVENT_REAL_GROUP_COUNT.checked_mul(4)?)?
.checked_add(4)?;
let (row, consumed_len) = parse_nondirect_condition_row_summary(
record_body.get(cursor..)?,
row_index,
remaining_minimum,
)?;
standalone_condition_rows.push(row);
cursor += consumed_len;
}
let grouped_marker_relative_offset = cursor;
if read_u32_at(record_body, cursor)? != PACKED_EVENT_REAL_GROUPED_EFFECT_MARKER as u32 {
return None;
}
cursor += 4;
let mut grouped_effect_row_counts = Vec::with_capacity(PACKED_EVENT_REAL_GROUP_COUNT);
let mut grouped_effect_rows = Vec::new();
for group_index in 0..PACKED_EVENT_REAL_GROUP_COUNT {
let group_row_count = usize::try_from(read_u32_at(record_body, cursor)?).ok()?;
cursor += 4;
grouped_effect_row_counts.push(group_row_count);
for row_index in 0..group_row_count {
let remaining_groups_minimum = (group_row_count - row_index - 1)
.checked_mul(PACKED_EVENT_NONDIRECT_GROUPED_EFFECT_ROW_SERIALIZED_LEN)?
.checked_add((PACKED_EVENT_REAL_GROUP_COUNT - group_index - 1).checked_mul(4)?)?
.checked_add(4)?;
let (row, consumed_len) = parse_nondirect_grouped_effect_row_summary(
record_body.get(cursor..)?,
group_index,
row_index,
remaining_groups_minimum,
)?;
grouped_effect_rows.push(row);
cursor += consumed_len;
}
}
let end_marker_relative_offset = cursor;
if read_u32_at(record_body, cursor)? != PACKED_EVENT_REAL_RECORD_TERMINATOR_MARKER as u32 {
return None;
}
cursor += 4;
if cursor != record_body.len() {
return None;
}
let head_signature_words = read_u16_window(record_body, 0, 18);
let post_group_signature_words =
read_u16_window(record_body, grouped_marker_relative_offset + 4, 12);
let ascii_preview_before_grouped_marker = record_body
.get(..grouped_marker_relative_offset)
.map(ascii_preview);
let mut notes = vec![
"decoded from compact non-direct 0x4e99/0x4e9a/0x4e9b map-bundle row framing recovered from the 0x430d70..0x431101 writer strip".to_string(),
format!(
"compact signature family = {}",
compact_nondirect_signature_family(
Some(grouped_marker_relative_offset),
&head_signature_words,
&post_group_signature_words,
)
),
format!(
"head signature u16 words = {}",
format_u16_word_signature(&head_signature_words)
),
format!(
"grouped-effect marker 0x4eb8 at relative offset +0x{grouped_marker_relative_offset:x}"
),
format!(
"row terminator marker 0x4eb9 at relative offset +0x{end_marker_relative_offset:x}"
),
];
if !post_group_signature_words.is_empty() {
notes.push(format!(
"post-group signature u16 words = {}",
format_u16_word_signature(&post_group_signature_words)
));
}
if let Some(preview) = ascii_preview_before_grouped_marker {
notes.push(format!("ascii preview before grouped marker = {preview}"));
}
notes.push(format!(
"compact non-direct grouped row counts by group = {:?}",
grouped_effect_row_counts
));
let decoded_conditions = decode_real_condition_rows(&standalone_condition_rows, None);
Some(SmpLoadedPackedEventRecordSummary {
record_index,
live_entry_id,
payload_offset: Some(payload_offset),
payload_len: Some(cursor),
decode_status: "parity_only".to_string(),
payload_family: "real_packed_nondirect_compact_v1".to_string(),
trigger_kind: None,
active: None,
marks_collection_dirty: None,
one_shot: None,
compact_control: None,
text_bands: Vec::new(),
standalone_condition_row_count,
standalone_condition_rows,
negative_sentinel_scope: None,
grouped_effect_row_counts,
grouped_effect_rows,
decoded_conditions,
decoded_actions: Vec::new(),
executable_import_ready: false,
notes,
})
}
fn build_nondirect_event_runtime_record_summary_from_signatures(
record_body: &[u8],
payload_offset: usize,
record_index: usize,
live_entry_id: u32,
) -> Option<SmpLoadedPackedEventRecordSummary> {
let grouped_marker_relative_offset =
find_u32_le_offsets(record_body, PACKED_EVENT_REAL_GROUPED_EFFECT_MARKER as u32)
.into_iter()
.next();
let end_marker_relative_offset = find_u32_le_offsets(
record_body,
PACKED_EVENT_REAL_RECORD_TERMINATOR_MARKER as u32,
)
.into_iter()
.next();
let head_signature_words = read_u16_window(record_body, 0, 18);
let post_group_signature_words = grouped_marker_relative_offset
.map(|offset| offset + 4)
.map(|offset| read_u16_window(record_body, offset, 12))
.unwrap_or_default();
let ascii_preview_before_grouped_marker = grouped_marker_relative_offset
.and_then(|offset| record_body.get(..offset).map(ascii_preview));
let mut notes = vec![
"decoded from non-direct 0x4e99/0x4e9a/0x4e9b map-bundle row segmentation using 0x526f-delimited slices".to_string(),
format!(
"compact signature family = {}",
compact_nondirect_signature_family(
grouped_marker_relative_offset,
&head_signature_words,
&post_group_signature_words,
)
),
format!(
"head signature u16 words = {}",
format_u16_word_signature(&head_signature_words)
),
];
if let Some(offset) = grouped_marker_relative_offset {
notes.push(format!(
"grouped-effect marker 0x4eb8 at relative offset +0x{offset:x}"
));
if !post_group_signature_words.is_empty() {
notes.push(format!(
"post-group signature u16 words = {}",
format_u16_word_signature(&post_group_signature_words)
));
}
}
if let Some(offset) = end_marker_relative_offset {
notes.push(format!(
"row terminator marker 0x4eb9 at relative offset +0x{offset:x}"
));
}
if let Some(preview) = ascii_preview_before_grouped_marker {
notes.push(format!("ascii preview before grouped marker = {preview}"));
}
Some(SmpLoadedPackedEventRecordSummary {
record_index,
live_entry_id,
payload_offset: Some(payload_offset),
payload_len: Some(record_body.len()),
decode_status: "compact_nondirect_parity_only".to_string(),
payload_family: "real_packed_nondirect_compact_v1".to_string(),
trigger_kind: None,
active: None,
marks_collection_dirty: None,
one_shot: None,
compact_control: None,
text_bands: Vec::new(),
standalone_condition_row_count: 0,
standalone_condition_rows: Vec::new(),
negative_sentinel_scope: None,
grouped_effect_row_counts: vec![0, 0, 0, 0],
grouped_effect_rows: Vec::new(),
decoded_conditions: Vec::new(),
decoded_actions: Vec::new(),
executable_import_ready: false,
notes,
})
}
fn parse_nondirect_condition_row_summary(
record_body: &[u8],
row_index: usize,
remaining_minimum: usize,
) -> Option<(SmpLoadedPackedEventConditionRowSummary, usize)> {
let mut cursor = 0usize;
let mut row_bytes = vec![0u8; PACKED_EVENT_REAL_CONDITION_ROW_LEN];
row_bytes
.get_mut(0..4)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes[4] = read_u8_at(record_body, cursor)?;
cursor += 1;
row_bytes
.get_mut(5..9)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(9..13)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes[0x0d] = read_u8_at(record_body, cursor)?;
cursor += 1;
row_bytes
.get_mut(0x0e..0x12)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x12..0x16)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
let candidate_name =
maybe_parse_nondirect_optional_name_block(record_body, &mut cursor, remaining_minimum)?;
let mut row = parse_real_condition_row_summary(&row_bytes, row_index, candidate_name)?;
row.notes.push(
"condition row reconstructed from the compact non-direct serializer fields under 0x430e80"
.to_string(),
);
Some((row, cursor))
}
fn parse_nondirect_grouped_effect_row_summary(
record_body: &[u8],
group_index: usize,
row_index: usize,
remaining_minimum: usize,
) -> Option<(SmpLoadedPackedEventGroupedEffectRowSummary, usize)> {
let mut cursor = 0usize;
let mut row_bytes = vec![0u8; PACKED_EVENT_REAL_GROUPED_EFFECT_ROW_LEN];
row_bytes
.get_mut(0..4)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(4..8)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes[8] = read_u8_at(record_body, cursor)?;
cursor += 1;
row_bytes
.get_mut(9..13)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x0d..0x11)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x11..0x15)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x12..0x16)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x14..0x18)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x16..0x1a)?
.copy_from_slice(record_body.get(cursor..cursor + 4)?);
cursor += 4;
row_bytes
.get_mut(0x18..0x24)?
.copy_from_slice(record_body.get(cursor..cursor + 12)?);
cursor += 12;
let locomotive_name =
maybe_parse_nondirect_optional_name_block(record_body, &mut cursor, remaining_minimum)?;
let mut row =
parse_real_grouped_effect_row_summary(&row_bytes, group_index, row_index, locomotive_name)?;
row.notes.push(
"grouped effect row reconstructed from the compact non-direct serializer fields under 0x430f68"
.to_string(),
);
Some((row, cursor))
}
fn maybe_parse_nondirect_optional_name_block(
record_body: &[u8],
cursor: &mut usize,
remaining_minimum: usize,
) -> Option<Option<String>> {
if record_body.len() < *cursor + PACKED_EVENT_NONDIRECT_OPTIONAL_NAME_BLOCK_LEN {
return Some(None);
}
if record_body.len()
< *cursor + PACKED_EVENT_NONDIRECT_OPTIONAL_NAME_BLOCK_LEN + remaining_minimum
{
return Some(None);
}
let block =
record_body.get(*cursor..*cursor + PACKED_EVENT_NONDIRECT_OPTIONAL_NAME_BLOCK_LEN)?;
let name = read_ascii_c_string_at(block, 0, PACKED_EVENT_NONDIRECT_OPTIONAL_NAME_BLOCK_LEN);
let Some(name) = name.filter(|name| {
!name.is_empty()
&& block
.iter()
.copied()
.all(|byte| byte == 0 || is_ascii_preview_byte(byte))
}) else {
return Some(None);
};
*cursor += PACKED_EVENT_NONDIRECT_OPTIONAL_NAME_BLOCK_LEN;
Some(Some(name))
}
fn decode_live_entry_ids_from_tombstone_bitset(
bitset: &[u8],
live_id_bound: u32,
@ -21726,6 +22022,81 @@ mod tests {
assert_eq!(summary.records[0].decode_status, "unsupported_framing");
}
#[test]
fn parses_nondirect_compact_event_runtime_record_rows() {
let mut bytes = Vec::new();
bytes.extend_from_slice(&(EVENT_RUNTIME_COLLECTION_METADATA_TAG as u32).to_le_bytes());
bytes.extend_from_slice(&EVENT_RUNTIME_COLLECTION_PACKED_STATE_VERSION.to_le_bytes());
let header_words = [
0u32, 6, 10, 20, 30, 1, 0, 1, 0, 0, 0, 0, 0, 1, 1, 23, 0, 0, 0,
];
for word in header_words {
bytes.extend_from_slice(&word.to_le_bytes());
}
bytes.extend_from_slice(&[0u8; 18]);
bytes.extend_from_slice(&(EVENT_RUNTIME_COLLECTION_RECORDS_TAG as u32).to_le_bytes());
bytes.extend_from_slice(&(PACKED_EVENT_REAL_CONDITION_MARKER as u32).to_le_bytes());
bytes.extend_from_slice(&1u32.to_le_bytes());
bytes.extend_from_slice(&u32::MAX.to_le_bytes());
bytes.push(4);
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.push(2);
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&(PACKED_EVENT_REAL_GROUPED_EFFECT_MARKER as u32).to_le_bytes());
bytes.extend_from_slice(&1u32.to_le_bytes());
bytes.extend_from_slice(&43u32.to_le_bytes());
bytes.extend_from_slice(&1u32.to_le_bytes());
bytes.push(4);
bytes.extend_from_slice(&u32::MAX.to_le_bytes());
bytes.extend_from_slice(&u32::MAX.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&[0u8; 12]);
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&0u32.to_le_bytes());
bytes.extend_from_slice(&(PACKED_EVENT_REAL_RECORD_TERMINATOR_MARKER as u32).to_le_bytes());
bytes.extend_from_slice(&(EVENT_RUNTIME_COLLECTION_CLOSE_TAG as u32).to_le_bytes());
let report = inspect_smp_bytes(&bytes);
let summary = report
.event_runtime_collection_summary
.as_ref()
.expect("non-direct event runtime collection summary should parse");
let record = summary
.records
.first()
.expect("first compact non-direct record");
assert_eq!(record.decode_status, "parity_only");
assert_eq!(record.payload_family, "real_packed_nondirect_compact_v1");
assert_eq!(record.standalone_condition_row_count, 1);
assert_eq!(record.standalone_condition_rows.len(), 1);
assert_eq!(record.standalone_condition_rows[0].raw_condition_id, -1);
assert_eq!(record.standalone_condition_rows[0].subtype, 4);
assert_eq!(record.grouped_effect_row_counts, vec![1, 0, 0, 0]);
assert_eq!(record.grouped_effect_rows.len(), 1);
assert_eq!(record.grouped_effect_rows[0].descriptor_id, 43);
assert_eq!(record.grouped_effect_rows[0].raw_scalar_value, 1);
assert_eq!(record.grouped_effect_rows[0].opcode, 4);
assert!(record.notes.iter().any(|line| {
line.contains("compact non-direct 0x4e99/0x4e9a/0x4e9b map-bundle row framing")
}));
assert!(
record
.notes
.iter()
.any(|line| { line.contains("compact signature family = nondirect-") })
);
}
fn encode_len_prefixed_string(text: &str) -> Vec<u8> {
let mut bytes = Vec::with_capacity(1 + text.len());
bytes.push(text.len() as u8);
@ -29096,11 +29467,10 @@ mod tests {
.iter()
.any(|line| line.contains("0x00431b20")
&& line.contains("0x00433130")
&& line.contains("0x00433060/0x00430d70")
&& line.contains("0x0062be18")
&& line.contains("[event+0x7ef]")
&& line.contains("kind 8")
&& line.contains("separate editor/build class")
&& line.contains("0x00432ea0"))
&& line.contains("kind 8"))
);
assert!(
trace.near_city_acquisition_projection_hypotheses[0]

View file

@ -242,18 +242,22 @@ Working rule:
longer whether kind `8` lives on a separate editor/build class either, but which loaded
kind-`8` rows actually carry the mutation-capable compact payloads
- bundle-side inspection now grounds the startup collection itself:
`War Effort.gmp` exposes a non-direct `0x4e99/0x4e9a/0x4e9b` runtime-event collection at
`0x74740c / 0x7543f4 / 0x7554cf` with `24` live rows, and those rows now segment cleanly as
compact `0x526f`-delimited bodies with repeated `0x4eb8` grouped-effect markers plus optional
`0x4eb9` terminators
- those non-direct rows now carry stable structural family ids in the inspection notes too:
the row probe emits `compact signature family = nondirect-...` keys derived from grouped-marker
offset plus salient head/post-group word lanes, so repeated compact families can be compared
across maps without scraping full raw signatures
sampled maps such as `War Effort.gmp`, `British Isles.gmp`, `Germany.gmp`, and
`Texas Tea.gmp` expose non-direct `0x4e99/0x4e9a/0x4e9b` runtime-event collections, and the
compact `0x526f/0x4eb8/0x4eb9` row family is now decoded into actual condition/grouped row
summaries rather than opaque slices
- the concrete owner strip above that bundle is grounded now too:
`0x00433060` is the direct non-direct serializer loop that writes `0x4e99/0x4e9a/0x4e9b`,
calls `0x00430d70` per live collection row, and sits beside the sibling `0x00433130` size/load
family rather than behind an unknown blob writer
- those non-direct rows still carry stable structural family ids in the inspection notes:
the row probe emits `compact signature family = nondirect-...` keys alongside decoded grouped
descriptors, so repeated compact families can still be compared across maps without scraping
raw bytes
- that moves the startup compact-effect blocker again:
the remaining question is no longer collection existence, but field mapping inside that
compact non-direct row family and whether its observed signatures correspond to loaded
`kind 8` rows that can reach the placed-structure mutation opcodes under `0x00431b20`
the remaining question is no longer compact row framing, but which serialized/live rows in this
now-decoded non-direct bundle correlate to loaded trigger-kind `8` rows and which of those can
reach the placed-structure mutation opcodes under `0x00431b20`
- the `[site+0x27a]` companion lane is grounded now too:
it is a live signed scalar accumulator rather than a second owner-identity seam, with zero-init
at `0x0042125d` and `0x0040f793`, accumulation at `0x0040dfec` and `0x00426ad8`, direct set on