Skip to content

[Bug]: output.messages not updated after compression — context never shrinks #491

@cybersea54

Description

@cybersea54

Bug Description

Description

Since commit cab3c459, the experimental.chat.messages.transform hook no longer modifies output.messages. Compression appears to succeed (notifications show token removal, summaries are created), but the LLM context window never decreases. Eventually OpenCode triggers compaction because the real token count keeps growing.

Root Cause

In lib/hooks.ts:107, filterProcessableMessages(output.messages) returns a new array (via Array.filter()). All subsequent mutations — prune(), stripStaleMetadata(), injectMessageIds(), etc. — operate on this copy. The original output.messages is never updated.

Before cab3c459 (worked)

return async (input: {}, output: { messages: WithParts[] }) => {
    await checkSession(client, state, logger, output.messages, ...)
    // ...
    prune(state, logger, config, output.messages)    // mutates original
    stripStaleMetadata(output.messages)               // mutates original
}

After cab3c459 (broken)

return async (input: {}, output: { messages: WithParts[] }) => {
  const messages = filterProcessableMessages(output.messages); // NEW array
  // ...
  prune(state, logger, config, messages); // mutates copy
  stripStaleMetadata(messages); // mutates copy
  // output.messages is never synced back
};

filterCompressedRanges() in prune.js does messages.length = 0; messages.push(...result) — correct in-place mutation, but on the wrong array.

Steps to Reproduce

  1. Start any session in opencode web or TUI
  2. Work until DCP triggers auto-compression (or use /dcp compress)
  3. Observe notification: ▣ DCP | -XXK removed, +YK summary
  4. Check actual context token counter in OpenCode — it does NOT decrease
  5. Continue working — context keeps growing, compaction eventually fires

Expected Behavior

After DCP compression runs, the output.messages array in the experimental.chat.messages.transform hook should reflect the pruned state: compressed messages removed, synthetic summary messages injected at anchor points. The OpenCode context token counter should decrease proportionally to the removed content, and subsequent LLM calls should receive only the pruned message set.

Instead, output.messages is never modified — the LLM receives the full original message list on every turn, context grows monotonically, and OpenCode's built-in compaction eventually fires despite DCP compression appearing to succeed.

Debug Context Logs

Debug logging was disabled ("debug": false in dcp.jsonc). No context log files exist at ~/.config/opencode/logs/dcp/context/.

However, the bug is fully reproducible via source code inspection:

  1. lib/hooks.ts:107filterProcessableMessages(output.messages) calls Array.filter(), which returns a new array per ECMAScript spec
  2. lib/messages/prune.js:177-178filterCompressedRanges() mutates via messages.length = 0; messages.push(...result) — correct mutation, but on the copy
  3. The handler returns at line 153 without ever assigning back to output.messages

The breaking change was introduced in commit cab3c459, where all output.messages references were replaced with the local messages variable (the filtered copy), but no sync-back step was added.

Suggested Fix

Sync the processed messages array back to output.messages before returning from the hook. Add after stripStaleMetadata(messages) (line 149):

output.messages.length = 0;
output.messages.push(...messages);

Environment

  • Plugin version: 3.1.8
  • OpenCode: 1.3.13
  • Mode: opencode web (also affects TUI — the bug is in shared code path)
  • OS: Linux

Breaking Commit

cab3c459e750f58fc1bd4895fb244497690d2557 — introduced filterProcessableMessages() to prevent crashes on malformed messages, but forgot to write the filtered+pruned result back to output.messages.

Expected Behavior

After DCP compression runs, the output.messages array in the experimental.chat.messages.transform hook should reflect the pruned state: compressed messages removed, synthetic summary messages injected at anchor points. The OpenCode context token counter should decrease proportionally to the removed content, and subsequent LLM calls should receive only the pruned message set.

Instead, output.messages is never modified — the LLM receives the full original message list on every turn, context grows monotonically, and OpenCode's built-in compaction eventually fires despite DCP compression appearing to succeed.

Debug Context Logs

Debug logging was disabled (`"debug": false` in `dcp.jsonc`). No context log files exist at `~/.config/opencode/logs/dcp/context/`.

However, the bug is fully reproducible via source code inspection:

1. `lib/hooks.ts:107` — `filterProcessableMessages(output.messages)` calls `Array.filter()`, which returns a new array per ECMAScript spec
2. `lib/messages/prune.js:177-178` — `filterCompressedRanges()` mutates via `messages.length = 0; messages.push(...result)` — correct mutation, but on the copy
3. The handler returns at line 153 without ever assigning back to `output.messages`

The breaking change was introduced in commit [`cab3c459`](https://github.com/Opencode-DCP/opencode-dynamic-context-pruning/commit/cab3c459e750f58fc1bd4895fb244497690d2557), where all `output.messages` references were replaced with the local `messages` variable (the filtered copy), but no sync-back step was added.

Tool Call Details

No response

DCP Version

3.1.8

Opencode Version

1.3.13

Model

Claude Sonnet 4

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions