Skip to content

drop JSON cast cache in text-mode adapter#1182

Merged
bgentry merged 1 commit intomasterfrom
bg/json-text-mode-drop-cast-cache
Mar 24, 2026
Merged

drop JSON cast cache in text-mode adapter#1182
bgentry merged 1 commit intomasterfrom
bg/json-text-mode-drop-cast-cache

Conversation

@bgentry
Copy link
Contributor

@bgentry bgentry commented Mar 24, 2026

Summary

Follow-up to the JSON text-mode parameter adaptation work from #1155.

jsonPlaceholderCasts currently caches parsed cast placeholders in a
process-wide map keyed by the full rendered SQL string. Because adaptation runs
after template schema replacement, varying schemas produce distinct cache keys
for otherwise-identical queries. In long-lived processes with many schema
values, this can grow monotonically over time.

This change drops the cast-placeholder cache and parses casts per call instead,
which keeps behavior unchanged while removing schema-driven unbounded cache
growth risk.

Why this is safe

  • The functional adaptation logic is unchanged.
  • The additional parsing work only happens on text-mode adaptation paths.
  • Existing adapter tests continue to validate behavior.

Testing

  • go test ./riverdriver/riverpgxv5 ./riverdriver/riverdrivertest

The JSON placeholder cast cache was keyed by rendered SQL text after schema
template replacement. In setups that vary schema names heavily, that creates
a monotonic key space and unbounded process-lifetime cache growth.

Drop the cache and parse casts per call instead. Behavior stays the same, and
this keeps the follow-up fix minimal while removing schema-driven growth.
@bgentry bgentry requested a review from brandur March 24, 2026 01:25
@bgentry bgentry merged commit bcbcb6d into master Mar 24, 2026
15 checks passed
@bgentry bgentry deleted the bg/json-text-mode-drop-cast-cache branch March 24, 2026 01:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants