← Back to PRs

#17375: fix(session): don't carry stale model info into reset welcome message

by BinHPdev open 2026-02-15 18:00 View on GitHub →
channel: mattermost gateway size: XS
## Summary After `/new` or `/reset`, the welcome message displayed a stale model name from the previous session's cached `model` and `contextTokens` fields — confusing users with wrong model info. Fixed by stopping the carry-over of `model` and `contextTokens` from the old session during resets. Fixes #17003 ## Validation - [x] `pnpm build` — passed - [x] `pnpm check` (format + tsgo + lint) — passed - [x] `pnpm test` — 1073 tests passed (105 test files) ## Scope - Single focused fix — single file changed (`src/gateway/server-methods/sessions.ts`) ## AI-Assistance - AI-assisted (Claude Code) for implementation - Human-reviewed: confirmed understanding of changes - Testing: full local validation suite passed <!-- greptile_comment --> <h3>Greptile Summary</h3> Fixes a bug where `/new` or `/reset` would carry over stale cached `model` and `contextTokens` values from the previous session into the newly created session entry. The `contextTokens` field was the primary culprit — `buildStatusMessage` in `src/auto-reply/status.ts` uses `entry?.contextTokens` with highest priority, so a stale value would cause the welcome message to display an incorrect context window size. The `model` cached field was also being carried over unnecessarily, though it had less direct impact on the welcome display. - Removed `model: entry?.model` from the reset `nextEntry` construction so it defaults to `undefined` - Changed `contextTokens: entry?.contextTokens` to `contextTokens: undefined` with a clear explanatory comment - These cached fields are re-populated naturally on the next LLM run via `session-usage.ts`, so clearing them on reset is safe <h3>Confidence Score: 5/5</h3> - This PR is safe to merge — it's a minimal, well-scoped bug fix with no risk of regressions. - The change is a two-line removal/modification in a single file. It stops carrying over cached display-only fields (`model`, `contextTokens`) that are re-populated on the next LLM run. The fix aligns with how other cached fields (like `modelProvider`) were already handled — they were never carried over in the reset path. No new logic is introduced, and the existing test suite (1073 tests) passes. - No files require special attention. <sub>Last reviewed commit: c04344f</sub> <!-- greptile_other_comments_section --> <!-- /greptile_comment -->

Most Similar PRs