A token counter for SpicyWriter conversations with warnings to start a new thread to avoid context degradation.
Add a simple token counter overlay to conversations on SpicyWriter.com that suggests when to start a new thread to avoid context degradation syndrome or context rot (when LLM models seem to get brain damage when the context window gets too large).
The token counter turns red at 32k tokens, a conservative limit for when most LLM models start to degrade.
This Userscript uses o200k_base encoding from gpt-tokenizer, "the fastest, smallest and lowest footprint GPT tokenizer available for all JavaScript environments". The token count displayed is increased by 16% to estimate the number of tokens used by Claude.
QingJ © 2025
镜像随时可能失效,请加Q群300939539或关注我们的公众号极客氢云获取最新地址