With the release of vLLM 0.11, the RunAI model streamer now breaks again. https://github.com/vllm-project/vllm/issues/26600