Skip to content

[Bug Report]: o tars search cannot render correctly #1094

@ulivz

Description

@ulivz

Version

latest

Issue Type

  • Select a issue type 👇
  • Agent TARS Web UI (@agent-tars/web-ui)
  • Agent TARS CLI (@agent-tars/server)
  • Agent TARS Server (@agent-tars/server)
  • Agent TARS (@agent-tars/core)
  • MCP Agent (@tarko/mcp-agent)
  • Agent Kernel (@tarko/agent)
  • Other (please specify in description)

Model Provider

  • Select a model provider 👇
  • Volcengine
  • Anthropic
  • OpenAI
  • Azure OpenAI
  • Other (please specify in description)

Problem Description

Snapshot

Handled by multimodal/tarko/agent-web-ui/src/common/state/actions/eventProcessor.ts
Rendered by multimodal/tarko/agent-web-ui/src/standalone/workspace/renderers/generic/GenericResultRenderer.tsx

Issue:

  1. Unable to scroll right
  2. Poor visual experience

Event Stream

        {
            "id": "9e3ae951-bbc2-43b7-90c7-10bea6b7d34d",
            "type": "tool_result",
            "timestamp": 1754990161734,
            "toolCallId": "call_1754990160194_mq5g41jhu",
            "name": "Search",
            "content": {
                "content": [
                    {
                        "type": "text",
                        "text": "{\n  \"searchParameters\": {\n    \"q\": \"UI TARS\",\n    \"gl\": \"us\",\n    \"hl\": \"en\",\n    \"type\": \"search\",\n    \"engine\": \"google\"\n  },\n  \"organic\": [\n    {\n      \"title\": \"bytedance/UI-TARS - GitHub\",\n      \"link\": \"https://github.com/bytedance/UI-TARS\",\n      \"snippet\": \"An open-source multimodal agent built upon a powerful vision-language model. It is capable of effectively performing diverse tasks within virtual worlds.\",\n      \"sitelinks\": [\n        {\n          \"title\": \"UI-TARS-desktop\",\n          \"link\": \"https://github.com/bytedance/UI-TARS-desktop\"\n        },\n        {\n          \"title\": \"Pull requests 7\",\n          \"link\": \"https://github.com/bytedance/UI-TARS/pulls\"\n        },\n        {\n          \"title\": \"Actions\",\n          \"link\": \"https://github.com/bytedance/UI-TARS/actions\"\n        },\n        {\n          \"title\": \"Security\",\n          \"link\": \"https://github.com/bytedance/UI-TARS/security\"\n        }\n      ],\n      \"position\": 1\n    },\n    {\n      \"title\": \"UI-TARS: Pioneering Automated GUI Interaction with Native Agents\",\n      \"link\": \"https://arxiv.org/abs/2501.12326\",\n      \"snippet\": \"This paper introduces UI-TARS, a native GUI agent model that solely perceives the screenshots as input and performs human-like interactions.\",\n      \"date\": \"Jan 21, 2025\",\n      \"position\": 2\n    },\n    {\n      \"title\": \"bytedance/UI-TARS-desktop: The Open-sourced Multimodal AI ...\",\n      \"link\": \"https://github.com/bytedance/UI-TARS-desktop\",\n      \"snippet\": \"UI-TARS Desktop is a native GUI agent driven by UI-TARS and Seed-1.5-VL/1.6 series models, available on your local computer and remote VM sandbox on cloud.\",\n      \"sitelinks\": [\n        {\n          \"title\": \"Releases 28\",\n          \"link\": \"https://github.com/bytedance/UI-TARS-desktop/releases\"\n        },\n        {\n          \"title\": \"Issues\",\n          \"link\": \"https://github.com/bytedance/UI-TARS-desktop/issues\"\n        },\n        {\n          \"title\": \"Pull requests 11\",\n          \"link\": \"https://github.com/bytedance/UI-TARS-desktop/pulls\"\n        },\n        {\n          \"title\": \"Support DeepSeek Provider\",\n          \"link\": \"https://github.com/bytedance/UI-TARS-desktop/issues/283\"\n        }\n      ],\n      \"position\": 3\n    },\n    {\n      \"title\": \"Bytedance UI-TARS AI Desktop: AI Agent for Computer Control\",\n      \"link\": \"https://ui-tarsai.com/\",\n      \"snippet\": \"UI-TARS is an AI model for computer automation, offering both browser and desktop agents. Learn about its capabilities, installation, and usage.\",\n      \"position\": 4\n    },\n    {\n      \"title\": \"ByteDance-Seed/UI-TARS-1.5-7B - Hugging Face\",\n      \"link\": \"https://huggingface.co/ByteDance-Seed/UI-TARS-1.5-7B\",\n      \"snippet\": \"UI-TARS-1.5, an open-source multimodal agent built upon a powerful vision-language model. It is capable of effectively performing diverse tasks within virtual ...\",\n      \"position\": 5\n    },\n    {\n      \"title\": \"Exploring UI-TARS : r/LocalLLaMA - Reddit\",\n      \"link\": \"https://www.reddit.com/r/LocalLLaMA/comments/1iafeo3/exploring_uitars/\",\n      \"snippet\": \"I've been exploring UI-TARS and the UI-TARS-Desktop agent (Note: I compiled my own version of it) and like a lot of early stage AI things, it's impressive.\",\n      \"date\": \"Jan 26, 2025\",\n      \"sitelinks\": [\n        {\n          \"title\": \"UI-TARS : r/ollama - Reddit\",\n          \"link\": \"https://www.reddit.com/r/ollama/comments/1i7as24/uitars/\"\n        },\n        {\n          \"title\": \"Anyone try UI-TARS-1.5-7B new model from ByteDance - Reddit\",\n          \"link\": \"https://www.reddit.com/r/LocalLLaMA/comments/1k665cg/anyone_try_uitars157b_new_model_from_bytedance/\"\n        }\n      ],\n      \"position\": 6\n    },\n    {\n      \"title\": \"UI-TARS:Next-generation native GUI agent model designed to ...\",\n      \"link\": \"https://seed-tars.com/\",\n      \"snippet\": \"Introducing UI-TARS-1.5✦⬧. Next-generation native GUI agent model designed to interact seamlessly with GUIs using human-like perception.\",\n      \"position\": 7\n    },\n    {\n      \"title\": \"ByteDance UI TARS : Best GUI Agent model to run computers\",\n      \"link\": \"https://medium.com/data-science-in-your-pocket/bytedance-ui-tars-best-gui-agent-model-to-run-computers-f087a029e932\",\n      \"snippet\": \"UI-TARS, which stands for User Interface — Task Automation and Reasoning System, is an innovative native GUI agent model created by ByteDance ...\",\n      \"date\": \"Jan 22, 2025\",\n      \"position\": 8\n    },\n    {\n      \"title\": \"ByteDance's UI-TARS can take over your computer, outperforms ...\",\n      \"link\": \"https://venturebeat.com/ai/bytedances-ui-tars-can-take-over-your-computer-outperforms-gpt-4o-and-claude/\",\n      \"snippet\": \"ByteDance's new UI-TARS understands graphical user interfaces (GUIs), applies reasoning and takes autonomous, step-by-step action.\",\n      \"date\": \"Jan 22, 2025\",\n      \"position\": 9\n    },\n    {\n      \"title\": \"ByteDance-Seed/UI-TARS-7B-SFT - Hugging Face\",\n      \"link\": \"https://huggingface.co/ByteDance-Seed/UI-TARS-7B-SFT\",\n      \"snippet\": \"UI-TARS is a next-generation native GUI agent model designed to interact seamlessly with graphical user interfaces (GUIs) using human-like perception, ...\",\n      \"position\": 10\n    }\n  ],\n  \"relatedSearches\": [\n    {\n      \"query\": \"ui-tars desktop\"\n    },\n    {\n      \"query\": \"ui-tars huggingface\"\n    },\n    {\n      \"query\": \"ui-tars-2b\"\n    },\n    {\n      \"query\": \"ui-tars ollama\"\n    },\n    {\n      \"query\": \"ui-tars api\"\n    },\n    {\n      \"query\": \"ui-tars-7b\"\n    },\n    {\n      \"query\": \"ui-tars mobile\"\n    },\n    {\n      \"query\": \"ui-tars paper\"\n    }\n  ],\n  \"credits\": 1\n}"
                    }
                ]
            },
            "elapsedMs": 1536
        },

Error Logs

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions