Skip to content

Ingestion of documents with Ollama is incredibly slow #1691

@Zirgite

Description

@Zirgite

I upgraded to the last version of privateGPT and the ingestion speed is much slower than in previous versions. It is so slow to the point of being unusable.
I use the recommended ollama possibility. More than 1 h stiil the document is not finished. I have 3090 and 18 core CPU. And I am using the very small Mistral.
I am ingesting 105 kb pdf file. 37 pages of text
Later I switched to less recommended 'llms-llama-cpp' option in PrivateGP. The problem was solved. But still is anyway to have fast ingetion with Ollama?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions