Pool_factor not changing embedding number #363
Unanswered
John42506176Linux
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, Colbert community :),
I'm currently testing out Colbert, and I was curious how I could test multiple degrees of token pooling similar to this https://www.answer.ai/posts/colbert-pooling.html. Currently, when changing pool_factor I haven't seen any changes in the number of embeddings.
Here is the code being used.
Version: 0.2.20
from colbert.modeling.checkpoint import Checkpoint
from colbert.infra import ColBERTConfig
answer_ai = Checkpoint("answerdotai/answerai-colbert-small-v1", colbert_config=ColBERTConfig())
vectors = answer_ai.docFromText(documents,bsize=2, pool_factor=3, showprogress=True)[0]
Beta Was this translation helpful? Give feedback.
All reactions