Skip to content

Conversation

@0xlakshan
Copy link
Contributor

@0xlakshan 0xlakshan commented Oct 13, 2025

Background

There is currently no middleware for embedding models, which would be useful for e.g. setting defaults (see #7947 ).

Summary

This PR adds a middleware wrapEmbeddingModel, making it easier to customize and extend embedding model behavior from unified central space.

This also adds support for defaultEmbeddingSettingsMiddleware middleware, letting users manage things like dimensions and provider options from single central place.

Example

// configure embedding model with settings
const centralSpace = customProvider({
  textEmbeddingModels: {
    'my-embedding-model': wrapEmbeddingModel({
      model: google.textEmbedding('gemini-embedding-001'),
      middleware: defaultEmbeddingSettingsMiddleware({
        settings: {
          providerOptions: {
            google: {
              outputDimensionality: 256,
              taskType: 'CLASSIFICATION',
            },
          },
        },
      }),
    }),
  },
});

// usage with embedMany
const embedManyResponse = await embedMany({
  model: centralSpace.textEmbeddingModel('powerful-embedding-model'),
  values: [
    'sunny day at the beach',
    'rainy afternoon in the city',
    'snowy night in the mountains',
  ],
});
console.log(embedManyResponse.embeddings);

// usage with embed
const response = await embed({
  model: centralSpace.textEmbeddingModel('powerful-embedding-model'),
  value: 'rainy afternoon in the city',
});
console.log(response.embedding);

Manual Verification

  • run examples/ai-core/src/middleware/embedding/google-default-settings.ts

Checklist

  • Tests have been added / updated (for bug fixes / features)
  • Documentation has been added / updated (for bug fixes / features)
  • A patch changeset for relevant packages has been added (for bug fixes / features - run pnpm changeset in the project root)
  • Formatting issues have been fixed (run pnpm prettier-fix in the project root)
  • I have reviewed this pull request (self-review)

Related Issues

Fixes #7947

@0xlakshan
Copy link
Contributor Author

Hi @gr2m @lgrammel , could you take a quick look at this PR when you get a chance? I’d really appreciate any feedback you have. Thanks :)

@zirkelc
Copy link

zirkelc commented Oct 16, 2025

Is there a a reason it is implement for EmbeddingModelV3 and not the current EmbeddingModelV2?

@0xlakshan
Copy link
Contributor Author

Hi @zirkelc, you might want to update the ai sdk package, as the EmbeddingModelV2 has been replaced with EmbeddingModelV3. (3 weeks ago)

Related PR - 0c4822d

@0xlakshan
Copy link
Contributor Author

0xlakshan commented Oct 18, 2025

Hi @lgrammel do i need to make this middleware support for EmbeddingModelV2 as well for backward compatibility.

@lgrammel lgrammel self-assigned this Oct 28, 2025
@lgrammel
Copy link
Collaborator

Thanks @0xlakshan I'll take it from here and will make a few changes. V2 support is not desired since it would mean changing the provider package for v2.

@lgrammel lgrammel changed the title feat(middleware): Introduce wrapEmbeddingModel middleware feat: embedding model middleware Oct 28, 2025
@lgrammel lgrammel merged commit 37c58a0 into vercel:main Oct 28, 2025
17 of 18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

add wrapEmbeddingModel to allow overwriting dimensions

3 participants