Skip to content

Add support for INT8 quantization of fusion_gru op #27330

@wojtuss

Description

@wojtuss

Please, enable INT8 quantization of the fusion_gru op using oneDNN-based INT8 kernel.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions