Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 14 additions & 2 deletions noir_stdlib/src/hash/mod.nr
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,23 @@ pub fn blake2s<let N: u32>(input: [u8; N]) -> [u8; 32]
// docs:end:blake2s
{}

#[foreign(blake3)]
// docs:start:blake3
pub fn blake3<let N: u32>(input: [u8; N]) -> [u8; 32]
// docs:end:blake3
{}
{
if crate::runtime::is_unconstrained() {
// Temporary measure while Barretenberg is main proving system.
// Please open an issue if you're working on another proving system and running into problems due to this.
crate::static_assert(
N <= 1024,
"Barretenberg cannot prove blake3 hashes with inputs larger than 1024 bytes",
);
}
__blake3(input)
}

#[foreign(blake3)]
fn __blake3<let N: u32>(input: [u8; N]) -> [u8; 32] {}

// docs:start:pedersen_commitment
pub fn pedersen_commitment<let N: u32>(input: [Field; N]) -> EmbeddedCurvePoint {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -172,6 +172,6 @@ unconstrained fn main(kernel_data: DataToHash) -> pub [Field; NUM_FIELDS_PER_SHA
}
}

let blake3_digest = std::hash::blake3(hash_input_flattened);
U256::from_bytes32(blake3_digest).to_u128_limbs()
let blake2_digest = std::hash::blake2s(hash_input_flattened);
U256::from_bytes32(blake2_digest).to_u128_limbs()
}
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@ pub fn field_from_bytes_32_trunc(bytes32: [u8; 32]) -> Field {
low + high * v
}

pub fn blake3_to_field<let N: u32>(bytes_to_hash: [u8; N]) -> Field {
let blake3_hashed = std::hash::blake3(bytes_to_hash);
let hash_in_a_field = field_from_bytes_32_trunc(blake3_hashed);
pub fn blake2s_to_field<let N: u32>(bytes_to_hash: [u8; N]) -> Field {
let blake2s_hashed = std::hash::blake2s(bytes_to_hash);
let hash_in_a_field = field_from_bytes_32_trunc(blake2s_hashed);

hash_in_a_field
}
Expand All @@ -36,6 +36,6 @@ fn main(tx_effects_hash_input: [Field; TX_EFFECTS_HASH_INPUT_FIELDS]) -> pub Fie
}
}

let blake3_digest = blake3_to_field(hash_input_flattened);
blake3_digest
let blake2s_digest = blake2s_to_field(hash_input_flattened);
blake2s_digest
}

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Loading
Loading