Conversation
|
|
||
| // TX FEE | ||
| tx_effects_hash_input[offset] = transaction_fee; | ||
| // TODO(Miranda): how many bytes do we expect tx fee to be? Using 29 for now |
There was a problem hiding this comment.
@ TMNT (or any) team: is there a restriction anywhere on the size of the tx fee?
There was a problem hiding this comment.
Don't think there really is a limit, there is probably some sensible value, but as base fees might go up a lot in congestion for now would just keep it as is. You can essentially see it as just a more domain specific public state update.
There was a problem hiding this comment.
Amazing!
I think this is mergeable. @LeilaWang, @LHerskind what do you think?
The circuits look good (with well documented todos) (and we'll be updating them for batching of blobs soon anyway).
The smart contracts look good (with well documented todos) (and we know they're not optimised, because we'll be moving to batching of blobs soon anyway).
The typescript is passing tests.
The "boxes" tests have been failing on master for a while, so no need to fix those here.
Ship?
LHerskind
left a comment
There was a problem hiding this comment.
There is a bit of naming that should probably be updated around single or multiple blobs, e.g., blobHash vs blobsHash. For the cases where there are multiple, would prefer the blobsHash to be used consistently to make it simpler to follow.
The TxsDecoder should also be pretty unused after this so maybe it can be deleted along with it.
| // Testing only. This should be removed eventually. | ||
| uint256 private assumeProvenThroughBlockNumber; | ||
|
|
||
| // @note Always true, exists to override to false for testing only |
There was a problem hiding this comment.
Is there an issue for getting rid of this again?
| * | | | ContentCommitment { | ||
| * | 0x0024 | 0x20 | numTxs | ||
| * | 0x0044 | 0x20 | txsEffectsHash | ||
| * | 0x0044 | 0x20 | blobHash |
There was a problem hiding this comment.
Thanks.
Quick question, should this not be blobsHash (multiple blobs).
l1-contracts/src/core/Rollup.sol
Outdated
| bytes32[7] calldata _args, | ||
| bytes32[] calldata _fees, | ||
| bytes calldata _aggregationObject, | ||
| bytes calldata _blobPublicInputsAndAggregationObject, // having separate inputs here caused stack too deep |
There was a problem hiding this comment.
It is possible to get around this by using something similar to the ProposeArgs. Might be worth considering for the sake of your sanity 🫡
l1-contracts/src/core/Rollup.sol
Outdated
| } | ||
|
|
||
| for (uint256 i = 0; i < _epochSize; i++) { | ||
| // free up gas (hopefully) |
There was a problem hiding this comment.
Unless you are submitting the proof in the same transaction as you set the values originally you will be paying extra to do this. Cleaning it here is essentially doing like gas-tokens did back in the days before pricing was changed.
Would remove.
| // Which we are checking in the `_validateHeader` call below. | ||
| bytes32 txsEffectsHash = TxsDecoder.decode(_body); | ||
| // Since an invalid blob hash here would fail the consensus checks of | ||
| // the header, the `blobInput` is implicitly accepted by consensus as well. |
l1-contracts/src/core/Rollup.sol
Outdated
| bytes32[7] calldata _args, | ||
| bytes32[] calldata _fees, | ||
| bytes calldata _aggregationObject | ||
| bytes calldata _blobPublicInputsAndAggregationObject // having separate inputs here caused stack too deep |
There was a problem hiding this comment.
As earlier, using a struct can be used to work around this while keeping the readability.
| bytes32(_blobPublicInputsAndAggregationObject[blobOffset:blobOffset += 32]) | ||
| ); | ||
| // To fit into 2 fields, the commitment is split into 31 and 17 byte numbers | ||
| // TODO: The below left pads, possibly inefficiently |
l1-contracts/src/core/Rollup.sol
Outdated
| _signatures: _signatures, | ||
| _digest: digest, | ||
| _currentTime: Timestamp.wrap(block.timestamp), | ||
| _blobHash: blobsHash, |
There was a problem hiding this comment.
The _blobHash should probably also be blobsHash (multiple)
l1-contracts/src/core/Rollup.sol
Outdated
| bytes32 _digest, | ||
| Timestamp _currentTime, | ||
| bytes32 _txEffectsHash, | ||
| bytes32 _blobHash, |
There was a problem hiding this comment.
Same down here, there is a bunch of blobHash that should probably be blobsHash.
LHerskind
left a comment
There was a problem hiding this comment.
Lets get it merged before the fees changes goes in.
| * @param _args - Array of public inputs to the proof (previousArchive, endArchive, previousBlockHash, endBlockHash, endTimestamp, outHash, proverId) | ||
| * @param _fees - Array of recipient-value pairs with fees to be distributed for the epoch | ||
| * @param _blobPublicInputsAndAggregationObject - The aggregation object and blob PIs for the proof | ||
| * @param _submitArgs - Struct for constructing PIs which has: |
There was a problem hiding this comment.
What is PIs? Public Inputs? Personal Information
| * @param _digest - The digest to validate | ||
| * @param _currentTime - The current time | ||
| * @param _blobHash - The EVM blob hash for this block | ||
| * @param _blobsHash - The EVM blob hash for this block |
There was a problem hiding this comment.
The EVM blob hash for this block is not correct anymore
| * @param _args - Array of public inputs to the proof (previousArchive, endArchive, previousBlockHash, endBlockHash, endTimestamp, outHash, proverId) | ||
| * @param _fees - Array of recipient-value pairs with fees to be distributed for the epoch | ||
| * @param _blobPublicInputsAndAggregationObject - The aggregation object and blob PIs for the proof | ||
| * @param _submitArgs - Struct for constructing PIs which has: |
There was a problem hiding this comment.
Anther PIs, think it is public inputs, but would be easier to follow if written out
| uint256 endBlockNumber = previousBlockNumber + _epochSize; | ||
|
|
||
| uint256 endBlockNumber = previousBlockNumber + _submitArgs.epochSize; | ||
| bytes32[7] memory args = _submitArgs.args; |
* master: (106 commits) feat: blobs. (#9302) chore(avm): operands reordering (#10182) feat: UltraRollupRecursiveFlavor (#10088) feat: one liner for nodes to join rough-rhino (#10168) feat!: rename sharedimmutable methods (#10164) chore(master): Release 0.64.0 (#10043) feat: e2e metrics reporting (#9776) chore: fix pool metrics (#9652) chore: Initial draft of testnet-runbook (#10085) feat: Improved data storage metrics (#10020) chore: Remove handling of duplicates from the note hash tree (#10016) fix: add curl to aztec nargo container (#10173) fix: Revert "feat: integrate base fee computation into rollup" (#10166) feat!: rename SharedMutable methods (#10165) git subrepo push --branch=master noir-projects/aztec-nr git_subrepo.sh: Fix parent in .gitrepo file. [skip ci] chore: replace relative paths to noir-protocol-circuits git subrepo push --branch=master barretenberg feat: sync tags as sender (#10071) feat: integrate base fee computation into rollup (#10076) ...
This reverts commit 03b7e0e.
* master: (64 commits) fix: docker compose aztec up fix (#10197) fix: aztec-nargo curl in the earthfile also (#10199) chore: fix devbox (#10201) chore: misc cleanup (#10194) fix: release l1-contracts (#10095) git subrepo push --branch=master noir-projects/aztec-nr git_subrepo.sh: Fix parent in .gitrepo file. [skip ci] chore: replace relative paths to noir-protocol-circuits git subrepo push --branch=master barretenberg feat: Origin tags implemented in biggroup (#10002) fix: Revert "feat: blobs. (#9302)" (#10187) feat!: remove SharedImmutable (#10183) fix(bb.js): don't minify bb.js - webpack config (#10170) feat: blobs. (#9302) chore(avm): operands reordering (#10182) feat: UltraRollupRecursiveFlavor (#10088) feat: one liner for nodes to join rough-rhino (#10168) feat!: rename sharedimmutable methods (#10164) chore(master): Release 0.64.0 (#10043) feat: e2e metrics reporting (#9776) ...
🤖 I have created a release *beep* *boop* --- <details><summary>aztec-package: 0.65.0</summary> ## [0.65.0](aztec-package-v0.64.0...aztec-package-v0.65.0) (2024-11-26) ### Features * **avm:** New public inputs witgen ([#10179](#10179)) ([ac8f13e](ac8f13e)) </details> <details><summary>barretenberg.js: 0.65.0</summary> ## [0.65.0](barretenberg.js-v0.64.0...barretenberg.js-v0.65.0) (2024-11-26) ### Bug Fixes * **bb.js:** Don't minify bb.js - webpack config ([#10170](#10170)) ([6e7fae7](6e7fae7)) </details> <details><summary>aztec-packages: 0.65.0</summary> ## [0.65.0](aztec-packages-v0.64.0...aztec-packages-v0.65.0) (2024-11-26) ### ⚠ BREAKING CHANGES * remove SharedImmutable ([#10183](#10183)) * rename sharedimmutable methods ([#10164](#10164)) ### Features * **avm:** New public inputs witgen ([#10179](#10179)) ([ac8f13e](ac8f13e)) * Blobs. ([#9302](#9302)) ([03b7e0e](03b7e0e)) * One liner for nodes to join rough-rhino ([#10168](#10168)) ([3a425e9](3a425e9)) * Origin tags implemented in biggroup ([#10002](#10002)) ([c8696b1](c8696b1)) * Remove SharedImmutable ([#10183](#10183)) ([a9f3b5f](a9f3b5f)) * Rename sharedimmutable methods ([#10164](#10164)) ([ef7cd86](ef7cd86)) * UltraRollupRecursiveFlavor ([#10088](#10088)) ([4418ef2](4418ef2)) ### Bug Fixes * Aztec-nargo curl in the earthfile also ([#10199](#10199)) ([985a678](985a678)) * **bb.js:** Don't minify bb.js - webpack config ([#10170](#10170)) ([6e7fae7](6e7fae7)) * Docker compose aztec up fix ([#10197](#10197)) ([d7ae959](d7ae959)) * Increase test timeouts ([#10205](#10205)) ([195aa3d](195aa3d)) * Release l1-contracts ([#10095](#10095)) ([29f0d7a](29f0d7a)) * Revert "feat: blobs. ([#9302](#9302))" ([#10187](#10187)) ([a415f65](a415f65)) ### Miscellaneous * Added ref to env variables ([#10193](#10193)) ([b51fc43](b51fc43)) * **avm:** Operands reordering ([#10182](#10182)) ([69bdf4f](69bdf4f)), closes [#10136](#10136) * Fix devbox ([#10201](#10201)) ([323eaee](323eaee)) * Misc cleanup ([#10194](#10194)) ([dd01417](dd01417)) * Reinstate docs-preview, fix doc publish ([#10213](#10213)) ([ed9a0e3](ed9a0e3)) * Replace relative paths to noir-protocol-circuits ([1650446](1650446)) </details> <details><summary>barretenberg: 0.65.0</summary> ## [0.65.0](barretenberg-v0.64.0...barretenberg-v0.65.0) (2024-11-26) ### Features * **avm:** New public inputs witgen ([#10179](#10179)) ([ac8f13e](ac8f13e)) * Origin tags implemented in biggroup ([#10002](#10002)) ([c8696b1](c8696b1)) * UltraRollupRecursiveFlavor ([#10088](#10088)) ([4418ef2](4418ef2)) ### Miscellaneous * **avm:** Operands reordering ([#10182](#10182)) ([69bdf4f](69bdf4f)), closes [#10136](#10136) </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please).
🤖 I have created a release *beep* *boop* --- <details><summary>aztec-package: 0.65.0</summary> ## [0.65.0](AztecProtocol/aztec-packages@aztec-package-v0.64.0...aztec-package-v0.65.0) (2024-11-26) ### Features * **avm:** New public inputs witgen ([#10179](AztecProtocol/aztec-packages#10179)) ([ac8f13e](AztecProtocol/aztec-packages@ac8f13e)) </details> <details><summary>barretenberg.js: 0.65.0</summary> ## [0.65.0](AztecProtocol/aztec-packages@barretenberg.js-v0.64.0...barretenberg.js-v0.65.0) (2024-11-26) ### Bug Fixes * **bb.js:** Don't minify bb.js - webpack config ([#10170](AztecProtocol/aztec-packages#10170)) ([6e7fae7](AztecProtocol/aztec-packages@6e7fae7)) </details> <details><summary>aztec-packages: 0.65.0</summary> ## [0.65.0](AztecProtocol/aztec-packages@aztec-packages-v0.64.0...aztec-packages-v0.65.0) (2024-11-26) ### ⚠ BREAKING CHANGES * remove SharedImmutable ([#10183](AztecProtocol/aztec-packages#10183)) * rename sharedimmutable methods ([#10164](AztecProtocol/aztec-packages#10164)) ### Features * **avm:** New public inputs witgen ([#10179](AztecProtocol/aztec-packages#10179)) ([ac8f13e](AztecProtocol/aztec-packages@ac8f13e)) * Blobs. ([#9302](AztecProtocol/aztec-packages#9302)) ([03b7e0e](AztecProtocol/aztec-packages@03b7e0e)) * One liner for nodes to join rough-rhino ([#10168](AztecProtocol/aztec-packages#10168)) ([3a425e9](AztecProtocol/aztec-packages@3a425e9)) * Origin tags implemented in biggroup ([#10002](AztecProtocol/aztec-packages#10002)) ([c8696b1](AztecProtocol/aztec-packages@c8696b1)) * Remove SharedImmutable ([#10183](AztecProtocol/aztec-packages#10183)) ([a9f3b5f](AztecProtocol/aztec-packages@a9f3b5f)) * Rename sharedimmutable methods ([#10164](AztecProtocol/aztec-packages#10164)) ([ef7cd86](AztecProtocol/aztec-packages@ef7cd86)) * UltraRollupRecursiveFlavor ([#10088](AztecProtocol/aztec-packages#10088)) ([4418ef2](AztecProtocol/aztec-packages@4418ef2)) ### Bug Fixes * Aztec-nargo curl in the earthfile also ([#10199](AztecProtocol/aztec-packages#10199)) ([985a678](AztecProtocol/aztec-packages@985a678)) * **bb.js:** Don't minify bb.js - webpack config ([#10170](AztecProtocol/aztec-packages#10170)) ([6e7fae7](AztecProtocol/aztec-packages@6e7fae7)) * Docker compose aztec up fix ([#10197](AztecProtocol/aztec-packages#10197)) ([d7ae959](AztecProtocol/aztec-packages@d7ae959)) * Increase test timeouts ([#10205](AztecProtocol/aztec-packages#10205)) ([195aa3d](AztecProtocol/aztec-packages@195aa3d)) * Release l1-contracts ([#10095](AztecProtocol/aztec-packages#10095)) ([29f0d7a](AztecProtocol/aztec-packages@29f0d7a)) * Revert "feat: blobs. ([#9302](AztecProtocol/aztec-packages#9302))" ([#10187](AztecProtocol/aztec-packages#10187)) ([a415f65](AztecProtocol/aztec-packages@a415f65)) ### Miscellaneous * Added ref to env variables ([#10193](AztecProtocol/aztec-packages#10193)) ([b51fc43](AztecProtocol/aztec-packages@b51fc43)) * **avm:** Operands reordering ([#10182](AztecProtocol/aztec-packages#10182)) ([69bdf4f](AztecProtocol/aztec-packages@69bdf4f)), closes [#10136](AztecProtocol/aztec-packages#10136) * Fix devbox ([#10201](AztecProtocol/aztec-packages#10201)) ([323eaee](AztecProtocol/aztec-packages@323eaee)) * Misc cleanup ([#10194](AztecProtocol/aztec-packages#10194)) ([dd01417](AztecProtocol/aztec-packages@dd01417)) * Reinstate docs-preview, fix doc publish ([#10213](AztecProtocol/aztec-packages#10213)) ([ed9a0e3](AztecProtocol/aztec-packages@ed9a0e3)) * Replace relative paths to noir-protocol-circuits ([1650446](AztecProtocol/aztec-packages@1650446)) </details> <details><summary>barretenberg: 0.65.0</summary> ## [0.65.0](AztecProtocol/aztec-packages@barretenberg-v0.64.0...barretenberg-v0.65.0) (2024-11-26) ### Features * **avm:** New public inputs witgen ([#10179](AztecProtocol/aztec-packages#10179)) ([ac8f13e](AztecProtocol/aztec-packages@ac8f13e)) * Origin tags implemented in biggroup ([#10002](AztecProtocol/aztec-packages#10002)) ([c8696b1](AztecProtocol/aztec-packages@c8696b1)) * UltraRollupRecursiveFlavor ([#10088](AztecProtocol/aztec-packages#10088)) ([4418ef2](AztecProtocol/aztec-packages@4418ef2)) ### Miscellaneous * **avm:** Operands reordering ([#10182](AztecProtocol/aztec-packages#10182)) ([69bdf4f](AztecProtocol/aztec-packages@69bdf4f)), closes [#10136](AztecProtocol/aztec-packages#10136) </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please).
Electric boogaloo (cont. of #9302) This reverts commit a415f65. --------- Co-authored-by: Tom French <[email protected]>
The Blobbening
It's happening and I can only apologise.

Follows #8955.
Intro
More detailed stuff below, but the major changes are:
SpongeBlob)Major Issues
Things that we should resolve before merging:
Run times massively increased:
nrcode forblobs is written with the BigNum lib, which uses a lot of unconstrained code then a small amount of constrained code to verify results. Unfortunately this means we cannot simulate circuits containing blobs (currentlyblock-root) using wasm or setnrtests tounconstrainedbecause that causes astack overflowin brillig.unconstrained(meaningrollup-libtests take 10mins or so to run) and I've forced circuit simulation to run in native ACVM rather than wasm (adding around 1min to any tests that simulateblock-root).nrcode would only cause more runtime issues.Data retrievalThe below will be done in feat(blobs): Integrate beacon chain client/web2 blob getter #9101, and for now we use calldata just to keep the archiver working:data-retrievalto use.Blob verification precompile gasBatching blob KZG proofs is being thought about (see Epic: Blobs #8955 for progression):nr, so we can call the precompile once per epoch rather than 3 times per block.General TODOs
Things I'm working on:
Description
The general maths in nr and replicated across
foundation/blobis described here.Old DA Flow
From the base rollup to L1, the previous flow for publishing DA was:
Nr:
baserollup, take in all tx effects we wish to publish andsha256hash them to a single value:tx_effects_hashmerge(orblock-root) circuitmergeorblock-rootcircuit simplysha256hashes each 'child'tx_effects_hashfrom its left and right inputsblock-root, we have one value:txs_effects_hashwhich becomes part of the header's content commitmentTs:
txs_effects_hashis checked and propogated through the orchestrator and becomes part of the ts classL2Blockin the headerL2Block's.bodypublishersends the serialised blockbodyandheaderto the L1 blockproposefunctionSol:
propose, we decode the blockbodyandheaderbodyis deconstructed per tx into its tx effects and then hashed usingsha256, until we haveNtx_effects_hashes (mimicing the calculation in thebaserollup)tx_effects_hashis then input as leaves to a wonky tree and hashed up to the root (mimicing the calculation frombasetoblock-root), forming the finaltxs_effects_hash*NB: With batch rollups, I've lost touch with what currently happens at verification and how we ensure the
txs_effects_hashmatches the one calculated in the rollup, so this might not be accurate.New DA Flow
The new flow for publishing DA is:
Nr:
baserollup, we treat tx effects as we treatPartialStateReferences - injecting a hint to thestartandendstate we expect from processing thisbase's transactionabsorbthem into the givenstartSpongeBlobstate. We then check the result is the same as the givenendstatePartialStateReferences, eachmergeorblock-rootchecks that the left input'sendblob state is equal to the right input'sstartblob stateblock-root, we check the above and that the left'sstartblob state was empty. Now we have a sponge which has absorbed, as a flat array, all the tx effects in the block we wish to publishbase)zby hashing this ^ hash with the blob commitmentzusing the flat array of effects in the barycentric formula (more details on the engineering design link above), to returnyblock-rootadds this triple (z,y, and commitmentC) to a new array ofBlobPublicInputsfees, eachblock-mergeandrootmerges the left and right input arrays, so we end up with an array of each block's blob info*NB: this will likely change to accumulating to a single set of values, rather than one per block, and is being worked on by Mike. The above also describes what happens for one blob per block for simplicity (it will actually be 3).
Ts:
BlobPublicInputsare checked against the ts calculated blob for each block in the orchestratorblobInput(plus the expected L1blobHashand a ts generated KZG proof) sent to L1 to theproposefunctionproposetransaction is now a special 'blob transaction' where all the tx effects (the same flat array as dealt with in the rollup) are sent as a sidecarbody, so the archiver can still read the data back until feat(blobs): Integrate beacon chain client/web2 blob getter #9101*NB: this will change once we can read the blobs themselves from the beacon chain/some web2 client.
Sol:
propose, instead of recalcating thetxs_effects_hash, we send theblobInputto a newvalidateBlobfunction. *This function:blobHashfrom the EVM and checks it against the one inblobInputz,y, andCindeed correspond to the blob we claimblobInput, but still need to link this to our rollup circuit:BlobPublicInputsis extracted from the bytes array and stored against its block numberrootproof is verified, we reconstruct the array ofBlobPublicInputsfrom the above stored values and use them in proof verificationBlobPublicInputsare incorrect (equivalently, if any of the published blobs were incorrect), the proof verification will failblobHashis been added toBlockLogonce it has been verified by the precompile*NB: As above, we will eventually call the precompile just once for many blobs with one set of
BlobPublicInputs. This will still be used in verifying therootproof to ensure the tx effects match those from eachbase.