DO NOT MERGE: Output validator plus across swap (for updated types only)#1649
DO NOT MERGE: Output validator plus across swap (for updated types only)#1649
Conversation
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
…racts into output-validator-lf-15125
…across-swap-facet
…across-swap-facet
…across-swap-facet
WalkthroughThis PR introduces comprehensive support for Across Protocol v4 swap bridging, including a new AcrossV4SwapFacet with multi-path routing (SpokePool, SpokePoolPeriphery, SponsoredOFT, SponsoredCCTP), an OutputValidator periphery contract, deployment infrastructure across networks, extensive test coverage, and updated backend signature handling for NEARIntentsFacet. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related PRs
Suggested labels
🚥 Pre-merge checks | ❌ 3❌ Failed checks (3 warnings)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
| // If this is the positive-slippage path, adjust the amounts after decoding but before validation. | ||
| if (_preSwapAmount != 0) { | ||
| // Derive output/input ratio from the quote calldata itself (includes fees + decimal differences). | ||
| uint256 outputAmountMultiplier = (params.outputAmount * |
Check warning
Code scanning / Olympix Integrated Security
Performing integer division before multiplication can lead to unnecessary loss of precision. Please note: this detector flags the division, the associated multiplication may be remote! Check function and variable chains to confirm. For more information, visit: http://detectors.olympixdevsectools.com/article/web3-vulnerability/faulty-div Medium
|
|
||
| // If this is the positive-slippage path, adjust the amounts after decoding but before validation. | ||
| if (_preSwapAmount != 0) { | ||
| uint256 outputAmountMultiplier = (swapAndDepositData |
Check warning
Code scanning / Olympix Integrated Security
Performing integer division before multiplication can lead to unnecessary loss of precision. Please note: this detector flags the division, the associated multiplication may be remote! Check function and variable chains to confirm. For more information, visit: http://detectors.olympixdevsectools.com/article/web3-vulnerability/faulty-div Medium
| /// @param _bridgeData The core information needed for bridging | ||
| /// @param _acrossV4SwapFacetData Data specific to Across V4 Swap API (enum + calldata) | ||
| /// @param _preSwapAmount If non-zero, indicates the pre-swap amount for positive-slippage handling (0 disables it) | ||
| function _startBridge( |
Check notice
Code scanning / Olympix Integrated Security
Reentrant functions which emit events after making an external call may lead to out-of-order events. For more information, visit: http://detectors.olympixdevsectools.com/article/web3-vulnerability/reentrancy-events Low
| bool isNative = LibAsset.isNativeAsset(_bridgeData.sendingAssetId); | ||
| if (!isNative) { | ||
| address decodedInputToken = address( | ||
| uint160(uint256(params.inputToken)) |
Check notice
Code scanning / Olympix Integrated Security
Performing a narrowing downcast may result in silent overflow due to bit truncation. For more information, visit: http://detectors.olympixdevsectools.com/article/web3-vulnerability/unsafe-downcast Low
| _validateAmount(quote.amount, _bridgeData.minAmount); | ||
|
|
||
| // Validate burnToken matches bridgeData asset (native is already excluded by msg.value check above) | ||
| address burnToken = address(uint160(uint256(quote.burnToken))); |
Check notice
Code scanning / Olympix Integrated Security
Performing a narrowing downcast may result in silent overflow due to bit truncation. For more information, visit: http://detectors.olympixdevsectools.com/article/web3-vulnerability/unsafe-downcast Low
Test Coverage ReportLine Coverage: 87.44% (3168 / 3623 lines) |
There was a problem hiding this comment.
Actionable comments posted: 6
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
deployments/arbitrum.diamond.staging.json (1)
208-223:⚠️ Potential issue | 🟡 MinorDuplicate facet key detected.
The address
0xC31f575546A12Bc9dEf107CE19a242cd351BaDe2appears twice in the Facets map (lines 208-211 and 220-223), both withAcrossV4SwapFacet. JSON parsers will silently keep only the last occurrence, which could lead to unexpected behavior during deployment tooling.Remove one of the duplicate entries.
🔧 Suggested fix
Remove lines 220-223:
"0xBdE21104479e1373a20F487D3f6ffDED62574FA7": { "Name": "PolymerCCTPFacet", "Version": "2.0.0" - }, - "0xC31f575546A12Bc9dEf107CE19a242cd351BaDe2": { - "Name": "AcrossV4SwapFacet", - "Version": "1.0.0" }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@deployments/arbitrum.diamond.staging.json` around lines 208 - 223, The Facets map contains a duplicate key "0xC31f575546A12Bc9dEf107CE19a242cd351BaDe2" for AcrossV4SwapFacet; remove the redundant object for that address so the Facets map only contains a single entry for "0xC31f575546A12Bc9dEf107CE19a242cd351BaDe2" (keep one AcrossV4SwapFacet record and delete the other) to avoid silent overrides by the JSON parser.
🧹 Nitpick comments (6)
script/deploy/zksync/utils/ScriptBase.sol (1)
98-98: Keep new parameter naming consistent with repo convention.Line 98 should use an underscore-prefixed parameter name to match the Solidity naming rule used across scripts/contracts.
As per coding guidelines: "Function parameters must use leading underscore prefix (e.g.,
_amount)."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@script/deploy/zksync/utils/ScriptBase.sol` at line 98, The parameter name allowZeroAddress violates the repo's leading-underscore convention; rename the parameter to _allowZeroAddress (update the function signature where allowZeroAddress appears in ScriptBase.sol) and replace all references inside the function body and any callers to use _allowZeroAddress so naming is consistent with the rule used across contracts.docs/AcrossV4SwapFacet.md (1)
66-72: Minor documentation correction: SwapData is an array.The text says "a
SwapData _swapDataparameter" but the actual signature showsLibSwap.SwapData[] calldata _swapData(an array).Suggested fix
## Swap Data -Some methods accept a `SwapData _swapData` parameter. +Some methods accept a `SwapData[] _swapData` parameter. -Swapping is performed by a swap-specific library that expects an array of calldata that can be run on various DEXs (i.e. Uniswap) to make one or multiple swaps before performing another action. +Swapping is performed by a swap-specific library that expects an array of swap instructions that can be run on various DEXs (i.e. Uniswap) to make one or multiple swaps before performing another action.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/AcrossV4SwapFacet.md` around lines 66 - 72, Update the docs to correctly state that SwapData is passed as an array: change references that say "a `SwapData _swapData` parameter" to indicate the parameter is `LibSwap.SwapData[] calldata _swapData` (an array of SwapData) and adjust surrounding wording to reflect multiple swap entries where applicable (refer to SwapData and LibSwap.SwapData[] calldata _swapData in the function signatures).src/Interfaces/ISpokePoolPeriphery.sol (1)
4-7: Add NatSpec reference to the upstream Across Protocol contract.As per coding guidelines for interfaces that shadow external protocols: add a short NatSpec note with a link or identifier referencing the original Across Protocol SpokePoolPeriphery contract.
Suggested addition
/// `@title` ISpokePoolPeriphery /// `@notice` Interface for interacting with Across Protocol SpokePoolPeriphery +/// `@dev` Based on Across Protocol's SpokePoolPeriphery contract +/// https://github.com/across-protocol/contracts/blob/master/contracts/SpokePoolPeriphery.sol /// `@author` LI.FI (https://li.fi) /// `@custom`:version 1.0.0🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Interfaces/ISpokePoolPeriphery.sol` around lines 4 - 7, Add a NatSpec reference to the upstream Across Protocol SpokePoolPeriphery contract at the top of the ISpokePoolPeriphery interface: update the contract header comments in ISpokePoolPeriphery.sol to include a short `@notice/`@dev or `@custom`:see tag with the original contract name and a URL or identifier pointing to the upstream Across Protocol SpokePoolPeriphery (e.g., repository or Etherscan address), so readers can trace this interface back to the source implementation.script/deploy/facets/DeployOutputValidator.s.sol (1)
1-39: Move this deploy script out ofscript/deploy/facets/.
OutputValidatoris not a facet, so placing its deploy script under the non-facet deploy path will keep structure consistent and easier to navigate.As per coding guidelines: “Place deployment scripts in
script/deploy/facets/for facet deployments orscript/deploy/for other contracts.”🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@script/deploy/facets/DeployOutputValidator.s.sol` around lines 1 - 39, Move this deploy script out of the facets folder: relocate the DeployScript contract file (containing DeployScript, run(), getConstructorArgs(), and the OutputValidator deployment logic) from script/deploy/facets/ to script/deploy/ so it lives with non-facet deployment scripts; update any import or relative root path usage if necessary and ensure the contract name DeployScript and its getConstructorArgs() still compile and reference the correct config path and OutputValidator type after moving.test/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.SponsoredRefunds.t.sol (1)
38-38: PreferTestBaseFacetfor this facet suite.This is a facet test, but it inherits from
TestBase, so it bypasses the repo's standard facet-test hooks/overrides. UsingTestBaseFacetkeeps the setup aligned with the other facet suites.As per coding guidelines "For facet tests, inherit from
TestBaseFacetrather thanTestBase, since standard facet functions need to be overridden".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.SponsoredRefunds.t.sol` at line 38, The test contract AcrossV4SwapFacetSponsoredRefundsTest currently inherits from TestBase but should inherit from TestBaseFacet to ensure facet-specific hooks/overrides run; update the contract declaration to extend TestBaseFacet (replace TestBase with TestBaseFacet) and ensure any required facet setup/overrides provided by TestBaseFacet are implemented/adjusted in this contract (preserve the contract name AcrossV4SwapFacetSponsoredRefundsTest and its existing setup logic).test/solidity/Periphery/OutputValidator.t.sol (1)
30-34: Unused event declaration.The
TokensWithdrawnevent is declared but never emitted in any test. If it's not needed, consider removing it to reduce noise. If it's intended for future use or mirroring the contract's event, adding a comment would clarify.Proposed fix
- event TokensWithdrawn( - address assetId, - address payable receiver, - uint256 amount - ); + // Note: TokensWithdrawn event from OutputValidator can be checked with vm.expectEmit if needed🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/solidity/Periphery/OutputValidator.t.sol` around lines 30 - 34, The declared event TokensWithdrawn is never emitted or asserted in the tests; either remove the TokensWithdrawn event declaration to eliminate dead code, or if it’s meant to mirror the production contract or be used later, add a one-line comment above TokensWithdrawn explaining its purpose (e.g., "kept for parity with contract / future tests") or update the relevant test(s) to emit/assert the event (use TokensWithdrawn in the test assertions). Ensure the chosen change is limited to the test file where TokensWithdrawn is declared.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@script/demoScripts/demoAcrossV4Swap.ts`:
- Around line 149-153: The demo forces every bytes32 finalRecipient through
bytes32ToEvmAddress which rejects non-EVM destinations; change the logic to stop
unconditionally decoding bytes32 into an EVM address: remove or guard uses of
bytes32ToEvmAddress when constructing finalRecipient (and the similar spots
referenced around the other uses) and instead pass the raw bytes32 value through
for non-EVM paths or only decode when you know the recipient is an EVM address;
update any validation to accept a 0x-prefixed 66-byte bytes32 as-is and only
call getAddress/bytes32ToEvmAddress when a clear EVM target is required. Ensure
bytes32ToEvmAddress remains available for explicit EVM-only conversions but do
not use it as the default for sponsored finalRecipient handling.
- Around line 683-691: The send path funds and approves the wrong assets for
swapAndStart: when ENTRYPOINT === 'swapAndStart' use the swapData array entries
(ISwapData) to determine deposits and msg.value for swaps with
requiresDeposit=true instead of always using
bridgeData.sendingAssetId/minAmount; update the logic around swapData parsing
and the send/deposit calculations (references: ENTRYPOINT, swapData, ISwapData,
requiresDeposit, bridgeData.sendingAssetId, bridgeData.minAmount, msgValueWei)
so that for each swap-and-bridge flow you (1) add required native ETH value
equal to the swap input amount when the swap input is native, (2) approve and
fund ERC20 inputs based on swapData.fromAsset/ fromAmount for entries with
requiresDeposit=true, and (3) preserve existing bridge funding logic for bridge
output tokens (bridgeData.isNative refers to bridge output type) so
native-output bridges still receive msgValue for the bridge while swap inputs
are separately funded from swapData.
In `@script/helperFunctions.sh`:
- Around line 4956-4994: The merge step can overwrite existing diamond sections
with empty results if a background collector failed; before building MERGED (the
jq merge using DIAMOND_NAME, FACETS_TMP, PERIPHERY_TMP), validate the temp files
(FACETS_TMP and PERIPHERY_TMP) are present and contain valid, non-empty JSON
(use jq -e '.' and check content is not an empty object/marker) and if either
check fails, abort the refresh: log a warning/error, return non-zero and do not
write to DIAMOND_FILE (avoid falling back to creating empty Facets/Periphery);
update the block that creates MERGED and the failure branch that currently falls
back to jq -n so it instead bails out when temp validation fails.
In `@src/Interfaces/ISponsoredCCTPSrcPeriphery.sol`:
- Around line 11-38: The SponsoredCCTPQuote struct in this file is
ABI-incompatible with upstream; update the struct definition (used by
depositForBurn) to include the two missing fields in the exact order and types
used upstream by inserting uint32 destinationDex and uint8 accountCreationMode
immediately after finalToken (i.e., before executionMode and actionData) so the
field ordering matches Across' SponsoredCCTPInterface.SponsoredCCTPQuote
exactly; ensure any local code that constructs or reads this struct uses the new
layout.
In `@test/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.t.sol`:
- Around line 1178-1185: The test wrongly uses
vm.expectRevert(InvalidCallData.selector) before a low-level (bool success, ) =
address(standaloneFacet).call(encoded); because vm.expectRevert cannot catch
reverts from low-level .call(); either remove the misleading
vm.expectRevert(InvalidCallData.selector) line and keep the assertFalse(success,
...) check, or replace the low-level .call with a direct/ABI-invoked call to the
facet method so vm.expectRevert(InvalidCallData.selector) can intercept the
revert; adjust references to encoded, standaloneFacet.call, vm.expectRevert, and
assertFalse accordingly.
In `@test/solidity/Periphery/OutputValidator.t.sol`:
- Around line 158-190: The test function
test_validateOutputERC20LessThanExpected currently claims "should revert" but
asserts behavior for a no-revert path; update it to match intended behavior: if
OutputValidator.validateERC20Output is supposed to silently do nothing when
actual < expected, rename the test to something like
test_validateOutputERC20LessThanExpected_NoExcessTransferred, remove the "should
revert" comment and keep the current assertions; otherwise, if it should revert,
rename to a revert-style name (e.g.,
testRevert_validateOutputERC20LessThanExpected), add vm.expectRevert before
calling OutputValidator.validateERC20Output and adjust assertions accordingly;
ensure references to the test name and the call to validateERC20Output are
updated to keep semantics clear.
---
Outside diff comments:
In `@deployments/arbitrum.diamond.staging.json`:
- Around line 208-223: The Facets map contains a duplicate key
"0xC31f575546A12Bc9dEf107CE19a242cd351BaDe2" for AcrossV4SwapFacet; remove the
redundant object for that address so the Facets map only contains a single entry
for "0xC31f575546A12Bc9dEf107CE19a242cd351BaDe2" (keep one AcrossV4SwapFacet
record and delete the other) to avoid silent overrides by the JSON parser.
---
Nitpick comments:
In `@docs/AcrossV4SwapFacet.md`:
- Around line 66-72: Update the docs to correctly state that SwapData is passed
as an array: change references that say "a `SwapData _swapData` parameter" to
indicate the parameter is `LibSwap.SwapData[] calldata _swapData` (an array of
SwapData) and adjust surrounding wording to reflect multiple swap entries where
applicable (refer to SwapData and LibSwap.SwapData[] calldata _swapData in the
function signatures).
In `@script/deploy/facets/DeployOutputValidator.s.sol`:
- Around line 1-39: Move this deploy script out of the facets folder: relocate
the DeployScript contract file (containing DeployScript, run(),
getConstructorArgs(), and the OutputValidator deployment logic) from
script/deploy/facets/ to script/deploy/ so it lives with non-facet deployment
scripts; update any import or relative root path usage if necessary and ensure
the contract name DeployScript and its getConstructorArgs() still compile and
reference the correct config path and OutputValidator type after moving.
In `@script/deploy/zksync/utils/ScriptBase.sol`:
- Line 98: The parameter name allowZeroAddress violates the repo's
leading-underscore convention; rename the parameter to _allowZeroAddress (update
the function signature where allowZeroAddress appears in ScriptBase.sol) and
replace all references inside the function body and any callers to use
_allowZeroAddress so naming is consistent with the rule used across contracts.
In `@src/Interfaces/ISpokePoolPeriphery.sol`:
- Around line 4-7: Add a NatSpec reference to the upstream Across Protocol
SpokePoolPeriphery contract at the top of the ISpokePoolPeriphery interface:
update the contract header comments in ISpokePoolPeriphery.sol to include a
short `@notice/`@dev or `@custom`:see tag with the original contract name and a URL
or identifier pointing to the upstream Across Protocol SpokePoolPeriphery (e.g.,
repository or Etherscan address), so readers can trace this interface back to
the source implementation.
In
`@test/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.SponsoredRefunds.t.sol`:
- Line 38: The test contract AcrossV4SwapFacetSponsoredRefundsTest currently
inherits from TestBase but should inherit from TestBaseFacet to ensure
facet-specific hooks/overrides run; update the contract declaration to extend
TestBaseFacet (replace TestBase with TestBaseFacet) and ensure any required
facet setup/overrides provided by TestBaseFacet are implemented/adjusted in this
contract (preserve the contract name AcrossV4SwapFacetSponsoredRefundsTest and
its existing setup logic).
In `@test/solidity/Periphery/OutputValidator.t.sol`:
- Around line 30-34: The declared event TokensWithdrawn is never emitted or
asserted in the tests; either remove the TokensWithdrawn event declaration to
eliminate dead code, or if it’s meant to mirror the production contract or be
used later, add a one-line comment above TokensWithdrawn explaining its purpose
(e.g., "kept for parity with contract / future tests") or update the relevant
test(s) to emit/assert the event (use TokensWithdrawn in the test assertions).
Ensure the chosen change is limited to the test file where TokensWithdrawn is
declared.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 62ea22d5-1126-4d14-92fd-030225b649ef
📒 Files selected for processing (41)
.cursor/rules/100-solidity-basics.mdc.cursor/rules/102-facets.mdc.gitignoreconfig/acrossV4Swap.jsonconfig/global.jsonconfig/nearintents.jsondeployments/_deployments_log_file.jsondeployments/arbitrum.diamond.staging.jsondeployments/arbitrum.staging.jsondeployments/optimism.diamond.staging.jsondeployments/optimism.staging.jsondocs/AcrossV4SwapFacet.mddocs/OutputValidator.mdscript/demoScripts/demoAcrossV4Swap.tsscript/deploy/deploySingleContract.shscript/deploy/facets/DeployAcrossV4SwapFacet.s.solscript/deploy/facets/DeployNEARIntentsFacet.s.solscript/deploy/facets/DeployOutputValidator.s.solscript/deploy/facets/UpdateAcrossV4SwapFacet.s.solscript/deploy/resources/deployRequirements.jsonscript/deploy/zksync/DeployAcrossV4SwapFacet.zksync.s.solscript/deploy/zksync/DeployOutputValidator.zksync.s.solscript/deploy/zksync/UpdateAcrossV4SwapFacet.zksync.s.solscript/deploy/zksync/utils/ScriptBase.solscript/helperFunctions.shscript/tasks/temp/checkPauserWalletPerNetwork.tssrc/Errors/GenericErrors.solsrc/Facets/AcrossV4SwapFacet.solsrc/Interfaces/IAcrossSpokePoolV4.solsrc/Interfaces/ISpokePoolPeriphery.solsrc/Interfaces/ISponsoredCCTPSrcPeriphery.solsrc/Interfaces/ISponsoredOFTSrcPeriphery.solsrc/Periphery/OutputValidator.soltest/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.Sponsored.t.soltest/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.SponsoredRefunds.t.soltest/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.t.soltest/solidity/Facets/NEARIntentsFacet.t.soltest/solidity/Periphery/OutputValidator.t.soltest/solidity/utils/TestAcrossV4SwapBackendSig.soltest/solidity/utils/TestEIP712.soltest/solidity/utils/TestNearIntentsBackendSig.sol
💤 Files with no reviewable changes (2)
- config/nearintents.json
- .gitignore
| const bytes32ToEvmAddress = (b32: Hex): Address => { | ||
| // bytes32(uint256(uint160(addr))) => addr in last 20 bytes | ||
| if (b32.length !== 66) throw new Error('Invalid bytes32 length') | ||
| return getAddress(`0x${b32.slice(26)}`) as Address | ||
| } |
There was a problem hiding this comment.
Don't force sponsored finalRecipient through an EVM-address decoder.
These checks assume every bytes32 recipient is a zero-padded EVM address. That makes the demo reject valid sponsored quotes targeting non-EVM destinations, which is one of the new Across V4 paths this PR is adding support for.
Based on learnings "Solana bytes32 addresses should use solanaAddressToBytes32(deriveSolanaAddress(privateKey)) for contract/struct fields expecting bytes32."
Also applies to: 423-425, 568-576, 623-627
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@script/demoScripts/demoAcrossV4Swap.ts` around lines 149 - 153, The demo
forces every bytes32 finalRecipient through bytes32ToEvmAddress which rejects
non-EVM destinations; change the logic to stop unconditionally decoding bytes32
into an EVM address: remove or guard uses of bytes32ToEvmAddress when
constructing finalRecipient (and the similar spots referenced around the other
uses) and instead pass the raw bytes32 value through for non-EVM paths or only
decode when you know the recipient is an EVM address; update any validation to
accept a 0x-prefixed 66-byte bytes32 as-is and only call
getAddress/bytes32ToEvmAddress when a clear EVM target is required. Ensure
bytes32ToEvmAddress remains available for explicit EVM-only conversions but do
not use it as the default for sponsored finalRecipient handling.
| let swapData: ISwapData[] = [] | ||
| if (ENTRYPOINT === 'swapAndStart') { | ||
| const swapDataJson = arg('--swapDataJson') | ||
| if (!swapDataJson) | ||
| throw new Error('swapAndStart requires --swapDataJson <path>') | ||
| const raw = await readFile(swapDataJson, 'utf8') | ||
| // Expect direct array of SwapData-compatible objects. | ||
| swapData = JSON.parse(raw) as ISwapData[] | ||
| } |
There was a problem hiding this comment.
The send path is funding and approving the wrong assets for swapAndStart.
Lines 717-742 always use bridgeData.sendingAssetId/minAmount, but swapAndStart deposits come from the swapData entries with requiresDeposit=true. Lines 761-782 also forward only msgValueWei, so native direct bridges and native-input swaps are underfunded.
🐛 Suggested shape for the fix
+ const requiredDeposits =
+ ENTRYPOINT === 'swapAndStart'
+ ? swapData.filter((swap) => swap.requiresDeposit)
+ : [
+ {
+ sendingAssetId: bridgeData.sendingAssetId,
+ fromAmount: bridgeData.minAmount,
+ },
+ ]
+
+ const txValue =
+ msgValueWei +
+ requiredDeposits.reduce(
+ (sum, deposit) =>
+ getAddress(deposit.sendingAssetId) === zeroAddress
+ ? sum + deposit.fromAmount
+ : sum,
+ 0n
+ )
+
- consola.info(`value=${msgValueWei.toString()}`)
+ consola.info(`value=${txValue.toString()}`)
- const isNativeAsset = getAddress(bridgeData.sendingAssetId) === zeroAddress
-
- const tokenContract = isNativeAsset
- ? null
- : getContract({
- address: bridgeData.sendingAssetId,
- abi: erc20Abi,
- client: { public: publicClient, wallet: walletClient },
- })
-
- await ensureBalance(
- isNativeAsset ? zeroAddress : tokenContract,
- walletAddress,
- bridgeData.minAmount,
- publicClient
- )
-
- if (!isNativeAsset && tokenContract) {
- await ensureAllowance(
- tokenContract,
- walletAddress,
- lifiDiamondAddress as string,
- bridgeData.minAmount,
- publicClient
- )
- }
+ for (const deposit of requiredDeposits) {
+ const isNativeAsset = getAddress(deposit.sendingAssetId) === zeroAddress
+ const tokenContract = isNativeAsset
+ ? null
+ : getContract({
+ address: deposit.sendingAssetId,
+ abi: erc20Abi,
+ client: { public: publicClient, wallet: walletClient },
+ })
+
+ await ensureBalance(
+ isNativeAsset ? zeroAddress : tokenContract,
+ walletAddress,
+ deposit.fromAmount,
+ publicClient
+ )
+
+ if (!isNativeAsset && tokenContract) {
+ await ensureAllowance(
+ tokenContract,
+ walletAddress,
+ lifiDiamondAddress as string,
+ deposit.fromAmount,
+ publicClient
+ )
+ }
+ }
- { value: msgValueWei }
+ { value: txValue }
- { value: msgValueWei }
+ { value: txValue }Also applies to: 706-710, 717-742, 761-782
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@script/demoScripts/demoAcrossV4Swap.ts` around lines 683 - 691, The send path
funds and approves the wrong assets for swapAndStart: when ENTRYPOINT ===
'swapAndStart' use the swapData array entries (ISwapData) to determine deposits
and msg.value for swaps with requiresDeposit=true instead of always using
bridgeData.sendingAssetId/minAmount; update the logic around swapData parsing
and the send/deposit calculations (references: ENTRYPOINT, swapData, ISwapData,
requiresDeposit, bridgeData.sendingAssetId, bridgeData.minAmount, msgValueWei)
so that for each swap-and-bridge flow you (1) add required native ETH value
equal to the swap input amount when the swap input is native, (2) approve and
fund ERC20 inputs based on swapData.fromAsset/ fromAmount for entries with
requiresDeposit=true, and (3) preserve existing bridge funding logic for bridge
output tokens (bridgeData.isNative refers to bridge output type) so
native-output bridges still receive msgValue for the bridge while swap inputs
are separately funded from swapData.
| # ensure diamond file exists and is valid JSON | ||
| if [[ ! -e $DIAMOND_FILE ]]; then | ||
| echo "{}" >"$DIAMOND_FILE" | ||
| else | ||
| # validate existing file is valid JSON, reset if corrupted | ||
| if ! jq -e . "$DIAMOND_FILE" >/dev/null 2>&1; then | ||
| warning "[$NETWORK] Existing diamond file is corrupted, resetting to empty object" | ||
| echo "{}" >"$DIAMOND_FILE" | ||
| fi | ||
| fi | ||
|
|
||
| # merge facets and periphery into diamond file atomically | ||
| local MERGED | ||
| MERGED=$(jq -r --arg diamond_name "$DIAMOND_NAME" --slurpfile facets "$FACETS_TMP" --slurpfile periphery "$PERIPHERY_TMP" ' | ||
| MERGED=$(jq --arg diamond_name "$DIAMOND_NAME" --slurpfile facets "$FACETS_TMP" --slurpfile periphery "$PERIPHERY_TMP" ' | ||
| .[$diamond_name] = (.[$diamond_name] // {}) | | ||
| .[$diamond_name].Facets = $facets[0] | | ||
| .[$diamond_name].Periphery = $periphery[0] | ||
| ' "$DIAMOND_FILE" || cat "$DIAMOND_FILE") | ||
| printf %s "$MERGED" >"$DIAMOND_FILE" | ||
| ' "$DIAMOND_FILE" 2>/dev/null) | ||
|
|
||
| # if merge failed, create fresh structure from temp files | ||
| if [[ $? -ne 0 || -z "$MERGED" ]]; then | ||
| warning "[$NETWORK] Merge failed, creating fresh diamond structure" | ||
| MERGED=$(jq -n --arg diamond_name "$DIAMOND_NAME" --slurpfile facets "$FACETS_TMP" --slurpfile periphery "$PERIPHERY_TMP" ' | ||
| { | ||
| ($diamond_name): { | ||
| Facets: $facets[0], | ||
| Periphery: $periphery[0] | ||
| } | ||
| } | ||
| ' 2>/dev/null) | ||
| fi | ||
|
|
||
| # write merged JSON with proper formatting | ||
| if [[ -n "$MERGED" ]]; then | ||
| echo "$MERGED" | jq . >"$DIAMOND_FILE" | ||
| else | ||
| error "[$NETWORK] Failed to generate valid diamond JSON" | ||
| return 1 | ||
| fi |
There was a problem hiding this comment.
Abort the refresh instead of writing empty sections after a worker failure.
This block persists whatever ended up in the temp files, and the surrounding flow normalizes missing facet/periphery output to {}. If either background collector fails, a transient RPC/Mongo issue will erase that section of the existing diamond log instead of preserving it.
🐛 Suggested guard
- if [[ -n "$PID_PERIPHERY" ]]; then wait "$PID_PERIPHERY"; fi
- if [[ -n "$PID_FACETS" ]]; then wait "$PID_FACETS"; fi
+ local PERIPHERY_STATUS=0
+ local FACETS_STATUS=0
+ if [[ -n "$PID_PERIPHERY" ]]; then
+ wait "$PID_PERIPHERY" || PERIPHERY_STATUS=$?
+ fi
+ if [[ -n "$PID_FACETS" ]]; then
+ wait "$PID_FACETS" || FACETS_STATUS=$?
+ fi
+ if [[ $PERIPHERY_STATUS -ne 0 || $FACETS_STATUS -ne 0 ]]; then
+ error "[$NETWORK] Failed to collect current diamond state; refusing to overwrite $DIAMOND_FILE"
+ rm -rf "$TEMP_DIR"
+ return 1
+ fi🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@script/helperFunctions.sh` around lines 4956 - 4994, The merge step can
overwrite existing diamond sections with empty results if a background collector
failed; before building MERGED (the jq merge using DIAMOND_NAME, FACETS_TMP,
PERIPHERY_TMP), validate the temp files (FACETS_TMP and PERIPHERY_TMP) are
present and contain valid, non-empty JSON (use jq -e '.' and check content is
not an empty object/marker) and if either check fails, abort the refresh: log a
warning/error, return non-zero and do not write to DIAMOND_FILE (avoid falling
back to creating empty Facets/Periphery); update the block that creates MERGED
and the failure branch that currently falls back to jq -n so it instead bails
out when temp validation fails.
| /// @notice Params used to create a sponsored CCTP quote and deposit for burn | ||
| /// @dev Mirrors `SponsoredCCTPInterface.SponsoredCCTPQuote` (Across) | ||
| struct SponsoredCCTPQuote { | ||
| uint32 sourceDomain; | ||
| uint32 destinationDomain; | ||
| bytes32 mintRecipient; | ||
| uint256 amount; | ||
| bytes32 burnToken; | ||
| bytes32 destinationCaller; | ||
| uint256 maxFee; | ||
| uint32 minFinalityThreshold; | ||
| bytes32 nonce; | ||
| uint256 deadline; | ||
| uint256 maxBpsToSponsor; | ||
| uint256 maxUserSlippageBps; | ||
| bytes32 finalRecipient; | ||
| bytes32 finalToken; | ||
| uint8 executionMode; | ||
| bytes actionData; | ||
| } | ||
|
|
||
| /// @notice Deposits tokens for burn via CCTP | ||
| /// @param quote The quote that contains the data for the deposit | ||
| /// @param signature The signature of the quote | ||
| function depositForBurn( | ||
| SponsoredCCTPQuote calldata quote, | ||
| bytes calldata signature | ||
| ) external; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
LOCAL_FILE="$(fd 'ISponsoredCCTPSrcPeriphery.sol' | head -n 1)"
echo "== Local interface signature summary =="
sed -n '1,240p' "$LOCAL_FILE" | rg -n \
'struct SponsoredCCTPQuote|function depositForBurn|sourceDomain|destinationDomain|mintRecipient|amount|burnToken|destinationCaller|maxFee|minFinalityThreshold|nonce|deadline|maxBpsToSponsor|maxUserSlippageBps|finalRecipient|finalToken|executionMode|actionData'
echo
echo "== Upstream Across SponsoredCCTPInterface summary =="
curl -fsSL \
'https://raw.githubusercontent.com/across-protocol/contracts/master/contracts/periphery/mintburn/sponsored-cctp/SponsoredCCTPInterface.sol' \
| rg -n \
'struct SponsoredCCTPQuote|function depositForBurn|sourceDomain|destinationDomain|mintRecipient|amount|burnToken|destinationCaller|maxFee|minFinalityThreshold|nonce|deadline|maxBpsToSponsor|maxUserSlippageBps|finalRecipient|finalToken|executionMode|actionData'Repository: lifinance/contracts
Length of output: 811
🌐 Web query:
Across Protocol SponsoredCCTPInterface contract GitHub repository
💡 Result:
The SponsoredCCTPInterface contract is in the Across Protocol contracts repository (across-protocol/contracts), at:
contracts/interfaces/SponsoredCCTPInterface.sol(github.com)
Citations:
🏁 Script executed:
# Try alternative GitHub paths for Across SponsoredCCTPInterface
echo "=== Trying alternative Across repository paths ==="
# Attempt 1: Try develop branch instead of master
curl -fsSL \
'https://raw.githubusercontent.com/across-protocol/contracts/develop/contracts/periphery/mintburn/sponsored-cctp/SponsoredCCTPInterface.sol' \
-w "\n%{http_code}\n" 2>&1 | head -20
echo "---"
# Attempt 2: Check if repo structure differs
curl -fsSL \
'https://api.github.com/repos/across-protocol/contracts/contents/contracts/periphery/mintburn/sponsored-cctp' \
2>&1 | head -30Repository: lifinance/contracts
Length of output: 2604
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== Fetching correct upstream Across SponsoredCCTPInterface ==="
curl -fsSL \
'https://raw.githubusercontent.com/across-protocol/contracts/master/contracts/interfaces/SponsoredCCTPInterface.sol' \
> /tmp/upstream_interface.sol
echo "File fetched successfully. Extracting struct and function signature..."
echo
echo "=== Upstream SponsoredCCTPInterface struct and function ==="
sed -n '1,300p' /tmp/upstream_interface.sol | grep -A 50 'struct SponsoredCCTPQuote' | head -30
echo
echo "=== Upstream depositForBurn function signature ==="
sed -n '1,300p' /tmp/upstream_interface.sol | grep -A 3 'function depositForBurn'
echo
echo "=== Full local interface for comparison ==="
fd 'ISponsoredCCTPSrcPeriphery.sol' -x cat {} | sed -n '1,100p'Repository: lifinance/contracts
Length of output: 1888
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== Complete upstream SponsoredCCTPInterface ==="
curl -fsSL 'https://raw.githubusercontent.com/across-protocol/contracts/master/contracts/interfaces/SponsoredCCTPInterface.sol' | cat -n
echo
echo "=== Local ISponsoredCCTPSrcPeriphery.sol ==="
fd 'ISponsoredCCTPSrcPeriphery.sol' -x cat -n {}Repository: lifinance/contracts
Length of output: 5909
ABI mismatch with upstream Across: struct is missing two fields.
The local SponsoredCCTPQuote struct diverges from the upstream Across SponsoredCCTPInterface.SponsoredCCTPQuote:
Missing fields (after finalToken):
uint32 destinationDexuint8 accountCreationMode
Local struct ends with executionMode and actionData, but upstream requires destinationDex and accountCreationMode before those two fields. This struct layout mismatch will break ABI compatibility; encoded calldata from upstream tools or contracts will fail to decode correctly.
Update the struct to match upstream field order and types exactly.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/Interfaces/ISponsoredCCTPSrcPeriphery.sol` around lines 11 - 38, The
SponsoredCCTPQuote struct in this file is ABI-incompatible with upstream; update
the struct definition (used by depositForBurn) to include the two missing fields
in the exact order and types used upstream by inserting uint32 destinationDex
and uint8 accountCreationMode immediately after finalToken (i.e., before
executionMode and actionData) so the field ordering matches Across'
SponsoredCCTPInterface.SponsoredCCTPQuote exactly; ensure any local code that
constructs or reads this struct uses the new layout.
| ); | ||
|
|
||
| vm.expectRevert(InvalidCallData.selector); | ||
| (bool success, ) = address(standaloneFacet).call(encoded); | ||
| assertFalse(success, "expected InvalidCallData revert"); | ||
|
|
||
| vm.stopPrank(); | ||
| } |
There was a problem hiding this comment.
vm.expectRevert does not catch low-level .call() reverts.
The vm.expectRevert(InvalidCallData.selector) on line 1180 won't intercept the revert from the low-level .call() on line 1181. The assertion assertFalse(success, ...) on line 1182 is the actual check. The vm.expectRevert here has no effect and is misleading.
Proposed fix
- vm.expectRevert(InvalidCallData.selector);
(bool success, ) = address(standaloneFacet).call(encoded);
assertFalse(success, "expected InvalidCallData revert");
+ // Note: vm.expectRevert doesn't work with low-level .call();
+ // revert reason is checked manually above in the conditional block🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@test/solidity/Facets/Across/SwapAPI/AcrossV4SwapFacet.t.sol` around lines
1178 - 1185, The test wrongly uses vm.expectRevert(InvalidCallData.selector)
before a low-level (bool success, ) = address(standaloneFacet).call(encoded);
because vm.expectRevert cannot catch reverts from low-level .call(); either
remove the misleading vm.expectRevert(InvalidCallData.selector) line and keep
the assertFalse(success, ...) check, or replace the low-level .call with a
direct/ABI-invoked call to the facet method so
vm.expectRevert(InvalidCallData.selector) can intercept the revert; adjust
references to encoded, standaloneFacet.call, vm.expectRevert, and assertFalse
accordingly.
| function test_validateOutputERC20LessThanExpected() public { | ||
| vm.startPrank(address(diamond)); | ||
|
|
||
| // Arrange - test case where actual amount is less than expected (should revert) | ||
| uint256 expectedAmount = 1000 ether; | ||
| uint256 actualAmount = 500 ether; // Less than expected | ||
|
|
||
| // Fund diamond with tokens | ||
| deal(address(dai), address(diamond), actualAmount); | ||
|
|
||
| // Approve OutputValidator to spend tokens from diamond | ||
| dai.approve(address(outputValidator), actualAmount); | ||
|
|
||
| outputValidator.validateERC20Output( | ||
| address(dai), | ||
| expectedAmount, | ||
| validationWallet | ||
| ); | ||
|
|
||
| // Assert | ||
| assertEq( | ||
| dai.balanceOf(validationWallet), | ||
| 0, | ||
| "No excess tokens should be transferred" | ||
| ); | ||
| assertEq( | ||
| dai.balanceOf(address(diamond)), | ||
| actualAmount, | ||
| "Should keep actual amount" | ||
| ); | ||
|
|
||
| vm.stopPrank(); | ||
| } |
There was a problem hiding this comment.
Test name/comment mismatch with actual behavior.
The test is named test_validateOutputERC20LessThanExpected and the comment on line 161 says "should revert", but the test does not use vm.expectRevert. Instead, it asserts that no excess is transferred and the diamond keeps the actual amount.
If the OutputValidator is designed to not revert when actual < expected (just do nothing), the test name and comment are misleading. Consider:
- Renaming to
test_validateOutputERC20LessThanExpected_NoExcessTransferred - Removing the "should revert" comment
- Or if it should revert, prefix with
testRevert_and addvm.expectRevert
Proposed fix (if no-revert is intended)
- function test_validateOutputERC20LessThanExpected() public {
+ function test_validateOutputERC20LessThanExpected_NoExcessTransferred() public {
vm.startPrank(address(diamond));
- // Arrange - test case where actual amount is less than expected (should revert)
+ // Arrange - test case where actual amount is less than expected (no excess to transfer)
uint256 expectedAmount = 1000 ether;
uint256 actualAmount = 500 ether; // Less than expected🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@test/solidity/Periphery/OutputValidator.t.sol` around lines 158 - 190, The
test function test_validateOutputERC20LessThanExpected currently claims "should
revert" but asserts behavior for a no-revert path; update it to match intended
behavior: if OutputValidator.validateERC20Output is supposed to silently do
nothing when actual < expected, rename the test to something like
test_validateOutputERC20LessThanExpected_NoExcessTransferred, remove the "should
revert" comment and keep the current assertions; otherwise, if it should revert,
rename to a revert-style name (e.g.,
testRevert_validateOutputERC20LessThanExpected), add vm.expectRevert before
calling OutputValidator.validateERC20Output and adjust assertions accordingly;
ensure references to the test name and the call to validateERC20Output are
updated to keep semantics clear.
Which Jira task belongs to this PR?
Why did I implement it this way?
Checklist before requesting a review
Checklist for reviewer (DO NOT DEPLOY and contracts BEFORE CHECKING THIS!!!)