StableSurge — Idea To Product

Balancer recently launched StableSurge, the first production hook on V3  — an innovative directional fee mechanism that dynamically adjusts swap fees to help protect stable-asset pegs during volatility.

This article explores how Balancer V3 was leveraged to bring StableSurge to life, introduces the tech stack that powers it, and highlights how collaboration across multiple service providers transformed this novel concept into a fully deployed production feature.

Idea To Contract Made Easy

The BLabs Smart Contracts team implemented the code — the StableSurgeHook itself and the associated factory.

The Hooks architecture enables developers to focus on their core functionality without worrying about the complexities of the Balancer Vault, Pools, and other internals — these components simply “just work.”

To support developers, Balancer provides:

Beyond development, audits can be faster and more cost-effective since the bounded scope reduces the risk of unintended issues. For example, StableSurge was fully audited in just one week.

The result? A shorter development cycle and faster time to market.

The final step for the SC team after Audits are complete are the production deployments to all networks supported by Balancer. This kicks off the final integration of the off chain components.

Operational Data

Balancer’s data teams focus on two key roles: operations and analysis. Operationally, on-chain data must be accessible in a way that enables consumers, such as the front-end, to utilize it effectively. Balancer achieves this through the Subgraph and its open-source API, run on in-house infrastructure.

Metadata

The Balancer Metadata repo serves as a central repository for storing critical information about pools, hooks, tokens, and more, all of which are utilized by the Balancer front-end. For example, the entry for StableSurge includes a description and deployment addresses, ensuring that the front-end can retrieve and display the correct details.

Subgraph

The Subgraph is a light weight data layer for V3 built by indexing onchain events. To add support for a new Hook/Pool the relevant config, addresses, abis, etc must be added (see StableSurge PR for reference). Any new, important parameters must also be identified and tracked, e.g. for StableSurge the following params were included: amp, maxSurgeFeePercentage, surgeThresholdPercentage.

API

The Balancer API builds on top of the Subgraph, transforming and augmenting data into a more usable format for other layers of the stack, including the SDK and front-end. To support a new Hook or Pool, the hook address must be added to the config, along with any new parameters. Some additional custom work may also be required, such as APR tracking or other specific calculations.

Integrations

Building and deploying the code is just the first step — adoption is what makes it valuable. The Integrations team ensure that a new product developed on Balancer’s platform is usable, accessible, and widely adopted. Packages are provided to make it easier to interact with the smart contracts and to replicate the core maths offchain. New hooks/pools are integrated into the swap router, and the team works closely with external aggregators to drive deeper ecosystem integration.

Balancer Maths

The Balancer maths repo contains reference mathematical implementations, in Javascript and Python, for supported Balancer pools and hooks. When we want to support a new hook type we add an implementation that should match smart contract code 100% (and similarly for a new pool type). You can see an exampe PR for adding the Python implementation of StableSurge here. The final step is to publish the updated NPM package which will be used in the SOR and aggregator integrations.

Smart Order Router

The Balancer Smart Order Router (SOR) identifies optimal swap paths for a given token pair and is accessible through the Balancer API. When a new pool/hook is created this must be integrated into the SOR. Swap results are calculated using the Balancer Maths package so that must be updated and any hook specific parameters must be passed appropriately.

SDK

The Balancer SDK is a Typescript/Javascript library for interfacing with the Balancer protocol. This includes common contract interactions such as adding and removing liquidity. The SDK leverages the API and the SC query functionality which means no changes are needed to support add/remove/swaps for new pool types or hooks (provided they work the Balancer Routers).

To support creation a small update is required whenever a new factory is developed. As StableSurge uses a dedicated factory a PR was made to add this.

Pool Creation

A Pool Creation UI is provided by Balancer to make the creation of new pools easy. Similarly to the SDK an update is required when a new factory is developed. Otherwise a pool can be configured to use a hook as detailed in the docs.

Aggregators

V3 Aggregator Routing

To maximize volume, we aim to expose Balancer V3 liquidity to as many aggregators and solvers as possible. We collaborate closely with these teams, each of whom has their own unique approach. Our efforts include:

  • Providing aggregator specific docs
  • Creating detailed pool/hook-specific docs with all necessary. information, such as the StableSurge reference as an example.
  • Notifying teams of new launches and offering direct support.
  • Contributing directly where possible, such as the Paraswap PR adding StableSurge support.

Bringing StableSurge to the Front End

StableSurge UI

The design and front-end teams play a crucial role in integrating StableSurge into the user experience, ensuring that all relevant hook information is accessible and intuitive. Their contributions include:

By building on the foundational work of the backend teams, the front-end and design teams ensure that StableSurge is not only functional but also user-friendly and informative.

Partnerships and Launch

In parallel with the technical development, the BizDev team has been actively identifying and collaborating with partners to prepare for launch. Partners who benefit from fee surging are naturally interested in improving pool performance for their liquidity providers while enhancing peg stability, making the value proposition clear.

The launch plan centered around key partners with an appetite for innovation and an interest in this particular product, including Aave, Treehouse, USDX, and emerging LST projects like Inception’s inwstETH and Loop’s slpETH. From an operations perspective, the Balancer Maxis team supported partners in creating and seeding pools, ensuring a smooth onboarding process.

A particularly strong collaboration emerged with Aave, where integrating GHO into a StableSurge pool with boosted features provided a comprehensive liquidity solution. Shortly after launch, the GHO/USDC Base pool quickly scaled to over $5 million TVL.

With the launch ongoing, data is being collected to fine-tune optimal surge thresholds, max fee settings, and other parameters like base fees and amplification. The surging mechanism enables a high-efficiency zone near the peg, while also acting as a backstop during volatility.

Next steps include:

  • Further optimizing hook settings based on real-world data.
  • Onboarding more stablecoins and ETH-based liquidity.
  • Expanding to BTC-correlated pairs while integrating boosting via rehypothecation.

Data Analysis

V3 Hooks Dashboard

The data team has developed a V3 Hooks dashboard to showcase curated hooks, featuring tailored visuals and key metrics that highlight the unique aspects of each hook. Meanwhile, other Balancer dashboards track overall key metrics and volume across the ecosystem.

Come Build With Us!

Building on Balancer V3 comes with a wide range of benefits, making contract development the primary focus for builders. The data layer, integrations, and front-end support are largely handled by Balancer’s infrastructure, reducing the overhead of building a complete ecosystem around a new feature.

With deep integrations into aggregators to drive volume, robust data tooling, and a well-supported front-end, developers can spend less time on infrastructure and more time innovating. Whether designing new hooks, optimizing swap mechanics, or experimenting with novel liquidity strategies, Balancer V3 provides a powerful, streamlined foundation to bring ideas to life and we’d love to help.

Unlocking the Power of Balancer V3: Hook Development Made Simple

Using Balancer V3 peripheral contracts to simplify the development and testing of custom Hooks

In my previous post, I discussed the basics of Balancer Hooks and demonstrated how simple it is to create a custom hook with a decaying exit fee. The ease of development on Balancer V3 is greatly aided by the peripheral smart contracts crafted by the Ballerinas (the team behind Balancer’s smart contracts). These contracts serve as helpful tools, simplifying the testing workflow and enhancing the safety and efficiency of projects. In this article, we will explore these in greater detail, showing how they can make developers lives easier.

The Hook That Holds: Enabling Peg Stability through Fee-Based Solutions

In Balancer’s stable pools, maintaining a healthy peg is crucial for yield-bearing assets like stable coins and staking derivatives. However, as market dynamics take over, one token may become significantly overweight, leading to inefficiencies in trading. The mighty ZenDragon has proposed a possible hook design that defends the peg while allowing passive liquidity providers to benefit from this de-pegging behavior (details to be covered in an upcoming post). One possible implementation of this can be seen in this StableSurge hook example which also serves as a good showcase for the simplified development process.

Peripheral Power

The main hook contract inherits three useful building blocks and uses the custom FixedPoint maths library:

contract StableSurgeHookExample is BaseHooks, VaultGuard, Ownable {...
using FixedPoint for uint256;

BaseHooks

BaseHooks.sol is provided as an abstract contract, with a minimal implementation of a hooks contract. At a high level this contract includes:

  • Base implementation: A complete implementation of the IHooks.sol interface, with each implemented function returning false.
  • Configuration: A virtual function getHookFlags that must be implemented by your hooks contract, defining which hooks your contract supports.

By inheriting this contract a hooks developer can concentrate on implementing the subset of callbacks they are interested in and remain confident the rest of the interface requirement is covered. In the StableSurgeHookExample we override three functions:

getHookFlags

function getHookFlags() public pure override returns (HookFlags memory hookFlags){
  hookFlags.shouldCallComputeDynamicSwapFee = true;
}

This is the only mandatory hook and can effectively be thought of as defining the hook config. When a pool is registered, the Vault calls this function to store the configuration. In this example, the shouldCallComputeDynamicSwapFee flag is set to true, indicating that the contract is configured to calculate the dynamic swap fee.

onRegister

function onRegister(address factory, address pool, TokenConfig[] memory, LiquidityManagement calldata) public override onlyVault returns (bool) {
  return factory == _allowedFactory && IBasePoolFactory(factory).isPoolFromFactory(pool);
}

The onRegister function enables developers to implement custom validation logic to ensure the registration is valid. When a new pool is registered, a hook address can be provided to “link” the pool and the hook. At this stage, the onRegister function is invoked by the Vault, and it must return true for the registration to be successful. If the validation fails, the function should return false, preventing the registration from being completed.

In this example we validate that the factory param forwarded from the Vault matches the _allowedFactory set during the hook deployment, and that the pool was deployed by that factory.

onComputeDynamicSwapFeePercentage

The Vault calls onComputeDynamiceSwapFeePercentageto retrieve the swap fee value. This is where the big brain logic for the hook is implemented. The actual code is fairly long but the pseudo-code looks like:

function onComputeDynamicSwapFeePercentage(
    PoolSwapParams calldata params,
    address pool,
    uint256 staticSwapFeePercentage
) public view override onlyVault returns (bool, uint256) {

  uint256 amountCalculatedScaled18 = StableMath.computeSwapResult(...swapParams);
  uint256 weightAfterSwap = getWeightAfterSwap(balancesAfter);
  if (weightAfterSwap > thresholdBoundary) {
    return (true, getSurgeFee(weightAfterSwap, thresholdBoundary, staticSwapFeePercentage, _surgeCoefficient));
  } else {
    return (true, staticSwapFeePercentage);
  }
}

Essentially the virtual weights of the tokens in the pool after the swap are calculated. If these are above a user defined threshold boundary a fee that is proportional to the weights distance from the threshold is returned. If not the normal static swap fee is used. The reader is encouraged to read the full code and theory to appreciate the implementation 🤓.

VaultGuard

The VaultGuard is a simple contract that shares the modifier onlyVault. This ensures a function can only be called when the sender is the vault.

modifier onlyVault() {
  _ensureOnlyVault();
  _;
}

function _ensureOnlyVault() private view {
  if (msg.sender != address(_vault)) {
  revert IVaultErrors.SenderIsNotVault(msg.sender);
  }
}

While it might seem overly cautious, especially for stateless hooks, it serves a crucial purpose. This restriction maintains predictable behavior and simplifies the reasoning about your contract’s state. It’s like having a bouncer at an exclusive club — sure, letting a few extra people in might not hurt, but it’s easier to manage when you stick to the guest list. This approach aligns with the standard lifecycle of Balancer pools, keeping the contract’s behavior consistent and secure. Of course, if the hook has state, permissioned functions, or any functions other than hook overrides, a more relaxed access policy can be appropriate.

Ownable

Ownable is actually an OpenZeppelin contract which “provides a basic access control mechanism, where there is an account (an owner) that can be granted exclusive access to specific functions.”

Here we are leveraging the onlyOwner to restrict the use of the setThresholdand setSurgeCoefficient functions to the owner of the contract. Ownership is set in the constructor to be the contract deployer:

constructor(
  IVault vault,
  address allowedFactory,
  uint256 threshold,
  uint256 surgeCoefficient
) VaultGuard(vault) Ownable(msg.sender)

function setThreshold(uint64 newThreshold) external onlyOwner {
  _threshold = newThreshold;
}

function setSurgeCoefficient(uint64 newSurgeCoefficient) external onlyOwner {
  _surgeCoefficient = newSurgeCoefficient;
}

FixedPoint

FixedPoint is a very useful library that supports 18-decimal fixed point arithmetic. All Vault calculations use this for high and uniform precision. In this example we use it to calculate the swap fee. Some of the commonly used functions are:

Testing

Testing in production is tempting but risky! The Balancer V3 mono-repo contains extensive tests and exposes a particularly useful BaseVaultTest contract that external teams are already leveraging during their own development. A few of the high level benefits include:

  • A default setup (that is also customizable) that handles all V3 related deployments including Vault, Router, Authorizer, etc and all associated approvals
  • Deployment of test tokens and initial seeding of balances for test accounts (test tokens take decimals as an argument, so you can construct them with different decimals if needed)
  • Easily handle deployment of your custom pools and hooks along with initial pool seeding and LP interactions (including all required approvals for common actions)
  • Helpers to get account balances (including pool balances), vault balances and hook balances

The detail of BaseVaultTest could probably be a post in itself so instead we will look at some specific examples of how I leveraged some of the functionality in my tests for the hook, StableSurgeExample.t.sol.

Test Pool And Hook Setup

As mentioned previously the StableSurge hook is configured to only work with a user configured pool factory. In this test, because the hook is expected to be used with Balancer StablePools, I want to make sure we use the StablePoolFactory. To achieve this we can override the createHook function which is called during the initial BaseVault setup:

function createHook() internal override returns (address) {
  stablePoolFactory = new StablePoolFactory(IVault(address(vault)), 365 days,    "Factory v1", "Pool v1");
  // LP will be the owner of the hook.
  vm.prank(lp);
  address stableSurgeHook = address(
  new StableSurgeHookExample(IVault(address(vault)), address(stablePoolFactory),   THRESHOLD, SURGE_COEFFICIENT)
);
  vm.label(stableSurgeHook, "Stable Surge Hook");
  return stableSurgeHook;
}

Fairly simple to follow, it deploys the StablePoolFactory and uses that address as part of the constructor input when deploying the StableSurgeHookExample. The address of the stableSurgeHook is returned at the end of the function and the BaseVaultTest exposes this via the poolsHookContract variable so it can be used later. Also interesting to note here is the hook is deployed by the lp account which will become the hook owner.

Next to override is the _createPool function which handles the actual pool deployment:

function _createPool(address[] memory tokens, string memory label) internal override returns (address) {
  PoolRoleAccounts memory roleAccounts;

  address newPool = address(
    stablePoolFactory.create(
    "Stable Pool Test",
    "STABLE-TEST",
    vault.buildTokenConfig(tokens.asIERC20()),
    AMP_FACTOR,
    roleAccounts,
    MIN_SWAP_FEE,
    poolHooksContract,
    false, // Does not allow donations
    false, // Do not disable unbalanced add/remove liquidity
    ZERO_BYTES32
  )
  );
  vm.label(newPool, label);  authorizer.grantRole(vault.getActionId(IVaultAdmin.setStaticSwapFeePercentage.selector), admin);
  vm.prank(admin);
  vault.setStaticSwapFeePercentage(newPool, SWAP_FEE_PERCENTAGE);

  return newPool;
}

The StableSurge hook is expected to be used with Balancer Stable Pools so unlike some other hook tests I need to make sure I’m not testing with the default MockPool. I use the stablePoolFactory to create a new pool that is configured to use our previously deployed hook, poolHooksContract. The last part of this process is to use the authorizer to set the pool static swap fee. This will be the expected fee when the pool is not “surging”.

And thats it! Now whenever we run our test (using: $ forge test --match-path test/foundary/StableSurgeExample.t.sol) the setUp function will be called and everything is deployed, seeded and ready for tests.

Testing Balances

The final helper we’ll check out is getBalances which can be found here. This function extracts and returns a collection of essential balances, encompassing test user pool and token balances, hook balances, and vault balances. It’s an invaluable tool for validating correct balance adjustments following operations, streamlining the testing process considerably:

function testSwapBelowThreshold() public {
  BaseVaultTest.Balances memory balancesBefore = getBalances(lp);

  // Calculate the expected amount out (amount out without fees)
  uint256 poolInvariant = StableMath.computeInvariant(
    AMP_FACTOR * StableMath.AMP_PRECISION,
    balancesBefore.poolTokens
  );
  uint256 expectedAmountOut = StableMath.computeOutGivenExactIn(
  AMP_FACTOR * StableMath.AMP_PRECISION,
  balancesBefore.poolTokens,
  daiIdx,
  usdcIdx,
  amountInBelowThreshold,
  poolInvariant
  );

  // Swap with amount that should keep within threshold
  vm.prank(bob);
  router.swapSingleTokenExactIn(pool, dai, usdc, amountInBelowThreshold, 0,   MAX_UINT256, false, bytes(""));

  BaseVaultTest.Balances memory balancesAfter = getBalances(lp);

  // Check Bob's balances (Bob deposited DAI to receive USDC)
  assertEq(
  balancesBefore.bobTokens[daiIdx] - balancesAfter.bobTokens[daiIdx],
  amountInBelowThreshold,
  "Bob DAI balance is wrong"
);

...

There is also an alternative implementation that allows balances to be tracked across user defined tokens/pools.

Conclusion

Hopefully this has given a helpful intro to some of the options available to improve the experience and efficiency while developing on top of Balancer V3. It really is easy and quick to get going so take some time and hack around and please reach out anytime if you have any questions or suggestions!

Unlocking the Power of Balancer V3: Exploring Hooks and Custom Routers

Intro

Balancers latest V3 design offers lots of exciting features such as transient accounting, native support for yield bearing tokens and boosted pools (see intro and docs for more details). Lately I’ve been diving into one of the other new features, hooks, which provide a way for developers to easily extend existing pool types at various key points throughout the pool’s lifecycle.

Hooks are standalone contracts with their own logic and state that can be linked to a pool during registration. Depending on the hook configuration, custom logic can be implemented at specific stages like before/after swapping, adding or removing liquidity (see Hook docs for details).

To explore hooks in more detail, I created a hook that charges a decaying fee for LPs removing liquidity. This is a particularly interesting example as it requires implementing a custom router (which doubles as the hook). This setup enables using NFTs to track both liquidity positions and associated metadata. When users add liquidity, they receive an NFT tied to the LP position’s start time and size. As the owner of this NFT, they can exit their LP position at any time, incurring a fee that diminishes over time since the start.

Are You A Hook Or Are You A Router?

A router is a primary entry point for a user to interact with the Vault and they can contain custom logic. Balancer has developed and deployed official routers that would normally be used for add and remove operations. In this example we need custom logic for adds and removes.

Because a hook is just a standalone contract that must implement the IHooks interface by adding the custom logic to handle adding and removing liquidity we get a single contract that can be both a hook and a router.

Note – in both the official Balancer routers and this implementation Permit2 is used to permission approvals and transfer tokens directly and safely from the user.

Show Me Some Code

For the full code see here. Lets take a look at some of the highlights.

Adding Liquidity

Normally when a user adds liquidity via the Balancer router the BPT minted by the Vault is sent to the msg.sender. In this example the BPT is minted to the router instead. The router mints the msg.sender an NFT with a particular `tokenId`. This tokenId is also used to keep track of the pool address, the BPT amount and the time the liquidity is added which are all used when removing liquidity.

We can see the router `addLiquidityProportional` function a user would call:

function addLiquidityProportional(
    address pool,
    uint256[] memory maxAmountsIn,
    uint256 exactBptAmountOut,
    bool wethIsEth,
    bytes memory userData
) external payable saveSender returns (uint256[] memory amountsIn) {
    // Do addLiquidity operation - BPT is minted to this contract.
    amountsIn = _addLiquidityProportional(
        pool,
        msg.sender,
        address(this),
        maxAmountsIn,
        exactBptAmountOut,
        wethIsEth,
        userData
    );

    uint256 tokenId = _nextTokenId++;
    // Store the initial liquidity amount associated with the NFT.
    bptAmount[tokenId] = exactBptAmountOut;
    // Store the initial start time associated with the NFT.
    startTime[tokenId] = block.timestamp;
    // Store the pool/bpt address associated with the NFT.
    nftPool[tokenId] = pool;
    // Mint the associated NFT to sender.
    _safeMint(msg.sender, tokenId);

    emit LiquidityPositionNftMinted(msg.sender, pool, tokenId);
}

The first step is to call _addLiquidityProportional which is inherited from MinimalRouter. This code is very similar to the official Balancer Router with the subtle change that allows it to specify a receiver of the BPT instead of automatically setting it to the sender. In this example its set to the router itself, address(this) The Router will call the Vault addLiquidity function:

(amountsIn, , ) = abi.decode(
    _vault.unlock(
        abi.encodeWithSelector(
            MinimalRouter.addLiquidityHook.selector,
            ExtendedAddLiquidityHookParams({
                sender: sender,
                receiver: receiver, ------ Note receiver can be set unlike Balancer Router
                pool: pool,
                maxAmountsIn: maxAmountsIn,
                minBptAmountOut: exactBptAmountOut,
                kind: AddLiquidityKind.PROPORTIONAL,
                wethIsEth: wethIsEth,
                userData: userData
            })
        )
    ),
    (uint256[], uint256, bytes)
);

During an addLiquidity operation the Vault checks if the associated hook has set “shouldCallBeforeAddLiquidity` and if true the `onBeforeAddLiquidity` hook function is called:

function onBeforeAddLiquidity(
        address router,
        address,
        AddLiquidityKind,
        uint256[] memory,
        uint256,
        uint256[] memory,
        bytes memory
    ) public view override onlySelfRouter(router) returns (bool) {
        // We only allow addLiquidity via the Router/Hook itself (as it must custody BPT).
        return true;
    }

In this example it has the simple job of making sure that liquidity can only be added via the router itself. If a user tries to add via another router `onlySelfRouter` will fail and block the operation. This is done to ensure the decaying exit fee is always taken into account for any pool using the hook.

The final part of the addLiquidity code is where the NFT and meta data magic happen:

uint256 tokenId = _nextTokenId++;
// Store the initial liquidity amount associated with the NFT.
bptAmount[tokenId] = exactBptAmountOut;
// Store the initial start time associated with the NFT.
startTime[tokenId] = block.timestamp;
// Store the pool/bpt address associated with the NFT.
nftPool[tokenId] = pool;
// Mint the associated NFT to sender.
_safeMint(msg.sender, tokenId);

A new `tokenId` is associated with this particular liquidity position. This is then used to store the particular, pool, BPT amount and time associated with the position and finally a new NFT with the tokenId is minted to the user.

Removing Liquidity

To remove liquidity a user must own an NFT linked to a liquidity position. `removeLiquidityProportional` is called using the `tokenId` of the NFT. During the remove operation the fee to applied is calculated using the previously stored time. Once the operation is complete the NFT is burned and the exit tokens are sent to the user.

We can see `removeLiquidityProportional` called by the user looks like:

function removeLiquidityProportional(
    uint256 tokenId,
    uint256[] memory minAmountsOut,
    bool wethIsEth
) external payable saveSender returns (uint256[] memory amountsOut) {
    // Ensure the user owns the NFT.
    address nftOwner = ownerOf(tokenId);

    if (nftOwner != msg.sender) {
        revert WithdrawalByNonOwner(msg.sender, nftOwner, tokenId);
    }

    address pool = nftPool[tokenId];

    // Do removeLiquidity operation - tokens sent to msg.sender.
    amountsOut = _removeLiquidityProportional(
        pool,
        address(this),
        msg.sender,
        bptAmount[tokenId],
        minAmountsOut,
        wethIsEth,
        abi.encode(tokenId) // tokenId is passed to index fee data in hook
    );

    // Set all associated NFT data to 0.
    bptAmount[tokenId] = 0;
    startTime[tokenId] = 0;
    nftPool[tokenId] = address(0);
    // Burn the NFT
    _burn(tokenId);

    emit LiquidityPositionNftBurned(msg.sender, pool, tokenId);
}

NFT ownership is checked and if the `msg.sender` is not the owner of the NFT with `tokenId` the transaction will revert:

address nftOwner = ownerOf(tokenId);

if (nftOwner != msg.sender) {
    revert WithdrawalByNonOwner(msg.sender, nftOwner, tokenId);
}

The pool address associated with the NFT liquidity position is retrieved from the mapping:

address pool = nftPool[tokenId];

which is then used in the `_removeLiquidityProportional`. Similar to the add operation, this is a slightly changed version of the Balancer Router function that allows us set the sender as the router (address(this)) and the receiver as the user (msg.sender). Its also interesting to note here that we pass the encoded tokenId as `userData`. This eventually gets forwarded to the after hook and is used to retrieve the mapped data required to calculate the fee.

During the removeLiquidity operation the Vault checks if the associated hook has set `shouldCallAfterRemoveLiquidity` and if true the `onAfterRemoveLiquidity` hook function is called:

function onAfterRemoveLiquidity(
    address router,
    address pool,
    RemoveLiquidityKind,
    uint256,
    uint256[] memory,
    uint256[] memory amountsOutRaw,
    uint256[] memory,
    bytes memory userData
) public override onlySelfRouter(router) returns (bool, uint256[] memory hookAdjustedAmountsOutRaw) {
    // We only allow removeLiquidity via the Router/Hook itself so that fee is applied correctly.
    uint256 tokenId = abi.decode(userData, (uint256));
    hookAdjustedAmountsOutRaw = amountsOutRaw;
    uint256 currentFee = getCurrentFeePercentage(tokenId);
    if (currentFee > 0) {
        hookAdjustedAmountsOutRaw = _takeFee(IRouterCommon(router).getSender(), pool, amountsOutRaw, currentFee);
    }
    return (true, hookAdjustedAmountsOutRaw);
}

This is the main meat of our hook. It first checks `onlySelfRouter(router)` which ensures that remove can only be done via the router itself (for the same reason given above for add). It then retrieves the `tokenId` passed via the userData. This is used to retrieve the fee using `getCurrentFeePercentage`:

function getCurrentFeePercentage(uint256 tokenId) public view returns (uint256 feePercentage) {
    // Calculate the number of days that have passed since startTime
    uint256 daysPassed = (block.timestamp - startTime[tokenId]) / 1 days;
    if (daysPassed < DECAY_PERIOD_DAYS) {
        // decreasing fee by 1% per day
        feePercentage = INITIAL_FEE_PERCENTAGE - ONE_PERCENT * daysPassed;
    }
}

which we can see uses the stored start time to calculate the fee at the current time. The fee is then taken by `_takeFee`:

function _takeFee(
    address nftHolder,
    address pool,
    uint256[] memory amountsOutRaw,
    uint256 currentFee
) private returns (uint256[] memory hookAdjustedAmountsOutRaw) {
    hookAdjustedAmountsOutRaw = amountsOutRaw;
    IERC20[] memory tokens = _vault.getPoolTokens(pool);
    uint256[] memory accruedFees = new uint256[](tokens.length);
    // Charge fees proportional to the `amountOut` of each token.
    for (uint256 i = 0; i < amountsOutRaw.length; i++) {
        uint256 exitFee = amountsOutRaw[i].mulDown(currentFee);
        accruedFees[i] = exitFee;
        hookAdjustedAmountsOutRaw[i] -= exitFee;
        // Fees don't need to be transferred to the hook, because donation will redeposit them in the vault.
        // In effect, we will transfer a reduced amount of tokensOut to the caller, and leave the remainder
        // in the pool balance.

        emit ExitFeeCharged(nftHolder, pool, tokens[i], exitFee);
    }

    // Donates accrued fees back to LPs.
    _vault.addLiquidity(
        AddLiquidityParams({
            pool: pool,
            to: msg.sender, // It would mint BPTs to router, but it's a donation so no BPT is minted
            maxAmountsIn: accruedFees, // Donate all accrued fees back to the pool (i.e. to the LPs)
            minBptAmountOut: 0, // Donation does not return BPTs, any number above 0 will revert
            kind: AddLiquidityKind.DONATION,
            userData: bytes("") // User data is not used by donation, so we can set it to an empty string
        })
    );
}

This function uses the computed fee to deduct an exitFee from each output token. This is donated back to pool using the special `DONATION` AddLiquidity kind. The updated amounts out are returned to the Vault via the returned `hookAdjustedAmountsOutRaw` which ensures all the Vault accounting passes.

Conclusion

This is a fairly involved example but it covers a lot including some interesting functionality:

  • How a custom router can be used to apply custom logic to pool operations
  • Enforcing pool operations to a specific router
  • Using an NFT to represent a liquidity position which can also be used to track associated meta data

This really shows the flexibility and power that hooks and custom routers can have. Combining these functionalities opens a really wide design space to explore!

SITIR 14

Nice blog about stacking Sats ₿

🧘‍♂️ “If you’re on a great journey at work, doing something meaningful, and the team appreciates you — maybe lean in there” – thought this post had some really good advice

I’ve been enjoying Core Doctorows blog recently and I really liked the subtle little trick described in Against Lore 🦉

🏫 Systematic problem solving – Consume information (with a wide scope) -> Start writing -> Share the idea and get feedback -> Goto writing step and repeat

I always find WebAssembly interesting and this post shares a viewpoint on where it should be used – “where you want to compose work from separate teams, the boundaries imposed by WebAssembly can be a useful tool” 🤝

🔀 The title says it all, Everything You Need To Know About Order Flow Auctions

🧘‍♀️ Really like the section from Nadine Stair in this one – “I would, perhaps, have more actual troubles but fewer imaginary ones” and “If I could do it again, I would travel lighter than I have”.

Maestro, an OS written in Rust because its a memory safe language that avoids issues described 💻

🛞 I’m useless at NFT stuff but this description of the wheel strategy was interesting

I liked this!

Ethereum Log Confusion in Polygon’s Heimdall, catch title 😂 but big brain post.

Boeing’s deliberately defective fleet of flying sky-wreckage – terrifying! 😱

👀 I’ve enjoyed some of the content from The Block Print and this was good: An Honest Review of Ethena

₿ Worth keeping in mind BTC Investing vs. BTC Standard

I’ve been following the TinyPilot blog since it got started, was cool to see how things unfolded and nice to see a successful ending 👏

Another from Pluralistic – 🦞 Red Lobster was killed by private equity, not Endless Shrimp

🐍 Bloom filters are cool and this blog post covers how they can be used in Ethereum to detect event logs

Axiom – Testing With Custom Foundry Cheat Codes

Intro

Axiom is a really exciting new protocol that harnesses ZK technology to allow smart contracts to trustlessly compute over the history of Ethereum. I believe its a novel new primitive for others to build with. The docs provide a lot of info about the protocol itself and has a helpful tutorial that can be followed to build an Autonomous Airdrop. An SDK is provided to improve the integration experience for developers and includes a CLI, React client and Typescript and Smart Contract libraries.

One of the SC libraries provides an extension to the standard Foundry test library and has a pretty interesting setup and implementations of custom cheat codes. I thought it would be interesting to investigate this a bit further using the test from the Autonomous Airdrop example as a reference example, specifically looking at AxiomTest in some more detail.

System Overview

To appreciate why the cheat codes are beneficial its useful to have a high level overview of the Axiom system. Following the flow of the Airdrop example:

  1. Query Initialisation
  • A query is sent to the AxiomV2Query contract sendQuery function. In the Airdrop example this is sent by the user from the UI
  • The query format spec can be found here
  • Will use the compute proof from an Axiom client Circuit
  • The query arguments can be created in a number of ways using the SDKs, e.g. CLINodeJSReact
  • Here the AxiomV2Callback is specified. This is what runs after query fulfillment
  1. Query Verification
  • Offchain Axiom indexes the query
  • Computes the result, and generate a ZK proof of validity
  1. Query Fulfillment
  • Axiom calls fulfillQuery on the AxiomV2Query contract.
  • Onchain: verify zk proof onchain, check hashes, confirm mathes original query
  • Calls the callback specified by the AxiomV2Callback in step 1
  1. Callback runs
  • This allows a custom contract to make use of the results of the query and run custom logic
  • In the Airdrop example the AutonomousAirdrop.sol contract validates the relevant airdrop requirements and issues the token if met

When testing locally the QueryFulfillment in step 3 will not be possible which would block testing of the custom logic implemented in the callback used in step 4. That’s where the AxiomTest library can be used.

Step By Step Testing

Following AutonomousAirdrop.t.sol can show us step by step how to use AxiomTest and allows us to investigate what is going on.

Importing

AxiomTest follows the same convention as a usual Foundry Test but instead we import AxiomTest.sol and inherit from AxiomTest in the test contract:

import { AxiomTest, AxiomVm } from "@axiom-crypto/v2-periphery/test/AxiomTest.sol";

contract AutonomousAirdropTest is AxiomTest { ...

Setup

setUp() is also the same as Foundry, an optional function invoked before each test case is run. Here there’s a bit more going on:

function setUp() public {
    _createSelectForkAndSetupAxiom("sepolia", 5_103_100);
    
    inputPath = "app/axiom/data/inputs.json";
    querySchema = axiomVm.compile("app/axiom/swapEvent.circuit.ts", inputPath);

    autonomousAirdrop = new AutonomousAirdrop(axiomV2QueryAddress, uint64(block.chainid), querySchema);
    uselessToken = new UselessToken(address(autonomousAirdrop));
    autonomousAirdrop.updateAirdropToken(address(uselessToken));
}

_createSelectForkAndSetupAxiom is found in the AxiomTest.sol contract. It basically initialises everything Axiom related on a local fork so the tests can be run locally.

  1. Setup and run a new local fork using vm.createSelectFork(urlOrAlias, forkBlock) docs;
  2. Using provided chainId find the addresses for axiomV2Core and axiomV2Query from local AxiomV2Addresses. These are actual deployments and currently only exist on mainnet/sepolia.
  3. Initialise core and query contracts using the addresses and interfaces:
axiomV2Core = IAxiomV2Core(axiomV2CoreAddress);
axiomV2Query = IAxiomV2Query(axiomV2QueryAddress);
  1. Initialise axiomVm
axiomVm = new AxiomVm(axiomV2QueryAddress, urlOrAlias, true);

AxiomVm.sol implements the cheatcode functionality as well as providing utility functions for compiling, proving, parsing args, etc.

Following initialisation of the fork, the axiomVm compile function is used to compile the local circuit and retrieve the querySchema associated to the circuit. The querySchema provides a unique identifier for a callback function to distinguish the type of compute query used to generate the query results passed to the callback and this is used as a constructor argument when creating a new AutonomousAirdrop contract.

Behind the scenes compile is using Foundry FFI to run the Axiom CLI compile command:

npx axiom circuit compile _circuitPath --provider vm.rpcUrl(urlOrAlias) --inputs inputPath --outputs COMPILED_PATH --function circuit --mock

This outputs a JSON file which contains the querySchema. This value is parsed from the file and returned.

Testing SendQuery

The test test_axiomSendQuery covers step 1 in the System Overview above.

function test_axiomSendQuery() public {
    AxiomVm.AxiomSendQueryArgs memory args =
        axiomVm.sendQueryArgs(inputPath, address(autonomousAirdrop), callbackExtraData, feeData);

    axiomV2Query.sendQuery{ value: args.value }(
        args.sourceChainId,
        args.dataQueryHash,
        args.computeQuery,
        args.callback,
        args.feeData,
        args.userSalt,
        args.refundee,
        args.dataQuery
    );
}

Looking at AxiomVm sendQueryArgs we see it is again using Axiom CLI. This time via the functions _prove and _queryParams.

_prove runs the prove command:

npx axiom circuit prove circuitPath --mock --sourceChainId vm.toString(block.chainid) --compiled COMPILED_PATH --provider vm.rpcUrl(urlOrAlias) --inputs inputPath --outputs OUTPUT_PATH --function circuit

This will prove the previously compiled circuit and generate an JSON output file with the interface:

{
    sourceChainId: string,
    computeResults: string[], // bytes32[]
    computeQuery: AxiomV2ComputeQuery,
    dataQuery: DataSubquery[],
}

_queryParams then runs the query-params command:

npx axiom circuit query-params vm.toString(callbackTarget) --sourceChainId vm.toString(block.chainid) --refundAddress vm.toString(msg.sender) --callbackExtraData vm.toString(callbackExtraData) --maxFeePerGas vm.toString(feeData.maxFeePerGas) --callbackGasLimit vm.toString(feeData.callbackGasLimit) --provider vm.rpcUrl(urlOrAlias) --proven OUTPUT_PATH --outputs QUERY_PATH --args-map

This uses the output generate by the prove step (at OUTPUT_PATH) and generates the sendQuery arguments to a JSON file in the format:

{
    value: bigint,
    queryId: bigint,
    calldata: string,
    args,
}

This file is read and the args are returned as a string which are parsed in _parseSendQueryArgs and returned as a AxiomSendQueryArgs struct.

Finally sendQuery itself is called on the axiomV2Query contract initialised during setup using the parsed args.

Testing Callback

The test test_axiomCallback mocks step 3 in the System Overview and allows the callback to be tested.

function test_axiomCallback() public {
    AxiomVm.AxiomFulfillCallbackArgs memory args =
        axiomVm.fulfillCallbackArgs(inputPath, address(autonomousAirdrop), callbackExtraData, feeData, SWAP_SENDER_ADDR);
    
    axiomVm.prankCallback(args);
}

Similar to the previous test fulfillCallbackArgs uses the Axiom CLI to prove and queryParams to generate the required args for AxiomFulfillCallbackArgs. These are used in prankCallback to call the axiomV2Callback function on the AutonomousAirdrop contract (args.callbackTarget is the address) with the relevant spoofed Axiom results:

IAxiomV2Client(args.callbackTarget).axiomV2Callback{gas: args.gasLimit}(
    args.sourceChainId,
    args.caller,
    args.querySchema,
    args.queryId,
    args.axiomResults,
    args.callbackExtraData
);

The axiomV2Callback function is inhertied from the AxiomV2Client and this function in turn calls _validateAxiomV2Call and _axiomV2Callback.

Conclusion

Following through these tests and libraries really helps to understand the moving parts in the Axiom system and hopefully the post helps others. Its exciting to see what gets built with Axiom as it becomes another core primitive!

Photo by David Travis on Unsplash

SITIR 13

🚕 An educational side project – this was inspiring. The dedication is probably the number 1 take away!

🏃 Pretty nice visualisation of the NYC marathon – brings back painful memories 😭!

🎥 A Brief Disagreement by Steve Cutts, super cool and very relevant

Interesting account of building S3 🤓

🧮 Computing Percentages Easier: x% of y of is always equal to y% of x (I know I’ll forget it!)

🍎 Enjoyed this true history of Newton’s job as Warden of the Mint, The Lumber Room

This is crazy creative – a context-to-image camera 📷

The sky is falling! Interesting thoughts on the fear of AI (and everything 😆)

🧓🏻 OG account of the DAO hack

🪙 The rise and fall of the gold standard, probably the best overview I’ve read on this

The Power of Free Time 🪷

💰 31 Lessons I’ve Learned About Money – this has some genuine insight

10 Thoughts From the Fourth Trimester, gold, this should be given to all first time parents (fetus that everyone can see 🤣)

What happens when the real Young Lady’s Illustrated Primer lands in China? The idea of local LLMs is kind of mind blowing 🤯

🥲 “You gave an old woman a little moment of joy,” – a reminder its nice to be nice

End-of-life Dreams and Visions – just interesting 🕊️

4 Phases of a Bull market – pretty solid 🐂

🥚 The Egg – what the heck!?

Learnings from the KZG Ceremony, I thought the KZG ceremony was really cool and a big success. This is a great post covering the full stack. Rust/WASM/React fun 🎑

👀 The Swivel-Eyed Loons Have a Point, elegnatly puts into words the things that I think most people feel. Makes the blood boil!

Liquidations in Decentralized Finance – a comprehensive explainer 📈

🧻 Roll up, roll up! Pretty good high level explainer here. I also enjoyed a a personal explainer from fucory.eth in London.

📖 I’m really getting to like the writing of John P Weiss, it resonates a lot:
Life is so terrible and beautiful at the same time – I’ve seen this but luckily haven’t experienced it yet
We must choose between being an anvil or a hammer – really thought provoking
The corrosive element in our lives – How do we not waste our life? It’s never too late to change course. 

Photo by Jamie Street on Unsplash

SITIR 12

(Some Interesting Thing I Read)

🖥️ Thread explaining single-use address that can broadcast a transaction without the need for a private key

Keep It Going – Pretty pertinent in these times! Could also be applied to other things like work, working out, maybe even a good general thesis to live by 🏋️

⚖️ BIP-19 was a bit of a special move by SolarCurve and this thread is a great explainer.

💰 Keep Your Personal Burn Rate Low To Maximize Your Options – the title says it all really but a nice reminder to see in black and white.

Bit of a mindfuck 😂 – A potential escape from Roko’s basilisk

A medical thought experiment 🔬

🧑‍🏫 Compounding Interest Rates explained with Solidity examples

Pretty funny boyo 🙈

🧙‍♂️ The Merge was such an impressive achievement!

💣 “One day the rule of law protected your assets, the next day they were frozen or forcibly sold.” – For the War by Arthur Hayes

“it’s not going to get easier. It’s going to get harder. So make yourself a person that ‘handles hard well’” 💪 good advice in Handle Hard Well

🤦 Thinking your the smartest person in the room never ends well from what I’ve observed! Nice thread on Sam Trubucco/Alameda

🍿 More FTX/Alameda, this time from Arthur

🧘‍♂️ Having enough – another good reminder by Morgan Housel

🪂 Just imagine falling 15000 feet, crazy story!

The Surfer Mentality is a good metaphor to remember 🏄‍♀️

🦉 “You aren’t handed anything. You earn everything every single day, over and over again. You have to prove it.” – Some good home truths in this one

👀 How to Spot Cult Leader Personalities, super useful, especially in the world of Crypto!

Vitalik – thought this was interesting considering the hype around EigenLayer. And also first time I’ve heard of a concerning issue around Eth L1 🤔

Photo by Juan Rumimpunu on Unsplash

A Muddy DeFi World

Introduction

This is a write up for my entry to the EthGlobal Autonomous Worlds hackathon, the imaginatively titled: MUD Powered Balancer Swaps. (Github)

Unfortunately I have no game development skills so the idea was to see how MUD could be used with an existing DeFi protocol. In this case by creating a new Balancer Relayer that was integrated with MUD and a front end to show swap data.

From the MUD docs: “MUD is a framework for ambitious Ethereum applications. It compresses the complexity of building EVM apps with a tightly integrated software stack.” The stand out for me is:


No indexers or subgraphs needed, and your frontend is magically synchronized!

Link

Since my early days at Balancer the Subgraph has been one of the main pain points I’ve come across. I’ve always thought there’s a clear need/opportunity for a better way of doing things. When I first saw the DevCon videos showing how the MUD framework worked it reminded me of the early days of the Meteor framework which was seemed like magical frontend/backend sync technology when I first saw it. With MUD we also get the whole decentralised/composability aspect too. It really seems like this could be a challenger and the hackathons a perfect way to get some experience hacking on it!

Solution Overview

Balancer Relayers are contracts that allow users to make calls to the Balancer Vault on behalf of the users. They can use the sender’s ERC20 vault allowance, internal balance, and BPTs on their behalf. As I’ve written before, multiple actions such as exit/join pools, swaps, etc can be chained together, improving the UX.

It’s important to note that because the Relayers have permissions over user funds they have to be authorized by the protocol. This authorisation is handled by Balancer Gorvernance and you can see a past governance proposal and authorisation PR here and here.

The MUD Store is the onchain database that can be written and read from similar to a normal database. The MUD framework handles all the complexity and makes developing with the Store super smooth.

By developing a new MUD enabled Relayer we can use a well established, battle tested Balancer protocol (Balancer 80/20 pools in particular could be interesting as liquidity for gaming assets) combined with all the benefits the MUD framework offers.

The How

Mainnet Forking

By using a local node forked from mainnet we can use all the deployed Balancer info including the real pools, assets and governance setup. To build this into the dev setup based off the MUD template project I added a .env with a mainnet archive from Alchemy and edited the root package.json` node script like so:

"node": "anvil -b 1 --block-base-fee-per-gas 0 --chain-id 31337 --fork-block-number 17295542 -f $(. ./.env && echo $ALCHEMY_URL)"

Now when the pnpm dev command is run it spins up a forked version of mainnet (with a chainId of 31337 which makes everything else keep working) and all the associated MUD contracts used during the normal dev process will be deployed there for use.

Relayer With MUD

The most recent Balancer Relayer V5 code can be found here. In the Hackathon spirit I decided to develop a very simple (and unsafe) version (I initially tried replicating the Relayer/Library/Multicall approach used by Balancer but had issues with proxy permissions on the store that I didn’t have time to solve). It allows a user to execute a singleSwap. The complete code is shown below:

import { System } from "@latticexyz/world/src/System.sol";
import { Swap } from "../codegen/Tables.sol";
import { IVault } from "@balancer-labs/v2-interfaces/contracts/vault/IVault.sol";
import "@balancer-labs/v2-interfaces/contracts/standalone-utils/IBalancerRelayer.sol";

contract RelayerSystem is System {
  IVault private immutable _vault;

  constructor() {
    _vault = IVault(address(0xBA12222222228d8Ba445958a75a0704d566BF2C8));
  }

  function getVault() public view returns (IVault) {
    return _vault;
  }

  function swap(
        IVault.SingleSwap memory singleSwap,
        IVault.FundManagement calldata funds,
        uint256 limit,
        uint256 deadline,
        uint256 value
    ) external payable returns (uint256) {
        require(funds.sender == msg.sender || funds.sender == address(this), "Incorrect sender");
        uint256 result = getVault().swap{ value: value }(singleSwap, funds, limit, deadline);
        bytes32 key = bytes32(abi.encodePacked(block.number, msg.sender, gasleft()));
        Swap.set(key, address(singleSwap.assetIn), address(singleSwap.assetOut), singleSwap.amount, result);
        return result;
  }
}

I think the simplicity of the code snippet really demonstrates the ease of development using MUD. By simply inheriting from the MUD System I can read and write to the MUD Store. In this case I want to write the assetIn, assetOut, amount and result for the trade being executed into the Swap table in the store where it can be consumed by whoever (see the Front End section below to see how). I do this in:

Swap.set(key, address(singleSwap.assetIn), address(singleSwap.assetOut), singleSwap.amount, result);

To setup the Swap table all I have to do is edit the mud.config.ts file to look like:

export default mudConfig({
  tables: {
    Swap: {
      schema: {
        assetIn: "address",
        assetOut: "address",
        amount: "uint256",
        amountReturned: "uint256"
      }
    }
  },
});

The rest (including deployment, etc) is all taken care of by the framework 👏

Permissions

Before I can execute swaps, etc there is some housekeeping to take care of. Any Balancer Relayer must be granted permission via Governance before it can be used with the Vault. In practice this means that the Authoriser grantRoles(roles, relayer) `function must be called from a Governance address. By checking out previous governance actions we can see the DAO Multisig has previously been used to grant roles to relayers. Using hardhat_impersonateAccount on our fork we can send the transaction as if it was from the DAO and grant the required roles to our Relayer. In our case the World calls the Relayer by proxy so we grant the role to the world address (not safe in the real world :P).

async function grantRelayerRoles(account: string) {
    const rpcUrl = `http://127.0.0.1:8545`;
    const provider = new JsonRpcProvider(rpcUrl);
    // These are the join/exit/swap roles for Vault
    const roles = ["0x1282ab709b2b70070f829c46bc36f76b32ad4989fecb2fcb09a1b3ce00bbfc30", "0xc149e88b59429ded7f601ab52ecd62331cac006ae07c16543439ed138dcb8d34", "0x78ad1b68d148c070372f8643c4648efbb63c6a8a338f3c24714868e791367653", "0xeba777d811cd36c06d540d7ff2ed18ed042fd67bbf7c9afcf88c818c7ee6b498", "0x0014a06d322ff07fcc02b12f93eb77bb76e28cdee4fc0670b9dec98d24bbfec8", "0x7b8a1d293670124924a0f532213753b89db10bde737249d4540e9a03657d1aff"];
    // We impersonate the Balancer Governance Safe address as it is authorised to grant roles
    await provider.send('hardhat_impersonateAccount', [governanceSafeAddr]);
    const signer = provider.getSigner(governanceSafeAddr);

    const authoriser = new Contract(authoriserAddr, authoriserAbi, signer);

    const canPerformBefore = await authoriser.callStatic.canPerform(roles[0], account, balancerVaultAddr);

    // Grants the set roles for the account to perform on behalf of users
    const tx = await authoriser.grantRoles(roles, account);
    await tx.wait();
    const canPerformAfter = await authoriser.callStatic.canPerform(roles[0], account, balancerVaultAddr);
    console.log(canPerformBefore, canPerformAfter);
}

The World address is updated each time a change is made to contracts, etc so its useful to use a helper:

import worldsJson from "../../contracts/worlds.json";

export function getWorldAddress(): string {
    const worlds = worldsJson as Partial<Record<string, { address: string; blockNumber?: number }>>;
    const world = worlds['31337'];
    if(!world) throw Error('No World Address');
    return world.address;
}

The Relayer must also be approved by the user who is executing the swap. In this case I select a user account that I know already has some funds and approvals for Balancer Vault. That account must call setRelayerApproval(account, relayer, true) on the Balancer Vault.

async function approveRelayer(account: string, relayer: string) {
    const rpcUrl = `http://127.0.0.1:8545`;
    const provider = new JsonRpcProvider(rpcUrl);
    await provider.send('hardhat_impersonateAccount', [account]);
    const signer = provider.getSigner(account);
    const vault = new Contract(balancerVaultAddr, vaultAbi, signer);
    const tx = await vault.setRelayerApproval(account, relayer, true);
    await tx.wait();
    const relayerApproved = await vault.callStatic.hasApprovedRelayer(account, relayer);
    console.log(`relayerApproved: `, relayerApproved);
}

In packages/helpers/src/balancerAuth.ts there’s a helper script that can be run using pnpm auth which handles all this and it should be run each time a new World is deployed.

Front End

Disclaimer – my front-end UI is ugly and some of the code is hacky, but it works! The idea here was to just show a super simple UI that updates anytime a swap is made through our relayer.

To trigger a swap via the UI I’ve got a simple button wired up to a systemCall:

const worldSwap = async (poolId: string, assetIn: string, assetOut: string, amount: string) => {
    const rpcUrl = `http://127.0.0.1:8545`;
    const provider = new JsonRpcProvider(rpcUrl);
    // Impersonates testAccount which we know has balances for swapping
    await provider.send('hardhat_impersonateAccount', [testAccount]);
    const signer = provider.getSigner(testAccount);
    const singleSwap = {
        poolId,
        kind: '0',
        assetIn,
        assetOut,
        amount,
        userData: '0x'
    };
    const funds = {
        sender: testAccount,
        fromInternalBalance: false,
        recipient: testAccount,
        toInternalBalance: false
    }
    const limit = '0';
    const deadline = '999999999999999999';
    console.log(`Sending swap...`);
    const test = await worldContract.connect(signer).swap(singleSwap, funds, limit, deadline, '0');
    console.log(`Did it work?`)
  };

I took the approach of impersonating the test account that we previously setup the Relayer permission for to avoid the UX of approving, etc via the UI. We just submit the swap data via the worldContract which proxies the call to the Relayer.

To display the swap data from the Store I use the storeCache which is typed and reactive. A simplified snippet shows how:

import { useRows } from "@latticexyz/react";
import { useMUD } from "./MUDContext";

export const App = () => {
  const {
    systemCalls: { worldSwap },
    network: { storeCache },
  } = useMUD();

  const swaps = useRows(storeCache, { table: "Swap" });
  
  return (
    <>
      ...
      <div>Swaps:</div>
      <ul>
        {swaps.map(({ value }, increment) => (
          <li key={increment}>
            Amount In: {value.amount.toString()} Amount Out: {value.amountReturned.toString()}
          </li>
        ))}
      </ul>
    </>
  );
};

(One other hack I had to make to get it working. In packages/client/src/mud/getNetworkConfig.ts I had to update the initialBlockNumber to 17295542.)

To demonstrate the reactive nature I also added another helper script that can be used to execute a swap with a random amount (see: packages/helpers/src/worldSwap.ts). This can be run using pnpm swap and its awesome to see the UI update automatically. I also really like the MUD Dev Tools which shows the Store updating.

Composability

I think one of the most exciting and original aspects of Autonomous Worlds is the opportunities for composability. With the standardisation of data formats in the MUD Store experimentation is made easier. As an extremely basic implementation of this I thought it was cool to show how the swap data could be used in another non-defi related app like a game. In this case I implemented the famous Google Dino hopper game where a cactus is encountered whenever a swap is made. We can import the swap data as before and trigger a cactus whenever a new swap record is added. (See packages/client/src/dino for the implementation).

Although basic, hacky and ugly it demonstrates how an Autonomous World of composable games, defi and data can start to develop. The really cool thing is who knows how it takes shape! MUD is a super cool tool and I’m excited to see it develop.

Building an SDK v1.0.1-beta.13 – Typechain

Intro

The TypeChain project provides developers a tool to generate TypeScript typings for smart contracts they are interacting with. This gives all the usual benefits of Typing for example – flagging an error if you try to call a function on the smart contract that doesn’t exist.

In the SDK we were using the @balancer-labs/typechain package which aims to provide TypeChain bindings for the most commonly used Balancer contracts but we decided it would be better to remove this dependency and generate the bindings as needed. This enables us to remain up to date with new contracts (e.g. Relayers) without waiting for the package support.

Making The Changes

TypeChain is really pretty easy to use but we had to add a few additonal changes to the SDK.

ABIs
To generate the typed wrapper TypeChain uses the Smart Contract ABIs. These were added in src/lib/abi. These can be found in the balancer-v2-monorepo or even from etherscan if the contract is already deployed/verified.

Targets
TypeChain will generate appropriate code for a given web3 library. In the SDK we use ethers.js so we need to make sure the @typechain/ethers-v5 package is added to our dev dependencies. (See the other available targets here)

CLI Command
To actually generate the files we need to run the typechain command and specifify the correct target, path to ABIs, and out path. For example:

typechain --target ethers-v5 --out-dir src/contracts './src/lib/abi/Vault.json'

Will target ethers and use the Vault ABI to generate the bindings in the src/contracts dir. You can see the full CLI docs here.

Its recommended that the generated file are not commited to the codebase so we add src/contracts/ to .gitignore. And in package.json a helper is added to scripts:

"typechain:generate": "npx typechain --target ethers-v5 --out-dir src/contracts './src/lib/abi/Vault.json' './src/lib/abi/WeightedPoolFactory.json' './src/lib/abi/BalancerHelpers.json' './src/lib/abi/LidoRelayer.json' './src/lib/abi/WeightedPool.json'"

and the CI is updated to call this command post install.

Updating the code
The last change to make was removing the old package and replacing any references to it. This is almost a direct replacement and just requires updating to use the path from the new contracts path. E.g.:

// Old
import { BalancerHelpers__factory } from "@balancer-labs/typechain";
// New
import { BalancerHelpers__factory } from '@/contracts/factories/BalancerHelpers__factory';

// Example of use
this.balancerHelpers = BalancerHelpers__factory.connect(
      this.contractAddresses.balancerHelpers,
      provider
    );

Example Of The Benefits

During the updates one of the benefits was highlighted. A previous example was incorrectly calling the queryExit function on the BalancerHelpers contract. This is a function that although it is used like a view it is actually a special case that requires it to be used with an eth_call (see here for more info). This led to a Type warning when trying to access the response. After correctly updating to use a callStatic the response typing matched the expected.

// Incorrect version
const response = await contracts.balancerHelpers.queryExit(...);
expect(response.amountsIn)....
// Shows: Property 'amountsIn' does not exist on type 'ContractTransaction'.

// Correct version
const response = await contracts.balancerHelpers.callStatic.queryExit
expect(response.amountsIn)....
/*
Shows:
const response: [BigNumber, BigNumber[]] & {
    bptOut: BigNumber;
    amountsIn: BigNumber[];
}
*/

Photo by Kristian Strand on Unsplash

Building an SDK v0.1.30 – Swaps With Pool Joins & Exits

In the Balancer Smart Order Router (SOR) we try to find the best “path” to trade from one token to another. Until recently we only considered paths that consisted of swaps but the Relayer allows us to combine swaps with other actions like pool joins and exits and this opens up new paths to consider.

Pools, Actions and BPTs

Lets take a look at the humble 80/20 BAL/WETH weighted balancer pool and see some of the associated actions.

A token holder can join a Balancer pool by depositing tokens into it using the joinPool function on the vault. In return they receive a Balancer Pool Token (BPT) that represents their share in this pool. A user can join with a single token or a combination of tokens, as long as the tokens used already exist in the pool.

A BPT holder can exit the pool at anytime by providing the BPT back to the Vault using the exitPool function. And they can exit to one or a combination of the pool tokens.

In the Balancer veSystem users lock the BPT of the 80/20 BAL/WETH weighted balancer pool. This is cool because it ensures that even if a large portion of BAL tokens are locked, there is deep liquidity that can be used for swaps.

A swap against the 80/20 pool with a “normal” token swap would usually just involve swapping tokens that exist in the pool. e.g. swapping BAL to WETH. This can be achieved by calling the `Swap` function on the Balancer Vault.

We also have multihop swaps that chain together swaps across different pools, which in Balancers case is super efficient because of the Vault architeture. This can be achieved by calling the `batchSwap` function on the Vault.

BPT tokens are actually an ERC20 compatible token which means they have the same approve, transfer, balance functionality as any other ERC20. This means it can itself also be a token within another Balancer pool. This opens up a whole world of interesting use cases, like Boosted Pools. Another example is the auraBal stable pool.

Aura

There’s lots of detailed info in the veBal and Aura docs but as a quick summary:

veBAL (vote-escrow BAL) is a vesting and yield system based based on Curves veCRV system. Users lock the 80/20 BPT and gain voting power and protocol rewards.

Aura Finance is a protocol built on top of the Balancer system to provide maximum incentives to Balancer liquidity providers and BAL stakers.

auraBAL is tokenised veBAL and the stable pool consists of auraBal and the 80/20BPT. Now if a user wants to trade auraBal to Weth they can do a multihop swap like:

For larger trades this requires deep liquidity in the BPT/WETH pool, which in the Aura case hasn’t always been available. But there is another potential path, using a pool exit, that can make use of the deep liquidity locked in the 80/20 pool:

With the similar join path also being available:

Updating The Code

So we can see that adding support for these additional paths is definitely useful but it requires some changes to the existing code.

SOR Path Discovery

First we need to adapt the SOR so it considers join/exits as part of a viable path. An elegant and relatively easy to implement solution was suggested by Fernando. Some pools have pre-minted (or phantom) BPT which basically means the pool contains it’s own BPT in its tokens list. This means a swap can be used to trade to or from a pool token to join or exit, respectively. We can make the SOR consider non preminted pools in the same way by artificially adding the BPT to the pool token list.

        if (useBpts) {
            for (const pool of pools) {
                if (
                    pool.poolType === 'Weighted' ||
                    pool.poolType === 'Investment'
                ) {
                    const BptAsToken: SubgraphToken = {
                        address: pool.address,
                        balance: pool.totalShares,
                        decimals: 18,
                        priceRate: '1',
                        weight: '0',
                    };
                    pool.tokens.push(BptAsToken);
                    pool.tokensList.push(pool.address);
                }
            }
        }

We also have to make sure that each pool also has the relevant maths for BPT<>token swaps. Once these are added the SOR can create the relevant paths and will use the existing algorithm to determine the best price.

Call Construction

Paths containing only swaps can be submitted directly to the Vault batchSwap function. A combination of swaps with joins/exits can not – they have to be submitted via the Relayer multicall function. We wanted to try and keep the SOR focused on path finding so we added some helper functions to the SDK.

The first function `someJoinExit checks whether the paths returned from the SOR need to be submitted via the Vault (e.g. swaps only) or the Relayer (swaps and joins/exits). We can do this by checking if any of the hops involve a weighted pool with one of the tokens being the pool bpt. This works on the assumption that the weighted pools are not preminted.

// Use SOR to get swap information
const swapInfo = await sor.getSwaps(tokenIn, tokenOut, ...);
// Checks if path contains join/exit action
const useRelayer = someJoinExit(pools, swapInfo.swaps, swapInfo.tokenAddresses)

The second, buildRelayerCalls, formats the path data into a set of calls that can be submitted to the Relayer multicall function.

First it creates an action for each part of the path – swap, join or exit using getActions:

  // For each 'swap' create a swap/join/exit action
  const actions = getActions(
    swapInfo.tokenIn,
    swapInfo.tokenOut,
    swapInfo.swaps,
    swapInfo.tokenAddresses,
    slippage,
    pools,
    user,
    relayerAddress
  );

which use the isJoin and isExit functions:

// Finds if a swap returned by SOR is a join by checking if tokenOut === poolAddress
export function isJoin(swap: SwapV2, assets: string[]): boolean {  
  // token[join]bpt
  const tokenOut = assets[swap.assetOutIndex];
  const poolAddress = getPoolAddress(swap.poolId);
  return tokenOut.toLowerCase() === poolAddress.toLowerCase();
}

// Finds if a swap returned by SOR is an exit by checking if tokenIn === poolAddress
export function isExit(swap: SwapV2, assets: string[]): boolean {
  // bpt[exit]token
  const tokenIn = assets[swap.assetInIndex];
  const poolAddress = getPoolAddress(swap.poolId);
  return tokenIn.toLowerCase() === poolAddress.toLowerCase();
}

Then these actions are ordered and grouped. The first step is to categorize actions into a Join, Middle or Exit as this determines the order the actions can be done:

export function categorizeActions(actions: Actions[]): Actions[] {
  const enterActions: Actions[] = [];
  const exitActions: Actions[] = [];
  const middleActions: Actions[] = [];
  for (const a of actions) {
    if (a.type === ActionType.Exit || a.type === ActionType.Join) {
      // joins/exits with tokenIn can always be done first
      if (a.hasTokenIn) enterActions.push(a);
      // joins/exits with tokenOut (and not tokenIn) can always be done last
      else if (a.hasTokenOut) exitActions.push(a);
      else middleActions.push(a);
    }
    // All other actions will be chained inbetween
    else middleActions.push(a);
  }
  const allActions: Actions[] = [
    ...enterActions,
    ...middleActions,
    ...exitActions,
  ];
  return allActions;
}

The second step is to batch all sequential swaps together. This should minimise gas cost by making use of the batchSwap function. We use the batchSwapActions function to do this:

const orderedActions = batchSwapActions(categorizedActions, assets);

and it is essentially checking if subsequent swaps have the same source/destination – if they do then they can be batched together and the relevant assets and limits arrays are updated.

Each of the ordered actions are encoded to their relevant call data. And finally the Relayer multicall is encoded.

  const callData = balancerRelayerInterface.encodeFunctionData('multicall', [
    calls,
  ]);

And here’s a full example showing how the new functions can be used:

/**
* Example showing how to find a swap for a pair using SOR directly
* - Path only uses swaps: use queryBatchSwap on Vault to see result
* - Path use join/exit: Use SDK functions to build calls to submit tx via Relayer
*/
import dotenv from 'dotenv';
import { BigNumber, parseFixed } from '@ethersproject/bignumber';
import { Wallet } from '@ethersproject/wallet';
import { AddressZero } from '@ethersproject/constants';
import {
BalancerSDK,
Network,
SwapTypes,
someJoinExit,
buildRelayerCalls,
canUseJoinExit,
} from '../src/index';
import { ADDRESSES } from '../src/test/lib/constants';
dotenv.config();
async function getAndProcessSwaps(
balancer: BalancerSDK,
tokenIn: string,
tokenOut: string,
swapType: SwapTypes,
amount: BigNumber,
useJoinExitPaths: boolean
) {
const swapInfo = await balancer.swaps.sor.getSwaps(
tokenIn,
tokenOut,
swapType,
amount,
undefined,
useJoinExitPaths
);
if (swapInfo.returnAmount.isZero()) {
console.log('No Swap');
return;
}
// console.log(swapInfo.swaps);
// console.log(swapInfo.tokenAddresses);
console.log(`Return amount: `, swapInfo.returnAmount.toString());
const pools = balancer.swaps.sor.getPools();
// someJoinExit will check if swaps use joinExit paths which needs additional formatting
if (
useJoinExitPaths &&
someJoinExit(pools, swapInfo.swaps, swapInfo.tokenAddresses)
) {
console.log(`Swaps with join/exit paths. Must submit via Relayer.`);
const key: any = process.env.TRADER_KEY;
const wallet = new Wallet(key, balancer.sor.provider);
const slippage = '50'; // 50 bsp = 0.5%
try {
const relayerCallData = buildRelayerCalls(
swapInfo,
pools,
wallet.address,
balancer.contracts.relayerV3!.address,
balancer.networkConfig.addresses.tokens.wrappedNativeAsset,
slippage,
undefined
);
// Static calling Relayer doesn't return any useful values but will allow confirmation tx is ok
// relayerCallData.data can be used to simulate tx on Tenderly to see token balance change, etc
// console.log(wallet.address);
// console.log(await balancer.sor.provider.getBlockNumber());
// console.log(relayerCallData.data);
const result = await balancer.contracts.relayerV3
?.connect(wallet)
.callStatic.multicall(relayerCallData.rawCalls);
console.log(result);
} catch (err: any) {
// If error we can reprocess without join/exit paths
console.log(`Error Using Join/Exit Paths`, err.reason);
await getAndProcessSwaps(
balancer,
tokenIn!,
tokenOut!,
swapType,
amount,
false
);
}
} else {
console.log(`Swaps via Vault.`);
const userAddress = AddressZero;
const deadline = BigNumber.from(`${Math.ceil(Date.now() / 1000) + 60}`); // 60 seconds from now
const maxSlippage = 50; // 50 bsp = 0.5%
const transactionAttributes = balancer.swaps.buildSwap({
userAddress,
swapInfo,
kind: 0,
deadline,
maxSlippage,
});
const { attributes } = transactionAttributes;
try {
// Simulates a call to `batchSwap`, returning an array of Vault asset deltas.
const deltas = await balancer.contracts.vault.callStatic.queryBatchSwap(
swapType,
swapInfo.swaps,
swapInfo.tokenAddresses,
attributes.funds
);
console.log(deltas.toString());
} catch (err) {
console.log(err);
}
}
}
async function swapExample() {
const network = Network.MAINNET;
const rpcUrl = `https://mainnet.infura.io/v3/${process.env.INFURA}`;
const tokenIn = ADDRESSES[network].WETH.address;
const tokenOut = ADDRESSES[network].auraBal?.address;
const swapType = SwapTypes.SwapExactIn;
const amount = parseFixed('18', 18);
// Currently Relayer only suitable for ExactIn and non-eth swaps
const canUseJoinExitPaths = canUseJoinExit(swapType, tokenIn!, tokenOut!);
const balancer = new BalancerSDK({
network,
rpcUrl,
});
await balancer.swaps.sor.fetchPools();
await getAndProcessSwaps(
balancer,
tokenIn!,
tokenOut!,
swapType,
amount,
canUseJoinExitPaths
);
}
// yarn examples:run ./examples/swapSor.ts
swapExample();
view raw SwapExample.ts hosted with ❤ by GitHub

Photo by Jakob Owens on Unsplash