A Sub-Validator Protocol For Network Efficiency

Not really certain how to tag things in this forum, but here goes nothing! :sweat_smile:

One issue we’ve been seeing is that most of the network is relying on several RPC providers; Slingshot, Neobase, Chandra Station, to name a few. This results in suboptimal network performance since the ideal scenario would be that everyone runs their own nodes. In effort to encourage users to run their own nodes and simplify the process, I wrote a simple bash script, however, it would seem that most people’s response to this is what’s in it for me?, which I interpret as is there an incentive mechanism behind running a node?

In short, since users are expecting some sort of incentive to run the software, I think it would be in our best interest to design a sub-validator protocol wherein a single primary validator can accept an array of transactions, so to speak, that are gathered from full-node operators. In exchange, this primary validator agrees to share the proceeds of their network rewards with these full-node operators that submit the transaction arrays. The amount of proceeds is an arbitrary value.

For example, I am a validator who receives 100 CANTO in rewards. These rewards were accrued via a subset of 2 sub-validators. Of that 100 CANTO, I distribute 50 to my sub-validator network, which results in 25 CANTO being distributed to sub-validator A and 25 CANTO being distributed to sub-validator B.

In order to ensure the legitimacy of this state transition between respective parties, the array of transactions is wrapped in a simple checksum.

To expand on this, the array of arrays could be considered something of a rudimentary mempool, which would then allow for the construction of more efficient blocks and also potentially allow for the introduction of MEV in some form or another.

I’d also like to close this out by noting that this is not a significant issue at the time being and perhaps this solution would be more appropriate for a more mature EVM-based ecosystem. This is the basic thought I have been turning over in my head within the past couple months and I’d like to get any feedback and opinions from the community. Has there been any prior work in this design space?



First thing that came to mind is to borrow Sei’s Access DAG parallel processing (sei-chain/Sei_Whitepaper.pdf at master · sei-protocol/sei-chain · GitHub) and distribute potential parallel EVM calls to sub-validators.

But it would mean for transactions to declare what State storage Keys they’re accessing. No declaration could be penalised.


Txs hit sub-validators first:

if TX doesn’t declare storage access keys, its simulated by sub-validator first to determine those keys. Caller is penalised for not declaring the keys. Tx message with Storage Keys is then forwarded to Validator.

If Tx has declared storage keys, the message is simply passed through to validator.

Validator forms Access DAG, composes execution plan, then distribute work among sub-validators. Sub-validators execute and return new state diffs to Validator. Validators apply state transition and creates new block.


That is an interesting approach as more nodes would better secure Canto. However, why would be a validator be willing to participate in this protocol? What would be in for the validators to join the sub-validator protocol? Without their participation that will not work. In addition if someone doesn’t have an incentive run a node for Canto is the current circumstances, in which in many cases you need your node to execute transactions, I really don’t know what would be an incentive. In addition in Tendermint there is a cap in the number of validators that can validate the chain (I think 200?). How would the sub-protocol impact the block validations for Canto in terms of number of validators?


hmmm i think in this context, sub-validators wouldn’t be considered as a validator. Validator rewards would be split between the validator & it’s “sub-validators”; i suppose thats the incentive.


A validator would be willing to participate in this protocol because the blocks they build would be heavier, meaning more optimally filled, which would then result in a higher amount of rewards.

On another note, I asked a friend for feedback on this post and he said:

Sub-validators are called “builders” and “searchers” in Ethereum-land and the vast majority of the value (MEV) accrues to validators. The reason is that builders and searchers compete in a race to zero profits.

Admittedly, I’m a bit behind on the terminology because I’ve been down the DeFi rabbit hole for so long that I feel I’m now missing the forest for the trees, but this brings up the issue of MEV and whether or not MEV is a symptom of an optimal market. Do we want MEV on Canto? Are we using Ethereum as a model of a healthy market? If not, what do we want to change? If so, what components are we trying to build upon?


You have raised some important points here:
MEV -that is a billion dollar question. While in Ethereum MEV is playing an import part it has also created a fair amount of friction and negative sentiment. On the other hand Cosmos chains like Osmosis are planning to make Mempool transactions private, so searches will not be able to get advantage, while others are speaking for protocol/chain MEV where benefits will be shared with the community instead searchers/block proposers and validators. Currently in Cosmos chains the blocks are build based on order of transactions received, put I presume there is a way to get some advantage, but likely not as big as it is in Ethereum. In addition MEV could be risky also from regulatory perspective, as front running is not allowed in many financial markets/jurisdiction. Increased censorship of transactions from block builders in Ethereum, has been noticed.

Sub-validators - while there may be an incentive for validators to participate in the sub-validators protocols that will depend if the 5% bonus that a proposal will get to have “heavy block” would be sufficient to compensate the sub-validators and validators. I believe that an important point here and the main differentiator with Ethereum, apart of MEV, is that in Ethereum the proposer is selected randomly. In Canto to proposer Validators is selected based on the amount of votes/staked tokens, which naturally will lead sub-validators to join the big Canto validators, which would lead to more centralization. Currently the 80% of the staked Canto are with the top-10 validators, so without a design to incentivize wider distribution of delegated token, that can lead to even more blocks to be proposed from a very small number of validators.


The rewarded amount to a sub-validator would be a variable set by the validator itself. Since this could be 10% or 100%, this creates a new game, so to speak, between validators since those with lower amounts of staked assets would be more incentivized to provide a much higher reward percentage. If a validator has a small amount of staked tokens, it would then give away 100% of total reward until it accrues some degree of critical mass which would then allow it to lower that reward threshold to something more reasonable. The inverse would apply to validators with higher amounts and so on.

In regard to generalized MEV, I think that’s still an aside and more of a philosophical question, but my gut instinct tells me that once a network reaches a certain size, MEV is a necessary function of a more efficient market.


yes, this is correct.

1 Like

Yea i agree on your last point, give a market enough volume, MEV will start to appear in some form.

1 Like

I suppose validator has also ability to slash sub-validator rewards if it fails to perform. Would validation only be limited to the EVM layer or consider the Tendermint layer as well?


when we’re talking about slashing, we mean penalizing a faulty node for relaying incorrect data, which is necessary to achieve equilibrium within a well-performing system.

this would require something of a reputation system, so i would lean on the eth attestation service protocol stack to fill in the blanks there.

i’ve only been thinking about this in the context of EVM layer. for the tendermint components i would defer to someone who knows more about it than i do.


Good summary re different approaches on MEV and how these apply in Ethereum and Cosmos chains. Good base for discussion imo. Reverie

In addition when I saw the first post from Zak I thought the idea was to encourage people to run nodes to secure Canto, but it transpires that is was for block builders/MEV all along :slight_smile: .


Interesting read! Thank you for sharing.

It seems like it actually serves both purposes.

The initial idea I had was solely and specifically introduced to increase network security and robustness, but in order to efficiently do so, it would appear that we may require a more complex incentive mechanism which also, as a byproduct, may enable the implementation of some ad-hoc, albeit rudimentary, MEV methodology.

Could it be that some degree of MEV is required in order to attain a healthy balance within an optimal network?

Robert Miller brought this article from Phil Daian to my attention and I think it sums up the MEV issue quite nicely.

The TL;DR is as follows:

MEV “is a critical metric for network security in any distributed system secured by economic incentives”

“To keep our economic assumptions strong, we must therefore keep MEV extraction efficient and democratic.”

This leads me to the thought of how we can now ensure fair ordering and transparency within our validator processes as it would appear that in Cosmos, MEV is accrued at the validator layer, so now we have come full circle in that this sub-validator protocol as presented in this thread may actually present a viable solution.


I really like this idea and what everyone has contributed so far. From my understanding, one of the concerns I have with RPC’s is a big entity like Infura becoming the default or go to, which results in the possibility to collect and sell user data and potential censorship. A blockchain is only as decentralized as it’s weakest link. What about doing a hybrid of @zscole original idea, and combining it with an RPC aggregate with smart routing? There was Github that is fairly outdated now but was going for something similar called BetaRPC which was created by mevalphaleak, although I believe was disbanded because they got bored according to a tweet of theirs.

Some info below:

BetaRPC is the most feature-rich RPC endpoint for DeFi wallets. By aggregating majority of public+private RPC endpoints as well as MEV relays(like FlashBots) and leveraging smart routing between them it provides numerous benefits to its users:
1. Mitigates negative externalities of MEV(like front-running or sandwiches)
2. Protects against failed transactions via MEV bundles when necessary(don't pay gas fee for failed NFT mints)
3. Back-runs all eligible transactions and provides full rebate to its users
4. And you receive all benefits above for free without sacrificing speed of transaction confirmation when MEV protection isn't necessary like simple ETH or token transfers.

Why BetaRPC?
Using betarpc.io provides advantages on following fronts:
(I) Speed of transaction confirmation
(II) Mitigating payment for failed transactions
(III) Uncle insurance fund[EXPERIMENTAL]
(IV) No extra fees and minimal logging

Learn More at: https://github.com/mevalphaleak/BetaRPC-setup

idk i feel like this is a cool idea, but not something that needs to be done on the protocol layer. wouldn’t this be more appropriate on the application layer and a load balancer?

1 Like

If I understand correctly the Sub-validator protocol will be something similar to the relay nodes/network that will be connected to validators? I have found interesting post that may be outdated but quite interesting imo Setup Cosmos Validator Relay Network | by Sophie Huang | Medium.

1 Like

Regarding the concept of a rudimentary mempool using the array of arrays, it presents an interesting opportunity for optimizing block construction and potentially addressing the issue of MEV.
However, it would be important to carefully consider the implications and potential risks associated with introducing MEV. Striking a balance between efficiency and fairness in transaction processing would be crucial IMO.