EigenLayer’s Sreeram Kannan on the Hot (and Risky) Ethereum Trend of ‘Restaking’
Sreeram Kannan was a professor at the University of Washington, Seattle, when he started working for its blockchain research lab in 2017.
It was there while working at the lab that he founded his company, EigenLabs, the organization behind EigenLayer – a blockchain protocol considered a pioneer in a just-now-arriving trend in the Ethereum ecosystem known as restaking. The idea is to repurpose ETH tokens staked on the Ethereum blockchain for double duty, using them to provide security for other applications.
EigenLayer has been in the news lately, as the restaking ecosystem starts to take shape. Some top Ethereum figures have also raised flags about the risks; Ethereum co-founder Vitalik Buterin at one point cautioned that the new feature could ultimately pose systemic risks to the main blockchain’s stability.
We spoke with Kannan last week, where we asked him questions about EigenLayer and restaking. Some highlights include:
On Kannan’s response to Buterin’s concerns about staking: “Anything that restaking can do, already liquid staking can do.”
On Ethereum’s plan to slow the rate of new validators coming on line, via the EIP-7514 proposal: “This is a super important thing for Ethereum to be conservative and not have an overflow.”
On whether Ethereum will ever reach its maximum capacity for shared security: “There’s absolutely a limit.”
Q: I saw Vitalik’s blog post about overloading the consensus layer, and how restaking, in his view, could pose systemic risks to Ethereum. I’m curious to hear your take on his take?Kannan: One of the things I think he wants to kind of lay out is that, “Hey don’t externalize, and don’t create something that, assuming that if the protocol goes wrong, Ethereum is going to fork around it.”
I think that is a pretty reasonable position from Ethereum, that you build protocols and the protocols have to internalize social consensus rather than externalize it to Ethereum.
So I read it as to not overload Ethereum social consensus, which is used only for forking the chain. And don’t assume that you can build a protocol that, and because you’re too big to fail, Ethereum can fork around that. So that’s how I read it.
And I think it’s a pretty obvious statement in our view. But I think it has to be said, somebody has to say it, so it’s good that Vitalik went out and said it.
Because what we don’t want is for calls to deploy code that is not properly audited, doesn’t have internal security controls, and then the Ethereum community has to now work hard to figure out how to retrieve it.
I think a lot of people after reading the article have been talking a lot about restaking risks.
I want to make it super clear: anything that restaking can do, already liquid staking can do, so I view restaking as a lesser risk than liquid staking.
Q: Can you expand on that?Kannan: Basically, you can take a liquid staking token and then deposit it into complex DeFi protocols, or you could just deposit it into validating a new layer 2, or a new oracle or any of these things.
So anything that restaking can do, liquid staking can already do. Because you know, you have the LSD [short for liquid staking derivative] token, and you can do anything with it. And one particular thing you could do with that is, of course, go and validate another network.
So I view restaking as just one particular use case of liquid staking, but actually reducing the risk of that one particular use case.
Q: Why do you think restaking is having a moment in the news?Kannan: I don’t know. I’m glad people are talking about it. Of course, anything that adds new rewards to stakers is something interesting.
I said anything that could be done with EigenLayer could be done with LSTs, but people didn’t know what to do with these LSTs.
They were doing exactly the same thing that people are doing with ether, which is lending, borrowing, the same set of DeFi parameters.
I think one thing that EigenLayer did is by creating this new category, that validation, if I can borrow the Ethereum trust network to do new things: I can build a new layer 1, I can build a new like oracle network, I can build a new data availability system, I can build any system on top of the Ethereum trust network, so it internalizes all the innovation back into Ethereum, or aggregates all the innovation back into Ethereum, rather than each innovation requiring a whole new system.
So I think that narrative is quite attractive.
Q: I was just reading the news about EIP-7514, which is a short term solution for solving the overcrowdedness of validators, by limiting entries of new validators. How does that affect an EigenLayer?Kannan: I think mostly, it means the same thing for EigenLayer that it means for liquid staking protocols, that there is going to be a smaller rate at which new validators can enter.
There’s a long entry queue right now, and people don’t want to wait that long.
And making it slower is going to just make the new growth of LSTs slower. But I understand fully that this is a super important thing for Ethereum to be conservative and not have an overflow of validators that may not be able to be handled by the consensus layer.
But in the long term, if the total staking of Ethereum cannot grow, one of the things that happens is the total yield or the return that stakers are getting is bounded by the Ethereum staking, whereas in the presence of restaking there is a possibility for them to get some of these additional rewards. Other than that, it’s pretty similar.
Q: You were making the point that EigenDA is just like in-house AVS (actively validated service) – explain what it is:Kannan: What we decided is, in order to keep this system of shared security, in order to keep EigenLayer as decentralized as possible, we want to make sure that there is a highly scalable data system at its backbone. And that’s what EigenDA is, it’s a highly scalable data availability system, built on the same ideas that underpin the Ethereum roadmap, particularly what is called danksharding.
Our view is that building an Ethereum-adjacent data availability layer requires first principles thinking, whereas Celestia and Avail are built to be chains by themselves.
If you’re building a data availability system adjacent to Ethereum, you’d want Ethereum validators to participate. So that’s just one part of the story. Of course, EigenLayer enables that.
But then you go beyond that, and then you see, “Oh, it’s not just you want to get the Ethereum nodes to participate.”
Ethereum already has consensus built in, and Ethereum gives you the ordering of the various transactions. So you should build the data availability system, which doesn’t need its own ordering.
Whereas all the existing other protocols like Celestia and Avail, are basically chains that have to do their own ordering; we built a system which doesn’t have internal ordering; all ordering is done on Ethereum.
Q: Liquid restaking tokens – once your liquid staking tokens are locked on EigenLayer, they become illiquid?Kannan: That’s correct, the problem that the liquidity staking tokens are trying to solve is, can I just have a restaked position, and then still keep it liquid. So you can take that receipt token of liquid restaking and then transfer it.
We are not building this kind of liquid restaking but other people are building liquid restaking on top of them.
Q: I think your comment was, you want to use the Ethereum shared security for as many things as possible. I’m curious, now that there’s also people building on the back of what y’all are doing, is there a natural limit to how much that you know, Ethereum can support?Kannan: This is a similar kind of question that one could ask already at the application layer of Ethereum: How many applications on Ethereum are smart contracts and how many smart contracts can be built on top of Ethereum?
So it’s the same thing with EigenLayer because people staking and running new applications, but now they do it much more flexibly and programmably with these aliases on top of EigenLayer, all contribute back to Ethereum. Their ETH staking increases rewards, ETH itself potentially increases in value because of all these additional use cases.
So over time, this can start to accommodate more and more.
But there’s absolutely a limit.