A Brief Intro to Coprocessors

Towards unlocking a new class of applications. Not the compute we need, but the compute we deserve.

Decentralized apps face limitations in performing complex on-chain computations due to Ethereum's restricted processing capabilities. As a result, we've seen many DeFi protocols move components like order books and risk systems off-chain. This points to a need for customized computational environments tailored to specific tasks.

We’ve seen a slow but gradual shift of many defi apps deployed onchain managing parts of their system off-chain. Dydx V4 is going to keep its order book off-chain and possibly its margining system as well. Blur keeps parts of its exchange off-chain for smooth UX. Aevo, an options exchange, keeps its order book and risk engine off-chain. The simplest reason for this is the difficulty of maintaining these performance-centric systems on-chain efficiently and in a scalable manner.

The migration of components off-chain points to a broader need - customized (and performant) computational environments tailored to specific tasks. This is not all, though. In this regime, the status quo works well. When a protocol runs an off-chain system, it’s ultimately opaque to you, the user, on trusting if the off-chain system works as it said it does. Verifiable computation does away with trust assumptions, allowing protocols to do off-chain computation without introducing trust factors. This is the promise of coprocessors for Ethereum. Before discussing the coprocessor model in Ethereum, let’s briefly recap where this idea stems from.

The concept of coprocessors originated in computer architecture as a technique to enhance performance. Traditional computers rely on a single central processing unit (CPU) to handle all computations. However, the CPU became overloaded as workloads grew more complex.

Coprocessors were introduced to help – specialized processors dedicated to particular tasks. For example, graphics processing units (GPUs) handle the immense parallel computations needed for 3D rendering. This allows the main CPU to focus on general-purpose processing. Other common coprocessors include cryptographic accelerators for encryption/decryption, signal processors for multimedia, and math coprocessors for scientific computations. Each coprocessor has a streamlined architecture to perform its niche workload efficiently. (Although you could say most of this has been subsumed by parallel programming, ala GPUs.)

This division of labor between CPU and coprocessors led to orders-of-magnitude improvements in performance. The coprocessor model enabled computers to take on increasingly sophisticated workloads not feasible with a lone generalist CPU.

Ethereum can also be considered a generalist CPU VM and is not equipped to do heavy computations simply because of the barring costs that one would have to pay for it to run on-chain, something that has constrained the deployment of a variety of protocols, even forcing designers to come up with something new within the constraints of the EVM. Put simply, costs are too restrictive for complex applications. This has also led to various protocols keeping parts of their protocol off-chain, and every off-chain model thus deployed has brought along with it a certain notion of risk. A risk of centralization and a risk simply of trust; you trust the protocol not to be malicious, which is somewhat against the ethos of decentralized apps.

In this article, I try to look at a few of these solutions and offer a glimpse into what kind of applications could be unlocked by virtue of this infrastructure. I will also try and look into alternative ways of offloading computation, which is sure to become a cornerstone of applications within the crypto space.


Zk-coprocessors

Coprocessors like those offered by Axiom and RiscZero have recently opened up a new dimension of applications possible on-chain by allowing smart contracts to offload heavy computation. The systems offer proof that the code was executed in a way anyone can verify.

Bonsai and Axiom are similar solutions in that they allow arbitrary computation with access to the on-chain state to be run off-chain and provide "receipts" that the computation was performed.

Axiom

Axiom enables Ethereum smart contracts to access more historical on-chain data and perform complex computations while maintaining the decentralization and security of the network. Currently, contracts have access to very limited data from the current block, which restricts the types of applications that can be built. At the same time, allowing contracts to access the full historical archive data would require all network nodes to store the full archive, which is infeasible due to storage costs and would negatively impact decentralization.

To solve this problem, Axiom is developing a "ZK co-processor" system. It allows contracts to query historical blockchain data and perform computations off-chain via the Axiom network. Axiom nodes access the requested on-chain data and perform the specified computation. The key is generating a zero-knowledge proof that the result was computed correctly from valid on-chain data. This proof is verified on-chain, ensuring the result can be trusted by contracts.

This approach allows contracts access to far more data from chain history and the ability to perform complex computations on it without burdening the base layer nodes. Axiom believes this will enable new categories of applications that rely on provable, objective analysis of historical blockchain activity. They have already launched mainnet functionality for basic data reads and plan to expand to full archive data access and ZK verification of contract view functions in the near future. Their longer-term vision is even more advanced ZK computations beyond EVM capabilities.

By generating proofs of correct off-chain execution, Axiom unlocks new categories of blockchain applications.

Risc Zero Bonsai

Risc Zero has developed a general-purpose zero-knowledge virtual machine (zkVM) that allows proving arbitrary programs written in languages like Rust, C/C++ and Go in zero knowledge.

The zkVM allows developers to prove arbitrary Rust code in zero knowledge without needing to design custom circuits. The goal is to make zero-knowledge application development more accessible. The zkVM generates a proof receipt that attests the program was executed correctly without revealing private inputs or logic. This allows intensive computations to happen off-chain, with the proof receipts validating correct execution on-chain. Rust crates work in this zkVM, but there are some limitations around system calls. A feature called continuations allows splitting large computations into segments that can be proven independently. This enables parallel proving, thus removing limits on computation size, and allows pausing/resuming zkVM programs. Continuations have enabled new use cases like fully homomorphic encryption, EVM, and WASM in the zkVM.

Bonsai is an off-chain zero-knowledge proving service developed by Risc Zero to enable the use of their general-purpose zkVM for Ethereum and blockchain applications. It provides a bridge between on-chain smart contracts and off-chain computations in zkVM.

The workflow enabled by Bonsai is as follows:

  • The developer writes a smart contract that calls out to Bonsai's relay contract to request an off-chain computation

  • Bonsai watches for these on-chain requests and executes the corresponding zkVM program written in Rust

  • The zkVM program runs in Bonsai's infrastructure, performing the intensive or private computation off-chain, and then generates proof that it was executed correctly.

  • The proof results, called “receipts,” are posted back on-chain by Bonsai via the relay contract.

  • The developer's smart contract receives the results in a callback function

This allows computationally intensive or sensitive logic to happen off-chain while still validating correct execution via zk proofs on-chain. The smart contract only needs to handle requesting computations and consuming the results.

Bonsai abstracts away the complexity of compiling Rust code to zkVM bytecode, uploading programs, executing in the VM, and returning proofs. Developers can focus on writing their program logic. This infrastructure thus enables running general-purpose computations off-chain while keeping sensitive data and logic private.

Bonsai enables developers to build blockchain applications with off-chain computing straightforwardly without needing expertise in the underlying zkVM cryptography and infrastructure. Simply put, Bonsai enables developers to integrate off-chain computations easily without zk expertise.

Alternative Solutions

Is a ZK-coprocessor the only way to achieve verifiable off-chain computation? What other applications exist to offload computation in a trustless and secure way? While opinions about the security properties, efficiency, and implementation differ, they are being explored in various corners of crypto and will come to the forefront slowly.

Alternatives like MPC and TEEs provide other approaches to verifiable off-chain computation. MPC allows joint computing on sensitive data, while TEEs offer hardware-based secure enclaves. Both present tradeoffs but can be alternatives for ZK-coprocessors.

MPC (Multi-Party Computation)

Secure multi-party computation (MPC) allows multiple parties to jointly compute a function over their inputs while keeping those inputs private. It enables collaboration on sensitive data, thereby maintaining privacy for all participants. However, achieving fairness in MPC, where either all parties learn the output or none do, is impossible if most parties are dishonest. In other words, privacy and integrity guarantees disappear when all nodes are corrupted. Blockchain technology can help make MPC protocols fairer.

Imagine three friends who want to know the average of their salaries without revealing their salaries to each other. They could use Secure MPC to accomplish this.

Assume the friends are Alice, Bob, and Eve:

  1. Alice takes her salary, adds a random number to it, and tells the result to Bob.

  2. Bob adds his salary and another random number to the number he received from Alice, then tells the result to Eve.

  3. Eve adds her salary to the number from Bob, then subtracts all the random numbers added earlier and divides the result by three to find the average.

The final number is the average of their salaries; no one has learned anything about the others' salaries. One nuance to pay attention to here is that although nobody knows the exact salary of each other if the average is lower than Eve's salary, then Eve could infer that one of the other two's salaries is lesser than hers.

The blockchain provides a tamper-proof public ledger that allows parties to post information. By using witness encryption, parties can encrypt the output of an unfair MPC protocol. They post tokens to the ledger that allow extracting a decryption key. Since the ledger is public, all parties can access the decryption key at the same time. This enables a fair decryption protocol where either all parties decrypt the output, or none do.

In “MPC as a Blockchain Confidentiality layer,” Andrew Miller talks about how MPC could help computations on private data. Publicly auditable MPC uses zero-knowledge proofs to retain integrity despite total node corruption. Clients would commit inputs to prove validity. The MPC nodes generate proofs of correct computation. Ultimately, the verifiers will check inputs, outputs, and proofs match. This auditing incurs minimal overhead beyond standard MPC costs. Proofs are succinct using SNARKs with a universal setup. However, questions remain about performance optimizations, programming models, and real-world deployment.

Secure Enclaves / TEEs

Sensitive data like personal information, financial data, etc., must be protected when stored or in transit and when it is being used and computed. Traditional encryption methods protect data at rest and in transit but not when data is actively being used. This is a problem because when data is being processed, it is often in an unencrypted form, leaving it vulnerable to attack.

Trusted execution environments (Or secure enclaves) are isolated environments where data can be encrypted, but computations can still be performed on them. The key idea is to isolate the data and computation so even privileged system processes can't access it. Trusted Execution Environments (TEEs) are secure areas within a processor that provide hardware-based security features to protect sensitive data and code. They isolate specific software from the rest of the system, ensuring that the data within the TEE cannot be tampered with, even by the operating system or other software running on the device.

TEEs allow sensitive data to remain protected even while it is being used. This enables applications like privacy-preserving social networks, financial services, healthcare, etc. There are some limitations around efficiency and trust assumptions, but enclaves are a powerful technology with many potential uses, especially when combined with blockchain networks to build robust, uncensorable systems. The tradeoffs around trust may be acceptable for many commercial and non-profit applications where strong data privacy is required.

Trusted execution environments (TEEs) allow you to outsource computations to an untrusted third-party cloud provider while keeping your data confidential and operations tamper-proof. This is hugely useful for decentralized apps and organizations that want to take advantage of the convenience and cost savings of the cloud without sacrificing privacy or control. But TEEs don't magically solve all problems - there are still some practical challenges to work through before most developers can easily use them.

For example, verifying that a TEE runs your expected code independently is hard. Subtle differences between builds mean that reproducing the exact same binary is tricky. This makes auditing difficult. Persistent storage is another issue - TEEs are isolated environments without permanent data storage. But real apps need to preserve state across reboots. This requires careful design to securely communicate data between the trusted TEE and the untrusted regular system. (Edited this mistake).

They are a powerful building block but still need thoughtful systems research to address their limitations around the one mentioned above and vendor centralization, scaling, and fault tolerance.

Trusted execution environments (TEEs) like Intel SGX and AWS Nitro Enclaves provide isolated environments for running sensitive computations and storing confidential data. TEEs ensure that even privileged system processes cannot access or tamper with code and data inside the TEE. This allows decentralized apps and organizations to outsource computations to untrusted third-party cloud hosts without worrying about privacy or integrity.

Solving these challenges will greatly expand the applicability of TEEs for decentralized apps needing strong integrity, confidentiality, and censorship resistance while outsourcing computation and storage to untrusted clouds. TEEs are a powerful primitive, but thoughtful system co-design remains necessary to address their limitations.


A brief comparison

When evaluating coprocessors, an important consideration is the security model and level of assurance needed for different types of computations. Certain sensitive calculations, like matching orders, require maximal security and minimal trust assumptions. For these, coprocessors using zero-knowledge proofs like zk-coprocessors provide strong guarantees, as results can be verified without trust in the operator.

However, zk-coprocessors might have downsides in efficiency and flexibility. Other approaches like MPC or trusted hardware may be acceptable tradeoffs for less sensitive computations like analytics or risk modeling. While providing weaker assurances, they enable a wider array of computations more efficiently. The level of security needed depends on the risk tolerance of applications. Teams should analyze the value at stake and evaluate if unverified but efficient coprocessors are a reasonable engineering compromise for certain non-critical computations.

Overall, coprocessors span a spectrum of security models, and teams should match solutions to the security requirements of specific tasks. The ecosystem is still nascent, so further advances in scalable verifiable computation will broaden the possibilities.


Applications

Dynamic Control for Lending Protocols

In the blog “Feedback Control as a new primitive for Defi,” the authors mention that control mechanisms for defi mechanism might slowly upgrade from one end to another, using reinforcement learning (RL) and DRL as computation and storage becomes abundant. While RL might still be difficult, Machine learning applications might still be possible due to verifiable computation.

Lending protocols in the past year have come under scrutiny because of the possibility of bad debt due to aggressive parameters for the token being lent in the otherwise liquidity-absent bear market. Models that can access on-chain liquidity and create a liquidity profile for assets could possibly dynamically change parameters.

For example, Lending protocols could benefit from dynamically controlling interest rates based on real-time on-chain data. Rather than relying on preset interest rate models, a feedback control system could adjust rates algorithmically based on current utilization and liquidity factors.

For example, when demand for borrowing an asset is high, pushing utilization rates up, the controller could rapidly increase interest rates to incentivize supply and stabilize utilization. Conversely, when utilization is low, rates could be reduced to incentivize borrowing. The controller parameters could be tuned to optimize for objectives like maximizing protocol revenue or minimizing volatility.

The protocol would need access to real-time on-chain data like total borrowed amounts, liquidity available, and other utilization metrics to implement this. The controller logic then processes this data to compute optimal interest rate adjustments. The rate updates could be governed on-chain via a DAO or off-chain with proof verification.

The protocol would need access to real-time on-chain data like total borrowed amounts, liquidity available, and other utilization metrics to implement this. The controller logic then processes this data to compute optimal interest rate adjustments. The rate updates could be governed on-chain via a DAO or off-chain with proof verification. Although recent work, “Attacks on Dynamic Defi Interest rate curves” by Chitra et al. has shown that dynamic lending models result in more MEV. So, the design of these protocols needs careful consideration.

ML applications

The abundance of access to blockchain data leads us to a natural conclusion of using machine learning applications this way. While proving computation for machine learning applications might be slightly harder, verifiable ML computation is a huge market on its own. These could also be utilized for on-chain applications, especially in some security applications.

Blockchain data contains valuable signals that machine learning models could use to detect suspicious activity or power risk management systems. However, running ML on-chain is currently infeasible due to gas costs and privacy concerns. This could look like on-chain Monitoring systems for smart contracts, wallets, or portfolio managers for detecting suspicious withdrawals or transfers. There is a vast amount of profiling data available for various kinds of signals to be obtained in the case of security, it would be for  “Ruggers,” “Hacks,” and other malicious attacks. It can also be used for defi applications for creditworthiness and profiling risk for lenders and borrowers given their onchain history.

Challenges include data quality, concept drift, and performance limitations of proof systems. But by combining ML with verifiable off-chain computation, coprocessors open up many new opportunities for blockchain analytics and risk management.

Perpetual swaps and Options

Margin systems for perpetual swaps have always been hidden from users regarding centralized and even decentralized exchanges. Margin systems for derivatives like perpetual swaps and options have traditionally been opaque black boxes controlled by centralized exchanges.

Coprocessors present an opportunity to implement transparent and verifiable margining logic for decentralized trading. The promise of implementing auto-deleveraging systems in a verified way offers a higher trustworthiness factor for users and immediately differentiates them from their centralized counterparts.

The margining system could monitor indexed price feeds and position values for perpetual swaps, liquidating positions before their margin balance goes negative. All risk parameters like maintenance margin ratios, funding rates, and liquidation penalties could be governed on-chain.

However, the intensive computations for calculating margin balances, unrealized PnL, liquidation amounts, etc., can be offloaded to a coprocessor. It would execute the margin engine logic in a confidential environment and generate proofs attesting to correct computation.

The benefits of the coprocessor approach include transparency, verifiability, and privacy. Margin engine logic is not a proprietary black box anymore. Computations happen off-chain, but users can trust proofs of correct execution. The same could be achieved for options as well.

Challenges include efficiently generating proofs for intensive margin calculations. But overall, coprocessors unlock new potential for decentralized derivatives platforms by combining privacy with verifiability.


Conclusion

Coprocessors greatly expand the possibilities for blockchain applications without compromising decentralization. As cutting-edge projects continue innovating in this space, the future looks bright for verifiable off-chain computation on Ethereum and beyond.

In a future article, I will dive into these solutions' security considerations, comparisons with rollups, how they fit into the broader ethereum application landscape, and if they are a panacea to scaling problems.

Subscribe to Emperor
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.