Skip to main content

High-level walk through

In this section, we will go through the basic principles behind the Coinweb protocol design and explain the rationale behind them. You will learn how Coinweb is changing blockchain technology and how you can get in early in the next paradigm shift in decentralisation!

Coinweb - Cross-chain computation platform or Super-Rollup?

Coinweb stands apart in the blockchain world, not as a new blockchain or a modification of existing ones, but as a revolutionary protocol. With Coinweb, existing and new blockchains can be used in very different ways to how they are used now. By unifying diverse, often incompatible blockchains, Coinweb doesn’t just merge their distinct advantages; it amplifies their capabilities.

The Coinweb protocol dramatically scales and extends fundamental drivers of value such as trustless computation and interchain data availability, giving decentralised applications the power they need to further close in on and replace centralised counterparts. Importantly, Coinweb does not rely on additional consensus layers, allowing data and computations to retain the native security from L1.

About this document

We will use both text and visual representations to explain the concepts described in this text. Many of the same components can be seen in the different diagrams as they occur within different contexts.

To make the diagrams easier to understand, here list of the most common components and their meaning:

plantuml

plantuml

plantuml

plantuml

Unblocking value flows

The below diagram shows the basic value proposition of Coinweb. The principles behind them and more detailed insight will be provided in the coming sections.

plantuml

Where did we start?

The key to achieving fundamental improvements is to start with careful analysis of the core mechanisms and principles enabling and driving blockchain technology, identifying unresolved bottlenecks in current approaches. We then started to devise new solutions, including identifying and integrating more suitable technological building blocks that preserve or strengthen the valuable properties of blockchain systems, while also increasing capacity to a level more similar to that of centralised systems. Although the Coinweb protocol resolves current limitations on many different levels, the most significant improvements come from changing the way information is proven and verified in blockchains and blockchain-based applications.

It turns out that distributed consensus is often also used to prove information derived from deterministic processes. This is highly ineffective and also not unnecessary.

Distributed consensus is an extremely powerful tool that is necessary for blockchains to determine their state from arbitrary input, it is however very expensive and slow and should not be used where not absolutely necessary. To prove the validity of deterministically derived information, such as the output from smart contracts, other proof mechanisms such as RDoC (Refereed Delegation of Computation) can be used with much higher efficiency.

The implications of using a more efficient proof mechanism are very significant. It opens up for many entirely new use cases and value streams as well as making existing use cases better.

By improving on the very fundamentals of blockchain technology, Coinweb allows dApps to scale orders of magnitude not only in computational capacity and data availability, but more importantly in the potential value that blockchain technology can generate.

What makes Coinweb different?

Let's consider the following observation - "Common flawed design patterns in blockchain tech are limiting progress"

What does this mean?

Which common design patterns are flawed, and how are they limiting progress?

The short answer is, as we have seen above, that distributed consensus (blockchain's main component) is being used where it should not, specifically to verify computation and for blockchain interoperability. It is this flawed design pattern that limits progress. To find out why this is the case, and to see how the solution to the problem will super-charge blockchains and dApps, let's take a look at what makes blockchains valuable in the first place and then identify where unnecessary use of distributed consensus is blocking blockchain technology from growing to the next level.

Please take a look at the following statement. On a general level, this is true not only for blockchains but is also valid for most automated systems where users interact.

"Blockchains generate value by producing an output from a given input, where the output has more value than the sum of the provided input and the blockchain fees"

How does Blockchains generate value?

plantuml

The added value from blockchain outputs most users are looking for is that the blockchain adds strong security to their transactions. Once something is written to a blockchain it is very difficult to remove or change it. This property is called immutability. Blockchains also make the information in the transaction generally available. This data availability means that the information in the transaction can be verified. Data availability also means that the information can be used as input in other transactions. Blockchains are very good at adding immutability and data availability properties to information because they are using distributed consensus which is a very powerful tool.

Using distributed consensus, blockchains can enable many use cases that can generate value. The most common one is likely payments, where a user can pay another user anywhere in the world with relatively low fees and short transaction time. Another well-known example is decentralized exchanges where users are willing to pay the DEX fees to exchange one cryptocurrency against another.

plantuml

The above diagrams show how blockchain creates value for users. However, the value growth of the blockchain itself and the functions it provides is determined by the network effects it inherits from it's users.

Network effects

The network effects emerge from existing users bringing in new users. As long as long as they find the usage of the blockchain valuable, this is beneficial for both the old and the new users. Metcalfe's law states that the financial value or influence of a network is proportional to the square of the number of connected users of the system.

This means that as more users start to use a blockchain system, each additional user adds to the number of users to interact with the existing users start to interact more with the system, in addition, the new users also add to the number of interactions in the system. Since each interaction generate value, the increased number of interactions from both existing and new users increases the value of the whole system .

plantuml

So, this looks great! No wonder why blockhains are popular, not only can users make cheap and fast payments to other users all around the world, but blockchains also enable a lot of other new and interesting transactions and interactions where users can both make profits and savings. But even with the obvious benefits and new functionality, most transactions and valuable interactions in the world still happen outside of blockchains. Let's take a closer look at one element from the diagrams above, it is the first place we often find that distributed consensus is applied in the wrong way:

What is limiting the network effects?

plantuml

plantuml

Computational bottlenecks

As we can see in the diagrams, computation is a key component of the value generation flow in blockchain. This is because it is computation that generates the blockchain's output - the state changes of the blockchain. Every time someone transfer tokens, uses a DEX, buys an NFT, runs any smart contract, deploys a dApp or any other action that writes something to the blockhain, a computation is executed and the result becomes a part of the blockchains new state. If Bob sends 2 tokens to Alice, a computation (simplified) would calculate:

plantuml

In reality a lot more calculations are required even for simple transactions like this.

Transactions involving smart contracts typically require a lot more computation. However, because smart contracts can be more complex, they can enable more use cases and therefore more value can be created. The more complex (computation) smart contracts (dApps) are able to run, the more use cases can be developed. As long as the increased value of the output is worth more than the fees, users will profit from using the dApp. If the fees are higher than the value created by the output, the dApp will not be useful.

We are now closing in on the first bottleneck limiting the growth and progress of blockchains.

Blockchains are not good at doing computations.

This is because when nodes are verifying the transactions in a block, each node is replicating the same computations as all the other nodes in the network. Imagine if your laptop or phone worked like that. Every time you used any application, hundreds or thousands of other devices would have to replicate what you did. And worse, every time someone else did something, you would have to replicate what they did! There are no laptops or phones that are available today that would be able to do this. Applications would be very limited in what they could do. Surely, if you were forced to replicate everything that everyone was doing, you would not let someone clog down your computer. In that case, you would probably rather disconnect from the network altogether.

This is exactly how blockchains are limited by computation, and it is a well-known problem.

If the amount of computation that the nodes have to perform becomes too large, most nodes disconnect from the network because it becomes too expensive to participate. If there are too few nodes to verify and agree on the state of the network, then the blockchain becomes centralised and no longer safe. When transactions are no longer secured, there is not much value left for the users, and the value grown by network effects will disappear. This is the reason why blockchains limit the amount of computation available per block (block gas limit), and also why transactions become expensive on popular blockchains (high gas price).

plantuml

As we can see from the diagram, the computational bottleneck is a severe limitation for blockchain progress. When the hard limit on computation is reached, the blockchain becomes saturated with transactions. Only the transactions with the highest fees will be processed, other transactions will be left unprocessed or processed with large delays. This makes the dApps less useful for users and further growth is stopped.

Existing computational scaling solutions

Fortunately, there are ways to reduce the impact of this issue. There are many L2 solutions such as rollups that move most of the smart contract execution out of the L1 blockchain and let the L1 blockchain do what it is good at, providing security and availability for transactions. These L2 blockchains are helping to reduce the load on popular L1 blockchains and they have become very popular.

plantuml

What these L2 solutions are doing is to decouple execution of complex computation of smart contracts from the expensive and slow L1 consensus mechanism. Instead, they let the L1 blockchain do the simpler task of storing and verifying the input to the smart contracts leaving the actual execution to happen in L2. Doing this, they are unlocking more computation, making more of this important ingredient for value generation available. This trick is possible because once the L1 has stored the input transactions, the execution of the smart contracts in L2 becomes deterministic. It is relatively easy to prove if a deterministic computation is valid or not. There is no need to use slow and expensive distributed consensus for verifying deterministic computations.

From the popularity of L2 solutions like rollups, we know that resolving issues where distributed consensus is used in the wrong place is very useful and unlocks a lot of value. But even though L2's are useful and popular, they have some problems and limitations on their own, and they only address a certain part of the scaling issues of individual blockchains. If we again take a look at how blockchains an dApps generate value, we will see that the current L2 solutions are only touching the surface of what is possible by moving functionality out of distributed consensus mechanisms. There is another crucial ingredient in the blockchain and dApp value generation recipe that is equally reduced due to the unnecessary use of distributed consensus. This crucial ingredient for blockchain value generation is:

Data Availability

We have already seen how limited access to computation can stop value growth in blockchains and how L2 solutions unlock further growth by making more computation available. Let's take a look at data availability, another ingredient that is in short supply. When users interact with blockchains, the blockchain or dApp combines the input that the users provide with data that is already stored on the blockchain. The users know that the output will be secure, since the input they provide will become verifiable and immutable, and the input from the data stored on the blockchain already has these properties.

The availability of more and more verifiable and immutable data to be used as input for transactions is how blockchains generate network effects. When many users have their balances or other data available to make transactions, the probability that new or existing users can find someone to transact with increases.

The same principle applies to dApps and the blockchain itself. The more each user interacts with the blockchain the more data they make available as input for dApps, thus making the dApp more interesting for other users to interact with.

plantuml

Lack of data availability makes dApps less useful

In the same way as the computational bottleneck, lack of data availability is a known problem.

However, unlike the computational bottleneck, the current attempts to increase data availability are mostly centralised or introduce additional layers of distributed consensus. The data made available with such implementations loses the inherited security from L1 consensus and thus becomes less valuable. In addition to this, most current interoperability solutions are passive and support only limited, user-initiated transactions.

The data made available with current solutions are also often limited to that of specific smart contracts, which further reduces the scope of data that the dApps can consume. All of these are bottlenecks for network effects and value generation.

The data availability problem does not only apply to data exchange between independent blockchains but is also relevant for computational scaling solutions like rollups, where L1 data access for L2 smart contracts is largely constrained to the data flow between the rollup-specific L1 contracts.

Let's take a look at how current solutions are implemented, and how the unnecessary use of distributed consensus reduces the value of the bridged data and also limits data access scope.

plantuml

From the sections above, we can now create a simple formula for the potential of dApps to generate value:

plantuml

A limitation on any of the three components in the formula will constrain the dApp's further growth:

  • If computation is limited, the dApp will not be able to process all the user inputs, or the fees to use the dApp will be too high for users to gain value from it.

  • If access to secure blockchain data is limited, there are less TODO

Opening the floodgates

The Coinweb protocol removes the technical constraints currently limiting dApps access to both computation capacity and availability of secure blockchain data in a way and with a significance unlike any other protocol. The colorful diagram below illustrates Coinweb dApps near unconstrained access to blockchain data and computation power.

plantuml

We have seen how increasing access to available data and processing power removes the limitations to dApp value generation and to the growth of network effects. The above diagram provides a visual illustration of how Coinweb facilitates this. Now let's take a closer look at the specific solutions within the Coinweb protocol that facilitate this:

Parallel computation environment

In the previous sections, we have seen how the use of distributed consensus to verify complex computations creates a severe bottleneck since every node in the blockchain network has to replicate all instructions from all of the transactions that are executed on the blockchain. This is clearly a flawed design that makes smart contract execution expensive and limited in scope.

But having all the nodes replicate every computation is not the only thing that is wrong!

Another flawed design commonly observed is using a sequential execution model. Sequential execution means that each transaction, and even each individual step in each transaction has to be executed one by one, one step at a time! In contrast, a typical laptop computer might have around 10 CPU cores, these cores can execute computations in parallel which allows horizontal scaling.

Most blockchains do not scale computation horizontally and the execution is limited to a single virtual machine instance that can only use one CPU core! Imagine all smart contract executions for a whole blockchain having to pass through a single core of your laptop. This is similar to how most blockchains are built today!

Since horizontal scaling is not possible, such a system can only be scaled by using very expensive single-core solutions, and even then only to a very limited degree.

The solution to this problem is obviously to allow smart contracts to be executed in parallel. Then multiple computers can work together in a cluster, and if more computational resources are required, then additional computers can be added to the cluster. This horizontal scaling is only possible with a parallel computation model. Coinweb adds a parallel execution environment on top of every (blockchain), where multiple virtual machines are spawned according to the number of transactions that are processed. It is interesting to note that since Coinweb shards run on top of each connected blockchain, Coinweb is running many parallel execution environments in parallel.

You can find more detailed information about Coinweb's parallel execution model here

L2 interoperability

In the section about computational bottlenecks, we saw how L2 rollups scale specific L1 chains by moving computation out of the L1 consensus mechanism. While this is certainly a good idea, rollups and similar L2 approaches use L1 smart contracts for dispute resolution. Their inputs are also tied to specific smart contracts on their parent L1 chains, which severely limits the data availability for L2 dApps on these chains. What if we took the L2 idea a few steps further?

Imagine an L2 rollup that was not tied to a specific chain, but instead could scale many chains simultaneously!

Such a rollup would be able to scale both transactional and computational volume immensely - it could combine the transactional capacity of all the connected chains, and run horizontally scaling parallel computation on top of each chain. And it would not stop there.

This rollup would provide full L1 and L2 data availability from all the underlying blockchains to its dApps!!

These dApps would surely have superpowers! We call Coinweb a cross-chain computation computation platform.

We could also have called Coinweb a "blockchain-agnostic interoperable rollup with full L1 and L2 data availability" because that is exactly what Coinweb is.

Two main innovations make it possible to create this "super-rollup":

1. Moving more verification out of the L1 consensus

In much the same manner as we move the computation layer out from the L1 consensus layer, we move the verification mechanism that rollups implement as L1 smart contracts out of the L1 consensus layer as well. This means that our super-rollup is no longer tied to a specific L1 consensus, and is thus free to add data from any L1 blockchain as input. The key to this improvement is again that we use an off-chain proof mechanism to verify data and resolve disputes. Instead of implementing the dispute mechanism as expensive and limited L1 smart contracts, we move this mechanism to the edge, and let the thin clients (such as wallets) determine the correct state for themselves. Coinweb thin clients can do this* by using Refereed Delegation of Computation (RDoC), which is an especially suitable type of interactive proof mechanism for the verification of large-scale computations. This is a much more effective approach than current implementations. You can read more about RDoC (Refereed Delegation of Computation here)

* A way of looking at this is that each thin client has a "mini Coinweb" instance that it can use to check things for itself.

2. Making things reactive and autonomous

Most current blockchain systems are user-driven. Execution of smart contracts is limited to one block and needs user-initiated transactions to start execution.

Coinweb smart contracts can both be user-driven and data-driven. Like many of the processes running on regular computers, Coinweb smart contracts can self-initiate and run autonomously, across multiple chains and blocks. They can monitor certain combinations of data and events and execute on their own based on programmable criteria. This not only gives Coinweb dApps unprecedented data availability but also enables new sets of powerful primitives that open up entirely new use cases. The reactive framework requires a true blockchain-agnostic platform token like CWEB to operate.

Reactve smart contracts

Within the current smart contract paradigm, smart contract executions are one-off events, in the sense that each execution is triggered by off-chain invocations embedded in the same block as the result of the execution. There is no mechanism to propagate an active process over multiple blocks without additional off-chain actions. This is very similar to how things worked in the early days of computers. Back then, a computer would run a single program at a time, each execution was explicitly started by the user, or an external trigger to the system. This is in contrast to modern computers, which run multiple autonomous processes that perform all kinds of useful tasks in the background, making the computer and other programs more useful. Autonomous reactive smart contracts, especially blockchain-agnostic ones, will enable a huge number of new use cases for blockchains. We have already added some of these use cases, such as token-shield, optimistic=bridges, and optimistic DEX to this developer portal, with many more to come!!

You can find more details about Coinweb's reactive smart contracts here.

RDoC proof protocol

Most smart contract blockchains, blockchain bridges, interoperability protocols, etc. use a very expensive mechanism (distributed consensus)* to prove information where they instead could use a different mechanism that is much cheaper and more effective.

This design flaw is a fundamental one, it is the reason behind high gas fees, bridge hacks, fractioned liquidity, poor interoperability and more; In fact, it is the single most important problem that prevents blockchain technology from reaching its full potential.

This does not mean that distributed consensus is not useful, it is an extremely powerful tool that can make nodes agree on arbitrary values and inputs and it is the foundation for blockchain technology[*]. This does not however make it suitable as a general purpose building block. Once distributed consensus has done its job and created its output, there are other much more suitable and effective mechanisms to create and verify proofs. The reason for this is that computations on output from distributed consensus systems such as blockchains become deterministic. Since all the input to the computation is immutable, the output will be the same every time the computation is run. The results of deterministic computations are therefore much easier to prove than the results of computations where the input will vary between each execution. Nevertheless, most blockchain systems tend to overuse distributed consensus, leading to the bottlenecks we can observe. One of Coinweb's most important contributions to the blockchain technology space is to apply RDoC (Refereed Delegation of Computation), a proof protocol that is more suitable for deterministic computations, for proving deterministic computations from inputs generated by multiple independent distributed consensus systems.

You can find more information about RDoC here.

[*] With Bitcoin as the best example, distributed consensus is the right mechanism to make the actual data on the blockchain verifiable. However, once the data is under consensus, adding additional consensus systems (validators, oracle networks, in-between blockchains) is most often not necessary. Proof protocols such as RDoC (Refereed Delegation of Computation) can prove deterministic operations such as smart contract execution and cross-chain operations much more efficiently and securely.

Inchain architecture

The structural differences between Coinweb and other blockchain platforms are radical at the core level. These structural differences are devised specifically to improve and extend the fundamental drivers of blockchain's utility and value. The principles behind these radical structural changes are defined in the Inchain architecture. The Coinweb protocol is an instance (the first) of the Inchain Architecture. To get a better understanding of the Coinweb protocol, it is helpful to go through the defining characteristics of the Inchain architecture:

    1. An Inchain architecture constitutes a layer of DLT infrastructure in which the execution layer is separated from the consensus and data availability systems.
    1. Enables computations over multiple different consensus and availability systems, where the outputs of the computations are executed deterministically from available consensus bound data retrieved from commonly available DLT-ledgers.
    1. Both the inputs and the outputs of computations are available

To achieve these characteristics, an inchain architecture must implement mechanisms to meet the following conditions:

    1. Inputs to computations can be retrieved from multiple different consensus and availability systems, but the outputs must be consistent and adhere to the 3 defining characteristics.
    1. The outputs from the computations must be verifiable and useful for off-chain participants.

Blockchain agnostic platform token - CWEB

The native coins of L1 blockchains serve as keys for combining and appending data in their secure data availability layer. To unify data availability across independent blockchains, a similar uniform key has to be available to access all the connected chains. The Coinweb protocol works by providing both a uniform format for embedding information inside different blockchains and a uniform execution environment on top of each connected blockchain. This is what constitutes the Coinweb protocols blockchain-agnostic layer. Since Coinweb's platform token, CWEB is blockchain agnostic, it can now be used to provide direct access and the ability to combine the data from all the connected chains, in much the same way as native coins on single chains.