1. Home
  2. coinbase-eng

coinbase-eng

Introducing our NFT Dapp Starter Kit for Developers

TL;DR: We’re providing developers with an open-source starter kit that can be used to kickstart dapp development for NFT minting sites. Happy building!

By JM Cho, software engineer for Coinbase Wallet

Last year, the NFT market experienced its best year yet — generating over $23 billion in trading volume with a floor market cap of $16.7 billion for the Top 100 NFT collections. With demand growing, NFT minting sites quickly became one of most popular types of dapps built in the crypto space. In comparison to projects launched on marketplaces like OpenSea, custom minting sites offered projects the ability to provide their community with a customized minting experience and retain full control over their distribution timeline and royalties.

For the teams behind these projects, creating an NFT minting site creates an opportunity to showcase their engineering capabilities, share important details about the project, and more deeply engage their community. However, the resulting demand for web3 developers who can ship end-to-end NFT minting experiences has far outpaced the number of resources that can teach how to build these dapps from scratch.

Enter: Coinbase’s NFT Dapp Starter Kit.

https://medium.com/media/c7c0d38a3921a589301f60611a8b6c53/href

The NFT Dapp Starter Kit reflects Coinbase’s commitment to empowering developers by making it easier and faster to build for web3. This open-source kit embraces the collaborative nature of web3 by enabling developers of any level to easily launch new NFT projects. Compatible with both the Ethereum Mainnet and EVM-compatible chains, this resource will be particularly helpful for those who want to create a full-stack NFT project but lack experience writing Solidity or integrating a smart contract to a front-end.

Features included in this starter kit:

  • Frontend web app integrated with a gas-efficient, audited ERC-721 smart contract
  • Built-in multi-wallet modal to enable users to easily connect to your minting site with Coinbase Wallet or other major self-custody wallets
  • NFT minting front-end component that allows users to specify mint number
  • Explorer to view minted collection and tokens owned by connected user
  • Gas-efficient pre-sale feature using merkle tree allowlists
  • Token art pre-reveal feature and scripts to make this process seamless
  • Support for Royalties on NFT transfers (EIP-2981 standard)
  • Ability to reserve tokens for gifting or airdrops
  • Art engine to create generative artwork and metadata with built-in examples
  • Easy-to-run scripts to simplify various NFT contract interactions
  • Hardhat unit tests to ensure correctness of contract
  • Clear documentation and walkthrough tutorial of the kit and how to use it

Check out the NFT Dapp Starter Kit on GitHub and watch the video tutorial for instructions on how to start building your dapp. As always, we welcome your feedback or ideas in the Github repository for our team’s consideration and look forward to hearing from the community.

Happy Building!


Introducing our NFT Dapp Starter Kit for Developers was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Announcing the Relay VSCode extension

Tl;dr: Coinbase is introducing its innovative VSCode extension created in collaboration with the Relay team, for the Relay GraphQL client, which is available for developer use now.

By Terence Bezman, Coinbase Senior Software Engineer

In collaboration with the Relay team at Meta, Coinbase has built a VSCode extension to integrate with the Relay language server which is now available for developer use.

The Relay framework is self-described as “the GraphQL client that scales with you”. As mentioned in a previous blog post, we currently use Relay to accelerate development for 300+ engineers on our core applications. Having a world class IDE experience is an important facet to improve engineer productivity — which is a top priority.

The extension allows for connection to the following:

  • IntelliSense (autocomplete in your GraphQL tags)
  • Go to Definition for fragments, fields, GraphQL types, etc.
  • Diagnostics (Errors, Warnings)
  • Hover type information
  • GraphQL syntax highlighting

Motivation

Prior to this extension, syntax highlighting and IntelliSense were available by using any of the other existing GraphQL extensions in the VSCode Marketplace, so why did Coinbase decide to build another one?

From custom directives to its own compiler, Relay adds several features on top of GraphQL. A third-party editor integration would need to re-implement all of the work done in the Relay Compiler to achieve what we’ve built in this extension. This feature integrates directly into the Relay Compiler via a language server giving us the full context from the compiler. This level of context unlocks the following features in your editor:

  • Showing Relay Compiler errors in your code
  • Jump to definition on Relay Resolver fields
  • Support for Relay directives

As Relay continues to build more features, Coinbase wants to remain at the forefront of this innovation and help set the tone for a better developer experience.

Maintainership

Part of what makes a great open source contributor is ownership and maintenance of the work. Coinbase has allocated engineering resources to create the first version of this VSCode extension and for continued support in the future. It’s important that the community understands that we do not view this as a one and done project, but a relationship we want to maintain for years to come.

Giving back to the community

Rather than be a passive consumer of Relay, Coinbase is working to be an active and engaged part of this community. Through our shared cooperation with Relay, we are working to understand the product at a deeper level — and leveraging that knowledge to assist the community in closing old issues and submitting pull requests to ensure the project’s success.

The Coinbase team is thankful for the amazing work the Relay team has done thus far to create a better developer experience with Relay. We want to honor the value this project has brought to our team and continue to innovate in an open-source way that benefits our community.


Announcing the Relay VSCode extension was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

web3 on the platform of your choice — a closer look at Coinbase Wallet’s multi-platform approach

web3 on the platform of your choice — a closer look at Coinbase Wallet’s multi-platform approach

Tl;dr: The replatforming of Coinbase Wallet’s mobile app reflects our commitment to improving access to web3. In this blog, we discuss why the transition to React Native marks a critical turning point for both our users and our technical teams.

By Chintan Turakhia, Director of Coinbase Wallet Engineering, and Dan Coffman, Coinbase Wallet React Native Lead

gm

This week, we announced the debut of Coinbase Wallet’s new mobile iOS and Android apps built using React Native. This launch marks a critical inflection point for Coinbase Wallet and its users, and we’d like to shed light on the motivation and journey to get here.

Our goal with Coinbase Wallet is to be the default gateway to the web3 ecosystem. We debuted a mobile app on iOS and Android in 2017 to make the benefits of crypto, self-custody, and the nascent dapp ecosystem accessible to all — regardless of network or blockchain, country or currency, crypto savvy or crypto beginner.

In 2021, we saw an uptick in web3 engagement via desktop, driven largely by NFTs and DeFi dapps. This led us to launch a Coinbase Wallet browser extension using React, providing users with the option to engage with the crypto economy on a desktop platform. This also gave our product, engineering, and design teams an opportunity to entirely rethink how a self-custody product should look and feel for power users as well as newcomers to web3.

Launching the browser extension unfortunately meant that our engineering teams now needed to code the same features for three different platforms, since Wallet’s browser extension, iOS, and Android apps leveraged different codebases. And as a consequence, our shipping velocity on extension far exceeded our pace on mobile. We knew that this wasn’t a viable long term solution to keep up with the innovation in web3. In order to maximize efficiency of our developers and designers as well as ship a consistently reliable, safe, and simple multi-chain wallet on all platforms, we would need to migrate our mobile products into a common framework.

Enter React Native.

Build once, ship everywhere

We embrace the mantra of build once and ship everywhere. Replatforming Coinbase Wallet’s mobile apps to React Native means we can more easily ship new features to Wallet’s browser extension, iOS, and Android apps in tandem, streamlining workflows and allowing us to deliver the same great user experience across desktop and mobile.

In 2020, the Coinbase app successfully transitioned from native mobile to React Native, a software framework which allows us to ship the same TypeScript code and React UI on both iOS and Android. Our goal since then was simple: leverage those paved roads to ship a highly performant mobile app while affording users a consistent design experience using the Coinbase Design System (CDS). We were able to quickly leverage paved roads including react navigation, deeplinking, and configuration in code.

By unifying our Coinbase Wallet mobile app and browser extension into a single data layer that handles all business logic, we can continue to ship products quickly across three platforms. We moved away from class-based RxJS and shifted to functional context-based repos to enable greater leverage of React core libraries. The additional challenge was migrating the data layer while continuing to build and ship features on the Extension. Web3 pauses for nobody, and so we carefully orchestrated the replacement of our entire Wallet engine while still flying.

Wallet will also be able to bring new features to market in a fraction of the time. A recent example of this is our DeFi portfolio, which took two months to build for the Wallet browser extension and only five days to port to our new React Native mobile app. The majority of complex logic resided in the data layer, and since these are now shared between all platforms, we were able to build it once and ship it to all platforms. Only client UI for mobile was remaining, and fortunately, with a common CDS, porting UI elements to mobile was simple. Thanks to these types of efficiencies, we expect to see the velocity at which Wallet launches new products continue to increase in the months to come.

Not only will users benefit from Wallet delivering features faster, the mobile experience as a whole will be snappier, more responsive, and more reliable.

Bridging the gap

Several of the features we released over the past year were built for the Wallet extension and have yet to be introduced to the Wallet mobile app. With our migration to React Native, we’re launching now ready to launch several great features in the mobile app for the first time, including support for storing, sending, and receiving Solana and SPL tokens, real-time price charts, an in-app dapp browser, token management, and a DeFi portfolio view.

How do you port 110+ features on a new React Native tech stack, while adding new functionality to the existing Chrome Extension platform all the while maintaining the existing Native mobile product? The common data layer was a force multiplier which enabled the team to keep building features for extension which would seamlessly port to RN mobile. We also prioritized feature parity of our existing mobile product while maintaining a high bar for quality and performance through benchmarking.

The team built foundations for a scalable and extensible performance monitoring system to measure client-side performance of the new app, including page load, UI thread blocking time and app responsiveness for every commit. We optimized loading and screen render times using a bespoke incremental rendering solution, memoizing components, and optimizing expensive hook computations. We also shipped countless improvements to the data layer through batching state updates and optimizing caching strategies, which benefited asset and NFT loading on both the new react native app and extension.

More than a fresh coat of paint

If you’ve noticed the new look-and-feel of the mobile app, it reflects much more than a fresh coat of paint. Coinbase Wallet is now an integral part of the Coinbase Design System.

Coinbase’s Design System is a robust architecture of reusable design and motion components built within React. We’ve found that the CDS enables our product, engineering, and design teams to focus on building high-quality product experiences while ensuring a consistent experience for users across Coinbase products. Beautiful design elements, like the motion-driven “warm welcome” that greets new users, are just a small preview of how we are providing a visually rich experience for a global audience.

Since Coinbase Wallet’s browser extension was built using this design system, desktop users will now enjoy a more consistent experience within Wallet’s mobile app. As an added bonus of this consistent approach, users who are familiar with Coinbase’s flagship app will immediately feel at home in Coinbase Wallet’s new mobile experience.

Another exciting result of this work is Wallet users now have an opportunity to personalize their mobile experience. Our research showed that users prefer dark mode, so the mobile app will open in dark mode by default. We’re also adding the ability to select a theme color for Coinbase Wallet, which can be selected in the Settings tab.

Looking ahead

The new Coinbase Wallet mobile app began rolling out globally across iOS and Android this week, and we expect the rollout to be complete over the next few weeks. Make sure your app is up to date by visiting the App Store on iOS or Google Play on Android, and follow @CoinbaseWallet on Twitter for the latest news and updates.

Coinbase Wallet is a self-custody wallet providing software services subject to Coinbase Wallet Terms of Service and Privacy Policy. Coinbase Wallet is distinct from Coinbase.com, and private keys for Coinbase Wallet are stored directly by the user and not by Coinbase. Fees may apply. You do not need a Coinbase.com account to use Coinbase Wallet.


web3 on the platform of your choice — a closer look at Coinbase Wallet’s multi-platform approach was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Scaling Node Operations at Coinbase

Tl;dr: This blog shares insights on how Coinbase is investing in new tools and processes to scale its node operations.

By Min Choi, Senior Engineering Manager — Crypto Reliability

Blockchain nodes power almost every user experience at Coinbase. We use them to monitor fund movements, help our customers earn their staking rewards, and build the analytics needed to support popular features within our applications. As such, being able to effectively manage blockchain nodes is vital to our core business and we are continuing to invest in ways to scale our node operations.

One of the most difficult aspects of node management is keeping up with the constant, and sometimes unpredictable, changes to the node software. Asset developers are consistently releasing new code versions and some blockchains, such as Tezos, leverage an on-chain governance model to take a community vote on all proposed changes. A decentralized governance model such as this makes it difficult to predict when a change will be introduced and prepare our internal systems in advance. An example of such a scenario is depicted in the below Messari alert.

Data provided by https://messari.io/

The consequences of not keeping up with these changes can be severe to our customers. They could cause long delays to balance updates in our core wallets or slashed staking rewards. To help minimize these incidents from occurring, we’re focusing investments into the following areas:

Asset Release Manager

This service gives us an extra pair of hands (or should I say “ARM”) to process common node upgrades. All puns aside, the ARM service monitors Github release activity for dozens of critical blockchains and automates the deployment of new node binaries to our non-production environments. This frees up our engineers to focus on service validations and work proactively with asset developers to resolve problems prior to production release.

The below diagram shows the high level data flow for ARM.

Here’s a recent example of how the ARM service was leveraged to process a node upgrade for Algorand.

  • On May 9 at 12:44 PM PDT, Algorand version 3.6.2 was released.
  • On May 9 at 1:13 PM PDT, the ARM service filed a ticket to notify our engineers and track the incoming change.
  • On May 9 at 1:43 PM PDT, the required code change was automatically generated for build and deployment.
  • On May 9 at 2:13 PM PDT, the change was automatically deployed to all our non-production environments for Algorand.
  • On May 9 at 2:43 PM PDT, an error in one of the three deployments was detected and the ARM service escalated to an engineer to help investigate.
  • On May 10 at 6:27 AM PDT, the engineer resolved the deployment problem and began service validation testing in preparation for production deployment.

As seen above in this event chronology, the system isn’t completely touchless, meaning engineers are still needed as part of the overall upgrade process. However, the ARM service allows us to transact hundreds of these upgrade operations in parallel, saving countless hours of engineering time which can then be reinvested into quality assurance efforts.

Test-Runner

This is an orchestration service used to execute integration tests, both via temporal workflows and API calls to critical systems across Coinbase. As the name may suggest, Test-Runner obtains and stores test results, aggregates them by metadata, and exposes an API to query the results. By making it simple to create these tests and share standardized test results across our engineering teams, we’re able to accelerate our asset addition and incident response processes. We put a lot of value in building reusable integration tests as we view them as a foundation of our asset maintenance regime.

The below diagram shows the high level service architecture for Test-Runner.

Here are also a few basic examples of the types of tests that are in scope for Test-Runner.

  1. Balance transfers within Coinbase.
  2. Deposits and withdrawals in and out of Coinbase.
  3. Sweep and restore operations between cold and hot wallets.
  4. Simple trade operations (buy/sell).
  5. Rosetta validation.

Each time a node is upgraded, these tests are automatically triggered through our continuous integration (CI) pipeline, providing a clear validation of success or failure. This helps our engineers make quick and informed operational decisions such as rolling back to a previous version of the node binary.

Blockchain Pods

As we add more blockchains to our support catalog, we’re investing in flexible engineering teams designed to collaborate on emerging priorities. Our pods are approximately 5–7 engineers in size, are made up of site reliability and software engineers, and offer opportunities to quickly adapt to shifting market conditions. For example, we most recently formed a pod to focus specifically on Ethereum’s upcoming transition from a Proof-of-Work (POW) to a Proof-of-Stake (POS) blockchain. The Merge is a very large and extremely complex change, requiring nearly all Coinbase systems to adjust, but is also merely a one time event that doesn’t justify the formation of a permanent engineering team.

We’re also in the process of forming new pods to focus on ERC-20 (Tokens) and ERC-721 (NFTs). In this way, we can pivot on the development of features that harness these standards for the betterment of our customers. By constantly forming and dissolving pods in this manner, we’re able to develop small economies of scale that quickly meet our customer needs. It also gives our engineers the flexibility to choose between areas of technological interest and build subject matter expertise that help them grow their careers at Coinbase.

Final Thoughts

Developing a comprehensive strategy for node management is a challenging endeavor. While we acknowledge that our own strategy is not without flaws, we take pride in operating at the cutting edge of blockchain technology. Everyday, Coinbase engineers work tirelessly in partnership with the greater crypto community to overcome these operational challenges. So if you’re interested in building the financial system of the future, check out the openings on the Crypto Reliability (CREL) team at Coinbase.


Scaling Node Operations at Coinbase was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Scaling Container Technologies at Coinbase with Kubernetes

Tl;dr: Our recent evaluation of Kubernetes underscored its suitability for scaling Coinbase into the future. In the past, a migration to Kubernetes raised concerns due to the operational burden of running and securing the control plane in-house. We’ve now concluded that managed Kubernetes offerings reduce this operational burden without compromising our stack security.

By Clare Curtis, Coinbase Staff Software Engineer

Almost two years ago we released a blog post detailing why Kubernetes is not part of our technical stack. At the time, migrating to Kubernetes would have created a whole new set of problems that outweighed any near-term benefits. However, as these technologies have matured, our newly-formed Compute Team devised a strategy for leveraging Kubernetes in a way that can deliver a more flexible and scalable version of our current system.

Coinbase has grown substantially since we first considered migrating to Kubernetes. With any growth of this kind, it is important to prioritize scalability concerns. As we continue to scale, one of the main areas in need of future-proofing is Coinbase’s compute platform. In mid-2020, our largest service was configured to run a relatively small number of hosts, whereas today it’s running 10x that number.

In this same period, we quadrupled the size of our engineering organization causing a substantial increase in the number of deployments — each needing completely new hosts. The increase in the number of deployments have raised concerns over future scalability as we are already running into technical limitations of current APIs and resources. Recurring issues with getting enough capacity and having it delivered in a reasonable timeframe, caused an increase in failed deployments and required our largest services to dramatically slow down their release process.

While these issues are solvable, we decided to take this opportunity to evaluate whether it made sense to continue investing in a homegrown system or consider an open source alternative that would be much more scalable in the long term.

In our evaluation of Kubernetes, we found that one of the biggest advantages of a migration is that it decouples host provisioning from service deployment, moving the burden of managing host acquisition from individual teams to the broader Infrastructure team. This empowers the Infrastructure team to take a holistic approach to host management. Also, capacity constraints are less likely to affect deployments, and we reduce the amount of cloud provider specific knowledge that individual engineers need to maintain.

The Kubernetes community has created a wealth of knowledge and tooling that we can utilize to provide better support to teams and quickly enable new features. Additionally, as Kubernetes is extensible, there is still the option to build tooling internally and open source it for use within the wider community.

Security is incredibly important at Coinbase and securing Kubernetes clusters is a non-trivial undertaking. Transitioning from highly-isolated and single-tenant compute to a system which promotes multi-tenancy requires deliberate security design and consideration. Because we have high-security workloads where we have to guarantee isolation, we must run separate clusters and build automated tooling that handles all cluster operations. Giving individuals access to operate high-security infrastructure is not allowed.

Managed Kubernetes offerings, such as AWS EKS, take on the responsibility of operating, maintaining, and securing the control plane, reducing the operational burden of running many clusters. Reducing our operational burden and security responsibility enables us to focus on building the orchestration and automation that is required to support many clusters across a large engineering organization. EKS has significantly matured over the past few years and shown that it provides stable, operational Kubernetes while also integrating with features that are commonly used in EC2 such as being able to attach security groups to pods and IAM Roles to service accounts. Having those integrations reduces the risk and cost associated with migration, as they allow for migration without having to change the identity or access patterns of our current platform.

While the migration to Kubernetes spurred concerns in the past, we’ve now concluded that managed Kubernetes offerings, such as AWS EKS, can reduce the operational burden without compromising security. Ultimately, we realized there is a clear ceiling to the ability of our homegrown system to scale, and while there is a large set up and migration cost associated with a move to Kubernetes, we are confident that it will be more flexible and scalable than our current system.


Scaling Container Technologies at Coinbase with Kubernetes was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Part 2: Quantitative Crypto Insight: Stablecoins and Unstable Yield

By George Liu and Matthew Turk

Tl;dr: This blog analyzes centralized stablecoin lending yield for Compound Finance and shares our insights on performance, volatility, and factors that drive this yield on collateralized lending of stablecoins in DeFi. The analysis shows that this lending yield can outperform the risk-free yield in the TradFi market.

In part two of this quantitative research piece, we will examine stablecoin lending yield for the Compound Finance V2 decentralized finance (DeFi) protocol and share our insights on yield performance, volatility, and what factors are driving yield on collateralized lending of stablecoins through DeFi protocols. We also compare the “risk-free” yield in traditional finance (TradFi) to the concept of “low-risk” yield in DeFi, which we introduced in part one.

ACKNOWLEDGEMENT: While we are aware of the recent collapse of Terra’s algorithmic stablecoin TerraUSD (UST), our analysis here is on the area of collateralized lending yield for centralized stablecoins. We’re focused specifically on Compound for USDC and USDT (fiat-backed stablecoins), which have disparate risks and opportunities.

We conclude in this piece that using stablecoins for low-risk (within DeFi) collateralized lending could outperform the risk-free investment in the traditional financial market.

USDT/USDC Yield Analysis

As mentioned in part one of this blog post, a Compound user who has placed their assets into a liquidity pool can calculate total lending yield using exchangeRate, which is an indication of the value of the interest that the lender can expect to receive over time, and the return from time T1 to T2 can be simply obtained as

R(T1,T2)=exchangeRate(T2)/exchangeRate(T1)-1.

Additionally, annualized yield for this type of collateralized lending (assuming continuous compounding) can be calculated as

Y(T1,T2)=log(exchangeRate(T2)) — log(exchangeRate(T1))/(T2-T1)

While the Compound liquidity pools support many stablecoin assets such USDT, USDC, DAI, FEI etc, we are only going to analyze the top 2 stablecoins here, i.e USDT and USDC, which have a market capitalization of $80B and $53B respectively. Together, they make over 70% of the total market of the stablecoins.

Below are the plots of the annualized daily, weekly, monthly and biannual yields generated according to the formulas in the previous section. The daily yield is somewhat volatile, while the weekly, monthly and biannual yields are respectively the smoothed version of the prior granular plot. USDT and USDC have relatively similar patterns in the plot, as they both experience high yield and high volatility during the start of 2021. This indicates that there are some systematic factors that are affecting the stablecoin lending market as a whole.

Source: The Graph

One hypothesis of the systematic factors that could affect the lending yield are crypto market data (like the BTC/ETH prices) and its corresponding volatilities. When BTC and ETH are in an ascending trend, some bull-chasing investors may borrow from the stablecoin pools to buy BTC/ETH, and then use the purchased BTC/ETH as collateral to borrow more stablecoins and repeat this cycle until their leverage reaches the desired level. Additionally, when the market enters into a high volatility regime, there are more centralized and decentralized crypto transactions which could increase the demand for stablecoins as well.

Now, to check the relationship of the stablecoin yield and the crypto market data, we perform a simple linear regression analysis to see how much variation in the yield can be attributed to the price and volatility factors using the following formula:

To measure the magnitude of these factors’ contribution, we use the R-Squared score, which has a range of [0, 100%]. A score of 100% would mean that the yield is completely determined by the contributing factors.

Regression of USDC/USDT on the BTC market and the ETH market respectively lead us to the following R-Squared table:

ETH market data has a better explanatory power (18% & 17%) than the BTC market data (16% & 11%) in determining the yield of USDC and USDT. This is unsurprising, particularly due to ETH’s increased popularity and expanded footprint in the DeFi market since the start of 2021. As seen with these results, crypto price and volatility factors did not fully explain the yield in stablecoins. We can conclude that there must be other factors that help to improve the score from the basic model.

We performed further exploratory analysis by introducing the historical stablecoin supply data and MACD technical indicator price data to the model. The stablecoin supply (the total number of stablecoins supplied to Compound liquidity pools) should — intuitively — affect the availability/scarcity of the stablecoins and indirectly impact the yield. MACD is an important momentum trading signal (subtracting the 26 period EMA from the 12 period EMA — in this case on price) as it could help momentum investors to decide when to leverage and when to deleverage.

We see a noticeable increase in R-Squared scores, as both USDC and USDT got a bump to a level around 60%-70% as shown below.

From this data we can conclude that stablecoin supply is a substantial contributing factor, as it alone is able to bring the score to around 60% for both stablecoins in any of the two markets. It seems to suggest that [supply] is a major factor in affecting the yield in the stablecoin lending market. This is very similar to the TradFi world, where credit supply by the Federal Reserve will affect the general interest rate of the whole system.

The introduction of MACD data (on BTC and ETH price) brings mixed improvement. In the case of the BTC market, its independent contribution is far less than the supply factor, and the marginal benefit over the shoulder of supply is only a few percentage points. We noticed in the ETH market, however, that MACD has a greater independent contribution to the R-Squared value as compared to the BTC market. This suggests that stablecoin lending yields are more correlated with momentum based trading activity in ETH than in BTC.

An example of the regression coefficients for USDC lending yield in the ETH market are displayed below. The table suggests that higher ETH prices, volatility and [stable coin supply] are generally associated with lower USDC lending yield. At the same time, the stronger the MACD signal is, the higher the yield would go.

Comparison to the Traditional Risk-Free Yield

While it is interesting to reveal what has driven the low-risk yield on stablecoin lending, it is also important to compare these yields with the counterpart in the TradFi market.

Because stablecoin lending yields are derived from the realized floating interest rates for collateralized loans on the Compound platform, we selected the General Collateral (GC) rate used in the traditional money market as the comparable risk-free rate, because it is also a floating rate with treasury debt as the loan collateral.

Below is a plot of the portfolio value of the investments that earn USDC lending yield, USDT lending yield, and GC rate yield respectively. The investments all start with $100 initial value on 2020–05–01, and end on 2022–05–01. As seen below, yield on USDT and USDC collateralized lending is higher than the GC rate by a large margin. On the other hand, risk-free investment that earns GC rate hardly grows for the same period.

The average interest rate in the table below also confirms that GC rate is on average around 0.08%, while USDC and USDT lending yields are respectively 3.71% and 4.51% for this period as seen below. (We also checked the 2Y term yield on the treasury debt on 2020–05–1 which is merely 0.2%)

For the foreseeable future, it is reasonable to conclude that the low-risk rate, within the crypto market at least, will continue to outperform the risk-free rate in the TradFi market. One reason for this is the smart contract risk, or liquidation risk mentioned in part one of this blog. However, a larger reason is the slower growth in the stablecoin supply relative to the growth in the crypto economy as a whole. By comparison, the TradFi market has seen major credit growth since the start of the Covid-19 pandemic, which has helped to drive the risk-free rate to historical lows (see Fed balance sheet growth below).

Conclusions

This blog provided a broadly indicative analysis of the low-risk yields available from collateralized lending of stablecoins through DeFi protocols. While these yields may be very volatile on a daily basis, their general trend can be explained relatively well by BTC/ETH prices, volatilities, stablecoin supply and MACD (momentum trading activities). We also compared these yields with the risk-free rate in the TradFi market where we see consistent outperformance in the crypto market. To reiterate, this is not financial advice.

Next steps

We, as part of the Data Science Quantitative Research team, aim to get a holistic understanding of this space from a quantitative perspective. We are looking for people that are passionate in this effort to join our growing team. If you are interested in Data Science and in particular Quantitative Research in crypto, come join us.

The analysis makes use of the Compound v2 subgraph made available through the Graph Protocol. Special thanks to Institutional Research Specialist, David Duong, for his contribution and feedback.

NOT INVESTMENT ADVICE

The content is for informational purposes only, is general in nature and should not be relied upon or construed as legal, tax, investment, financial, a promise, guarantee, or other advice. Nothing contained herein constitutes a solicitation, recommendation, endorsement, or offer by Coinbase or its affiliates to buy or sell any cryptocurrency or other instruments in any jurisdiction in which such solicitation or offer would be unlawful under the laws of such jurisdiction.


Part 2: Quantitative Crypto Insight: Stablecoins and Unstable Yield was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Embracing Decentralization: Integrating and Building Protocols at Coinbase

By Jesse Pollak, Senior Director of Engineering at Coinbase

Tl;dr:

  • Coinbase is committed to its vision of investing in web3 and the broader cryptoeconomy.
  • We’re embodying this vision by integrating, building, and supporting protocols at Coinbase.
  • Protocols are increasingly integrated into our products, a focus for teams across the company, and a major investment area for Coinbase Ventures.
  • To support this work, we’re growing our Smart Contract Engineering team and building protocol expertise across every role.

Over the past year, Coinbase has begun laying the groundwork to transform into a web3 company. We’re doing this because we expect that large portions of the global economy will move “on-chain” in the coming years. This transition will occur as people globally demand greater access to financial services and ownership in the success of the businesses they support. Coinbase is positioned to play a key role in increasing economic freedom by doubling down on web3 innovation and supporting the development of protocols that put powerful financial tools in the hands of people around the world. Today, we’re sharing an update on how we’re integrating, building, and supporting these protocols.

Protocols offer a range of products and services that allow people anywhere in the world to transact directly with each other. For example, Compound makes it easier for individuals to lend or borrow crypto assets, giving them access to capital and higher yields with no intermediaries involved. Another example is USD Coin, the largest stablecoin on Ethereum with a market capitalization over $50B, that allows people to easily send and receive funds globally using a digital asset that is pegged to the US dollar. Protocols like these form the foundation of the cryptoeconomy that millions of people interact with daily.

In just over two years, the value locked in these protocols has gone from a few hundred million to over $200B. People all over the world want greater control over their financial futures and web3 protocols are making that a reality.

At Coinbase, we are embracing protocols across all our strategic pillars with four key initiatives:

  1. Integrating protocols into our products. In Coinbase Wallet, and our recently launched dapp wallet, we are natively integrating trading and yield features supported by protocols as well as enabling open access to protocols through our browser. We recently launched Coinbase NFT, which is powered by the 0x protocol to enable low fee NFT swaps. Additionally, through our DeFi Yield product, we are integrating with protocols like Compound to offer users in eligible jurisdictions higher yields on their holdings. We are excited about how protocols can enable better user experiences and aim to keep integrating them deeply into our products.
  2. Building protocols and supporting innovation. We support protocols, regardless of how they will directly integrate into Coinbase. We support protocol innovation through our Project 10% program, which funds moonshot ideas including the recent Coinbase Ventures investment in Backed. We’re also supporting protocol work in the ecosystem through our developer grants program and open source contributions like Rosetta, which makes it easier to integrate protocols into Coinbase. Today, we also announced that Coinbase Cloud is supporting a group of experienced founders and operators who are building the first enterprise-grade liquid staking protocol.
  3. Funding protocols in the broader ecosystem. Coinbase Ventures has invested in protocols throughout its lifetime and is an early investor in some of the most exciting web3 protocols, including Compound, UMA, Saddle, Radicle, Synthetix, Notional, Goldfinch, and others. In the first quarter of 2022 alone, Coinbase Ventures closed 70+ deals (300+ to date). It invests across the web3 technology stack and is expanding cross-chain investments beyond Ethereum with the goal of promoting an open multi-chain future of finance.
  4. Investing in top talent and core technology. We’re building a best-in-class Smart Contract Engineering team to support our protocol work and integrating protocol thinking into our product management practice. Through developer tooling, hackathons, and company wide knowledge sharing and experimentation, we are building a community that sets the standard for web3 talent. In the year ahead, we are also excited to contribute open source developer tools and libraries to support building protocols at scale.

Last year we wrote about Coinbase embracing decentralization by adding more assets, expanding internationally, integrating with third-party platforms, and emphasizing self-custody. We closed by saying:

Many of the most innovative use cases in crypto are being created in decentralized apps. By fully embracing this trend we can put crypto in the hands of more people around the world and thereby increase their economic freedom.

We’ve been hard at work embracing this vision by investing in web3 protocols at Coinbase — and while we’ve made extensive progress, we’ve got a long way to go. We’re excited for the journey.

If these challenges excite you, join our team and help build an open financial system for the world.


Embracing Decentralization: Integrating and Building Protocols at Coinbase was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Rearchitecting apps for scale

How Coinbase is using Relay and GraphQL to enable hypergrowth

By Chris Erickson and Terence Bezman

A little over a year ago, Coinbase completed the migration of our primary mobile application to React Native. During the migration, we realized that our existing approach to data (REST endpoints and a homebuilt REST data fetching library) was not going to keep up with the hypergrowth that we were experiencing as a company.

“Hypergrowth” is an overused buzzword, so let’s clarify what we mean in this context. In the 12 months after we migrated to the React Native app, our API traffic grew by 10x and we increased the number of supported assets by 5x. In the same timeframe, the number of monthly contributors on our core apps tripled to ~300. With these additions came a corresponding increase in new features and experiments, and we don’t see this growth slowing down any time soon (we’re looking to hire another 2,000 across Product, Engineering, and Design this year alone).

To manage this growth, we decided to migrate our applications to GraphQL and Relay. This shift has enabled us to holistically solve some of the biggest challenges that we were facing related to API evolution, nested pagination, and application architecture.

API evolution

GraphQL was initially proposed as an approach to help with API evolution and request aggregation.

Previously, in order to limit concurrent requests, we would create various endpoints to aggregate data for a particular view (e.g., the Dashboard). However, as features changed, these endpoints kept growing and fields that were no longer used could not safely be removed — as it was impossible to know if an old client was still using them.

In its end state, we were limited by an inefficient system, as illustrated by a few anecdotes:

  1. An existing web dashboard endpoint was repurposed for a new home screen. This endpoint was responsible for 14% of our total backend load. Unfortunately, the new dashboard was only using this endpoint for a single, boolean field.
  2. Our user endpoint had become so bloated that it was a nearly 8MB response — but no client actually needed all of this data.
  3. The mobile app had to make 25 parallel API calls on startup, but at the time React Native was limiting us to 4 parallel calls, causing an unmitigatable waterfall.

Each of these could be solved in isolation using various techniques (better process, API versioning, etc.), which are challenging to implement while the company is growing at such a rapid rate.

Luckily, this is exactly what GraphQL was created for. With GraphQL, the client can make a single request, fetching only the data it needs for the view it is showing. (In fact, with Relay we can require they only request the data they need — more on that later.) This leads to faster requests, reduced network traffic, lower load on our backend services, and an overall faster application.

Nested pagination

When Coinbase supported 5 assets, the application was able to make a couple of requests: one to get the assets (5), and another to get the wallet addresses (up to 10) for those assets, and stitch them together on the client. However, this model doesn’t work well when a dataset gets large enough to need pagination. Either you have an unacceptably large page size (which reduces your API performance), or you are left with cumbersome APIs and waterfalling requests.

If you’re not familiar, a waterfall in this context happens when the client has to first ask for a page of assets (give me the first 10 supported assets), and then has to ask for the wallets for those assets (give me wallets for ‘BTC’, ‘ETH’, ‘LTC’, ‘DOGE’, ‘SOL’, …). Because the second request is dependent on the first, it creates a request waterfall. When these dependent requests are made from the client, their combined latency can lead to terrible performance.

This is another problem that GraphQL solves: it allows related data to be nested in the request, moving this waterfall to the backend server that can combine these requests with much lower latency.

Application architecture

We chose Relay as our GraphQL client library which has delivered a number of unexpected benefits. The migration has been challenging in that evolving our code to follow idiomatic Relay practices has taken longer than expected. However, the benefits of Relay (colocation, decoupling, elimination of client waterfalls, performance, and malleability) have had a much more positive impact than we’d ever predicted.

Simply put, Relay is unique among GraphQL client libraries in how it allows an application to scale to more contributors while remaining malleable and performant.

These benefits stem from Relay’s pattern of using fragments to colocate data dependencies within the components that render the data. If a component needs data, it has to be passed via a special kind of prop. These props are opaque (the parent component only knows that it needs to pass a {ChildComponentName}Fragment without knowing what it contains), which limits inter-component coupling. The fragments also ensure that a component only reads fields that it explicitly asked for, decreasing coupling with the underlying data. This increases malleability, safety, and performance. The Relay Compiler in turn is able to aggregate fragments into a single query, which avoids both client waterfalls and requesting the same data multiple times.

That’s all pretty abstract, so consider a simple React component that fetches data from a REST API and renders a list (This is similar to what you’d build using React Query, SWR, or even Apollo):

https://medium.com/media/8c84c3113e887e7acf78f982fbda4c7e/href

A few observations:

  1. The AssetList component is going to cause a network request to occur, but this is opaque to the component that renders it. This makes it nearly impossible to pre-load this data using static analysis.
  2. Likewise, AssetPriceAndBalance causes another network call, but will also cause a waterfall, as the request won’t be started until the parent components have finished fetching its data and rendering the list items. (The React team discusses this in when they discuss “fetch-on-render”)
  3. AssetList and AssetListItem are tightly coupled — the AssetList must provide an asset object that contains all the fields required by the subtree. Also, AssetHeader requires an entire Asset to be passed in, while only using a single field.
  4. Any time any data for a single asset changes, the entire list will be re-rendered.

While this is a trivial example, one can imagine how a few dozen components like this on a screen might interact to create a large number of component-loading data fetching waterfalls. Some approaches try to solve this by moving all of the data fetching calls to the top of the component tree (e.g., associate them with the route). However, this process is manual and error-prone, with the data dependencies being duplicated and likely to get out of sync. It also doesn’t solve the coupling and performance issues.

Relay solves these types of issues by design.

Let’s look at the same thing written with Relay:

https://medium.com/media/8758d071ef380a529d5949465f1fbee7/href

How do our prior observations fare?

  1. AssetList no longer has hidden data dependencies: it clearly exposes the fact that it requires data via its props.
  2. Because the component is transparent about its need for data, all of the data requirements for a page can be grouped together and requested before rendering is ever started. This eliminates client waterfalls without engineers ever having to think about them.
  3. While requiring the data to be passed through the tree as props, Relay allows this to be done in a way that does not create additional coupling (because the fields are only accessible by the child component). The AssetList knows that it needs to pass the AssetListItem an AssetListItemFragmentRef, without knowing what that contains. (Compare this to route-based data loading, where data requirements are duplicated on the components and the route, and must be kept in sync.)
  4. This makes our code more malleable and easy to evolve — a list item can be changed in isolation without touching any other part of the application. If it needs new fields, it adds them to its fragment. When it stops needing a field, it removes it without having to be concerned that it will break another part of the app. All of this is enforced via type checking and lint rules. This also solves the API evolution problem mentioned at the beginning of this post: clients stop requesting data when it is no longer used, and eventually the fields can be removed from the schema.
  5. Because the data dependencies are locally declared, React and Relay are able to optimize rendering: if the price for an asset changes, ONLY the components that actually show that price will need to be re-rendered.

While on a trivial application these benefits might not be a huge deal, it is difficult to overstate their impact on a large codebase with hundreds of weekly contributors. Perhaps it is best captured by this phrase from the recent ReactConf Relay talk: Relay lets you, “think locally, and optimize globally.”

Where do we go from here?

Migrating our applications to GraphQL and Relay is just the beginning. We have a lot more work to do to continue to flesh out GraphQL at Coinbase. Here are a few things on the roadmap:

Incremental delivery

Coinbase’s GraphQL API depends on many upstream services — some of which are slower than others. By default, GraphQL won’t send its response until all of the data is ready, meaning a query will be as slow as the slowest upstream service. This can be detrimental to application performance: a low-priority UI element that has a slow backend can degrade the performance of an entire page.

To solve this, the GraphQL community has been standardizing on a new directive called @defer. This allows sections of a query to be marked as “low priority”. The GraphQL server will send down the first chunk as soon as all of the required data is ready, and will stream the deferred parts down as they are available.

Live queries

Coinbase applications tend to have a lot of rapidly changing data (e.g. crypto prices and balances). Traditionally, we’ve used things like Pusher or other proprietary solutions to keep data up-to-date. With GraphQL, we can use Subscriptions for delivering live updates. However, we feel that Subscriptions are not an ideal tool for our needs, and plan to explore the use of Live Queries (more on this in a blog post down the road).

Edge caching

Coinbase is dedicated to increasing global economic freedom. To this end, we are working to make our products performant no matter where you live, including areas with slow data connections. To help make this a reality, we’d like to build and deploy a global, secure, reliable, and consistent edge caching layer to decrease total roundtrip time for all queries.

Collaboration with Relay

The Relay team has done a wonderful job and we’re incredibly grateful for the extra work they’ve done to let the world take advantage of their learnings at Meta. Going forward, we would like to turn this one-way relationship into a two-way relationship. Starting in Q2, Coinbase will be lending resources to help work on Relay OSS. We’re very excited to help push Relay forward!

Are you interested in solving big problems at an ever-growing scale? Come join us!


Rearchitecting apps for scale was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

The Merge and the Ethics of Ethereum

By Yuga Cohler, Staff Software Engineer at Coinbase

Coinbase logo

The Merge is coming. The long-awaited transition of the Ethereum network’s consensus mechanism from Proof-of-Work (PoW) to Proof-of-Stake (PoS) is expected to arrive sometime this year. After the Merge, the already-running Beacon Chain will begin validating Ethereum mainnet’s blockchain through a system based on rewards and penalties (i.e. “stake”) rather than the costs associated with computational intractability (i.e. “work”). Six years in the making, the Merge will be a milestone in the history of cryptocurrencies with both material and philosophical implications.

Perhaps the most heralded aspect of the Merge is its resulting efficiency: Ethereum’s shift to PoS will lead to a projected 99.95% reduction in energy consumption compared to PoW. This evolution will be a welcome development at a time when energy costs are surging across the world. Of course, there are legitimate counterarguments against PoS — for example, the concentration of wealth that PoS can facilitate as well as the lack of comparable at-scale testing. Nevertheless, given the existence of a peer-to-peer blockchain network that continues to operate on PoW — namely, Bitcoin — the Merge makes sense as a strategic next step for Ethereum and should garner the goodwill of the environmentally conscious.

The essential accomplishment of the Merge, however, is a more humanistic one than energy efficiency. The Merge has proven to be an extremely complex task with many challenges. Yet, its completion will be achieved not through the dictum of a central authority, but through the organic coordination of like-minded individuals. Fundamentally, a successful Merge will prove the viability of decentralization as a social organizing principle.

I have borne witness to this process through my periodic attendance at the biweekly Ethereum All-Core Developers meetings — a regular forum for Ethereum contributors to share their progress, present improvement proposals, and plan work. What is most remarkable to me about these meetings is how democratic they are. Thirty people — some anonymous, others not; most with their cameras off — convening to decide the fate of a $300 billion financial network. Anyone can propose a topic ahead of the meeting and anyone can contribute their opinion. The discussions are substantive, generally focusing on the engineering tasks at hand, but also spanning the many disciplines that underlie cryptocurrency networks — economics, mechanism design, governance, and culture.

Routinely, Ethereum Founder Vitalik Buterin is in attendance for these meetings. Despite his superlative status in the cryptocurrency community, Vitalik is treated no differently than any other core contributor: with respect and intellectual honesty, but without especial deference for his position. Disagreements over engineering approaches are hashed out in an orderly manner, and challenging Vitalik is a matter of course if it was best for Ethereum as a whole.

In an industry known for individualistic billionaires wielding influence, Vitalik’s approachable disposition is notable. So, it makes sense that this ethos of democratic decentralization imbues not just the meetings, but every aspect of Ethereum. The financial protocols, the cultural products, the governance processes, and now, even the consensus mechanism of Ethereum, are all subservient to the principle of decentralization. Progress is made not through the fiat of a single sovereign, but through the good-faith coordination of unallied actors, none more entitled to power over another.

These ethics of Ethereum are the foundations of its credible neutrality — a property that will become only more valuable as time goes on. On one hand, traditional financial systems are increasingly vulnerable to the nation-states who control them for their own interests. On the other hand, even though alternative L1 blockchains might provide greater scalability, convenience, and agility, none can claim to be as objectively decentralized as Ethereum. Its commitment to axiomatic decentralization positions Ethereum to exist as one of the only systems allowing for the storage and transfer of value with a high level of security, but without a prejudice for intent or origin. The Merge will be the latest alternative to “the inherent weaknesses of the trust based model” that Satoshi Nakamoto pointed out in the Bitcoin whitepaper, made possible by the shared values of the Ethereum ecosystem.

The moral logic of the Merge is decidedly optimistic, and it should be. Teamwork, cooperation, humility, curiosity, honesty: these are values we do not intuitively ascribe to capital markets, and yet, they are the hallmarks of Ethereum that have allowed it to progress towards one of the greatest engineering feats in the history of financial technology. Understanding this ethical foundation of a decentralized democracy should give us a renewed appreciation for the upcoming Merge, and guide us as we chart a course for the future of the cryptoeconomy.

This material is the property of Coinbase, Inc., its parent and affiliates (“Coinbase”). The views and opinions expressed herein are those of the author and do not necessarily reflect the views of Coinbase or its employees and summarizes information and articles with respect to cryptocurrencies or related topics that the author believes may be of interest. This material is for informational purposes only, and is not (i) an offer, or solicitation of an offer, to invest in, or to buy or sell, any interests or shares, or to participate in any investment or trading strategy, (ii) intended to provide accounting, legal, or tax advice, or investment recommendations or (iii) an official statement of Coinbase. No representation or warranty is made, expressed or implied, with respect to the accuracy or completeness of the information or to the future performance of any digital asset, financial instrument or other market or economic measure. The information is believed to be current as of the date indicated on the materials. Recipients should consult their advisors before making any investment decision. Coinbase may have financial interests in, or relationships with, some of the entities and/or publications discussed or otherwise referenced in the materials. Certain links that may be provided in the materials are provided for convenience and do not imply Coinbase’s endorsement, or approval of any third-party websites or their content. Coinbase, Inc. is not registered or licensed in any capacity with the U.S. Securities and Exchange Commission or the U.S. Commodity Futures Trading Commission.


The Merge and the Ethics of Ethereum was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security

Coinbase Cloud Works with Acala Foundation to Support Liquid Staking

By Joe Lallouz, Head of Coinbase Cloud

Coinbase Cloud

Coinbase Cloud is pleased to announce support of liquid staking through a collaboration with the Acala Foundation, starting with KSM liquid staking on Karura.

Liquid staking lets token holders stake their tokens while still putting them to work in DeFi — without being subject to unbonding periods. This offers token holders more opportunities to participate in the crypto economy.

Liquid staking is an important initiative that has the potential to bring even more participants into the growing Polkadot DeFi ecosystem, help unlock more value for token holders, and onboard more users into web3.

The significance of liquid staking

In traditional proof-of-stake networks, users who stake their assets are subject to an unbonding period where they cannot withdraw their tokens before a given time period. This time period is different for each protocol, such as 28 days for Polkadot and 7 days for Kusama. Additionally, even though users are earning rewards on their staked tokens, they are unable to use staked tokens in other applications.

Liquid staking changes that — It lets users earn both staking rewards as well as any rewards that would accrue from using their tokens in DeFi applications.

Through this process of liquid staking, users can stake their tokens and receive a representative L-Token in exchange (e.g. stake DOT and receive LDOT). The L-token represents both the principal staked asset as well as the staking yield that continues to accrue. L-Assets are tradable across all chains on the Polkadot and Kusama networks and are redeemable for the underlying asset at any time. Therefore, stakers are able to maximize their potential rewards. We must note that, like many other proof-of-stake networks, users also risk losing a portion of their tokens in the event that a validator is slashed.

Liquid staking launched for Karura in 2021, which allows users to stake KSM tokens on Karura in exchange for LKSM. To support the initiative, Coinbase Cloud is powering allowlisted validators that receive delegations from the community. LKSM offers liquidity for staked KSM, as users are not subject to an unbonding period and can unbond at any time for a small fee. This newfound liquidity will enable users to use their LKSM to earn yield in other use cases, such as to earn yield on Anchor Protocol in Acala’s recently announced integration. Early unbonding allows users to instantly exit staking positions, instead of waiting for the standard seven days unbonding period, eliminating the opportunity cost of the unbonding period.

Once live, the mechanism for liquid staking on Acala will function in the same way. Users can stake DOT and receive LDOT in return.

Liquid staking is set to lay the foundation for a number of new use cases, including minting aUSD (the native stablecoin of the Polkadot and Kusama ecosystem), creating new synthetic assets, and additional yield opportunities for aUSD and L-tokens. As the crypto and Polkadot DeFi ecosystem continue to grow, initiatives like liquid staking create opportunities to unlock additional value for token holders and help the network grow and scale securely through increased participation.

What’s Next?

Running high-performance validators with uptime is critical to helping protocols scale and ensuring that those who have staked their tokens continue to earn rewards. Coinbase Cloud has deep expertise in decentralized infrastructure and the Polkadot ecosystem, which positions us as an ideal infrastructure provider for the liquid staking initiative.

Coinbase Cloud has been closely involved with the Substrate ecosystem, having worked on Polkadot and Kusama since their earlier testnets and offering web3 builders a comprehensive solution of infrastructure solutions for Substrate.

For those looking to participate and build in Polkadot, Kusama, Acala, and Karura more broadly, check out our protocol guides for Polkadot, Acala and Karura. Or, visit our secure read/write infrastructure solutions for builders. Get in touch with our team to learn more.

Token holders looking to participate in the Polkadot DeFi ecosystem can participate in Acala liquid staking today. Visit the Karura Wiki for more information.

*This document and the information contained herein is not a recommendation or endorsement of any digital asset, protocol, network, or project. However, Coinbase may have, or may in the future have, a significant financial interest in, and may receive compensation for services related to one or more of the digital assets, protocols, networks, entities, projects, and/or ventures discussed herein. The risk of loss in cryptocurrency, including staking, can be substantial and nothing herein is intended to be a guarantee against the possibility of loss.

This document and the content contained herein are based on information which is believed to be reliable and has been obtained from sources believed to be reliable, but Coinbase makes no representation or warranty, express, or implied, as to the fairness, accuracy, adequacy, reasonableness, or completeness of such information, and, without limiting the foregoing or anything else in this disclaimer, all information provided herein is subject to modification by the underlying protocol network.


Coinbase Cloud Works with Acala Foundation to Support Liquid Staking was originally published in The Coinbase Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hong Kong ETFs begin trading, issuers unfazed if US declares ETH a security