Crypto Exchange Dilemma: Internal Market Makers or Transparency?

In the world of crypto exchanges, the use of internal market-making teams has become a contentious issue in recent years. Internal market makers are teams of traders that work for an exchange to make a profit on the trading activity that takes place on the exchange. Some insiders argue that these teams can help contribute to the liquidity and stability of an exchange’s markets, while others believe they can create a conflict of interest that could harm investors. In this article, we’ll take a closer look at the debate over internal market makers by examining the views of two prominent figures in the crypto exchange industry – BitMEX CEO, Stephan Lutz, and Crypto.com’s internal trading teams.

BitMEX CEO’s statement on internal market makers

Stephan Lutz, CEO of BitMEX, has been a vocal opponent of the use of internal market-making teams on crypto exchanges. In an interview with The Block, Lutz argued that exchanges that make money from proprietary trading should let go of their internal market-making teams. He went on to state that there are enough high-frequency trading firms and proprietary trading shops in the market that can perform the function of proprietary trading and market-making teams, making internal teams unnecessary. Lutz’s argument is based on the idea that internal market makers can create a conflict of interest that harms investors. When an exchange’s internal market maker has access to all of the exchange’s trading information, it can use that information to its advantage, potentially at the expense of the exchange’s users. This can create a situation where the internal market maker prioritizes its profits over the interests of the exchange’s users.

Concerns have arisen over Crypto.com’s internal trading teams

Crypto.com, a popular crypto exchange, has been the subject of criticism due to its use of internal trading teams. The exchange has a team of traders who work to facilitate tight spreads and efficient markets on its platform. While the team has publicly stated that it treats its actions the same way as any other third party, many critics believe that the team’s actions could create a conflict of interest. In response to these concerns, a spokesperson from Crypto.com stated that the trading team ensures that the exchange remains risk-neutral by hedging these positions on several venues. This means that if the internal team takes a position on a particular asset, it also takes offsetting positions on other exchanges to ensure that the exchange remains risk-neutral.

Comparison with BitMEX’s past allegations of running an internal trading team

BitMEX itself faced allegations of running an internal trading team to make profits several years ago. At the time, the derivatives exchange was accused of using Arrakis Capital, an internal market maker, to trade against its own users. While BitMEX denied the allegations, it separated Arrakis Capital from the exchange to avoid the appearance of impropriety.

The use of internal market makers by crypto exchanges has become a controversial issue. While some believe that internal teams can contribute to the liquidity and stability of an exchange’s markets, others argue that they can create a conflict of interest that could harm investors. BitMEX CEO Stephan Lutz has been a vocal opponent of the use of internal teams, arguing that exchanges that make money from proprietary trading should let go of their internal market-making teams. Crypto.com has defended its use of internal trading teams, stating that its team exists to facilitate tight spreads and efficient markets on its platform. Ultimately, the decision of whether to use internal market makers or third-party firms will depend on a variety of factors, including an exchange’s priorities, its risk tolerance, and its commitment to transparency and fairness.

Explore more

Data Centers Evolve Into Intelligent AI Factories

Far from the silent, climate-controlled warehouses of the past, today’s data centers are rapidly transforming into dynamic, thinking ecosystems that mirror the very intelligence they were built to support. The insatiable computational demands of artificial intelligence have ignited a revolution, forcing a fundamental reimagining of the digital infrastructure that underpins modern society. No longer passive containers for servers, these facilities

Google and Planet to Launch Orbital AI Data Centers

The relentless hum of servers processing artificial intelligence queries now echoes with a planetary-scale problem: an insatiable appetite for energy that is pushing terrestrial data infrastructure to its absolute limits. As the digital demands of a globally connected society escalate, the very ground beneath our feet is proving insufficient to support the future of computation. This realization has sparked a

Has Data Science Turned Marketing Into a Science?

The ghost of the three-martini lunch has long since been exorcised from the halls of advertising, replaced not by another creative visionary but by the quiet hum of servers processing petabytes of human behavior. For decades, marketing was largely considered an art form, a realm where brilliant, intuitive minds crafted compelling narratives to capture public imagination. Success was measured in

Agentic Systems Data Architecture – Review

The relentless proliferation of autonomous AI agents is silently stress-testing enterprise data platforms to their absolute breaking point, revealing deep architectural flaws that were once merely theoretical concerns. As Agentic Systems emerge, representing a significant advancement in Artificial Intelligence and data processing, they bring with them a workload profile so demanding that it challenges decades of architectural assumptions. This review

GenAI Requires a New Data Architecture Blueprint

The sudden arrival of enterprise-grade Generative AI has exposed a foundational crack in the data platforms that organizations have spent the last decade perfecting, rendering architectures once considered state-of-the-art almost immediately obsolete. This guide provides a comprehensive blueprint for the necessary architectural evolution, moving beyond incremental fixes to establish a modern data stack capable of powering the next generation of