Money2020 Interoperability issues in the tokenization industry must be addressed

Pallavi Thakur, the Director of Strategy and Innovation at Swift, and Julien Clausse, the head of Asset Foundry at BMP Paribas, provided valuable insights on tokenization during their presentation at the Money20/20 conference.

In their collaborative discussion, Clausse and Thakur acknowledged the significant challenges of interoperability in tokenization, emphasizing the need for solutions at various levels. Thakur discussed how tokenization often leads to isolated networks or “islands” that struggle to communicate with each other, hindering its potential to revolutionize the securities market.

Thakur stated, “Tokenization is gaining momentum and has the power to reshape the securities market, but the issue of isolated networks remains a major obstacle.” She identified the network layer, token format layer, and data layer within tokens as key areas that must be addressed for seamless operation.

Both speakers emphasized the importance of overcoming the fragmented nature of the blockchain environment to ensure the success of tokenization. Clausse highlighted the complexity of achieving true interoperability, attributing the problem to the diverse range of blockchain projects in existence. He stressed the need for common networks and standardized adoption within the tokenization industry.

Clausse emphasized the necessity of establishing industry standards across diverse blockchains for the future of tokenization. He suggested that these standards should be developed through consensus among industry stakeholders and practical applications in the real world.

Thakur and Clausse underscored the significance of practical use cases and industry collaboration in shaping the future of tokenization. They discussed examples of industry standards being applied, such as the tokenization of small-scale renewable energy projects by Swift, which has improved efficiency and transparency in financing and management. Both speakers urged the crypto industry to work together to address interoperability challenges and strive towards common standards for the advancement of tokenization.

Leave a Reply

Your email address will not be published. Required fields are marked *