Etherscan’s Code Reader: Revolutionizing Ethereum Smart Contract Analysis with AI-Powered Insights

Ethereum block explorer and analytics platform Etherscan has recently launched a new tool called “Code Reader,” which uses artificial intelligence to retrieve and interpret the source code of a specific contract address. This new AI-driven tool is expected to offer deeper insights into the code of contracts and provide comprehensive lists of smart contract functions related to Ethereum data. However, amid the AI boom, some experts have cautioned on the feasibility of current AI models.

Etherscan has launched an AI-driven tool called “Code Reader”

The Code Reader tool developed by Etherscan would help users to retrieve and interpret the source code of a specific contract address. After a user inputs a prompt, Code Reader generates a response via OpenAI’s large language model, providing insights into the contract’s source code files. This tool is expected to be useful in gaining deeper insights into contracts’ code via AI-generated explanations, obtaining comprehensive lists of smart contract functions related to Ethereum data, and understanding how the underlying contract interacts with decentralized applications.

Code reader’s capabilities and use cases

Code Reader’s capabilities include an AI-driven approach to retrieve and interpret the source code of a specific contract address. This tool is expected to be helpful in obtaining deeper insights into a contract’s code as it provides AI-generated explanations. Furthermore, Code Reader can also generate comprehensive lists of smart contract functions related to Ethereum data, which would assist users in understanding how the underlying contract interacts with decentralized applications.

Experts caution on the feasibility of current AI models

Amid an AI boom, experts have warned that current AI models face significant constraints in terms of complex data synchronization, network optimization, and data privacy and security concerns. According to a recent report published by Singaporean venture capital firm Foresight Ventures, computing power resources will be the next big battlefield for the next decade.

Computing power resources are set to be the next big battlefield

With AI becoming more prevalent in various industries, the demand for training large AI models has grown in decentralized distributed computing power networks. However, researchers say current prototypes face significant constraints such as complex data synchronization, network optimization, data privacy, and security concerns. Computing power resources are expected to be the next big battlefield in the coming decade.

Current constraints of decentralized distributed computing power networks

In decentralized distributed computing power networks, training a large model with 175 billion parameters using single-precision floating-point representation would require around 700 gigabytes. However, distributed training requires frequent transmission and updates between computing nodes. Researchers suggest that small AI models are still a more feasible choice in most scenarios.

Training large AI models requires significant resources

Training large AI models requires significant resources in terms of computing power, data storage, and network optimization. In most scenarios, small AI models are still a more feasible choice. Distributed training would require these parameters to be frequently transmitted and updated between computing nodes, making it a complex process. Current prototypes are facing significant constraints such as complex data synchronization, network optimization, data privacy, and security concerns.

Small AI models are still a more feasible choice in most scenarios

Researchers have recommended that small AI models remain a more feasible choice for most scenarios. They argue that there is no need to fear missing out on large models during the tide of FOMO (fear of missing out). The researchers noted that small AI models could be a more practical choice over large AI models that require significant computing power, data storage, and network optimization.

As the demand for training large AI models grows, distributed computing power networks are expected to be the next big battlefield in the coming decade. While large AI models have their advantages, researchers suggest that small AI models remain a more practical choice in most scenarios. Current prototypes face significant constraints such as complex data synchronization, network optimization, data privacy, and security concerns. Etherscan’s new AI-driven tool called Code Reader offers new capabilities for retrieving and interpreting the source code of a specific contract address. This would assist in gaining deeper insights into contracts’ code and understanding how the underlying contract interacts with decentralized applications.

Explore more