Validators DAO Releases Open-Source Reproducible Linux Server Performance Measurement Tools to Support Environment Selection for Solana Applications
Validators DAO Releases Open-Source Reproducible Linux Server Performance Measurement Tools to Support Environment Selection for Solana Applications

ELSOUL LABO B.V. (Headquarters: Amsterdam, the Netherlands; CEO: Fumitake Kawasaki) and Validators DAO announce the public release of “Validators DAO Performance Testing Tools,” an open-source set of tools for measuring and verifying Linux server performance using reproducible methodologies.
These tools target Linux nodes including VPS, bare metal servers, and cloud instances, and are designed to allow anyone to measure, understand, and compare performance characteristics under the same conditions and with the same procedures. Rather than serving as benchmarks to promote specific services or products, the focus is on making the verification methods themselves openly available as practical decision-making tools for real-world operations.
Infrastructure Selection Assumptions in the Blockchain Era
In traditional Web application development, global reach and low latency were requirements limited to certain high-value use cases. However, in blockchain-based applications—particularly those built on high-speed chains such as Solana—transaction submission, stream processing, and real-time analytics occur routinely, and millisecond-level latency and its variance can directly affect outcomes.
In blockchain environments where all applications inherently take on financial characteristics, infrastructure selection itself becomes a prerequisite for application viability. Proceeding with development without understanding the effective performance of fundamental components such as CPU, memory, disk, and network can result in products that carry unexpected latency and instability risks.
Technologies Behind the Numbers and Differences in Effective Performance
In VPS and virtual machine environments, specifications such as vCPU count and memory capacity are commonly presented as performance indicators. However, these figures represent logical allocations and do not necessarily guarantee effective performance. One key factor behind this is overcommitment, an important operational technique in data center environments.
Overcommitment has been widely adopted as a means of efficiently utilizing physical resources based on the assumption that not all virtual machines will operate at maximum load simultaneously. Data center CPUs have also been developed with this usage model in mind, with continuous improvements in parallel processing performance.
At the same time, this structure inevitably introduces performance loss. CPU contention, cache variability, and scheduling effects can surface as non-negligible differences in certain workloads. Even environments labeled with the same “4 vCPU” specification may exhibit significantly different actual processing capabilities and stability.
Such differences can be confirmed numerically through appropriate measurement tools rather than through subjective perception or assumption. Selecting an environment without understanding its effective performance can ultimately impact development efficiency and product quality.
Positioning of Validators DAO Performance Testing Tools
Validators DAO Performance Testing Tools are an open-source collection of tools designed to allow anyone to verify effective Linux server performance using the same procedures across different environments such as VPS, cloud, and bare metal. By publishing both measurement methods and results, the tools enable performance differences between environments to be understood quantitatively rather than through intuition or experience.
The tools focus on fundamental performance factors that directly affect application behavior, including CPU processing capability, memory bandwidth, and disk read/write performance. These elements have a direct impact on final processing speed and stability in real-world operations such as indexers, RPC services, stream processing, and Solana node–adjacent workloads.
Execution is completed with a single command, requiring no special preparation or configuration, and can be run using the same procedure on any Linux server. While the command is running, simply observing the displayed numerical values allows users to intuitively understand which environment delivers higher performance.
Validators DAO Performance Testing Tools are not provided as metrics to evaluate specific environments, but as a common yardstick to help developers understand their own server resources and select appropriate environments according to their intended use cases.
About node_bench
The primary tool currently available, node_bench, is a benchmark tool designed to measure CPU, memory, and disk performance on Linux nodes in a reproducible manner. These metrics represent fundamental performance characteristics that cannot be avoided in real-world operations of high-speed applications, including those built on Solana.
In node_bench, CPU processing performance is measured using sysbench, memory performance is evaluated using STREAM, and disk performance is tested using fio with direct I/O and explicitly defined, fixed workloads. All execution results are saved as logs, including raw JSON data, allowing for later verification and independent analysis.
Execution is performed via curl, eliminating opacity caused by omitted commands and explicitly displaying all processes executed during the run.
Significance of Open-Source Publication
In performance testing, what matters is not the numerical results themselves, but the conditions and methodologies under which those numbers were obtained. The meaning of results varies significantly depending on what and how measurements are performed. Comparison becomes possible only when measurement methods are aligned.
Validators DAO publishes these testing tools and execution results as open source based on these principles. The repository welcomes pull requests containing execution results and measurements from other environments. By accumulating real measurement data generated using unified methodologies, the project aims to provide developers with reliable reference material for selecting appropriate server resources.
Validators DAO Official Discord
Developers experiencing performance challenges are encouraged to first measure their own resources and compare the results. The Validators DAO official Discord may also be used as a place for practical information exchange for this purpose.
- Validators DAO Performance Testing Tools: https://github.com/ValidatorsDAO/testing-tools
- Validators DAO Official Discord: https://discord.gg/C7ZQSrCkYR
- ERPC Official Website: https://erpc.global/en

