Pure Storage FlashBlade//EXA: Verified Benchmarks vs. Marketing Claims

Pure Storage submits to STAC-M3 auditing—a credible approach. But claims of '30% better than competitors' and '10 TB/s from early testing' deserve the same scrutiny applied to any vendor.

Pure Storage announced next-generation storage products at Pure//Accelerate in June 2025, including FlashBlade//EXA targeting AI and HPC workloads. The announcements included specific performance claims: FlashBlade//S R2 “performs up to 30% greater than competitors across critical workloads,” FlashBlade//EXA achieves “more than 10 terabytes per second of read performance within a single namespace” in early testing, and FlashArray//ST delivers “over 10 million IOPS per five rack units” [1].

Pure Storage deserves credit for participating in independent benchmark programs. Their STAC-M3 submissions for quantitative trading workloads are audited and published with full methodology [2]. This transparency distinguishes Pure from vendors who publish only internal benchmarks. However, not all Pure claims meet this verification standard, and marketing materials mix audited results with unverified assertions.

What Pure Gets Right: STAC-M3 Participation

The Securities Technology Analysis Center (STAC) maintains benchmark suites for financial services workloads, with defined rules and independent auditing. Pure Storage has submitted FlashBlade results to STAC-M3, which measures performance for backtesting quantitative trading strategies—a demanding, latency-sensitive workload [2].

STAC-M3 results include full configuration disclosure: hardware specifications, software versions, network topology, and test parameters. Results are published on stacresearch.com where anyone can review methodology and compare across vendors. This represents the gold standard for benchmark credibility.

Pure’s participation demonstrates confidence in their technology and willingness to submit to independent verification. When Pure claims specific STAC-M3 results, those claims can be validated against published audit reports. This practice should be the industry norm.

Where Verification Falls Short

Not all Pure claims receive the same verification rigor.

“30% better than competitors”: The FlashBlade//S R2 announcement claims performance “up to 30% greater than competitors across critical workloads like genome sequencing, inference, and electronic design automation simulations” [1]. This claim raises several questions:

Which competitors? Dell PowerScale? NetApp ONTAP? Weka? VAST? Each competitive positioning tells a different story. A 30% advantage over one competitor’s previous-generation product differs from 30% over current-generation across the field.

What configurations were compared? Storage benchmarks depend heavily on configuration. A fully-populated, high-performance configuration compared against a competitor’s entry-level setup produces misleading percentages.

What methodology? Unlike STAC-M3 results, this competitive claim appears without published test parameters. The workloads mentioned (genomics, inference, EDA) each have different performance characteristics. “Up to 30%” suggests cherry-picked best-case results rather than consistent advantages.

“10 TB/s from early testing”: FlashBlade//EXA’s claimed 10+ TB/s read performance “in early testing” explicitly acknowledges this is preliminary data, not a verified benchmark. Early testing results frequently improve or degrade as products mature. Marketing “early testing” numbers before product availability sets expectations that may not match shipping performance.

The honest disclosure that this is “early testing” is appreciated—it’s more transparent than presenting preliminary numbers as final specifications. But preliminary numbers shouldn’t appear in marketing materials until verified.

The FlashArray//ST Numbers

FlashArray//ST’s “over 10 million IOPS per five rack units” claim falls in a middle ground [1]. The number is specific enough to evaluate, but the announcement lacks methodology detail.

Ten million IOPS across five rack units suggests approximately 2 million IOPS per unit—achievable with modern NVMe arrays under favorable conditions. The question is: what IO size? What read/write ratio? What queue depth?

At 4KB random reads with sufficient queue depth, modern NVMe arrays commonly achieve millions of IOPS. At 4KB random writes with synchronous commits, the same hardware might deliver 10-20% of that figure. The “10 million IOPS” claim without IO characterization lacks the specificity needed for meaningful comparison.

Pure could strengthen this claim by publishing methodology: IO size distribution, read/write ratio, queue depth, consistency model, and test duration. Until then, the number provides marketing impact without engineering utility.

The Credit Due

Pure Storage operates with more transparency than many competitors. STAC-M3 participation demonstrates willingness to face independent scrutiny. Their pricing model (Evergreen subscription with hardware refresh) provides cost predictability that customers value. Customer retention and Net Promoter Scores suggest genuine satisfaction beyond marketing hype.

The technology is real. FlashBlade serves demanding workloads in production at major enterprises. FlashArray maintains strong market position in enterprise block storage. Pure’s engineering delivers functional products that customers deploy successfully.

The criticism here focuses specifically on benchmark claims that lack the verification rigor Pure applies to STAC-M3 submissions. A vendor that submits to audited benchmarks demonstrates understanding of what verification means—which makes unverified competitive claims more conspicuous.

What Verification Would Complete

Pure could extend their STAC-M3 credibility to other claims:

SPECstorage submissions would provide independently audited performance data for file and object workloads, complementing STAC-M3’s financial services focus.

Published competitive methodology would transform “30% better than competitors” from marketing into engineering data. Specify the competitors, configurations, test parameters, and publish the raw results.

Verified FlashBlade//EXA numbers once the product ships, replacing “early testing” claims with audited benchmark submissions.

These additions would align Pure’s broader marketing with the verification standards they’ve established through STAC-M3 participation.

The Industry Standard Question

Storage vendors face competitive pressure to publish impressive numbers. Pure’s approach—mixing audited benchmarks with unverified marketing claims—is common industry practice. The question is whether the industry standard is appropriate.

Weka publishes SPECstorage results with full disclosure [3]. MinIO provides open-source benchmark tools for independent validation. These practices enable the technical verification that differentiates engineering from marketing.

Pure has demonstrated capability for this verification through STAC-M3. Extending that rigor to all performance claims would establish Pure as a benchmark transparency leader rather than an average participant in industry marketing conventions.

For Evaluators

Organizations considering Pure Storage should:

Reference STAC-M3 results for workloads similar to quantitative trading—the audited data provides reliable performance expectations.

Request methodology for competitive claims like “30% better than competitors.” Ask which competitors, what configurations, and what test parameters produced that number.

Wait for verified data on FlashBlade//EXA rather than relying on “early testing” figures. Product announcements and shipping performance frequently diverge.

Conduct proof-of-concept testing with actual workloads. Pure’s technology is mature enough that real-world testing provides more relevant data than any benchmark.

Pure’s willingness to participate in audited benchmarks suggests confidence in their technology. Customers benefit by holding Pure to the verification standard they’ve established for themselves.

The Bottom Line

Pure Storage demonstrates that verified benchmarks are achievable—their STAC-M3 participation proves it. This makes their unverified marketing claims—“30% better than competitors,” “10 TB/s from early testing”—more notable.

The technology is real. The products work. Customers deploy Pure successfully in demanding environments. The question isn’t whether Pure delivers value but whether their marketing claims meet the verification standard they’ve shown capable of achieving.

Pure could be an industry leader in benchmark transparency. They’ve demonstrated the capability through STAC-M3. Extending that rigor to all performance claims would serve customers better than mixing audited results with marketing numbers.

When a vendor proves they can submit to independent verification, unverified claims become a choice rather than a limitation.


References

[1] Pure Storage, “Pure Storage Unveils Next-Generation Storage Products to Deliver Performance at Any Scale,” June 2025. https://www.purestorage.com/company/newsroom/press-releases/pure-storage-unveils-next-gen-storage-products.html

[2] Pure Storage Blog, “Pure Storage STAC-M3 Benchmark Testing Results for High-Performance and Quant Trading.” https://blog.purestorage.com/news-events/pure-storage-stac-m3-benchmark-testing-results-quantitative-trading/

[3] SPEC, “SPECstorage Solution 2020 Results.” https://www.spec.org/storage2020/results/


StorageMath recognizes vendors who participate in audited benchmarks. Pure Storage’s STAC-M3 submissions demonstrate verification capability. We encourage extending that standard to all performance claims—customers deserve consistent transparency.