The world’s digitization has led the semiconductor market to take off. You’ve probably seen the headlines: Sales increased more than 20% to about $600 billion in 2021 and are anticipated to grow by 6% to 8% a year until 2030. With this growth comes challenges around the core processes involved in semiconductor chip manufacturing. One of these core processes is electronic design automation (EDA). 

When manufacturers face challenges, so do supply chains, which makes meeting these challenges critical. Chip shortages are forcing manufacturers to speed up EDA workloads to accelerate the chip design process. High-performance storage that is optimal for EDA workloads can dramatically accelerate chip design and ultimately enable delivery of new semiconductor products faster.. 

Read on to learn more about what EDA is, why it’s important, the challenges EDA-based semiconductor chip manufacturers are facing, and the various ways they can address these challenges via modern data storage and data management. 

What Is EDA?

Electronic design automation (EDA), also known as electronic computer-aided design (ECAD), is a category of software, hardware, and services used for manufacturing, analyzing, and testing semiconductor chips. EDA tools make it much easier to design semiconductor chips, which often have billions of components and keep adding more as Moore’s Law takes effect. 

Why Are EDA Solutions Important?

EDA tools play a critical role in semiconductor chip manufacturing for the following reasons:

  • They’re used to vet semiconductor manufacturing processes to ensure they deliver the required performance and density. This part of EDA is called technology computer-aided design (TCAD).
  • EDA solutions also verify that a design can meet all manufacturing process requirements. Design deficiencies can lead to reliability risk, reduced capacity, and malfunctioning. 
  • EDA tools are also used to monitor the post-manufacturing performance of semiconductor chips to ensure they continue to perform as expected throughout their lifetime. 

EDA Vendor Challenges

EDA tools and solutions are obviously heavily reliant on data to function well. Like all other data-related tools, EDA solutions face challenges that come with having more and more data to deal with and faster data processing requirements. These issues translate into resource and cost issues, which eventually translate into slow, failed, or scrapped initiatives. 

Specifically, EDA vendors face challenges around:

  • Time to market. Dev cycles are speeding up and many EDA vendors find themselves unable to keep up with competitors. 
  • Management complexity and its ensuing lack of productivity. IT teams are very resource-constrained and need to keep up with developer demands.
  • Data center costs from application infrastructures that still run on legacy storage.
  • Massive growth in compute resources during the design and manufacturing process, which has become a major source of investment. Sub-10nm chip design is now the standard. This process generates a massive amount of files during different phases of the workflow, and these files require high-performance and high-capacity data storage. 

Considering all of this, it’s no surprise that EDA vendors and manufacturers are struggling with their data-related costs and processes. 

Increasing design complexity, on-demand compute requirements, industry consolidation, and use of hybrid cloud for design workloads are forcing EDA vendors to rethink how they do things. 

How can they:

scale more cores in their compute farms for incremental growth by enabling growth with headroom and non-disruptive upgrades?

accelerate the software delivery pipeline by improving developer productivity with increased software quality?

simplify storage management through better monitoring and reporting of EDA infrastructure endpoints?

deliver designs faster with reduced job completion times by keeping the design pipeline flowing and optimize for lower license costs?

provide scalable and efficient performance by scaling storage and compute independently (disaggregated) and reducing data center footprints?

mobilize data for hybrid cloud by enabling array- and host-based data mobility and building enterprise hybrid cloud solutions with no compromises?

The answer: Pure. 

How Pure Storage Helps EDA Vendors 

Traditional storage architectures struggle to keep up with I/O needs of today’s EDA workloads. They either saturate or scale poorly under the high-concurrency demands. High performance computing (HPC) workloads that EDA tools use require a high magnitude of compute resources to run jobs in a 24×7 queue for design and manufacturing processes. Accordingly, semiconductor organizations need to be able to provision infrastructure on demand to accelerate the chip design process and drive faster time to market.

Designed from the ground up for high-concurrency and high-performance environments, Pure Storage® FlashBlade//S™ is the ideal storage solution for EDA workloads. FlashBlade’s simplicity, performance, and scalability enable reduction of job run time—the primary challenge for EDA environments. 

Learn how Pure Storage FlashBlade//S Accelerates EDA Workloads

Specifically, FlashBlade//S gives EDA vendors:

1. Accelerated chip design cycles

Pure helps you optimize your chip design workflow for faster time to market and faster job completion time with minimum failures. It also helps you maximize compute density and increase concurrent job completion. 

2. Faster agile development 

With Pure, you get improved software developer productivity with higher business and operational benefits and better infrastructure efficiency with integrated design tools.

3. Disaggregated compute and storage

Pure enables you to make the most of your compute and storage investments, giving you a much higher IOP-to-capacity ratio. Your read performance will be significantly higher as well and far more effective compared to competitors who use storage-class memory, which is limited in scalability and much more costly.

4. Flexible hybrid cloud architecture

Pure provides powerful HPC capabilities for EDA with Microsoft Azure that provides improved engineering productivity, scalable performance and capacity, cost efficiency with data reduction, faster data recovery from failures, optimized EDA tool license costs, and data security and sovereignty with a connected storage hybrid cloud solution.

5. Lower cost and better efficiency

With Pure, you can lower costs by optimized EDA software licensing and improve efficiency via automation using REST APIs to provision and integrate EDA workflows.

6. Integrations and observability

Pure gives you monitoring and reporting from a single pane for all endpoints, plus long-term retention of metrics data and connectivity to monitoring tools like Prometheus and Datadog. 

Learn more about how FlashBlade can accelerate EDA workflows.

Proof point: Silicon Labs

Silicon Labs manufactures electronic chips that drive remote-controlled light bulbs, smart home systems, and industrial automation processes. Their EDA procedure, used for chip creation, generates numerous small files and directories for each project. EDA workloads, which heavily rely on metadata, pose a data throughput challenge for outdated storage systems. To address this challenge, Silicon Labs turned to Pure Storage FlashBlade® for high-performance data storage, not only for EDA but also for various other workloads.

Silicon Labs required storage solutions that could accelerate their design process, efficiently managing large volumes of unstructured data. Concurrently, they aimed to streamline their other workloads onto a unified system for enhanced efficiency and simplicity. With multiple locations, including some lacking in-house IT support, Silicon Labs needed a user-friendly solution from a vendor offering robust support.

Using Pure Storage, Silicon Labs was able to: 

  • Enhance innovation speed through high performance and reliability.
  • Streamline workloads on a single platform while boosting overall performance.
  • Achieves speeds up to 40 times faster for certain functions compared to legacy HPC storage.
  • Reduce data center operations footprint by 85%, simplifying operations.


And much more.