Simulation and emulation: The perfect match
Keywords:Simulators verification EDA register transfer level RTL
While pundits may predict the departure of simulation from the verification flow and the rise of hardware emulation, they actually co-exist nicely... thank you very much.
In the 1980s, the logic simulator became main-stream, promoted by the Computer Aided Engineering (CAE) industry as the vehicle for design verification. Up to this point, all logic simulators operated at the gate level (by contrast, analogue simulators of that time operated at the transistor level).
In the 1990s, simulation moved up the abstraction level to the register transfer level (RTL) supporting the two most popular hardware description languages (HDLs)—Verilog and VHDL. Towards the end of that decade, the vendors—now called EDA (Electronic Design Automation), which embodied a merger of CAE and CAD—supported both languages in the same tool. Today, all three major EDA players—Cadence, Mentor, and Synopsys—offer their own HDL simulators, with each company holding roughly one third of the market.
Accurate timing modelling, four logic states (0, 1, X, and Z), and multiple logic strengths give the HDL software simulator the breadth to verify both the functionality and timing behaviour of any digital design in any industry segment... up to a point. Due to cache misses and memory swapping, design sizes in excess of about 100-million ASIC-equivalent gates slow execution speed. Such a design size is not a hard limit, and larger designs can be simulated although the speed of execution becomes unbearably slow.
To simulate one second of real data on a 100-million ASIC-gate design clocked at 500MHz in the real-world, for example, a fast simulator running at 10Hz would take 50-million seconds. That is almost 600 days for a large and complex design, unacceptable by rigid schedules, and one second of real data is not realistic for hardware debugging. Thus, typical testbenches aiming at hardware debug using a software simulator may generate data equivalent to one millisecond or less in real time. This would reduce the execution time to a day or less when simulated on a state-of-the-art PC configured with lots of memory—a reasonable target.
To increase the throughput of HDL software simulators, they can be run in parallel on PC farms, with each PC processing a self-contained testbench. In the above example, a farm with 1,000 PCs would process close to one-billion cycles per day. This impressive processing power is a welcome necessity for running many small block tests in parallel, typical of large regression test suites.
Still, this approach is not suitable to run single monolithic tests that are sequential in nature and that require advancing the design to a "state of interest" for testing following the OS boot-up. That is to say that simulation farms are simply not adequate when it comes to executing embedded software. To process embedded software, it is necessary to execute several billion cycles in sequence, since software programs cannot be split in subsets and run in parallel. The task is inherently a sequential process. For this, hardware emulation is the perfect choice.
Related Articles | Editor's Choice |
Visit Asia Webinars to learn about the latest in technology and get practical design tips.