Introduction
Coordinating simulation execution requires clearly defined phases. Without structured phase management, testbench components execute in unpredictable order, leading to race conditions and unreliable verification results. The three primary phases—Build, Run, and Wrap-up—provide a systematic framework for organizing testbench execution from initialization through final reporting.
Understanding these phases enables verification engineers to construct predictable, debuggable test environments. This article examines each phase in detail, breaking down their constituent steps and explaining how they coordinate testbench components with the design under test (DUT). Readers will gain practical knowledge for implementing phase-structured testbenches that produce consistent, repeatable results.
(toc) #title=(Table of Content)
The Build Phase: Preparing the Environment
The Build phase establishes the foundational state required for test execution. This phase runs once at simulation start and never repeats during a single test iteration. Four sequential steps comprise the Build phase.
Generate Configuration
Configuration randomization occurs before any testbench components are constructed. The configuration object determines how the DUT and surrounding environment will behave during the test. Typical configuration parameters include bus widths, protocol variants, timing parameters, and which verification components remain active.
For a memory controller testbench, configuration randomization might select between DDR3, DDR4, or LPDDR4 protocols, set burst length limits, and determine whether error correction coding is enabled. All these choices are fixed before environment construction begins.
Build Environment
With configuration determined, the testbench allocates and connects its components. Testbench components—unlike RTL design components—exist solely within the verification environment. Examples include bus functional models (BFMs), monitors, scoreboards, and coverage collectors.
If the generated configuration specifies three active bus drivers on different interfaces, the Build Environment step creates and initializes exactly three driver instances. Connections between drivers, monitors, and the DUT ports are established during this step.
Reset and Configure the DUT
After environment construction completes, the DUT receives a reset sequence. Reset brings all DUT state elements to known values. Following reset exit, the testbench applies the generated configuration values to the DUT. Configuration registers are written, mode bits are set, and the DUT enters the operational state defined by the configuration object.
The Run Phase: Test Execution
The Run phase contains the active stimulus application and response checking. This phase typically consumes the majority of simulation time.
Start Environment
Testbench components begin operation. BFMs start monitoring interfaces, stimulus generators begin producing transactions, and scoreboards activate their checking logic. Starting the environment ensures that all verification infrastructure is ready before the test itself runs.
Run the Test
The test executes and the simulation waits for completion. Directed tests have obvious completion points—the final stimulus item is sent, and the test explicitly calls finish. Random tests present greater complexity.
A robust approach uses the testbench layer hierarchy to determine completion. Starting from the top layer, the verification environment waits for each layer to drain all inputs received from the layer above. After draining completes, the current layer must become idle—no pending transactions, no scheduled events. This process repeats downward through each layer. Time-out checkers provide safety in case the DUT or testbench enters a locked state.
The Wrap-up Phase: Final Collection and Reporting
The Wrap-up phase handles post-execution cleanup and results analysis. Two steps complete the testbench execution cycle.
Sweep
After the lowest testbench layer completes its draining process, transactions still propagating through the DUT must exit. The Sweep step waits for these final transactions to emerge from DUT outputs. This delay ensures the scoreboard receives all DUT responses before final checking begins.
Report
With the DUT idle, the testbench sweeps all verification components for incomplete or lost data. Scoreboards may hold transactions that entered the DUT but never exited—possibly dropped due to DUT errors or protocol violations. Monitors may have partial transactions in progress.
The reporting step aggregates this information into a final pass/fail determination. Failed tests require special handling: functional coverage data should be deleted before saving results. Coverage from failed tests represents incomplete or invalid design states and would corrupt coverage closure analysis.
Practical Implementation Overview
The following table summarizes the three phases and their verification objectives:
| Phase | Steps | Primary Objective |
|---|---|---|
| Build | Generate config, Build environment, Reset DUT, Configure DUT | Establish deterministic starting state |
| Run | Start environment, Execute test with completion detection | Apply stimulus and collect responses |
| Wrap-up | Sweep remaining transactions, Generate report | Complete checking and determine outcome |
Common Implementation Challenges
Timeout checker configuration requires careful attention. Setting timeouts too short causes false failures on legitimate but slow operations. Setting timeouts too long wastes simulation cycles. A typical approach uses 10× the expected maximum transaction latency as a starting value, adjusted based on empirical results.
Layer draining order must follow the data flow direction. Draining from top to bottom ensures that higher-layer test intent propagates completely through the verification stack before lower layers declare idleness.
Conclusion
The Build-Run-Wrap-up phase model provides a proven framework for testbench execution coordination. Build phase establishes configuration and constructs the environment. Run phase executes stimulus while detecting natural completion points through layer draining. Wrap-up phase sweeps residual transactions and produces final reports. Verification engineers implementing this phase structure achieve more predictable simulation behavior and simpler debug workflows.