代写代考 ELEC97106/97107/70056

Microsoft PowerPoint – HW_Verification_Lecture_1

Hardware and Software
Verification

Copyright By PowCoder代写 加微信 powcoder

ELEC97106/97107/70056

Autumn term 2022

Visiting Prof.

Hardware Verification –

ELEC97106/97107/70056
Autumn term 2022

(with acknowledgement to Dr Chris Brown and other colleagues at Arm)

Something about me

• – one of the founders of Arm (spin-off from Acorn in 1990)
• Two hats:

• Director of Functional Safety in CPU group at Arm in Cambridge
• Visiting Professor in Dependable Embedded Computing in E&EE (part-time)

• Previously I was a Royal Academy of Engineering VP

• When I come to the department, you may find me in room 901
• But you will most likely have to rely on Teams or email to contact me

• Contact details:
• (also )

Lecture schedule for Hardware Verification

• Week 1 – shared lecture (today) – Introduction to Hardware Verification
• Week 2 – Thursday 20th October – Introduction to SystemVerilog

• Exercise to compile and run simple testbench using Questasim
• Introduce main coursework

• Week 3 – Thursday 27th October – SystemVerilog continued and
SystemVerilog assertions (SVA)

• Exercise – build and run simulation of Arm-based SOC; add assertions
• Week 4 – Thursday 3rd November – Formal Verification

• Exercise to prove assertions and find bugs using JasperGold
• Week 5 – Thursday 10th November – Code and functional coverage

• Continue with main coursework
• (Weeks 6-9 – Software Verification)
• Week 10 – Thursday 15th December – Wrap-up (shared lecture)

• Mondays at 9am in Room 909B
• GTAs will be there to answer your questions

• Jianyi Chang – H/W Verification
• – S/W Verification
• – S/W Verification

• I won’t always be able to attend these but will monitor Teams for any
messages and can join a Teams meeting if needed

Objectives of Hardware Verification module

• Appreciate the challenges in verifying complex components and systems
• Give you an appreciation of hardware verification methodologies
• Learn the basics of SystemVerilog for verification
• Assertion-based verification using SystemVerilog Assertions (SVA)
• Introduction to Formal Verification and proving properties
• Understand the importance of metrics: Code and Functional Coverage
• Practical experience with writing a test plan, creating a testbench, writing a

checker, measuring coverage, using Mentor QuestaSim
• Practical experience with a Formal Verification (FV) tool ( Gold)

Recommended reading

• SystemVerilog For Verification 2nd Edition, Spear, Springer https://www.dawsonera.com/abstract/9780387765303
• 3rd edition by Tumbush is available as a printed textbook

• SystemVerilog Language Reference Manual (LRM) –
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8299595

• Formal Verification – An Essential Toolkit for Modern VLSI Design
Seligman, Schubert and Kumar, https://doi.org/10.1016/C2013-0-18672-2

• SystemVerilog Assertions (SVA)
• Mentor QuestaSim – Help pages within tool
• Gold – Help pages within tool

An Introduction to Hardware Verification

• Verification Challenges
• Bugs and Errata
• Multi-layered verification strategy
• Verification Techniques

• Testbenches and testplans
• Stimulus; Coverage; Checking
• Formal Verification
• SystemVerilog and SVA

• Coursework

What is Verification?

• Verification is the process of ensuring that a design meets the
specification

• In other words: Does this product do what was intended?

• Alternatively:
• Quantifying the risk that a design deviates from its functional specification

• “How sure are you that there are no bugs left?”

What is Verification?

If it’s not been tested, it’s broken

The Verification Challenge

• The demand for high technology products is
growing, requiring constant innovation

• As the complexity of products grows, so does
the challenge of verification

• Even as design complexity goes up, so there is
pressure to release new products more
frequently

• Verification can take the majority of the time
on a large project

Autonomous Driving

Robotic Surgery

Why is Verification a critical activity?

• Complex systems can fail
• Aim is to reduce the risk of system failure due to a systematic fault by ensuring complete and

comprehensive system verification

Credit: La Cara Salma [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)]

The challenge of verifying a CPU
• Verification of a complex CPU is highly

challenging

• Verify that design conforms to Architecture
• ISA, memory system architecture, exceptions, debug

architecture, …
• Verification of microarchitecture

• Pipelining, register hazarding, branch prediction,
cache behaviour, …

• Verification effort typically exceeds design

• Many man-years

TLB Data Cache

BIU Data Buffer

Instruction PU Integer Unit ETM

Requirements Traceability
• Functional safety case requires evidence of meeting requirements

Product Specification

Engineering Specification

Unit Specification Testbench Specification Testplan

Implementation Verification Tests Coverage Results

• State space = number of unique states the logic can get into
• For instance, say the number of control flip-flops in CPU = 1000
• Reachable state space = 21000 ~= 10300
• Number of atoms in the observable universe ~= 2266

• Number of seconds since The Big Bang ~= 5×1017

• Realistic reachable state space is actually much bigger
• How can we meet the verification challenge in the finite time allocated to

a project and meet the quality goals?

Verification Challenges = State Space Explosion

Bugs and Errata

• A bug is a defect in a design that causes it to behave incorrectly
according to the specification and that is found before a product is

• An erratum (plural errata) is a correction to something that has
already been released

• One well-publicized example is Intel’s “FDIV” erratum in
early Pentium processors

• Verification engineers aim to find the bugs BEFORE they become

• Errata cost a lot more to fix than a bug!

Verification Engineer’s Toolkit:
A multi-layered approach

Constrained

Multi Unit Level TestbenchesUnit Level Testbenches

Functional

System Level

System Level

Architecture
Compliance Tests

Verification

Soak Testing Protocol

Peer Review

Silicon Test Chips

Verification Techniques

• Bottom-Up Verification – Unit and Multi-Unit Testing
• What is a Testbench?
• What is a Testplan?
• Overview of industry-standard verification techniques

• Stimulus
• Coverage
• Checking

• Formal verification

• Put HDL Design Under Test (DUT) in
simulation environment (testbench)

• Drive stimulus on inputs of DUT
• Model expected outputs according to

same inputs
• Checker compares outputs of DUT and

• Stimulus can be:

• Directed tests
• Constrained random testing

What is Dynamic Verification?

(stimulus)

• A variety of techniques are required to exhaustively verify designs
• Use a bottom-up testing methodology, starting with unit level testbenches
• Designs are partitioned based on clean interfaces to allow thorough testing of sections

of the design in isolation (Unit level)
• Sensible subsets of the design are then tested in conjunction to stress key interactions

(Multi-Unit level)
• The full IP level is put through stress (Top level) before then moving to a larger

representative system containing the key IPs such as CPUs, GPUs and Interconnects
(System level)

• The aim is to find all bugs at the lowest level before moving up to the next level of
abstraction

Finding Bugs Efficiently: Bottom-up Verification
Verification is a multi-layered problem

• Using a CPU Bus Interface Unit (BIU) as
an example, the following slides
illustrate the levels of verification
carried out

• The BIU is tested in a standalone
testbench with Bus Functional Models
(BFMs) to mimic the interactions with
the rest of the DUT

• This provides the quickest path to initial
bring-up and basic operation bugs

• Once this technique saturates we
move to multi-unit testing

Bottom-Up Verification Methodology
Exhaust finding bugs at the lowest level before moving up a level of abstraction

TLB Data Cache

BIU Data Buffer

Instruction PU Data Processing ETM

Instruction -Up Verification Methodology

L2 and SCU

CPU Data Processing

CPU Core-0

TLB Data Cache

BIU Data Buffer

Debug APB I/F

CPU Core-1

TLB Data Cache

BIU Data Buffer

Debug APB I/F

AMBA 4 ACE or AMBA 5 CHI interface

CPU Cluster

Instruction PU Data Processing ETM

Bottom-Up Verification Methodology

L2 and SCU

CPU Core-0

Instruction Fetch

TLB Data Cache

BIU Data Buffer

Debug APB I/F

CPU Core-1

TLB Data Cache

BIU Data Buffer

Debug APB I/F

AMBA ACE 4 I/F

CPU MP2 Cluster

Instruction Processing ETM Data Processing ETM

Bottom-Up Verification Methodology

L2 and SCU

CPU Core-0

Instruction

CPU Core-1

Instruction

CPU MP2 Cluster

L2 and SCU

CPU Core-0

CPU Core-1

CPU MP2 Cluster

Interconnect

ETMData Processing Data Processing ETMData Processing ETMData Processing ETM

Instruction FetchInstruction -Up Verification Methodology

L2 and SCU

Data Processing

CPU Core-0

TLB Data Cache

Data Buffer

Debug APB I/F

CPU Core-1

TLB Data Cache

BIU Data Buffer

Debug APB I/F

AMBA 4 ACE or AMBA 5 CHI interface

CPU MP2 Cluster

ETM Data Processing ETM

Interconnect

CPU-A MP2 Cluster

CPU-B MP2 Cluster GPU

Bottom-Up Verification Methodology

What is a Simulator?
• Logic simulation is the use of simulation software to predict the behaviour of digital

circuits and hardware description languages

• Simulation can be performed at varying degrees of physical abstraction, such as at the
transistor level, gate level, register-transfer level (RTL), electronic system-level (ESL), or
behavioural level

Source: Wikipedia

• Simulators normally provide a waveform viewer and debug capability

• RTL simulation is often the largest element of a verification strategy

What is a Simulator?

What is a Testplan?
• The document outlining how the DUT will be checked for compliance to the

specification
• Typical Contents:

• Overview of the testbench architecture
• Specific tests required to exercise areas of the specification
• Details of coverage to be collected to ensure specification is met

• Sometimes split into two documents
• Testbench Specification
• Coverage Plan

What is a Testbench?
• The collective name for the code written to ensure a DUT is compliant to the

specification
• Contains:

• STIMULUS: Some way of driving values into the DUT
• CHECKING: Some way of checking the DUT has behaved correctly
• COVERAGE: To measure the success of the stimulus

• We can think of this as the three sides of a triangle
• We will now consider each of these in turn

The Key to Successful Verification

Our Design Under Test (DUT)

Stimulus: Directed Testing

• A set of tests per feature
• Disadvantages

• If a test is not written to stimulate an area, bugs will remain
• Writing/maintaining a large directed test set is manually intensive
• Often tests cover similar parts of the state space giving poor return on simulation time

Stimulus: Random Testing

• Random Stimulus
• Disadvantage: No focus to the testing: again, many bugs go undetected

• Think of a 32 bit adder with purely random inputs, what are the chances of hitting
interesting cases such as all zero’s or full carry ripple?

Stimulus: Constrained Random Testing

• Random Stimulus with constraints to focus the testing into areas of

• Disadvantage: We believe this should be better than directed only or
random only but have no way of knowing

Constraints

Stimulus: The Hybrid Approach:
Coverage-Driven Constrained Random Testing + Directed Tests

Constraints

Directed Test C1

The Key to Successful Verification

So What is Coverage?
 CODE COVERAGE

 Automatically collected by simulator for “free”
 Collects information about the lines of RTL

 Lines hit
 Branches hit
 Expressions hit
 Signal toggles hit

 FUNCTIONAL COVERAGE
 Automatically collected by simulator

based on user input
 Collects information about the user-

defined scenarios or requirements
 Shows success (or failure) of constrained

random (or directed) stimulus
 Allows linkage of requirements that span

multiple features

 STATISTICAL COVERAGE
 Collects engineer defined statistics within a simulation and across a group of simulations
 Enables a greater understanding of what is really happening within the simulations
 Enables fine tuning of constraints to improve stimulus
 Functional/code coverage may show that we have executed all instructions, but might not

show that we have executed 1 instruction 1 billion times and all instructions 100 times

always @* begin
if (a==b) begin

z = x + y;

else if ((a > b) || (a > c)) begin

z = x – y;
else begin

Code Coverage

// Statement
// Statement

// Statement
// Statement

// Statement

// Branch (if)

// Branch (else) // Expression

// Condition

// Condition

Good initial metric of RTL code, but does not tell us about the interactions/relationships
between unrelated pieces of code.

Functional Coverage

• ADD r0, r1, r2
• SUB r4, r5, r0
• MUL r7, r8, r9

• Functional Coverage is a user-written encapsulation of a requirement of the DUT.
• The amalgamation of all these requirements gives us our overall functional

coverage score
• The following example is a simple 6 instruction, 10 register CPU:

Instruction Coverage = 50% hit

ADD SUB MUL AND OR NOT

Destination Coverage = 30% hit

r0 r4 r7 r8 r9r1 r2 r3 r5 r6

Coverage: Are we done?
• “Once we get to 100% Code Coverage and 100% Functional Coverage, are we

• No! Functional Coverage is only as good as the coverage points you define.

There may be bugs in scenarios or relationships between features that have not
been thought of and have not yet been stimulated by the constrained random

• “How do we solve this?”
• We can’t ever prove that we’re finished and there are no more bugs (see earlier

on state space and complexity)
• We need to demonstrate a very high level of confidence in the robustness of the

• Deciding that we have reached the required level of confidence is very hard…
• Metrics, metrics, metrics…

Metrics – Tracking the Bug Rate Curve

1st Release

The Key to Successful Verification

Checking: Test-based checking vs. monitoring

Constraints

Directed Test C1
(Stimulus & Self Checking)

Monitor: Checking in TB

Directed Test C1
(Stimulus only)

Complete Testbench

Constraints

Monitor: Checking in TB

Directed Tests
Directed Tests

Directed Tests
Directed Tests

• Unit level testing uses powerful techniques such as:

• Coverage-Driven Constrained Random Testing
• Stimulus and Coverage

• Models to ensure that different stages or features of the DUT are checked appropriately
• Checking

• All three sides of the triangle are required to ensure we have tested our design
as planned:

• If the stimulus is insufficient, then bugs will remain. Coverage will alert us to this.
• If coverage is not present, then we know that we have exercised the design to some

extent but we do not know how thoroughly
• If checks are missing then we may have stimulated and covered a feature but the design

may be behaving incorrectly

• “Formal” – applying automated mathematical techniques to reason about a design

• Static testing (simulation is dynamic)
• No stimulus

• Reason about properties of a design
• Property is a logical statement about the design
• Either true or false

• Can give exhaustive proofs
• Property is always true for all possible cases

• Formal verification is an additional technique to increase confidence in the design
• Complementary to simulation-based approach

What is Formal Verification?

Coursework

• Some simple exercises in writing SystemVerilog
• Create SystemVerilog testbench and run in QuestaSim
• Write some SystemVerilog assertions and run these in simulation
• Run JasperGold formal verification tool and try to prove properties
• Main coursework for submission and assessment:

• Modify some supplied RTL code for a peripheral according to specification
• Write verification plan
• Create unit-level testbench to verify peripheral; apply constrained random stimulus
• Write properties and try to prove these using FV
• Compile and run Cortex M0 example SOC with modified peripheral
• Write assembler code to run some top-level tests
• Write functional coverage points and measure code coverage and functional coverage

• Course materials will be found on the ‘Files’ tab in the Teams group

Example Arm-based SOC

Peripheral

Peripheral

Low Latency

ARM AMBA 3 AHB-Lite System Bus
AHB to APB
Bus Bridge

Management Unit

RAM UART VGA GPIO Timer

Arm Cortex-M0
Microprocessor

Arm AMBA 3 AHB-Lite System Bus

An example of an Arm-based SoC

Peripheral

Peripheral

Low Latency

ARM AMBA 3 AHB-Lite System Bus
AHB to APB
Bus Bridge

Management Unit

JTAG/Serial wire

RAM UART VGA GPIO Timer

Arm Cortex-M0
Microprocessor

Arm AMBA 3 AHB-Lite System Bus

An example of an Arm-based SoC

© 2017 Arm Limited

Coursework arrangements

• For the coursework you will work in pairs
• Each pair should email with your names
• Each pair will then be allocated a private channel in Teams
• You will only be assessed on the ‘Main Coursework’ for the H/W

Verification part of the module, not the other exercises

License set-up for EDA tools
• For remote access, see: https://www.imperial.ac.uk/admin-services/ict/self-service/connect-

communicate/remote-access/remotely-access-my-college-computer/remote-desktop-access-for-students/

• Get an account on the mill3 server
• If you don’t have an account on mill3,

give your name to the GTA: –

• Use nfshome for your working directory
• Add to your .bashrc file:

• source /usr/local/mentor/QUESTA-CORE-PRIME_10.7c/settings.sh
• source /usr/local/cadence/JASPER_2018.06.002/settings.sh

• Then use command ‘vsim’ to run Mentor QuestaSim
• First you need to ‘vlib work’; ‘vlog file1.v file2.sv’ to compile code

• Use command: ‘jg’ to run Gold

Next lecture: SystemVerilog and SVA

• SystemVerilog is a superset of Verilog, with some differences
• We will cover some SV design constructs that are also useful for

verification
• But will concentrate on the language features that are most useful for

verification
• SystemVerilog Assertions (SVA) – a way to define properties about

your design

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com