First Advisor

Fei Xie

Term of Graduation

Winter 2021

Date of Publication

1-28-2021

Document Type

Dissertation

Degree Name

Doctor of Philosophy (Ph.D.) in Computer Science

Department

Computer Science

Language

English

Physical Description

1 online resource (xii, 119 pages)

Abstract

Modern system design involves integration of all components of a system on a single chip, namely System-on-a-Chip (SoC). The ever-increasing complexity of SoCs and rapidly decreasing time-to-market have pushed the design abstraction to the electronic system level (ESL), in order to increase design productivity. SystemC is a widely used ESL modeling language that plays a central role in modern SoCs design process. ESL SystemC designs usually serve as executable specifications for the subsequent SoCs design flow. Therefore, undetected bugs in ESL SystemC designs may propagate to low-level implementations or even final silicon products. In addition, modern SoCs design often involves intellectual properties supplied by outsourced design services and untrusted third-party vendors, as well as intensive usage of electronic design automation tools provided by different vendors. Given this situation, modern SoCs are vulnerable to malicious implants such as hardware Trojans. Bugs and Trojans in silicon products can be extremely expensive and dangerous, especially in safety critical systems. Therefore, it is critical to detect bugs and Trojans as early as possible during SoCs design process. However, it is a challenging task for SystemC designs due to their object-oriented features and inherent concurrency, as well as the stealthy nature of hardware Trojans.

We propose a framework to validate SystemC designs with automated test generation. We first develop an approach for generating high-quality test cases for SystemC designs using symbolic execution. To improve the scalability, we further propose an approach to test generation for SystemC designs with binary level concolic testing techniques. To evaluate the quality of the generated test cases, we adopt code coverage and assertion-based verification techniques. We further extend our test generation framework for hardware Trojan detection in behavioral SystemC designs.

In addition, we also develop a comprehensive suite of benchmark designs for SystemC verification and validation. SystemC verification has been studied for around two decades. However, so far, different verification approaches are evaluated on different sets of SystemC designs, among which some designs are not updated according to the latest SystemC Standard. Lacking common benchmarks makes it difficult to compare the performances of various approaches. Our benchmark covers many application domains and SystemC core features, as well as conforming to the latest SystemC Standard.

To evaluate the efficiency, effectiveness and scalability of our test generation framework, we have applied it to the benchmark that we developed. Our experimental results demonstrate that the test cases generated by our approaches are able to achieve high code coverage and detect design errors effectively. In our experiments, our framework detects two severe errors, one functional error and one out-of-bound access. We have also applied our hardware Trojan detection approach to an open source SystemC benchmark with various hardware Trojans. Our approach is able to detect those hardware Trojans effectively and efficiently. The extensive experiments with our framework show that it scales to designs with practical sizes.

Rights

© 2021 Bin Lin

In Copyright. URI: http://rightsstatements.org/vocab/InC/1.0/ This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).

Persistent Identifier

https://archives.pdx.edu/ds/psu/35137

Share

COinS