NNSmith ASPLOS’23 Artifact!#

DOI

Getting Started!#

Prerequisites

  1. OS: A Linux System with Docker Support;

  2. Hardware: X86 CPU; 16GB RAM; 512GB Storage; Good Network to GitHub and Docker Hub;

Warning: Super-multi-core & SLURM-based test-bed are not recommended

We observed performance issues when running NNSmith-ONNXRuntime on a cluster managed by SLURM and on a 64-core workstation. Therefore, users might want to avoid the mentioned settings. Nevertheless, users may set export NNSMITH_CORE=1 (inside the container) to stabilize the performance.

We are not sure about the reason but it could be that ONNXRuntime can use all CPU cores by default, causing over-threading.

Before you start, please make sure you have Docker installed.

docker --version # Test docker availability
# Docker version 20.10.12, build e91ed5707e

Otherwise please follow the installation page of Docker.

Use TMUX to run long experiments in the background

The experiements could take one day, it is recommended to open a tmux session to start it in the background and come back when the experiments are finished.

Create a tmux session.

tmux new -s nnsmith-artifact   # create a tmux session.

To leave the job running in the background:

  • ctr + b

  • d

To resume the session:

tmux at -t nnsmith-artifact
Install/Import the image!
docker pull ganler/nnsmith-asplos23-ae
tar xf NNSmith-ASPLOS23-Artifact.tar.gz
export NNSMITH_DOCKER=ganler/nnsmith-asplos23-ae:latest
cat nnsmith-ae.tar | docker import - $NNSMITH_DOCKER
Download pre-generated LEMON models

Evaluating the LEMON baseline in NNSmith’s setting is complicated (why?). For reviewers’ convenience, the LEMON models (55.7GB) are pre-generated/converted. Nevertheless, you can refer to Generate LEMON models from scratch to re-generate them.

You can download the pre-generated models from OneDrive[1] and then use the script below to extract the files.

md5sum lemon-onnx.tar    # check data integrity
# ab7b8416a841ef8ba9bb09acc3dd6a21  lemon-onnx.tar
tar xvf lemon-onnx.tar   # About 2~4 minutes
# Models are stored in ./lemon-onnx
# rm lemon-onnx.tar      # No longer needed

For ASPLOS’23 AE reviewers

Please contact us via HotCRP if you have trouble in getting LEMON models and we can open a test-bed access with environments installed for you.

Kick the tire!
# Run docker image
docker run -it --name ${USER}-nnsmith -v $(realpath ./lemon-onnx):/artifact/data/lemon-onnx ganler/nnsmith-asplos23-ae
# Now, you will "get into" the image like entering a virtual machine.
# By using this command, you will "get into" the image like entering a virtual machine.
# The session will be kept under the name "${USER}-nnsmith"

# Inside the image;
cd /artifact
git remote set-url origin https://github.com/ganler/nnsmith-asplos-artifact.git
git pull origin master
source env.sh     # Use a virtual environment
bash kick_tire.sh # 40 seconds
# Running NNSmith fuzzing 20 seconds for each of tvm and onnxruntime.

If you can see /artifact/nnsmith/kk-tire-ort and /artifact/nnsmith/kk-tire-tvm (bug report folders), then congratulations for successfully running the artifact!

Next step#

Please go to Evaluating artifact for detailed evaluation steps.