Developer Blog

Networking |

Building an NVIDIA Pure SONiC Image

Pure SONiC is the version of SONiC that eliminates vendor dependence. Being community developed, publicly available, and 100% open source enables you to build a Pure SONiC image that is in sync with the desired community branch. It means that every line of code of SONiC and NVIDIA implementation of SAI (switch abstraction interface) is just one click away at the master image.

NVIDIA is committed to your success when you choose Pure SONiC. To ensure that Pure SONiC is hardened and qualified, NVIDIA recommends building the image using specific public hashes, a mechanism to snapshot the Git repository at a specific point in time. Each public hash communicated to users is verified on all NVIDIA platforms through extensive QA. Moreover, NVIDIA acknowledges the need for valuable documentation. Release notes and user manuals are tied to a specific public hash.

Building a Pure SONiC image

Here’s how I built my Pure SONiC image, ZTP included, to run on my NVIDIA Mellanox Spectrum Open Ethernet switches. My solution was inspired by the Build SONiC Switch Images tutorial on GitHub. By default, ZTP is disabled in the repo’s build config file.

Spectrum switches come preinstalled with ONIE (open network install environment), a boot loader that provides an environment to install any network OS on a bare-metal switch system. ONIE allows end users to automate the installation of the network OS as part of datacenter provisioning, similar to the way Linux servers are managed by ONIE switches.

My build server consisted of 24 cores CPU, 250-GB build storage, and 64-GB RAM running on Ubuntu 16.04 with Docker version 18.03.0-ce, Python, and jinja2. I found that my build configuration required at least 100 GB of free disk space. The final build directory consumed about 30 GB. In situations which build time is business-critical, I recommend upgrading the CPU and RAM to allow more cores working in parallel to shorten the build time.

For automation and code reuse purposes, I separated the code into three short files:

  • build.cfg: Initializes common environment variables and sourced by other files.
  • gitsonic.sh: Pulls the public git repository source.
  • build.sh: Executes the build.

Running the scripts executes the build process.

Step 1: Create build.cfg

# An example to hash that was qualified by NVIDIA
SONICBRANCH=201911
COMMITHASH="bea968b"
BLDBRANCH="${SONICBRANCH}"
BUILD_NUMBER="00005"
let BLDNUM="${BUILD_NUMBER}"
 
#ZTP is disabled by default per community decision. I found it useful to enable in my build, more
#options are available in the file ./rules/config
ENABLE_ZTP="y"
 
SONIC_IMAGE_VERSION="SONIC.${SONICBRANCH}.${BLDNUM}-${COMMITHASH}_Internal"
SONIC_OVERRIDE_BUILD_VARS='
SONICIMAGE_VERSION=SONIC.${SONICBRANCH}.${BLDNUM}-${COMMITHASH}_Internal
BUILD_NUMBER=${BLDNUM} ENABLE_ZTP=y'
BLDDIR="./sonic-buildimage_${BLDBRANCH}_${BUILD_NUMBER}_${COMMITHASH}_ZTP"

Step 2: Create sonicgit.sh

#!/bin/bash
source ./build.cfg
 
if [ -d "${BLDDIR}" ];then
   echo "directory sonic-buildimage already exists, aborting git"
   exit 1
fi
 
# git clone the top-level
# source code from the public repository, SONICBRANCH=201911
 
git clone -b ${SONICBRANCH} https://github.com/Azure/sonic-buildimage.git
 
# move the cloned source to a build-specific named directory
# avoid overwriting earlier versions that you may need.
mv ./sonic-buildimage "${BLDDIR}"
 
# If you are making any changes to the latest checked in branch, you must make
# changes to the configuration.
# Because you are making changes, create a build branch based on the specific commit hash
#this git branch information shows up in the build image,
#when you run command $show version from the switch command line.
 
cd "${BLDDIR}"
git checkout -b "${BLDBRANCH}" ${COMMITHASH}
 
# the git clone step only pulls the top-level module.
# the underlying submodules must be recursively
# init-ed and updated.
 
git submodule update --init --recursive
 
#display the status
echo "${BLDDIR}"
git status | grep branch

Step 3: Create build.sh

#!/bin/bash
 
source ./build.cfg
 
#Helper functions start###
function checkErrors()
{
    X=`grep -i -c "${1}" "${2}"`
    if [ "${X}" != "0" ];then
       grep -i -n "${1}" "${2}"
    fi
}
 
function doSetup()
{
  CONFIGZTP="ENABLE_ZTP=${ENABLE_ZTP}"
  CONFIGSONIC="{$CONFIGZTP}"
}
 
function doMakeConfig()
{
  #Execute make configure once to configure ASIC
  #make configure PLATFORM=[ASIC_VENDOR]
  make configure PLATFORM=mellanox
}
 
# Build SONiC image
function doMake()
{
  LOGFILE="../logs/${BLDDIR}.log"
  echo "time make
SONIC_BUILD_JOBS=24 ${SONIC_OVERRIDE_BUILD_VARS}
target/sonic-mellanox.bin" > "${LOGFILE}"
  time make SONIC_BUILD_JOBS=24 ${SONIC_OVERRIDE_BUILD_VARS} target/sonic-mellanox.bin | tee "${LOGFILE}"
  checkErrors "fail" "${LOGFILE}"
  checkErrors "warning" "${LOGFILE}"
  checkErrors "error" "${LOGFILE}"
}
#Helper functions end###
 
cd "${BLDDIR}"
doSetup
doMakeConfig
doMake

Summary

The build process described in steps 1-3 results in a Pure SONiC image with ZTP enabled. Some might argue that the image demonstrates open networking at its best: building an open-source operating system and eliminating vendor dependence.

Considering deploying SONiC in 2021? Feel free to reach out to me in the comments. I would love to have a discussion and hear your feedback.