Skip to content

Instantly share code, notes, and snippets.

@christiangalsterer
Last active October 23, 2023 05:18
Show Gist options
  • Star 13 You must be signed in to star a gist
  • Fork 4 You must be signed in to fork a gist
  • Save christiangalsterer/5f55389b9c50c74c31b9 to your computer and use it in GitHub Desktop.
Save christiangalsterer/5f55389b9c50c74c31b9 to your computer and use it in GitHub Desktop.
Script to download the National Vulnerability Database files from https://nvd.nist.gov
#!/bin/sh
# https://gist.github.com/christiangalsterer/5f55389b9c50c74c31b9
# Copyright 2015 Christian Galsterer
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Downloads the National Vulnerability Database files from https://nvd.nist.gov
# If no parameter is specified the files will be downloaded to the current directory. Alternativly a target directory can be specified as an argument to the script.
# export https_proxy=<ADD HERE YOUR PROXY IF NEEDED>
START_YEAR=2002
END_YEAR=$(date +'%Y')
DOWNLOAD_DIR=.
CVE_12_MODIFIED_URL='https://nvd.nist.gov/download/nvdcve-Modified.xml.gz'
CVE_20_MODIFIED_URL='https://nvd.nist.gov/feeds/xml/cve/nvdcve-2.0-Modified.xml.gz'
CVE_12_BASE_URL='https://nvd.nist.gov/download/nvdcve-%d.xml.gz'
CVE_20_BASE_URL='https://nvd.nist.gov/feeds/xml/cve/nvdcve-2.0-%d.xml.gz'
if [[ $# -eq 1 ]] ; then
DOWNLOAD_DIR=$1
fi
START_TIME=$(date +%s)
download () {
echo
echo "Starting download of $1"
OUTPUT_FILE=${1##*/}
wget --no-check-certificate $1 -P $DOWNLOAD_DIR -O $OUTPUT_FILE
if [ "$?" != 0 ]; then
echo "ERROR: Downloading of $1 failed."
exit 1
fi
echo "Extracting $OUTPUT_FILE"
gzip -df $OUTPUT_FILE
if [ "$?" != 0 ]; then
echo "ERROR: Extracting of $OUTPUT_FILE failed."
exit 1
fi
echo "Download of $1 sucessfully completed."
echo
}
echo "Starting download of NVD files ..."
download "$CVE_12_MODIFIED_URL"
download "$CVE_20_MODIFIED_URL"
for ((i=$START_YEAR;i<=$END_YEAR;i++));
do
download "${CVE_12_BASE_URL//%d/$i}"
done
for ((i=$START_YEAR;i<=$END_YEAR;i++));
do
download "${CVE_20_BASE_URL//%d/$i}"
done
END_TIME=$(date +%s)
DURATION=$((END_TIME-START_TIME))
echo "Download of NVD files successfully completed in $DURATION seconds."
@Big-al
Copy link

Big-al commented Sep 18, 2023

This functionality will be deprecated in September 2023.

Until then, it is still possible to grab a backup of the NVD files with

for y in {2002..2023}; do curl -LO "https://nvd.nist.gov/feeds/json/cve/1.1/nvdcve-1.1-$y.json.gz"; done

That was a very clever and simple solution. Thank you.

I'll add that for anyone seeking to ingest this in another platform, you can unwrap the compressed json with the same approach, by replacing curl with gunzip.

I used parallel to optimize, albeit it wasn't really necessary.
parallel --progress gunzip ::: nvdcve-1.1-{2002..2023}.json.gz

Then be aware that there is a header on the actual cve data, so the structure is like this for each file:

{
  "CVE_data_type" : "CVE",
  "CVE_data_format" : "MITRE",
  "CVE_data_version" : "4.0",
  "CVE_data_numberOfCVEs" : "6769",
  "CVE_data_timestamp" : "2023-08-17T07:02Z",
  "CVE_Items" : [ {... **cve objects** ...} ]
}

Which you can easily pick out with jq if you want to continue with the terminal or in python by iterating each file and picking the objects, and yielding it into whatever you're uploading it to:

    with open(filename, "r") as file:
        return json.load(file).get("CVE_Items", [])

This leaves you with nicely formatted json objects for each cve from the export, to do with as you please.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment