Validation
This guide explores several areas that can be used to improve the quality of programmatic changes. These areas are well established concepts in the software development world, and apply to Infastructure as Code in a similar fashion. It is worth to note that many of these areas are addressed in commercial offers from Cisco such as Nexus Dashboard Insights. Features such as pre-change validation and delta analysis can be used to verify business intent and whether the configuration meets your requirements. That being said, a plethora of open-source tools are available that can provide users with a set of tools to develop their own set of validations.
Linting, semantic and syntactical validation
To improve the quality of your code it is recommended to verify that the meaning of user input is both valid and meaningful. This means ensuring that all user input is relevant, accurate, and appropriate for the intended purpose.
Linting is a process of analyzing code for programmatic and stylistic errors. It helps to identify potential issues such as syntax errors, missing semicolons, excessive whitespace, and formatting inconsistencies. Linting can help make sure that code follows a standard style and complies with best practices, making it easier to read and debug. By automating certain parts of the process, linting can help developers save time and write better quality code.
There are many ways to lint code, from online tools such as yamllint.com, that lets users copy & paste code, or the more commonly used CLI-driven tools. A common linter for yaml
is the yamllint
python package, which can be installed through:
pip install yamllint
yamllint
is also packaged for all major operating systems, see installation examples (dnf, apt-get…) in the documentation.
Running yamllint .
within a directory will lint all *.yaml
files, and show any errors:
$ yamllint ../tenant_PROD.nac.yaml 24:43 error trailing spaces (trailing-spaces) 25:14 error syntax error: expected <block end>, but found '<block sequence start>' (syntax) 26:15 error wrong indentation: expected 15 but found 14 (indentation)
---yaml-files: - '*.yaml' - '*.yml' - '.yamllint'
ignore: | .terraform
When using a git repository it is advised to include these steps in a pre-commit hook. This is a client side action that will run each time that you commit a change. To do that you can create a .pre-commit-config.yaml
file. Note that it is possible to write your own scripts but one of the advantages of pre-commit
is that you can leverage a large ecosystem of hooks made by other people. Many other examples for pre-commit
can be found at https://pre-commit.com/hooks.html
Create .pre-commit-config.yaml
with the following content:
repos:- repo: https://github.com/adrienverge/yamllint.git rev: v1.28.0 hooks: - id: yamllint
To make use of this configuration, you must first install and initialize pre-commit
:
> pip install pre-commit> pre-commit installpre-commit installed at .git/hooks/pre-commit
Note that the
pre-commit install
must be run from the root of the repository.
And make sure to add .pre-commit-config.yaml
with git add
:
git add .pre-commit-config.yaml
Next time you run git commit
, the hook will initiate yamllint
:
> git commit -m "updating configuration"yamllint.................................................................Failed- hook id: yamllint- exit code: 1
data/tenant_PROD.nac.yaml 29:43 error trailing spaces (trailing-spaces) 30:14 error syntax error: expected <block end>, but found '<block sequence start>' 31:15 error wrong indentation: expected 15 but found 14 (indentation)
The downside of pre-commit hooks is that they run exclusively on your system. If a contributor to your project does not have the same hooks installed, they may commit code that violates your pre-commit hooks. If you use GitHub you can integrate pre-commit hooks in your CI workflow. At the moment of writing, this is only available for GitHub. Visit https://pre-commit.com for more information. For GitLab CI users it is possible to run a job to check whether the pre-commit hooks were properly applied. Below is an example of adding a linting stage to your gitlab-ci.yml
.
yamllint: stage: linting image: registry.gitlab.com/pipeline-components/yamllint:latest script: - yamllint data/
Semantic validation is the process of checking the meaning behind data to ensure accuracy and correctness. This can involve validating data against a set of rules, making sure it conforms to certain expectations. Syntactic validation is the process of validating data against a set of predetermined rules. This ensures that information entered into a system meets the requirements for it to be accepted and processed correctly. Syntactic validation can involve using regular expressions to check for specific patterns, or comparison operators to check if values meet certain criteria.
The open-source nac-validate
python tool may be used to perform syntactic and semantic validation of YAML files. Syntactic validation is done by basic YAML syntax validation (e.g., indentation) and by providing a Yamale schema and validating all YAML files against that schema. Semantic validation is done by providing a set of rules (implemented in Python) which are then validated against the YAML data. Every rule is implemented as a Python class and should be placed in a .py
file located in the --rules
path.
Each .py
file must have a single class named Rule
. This class must have the following attributes: id
, description
and severity
. It must implement a classmethod()
named match
that has a single function argument data which is the data read from all YAML files. It should return a list of strings, one for each rule violation with a descriptive message.
Python 3.10+ is required to install nac-validate. It can be installed using pip:
pip install nac-validate
It may also be integrated via a pre-commit hook with the following .pre-commit-config.yaml
, assuming the default values are used (.schema.yaml
, .rules/
).
repos: - repo: https://github.com/netascode/nac-validate rev: v1.0.0 hooks: - id: nac-validate
In case the schema or validation rules are located somewhere else, the required CLI arguments can be added like this:
repos: - repo: https://github.com/netascode/nac-validate rev: v1.0.0 hooks: - id: nac-validate args: - '--non-strict' - '-s' - 'my_schema.yaml' - '-r' - 'rules/'
An example .schema.yaml
for ACI can be found here.
When validating your *.yaml
code against the above .schema.yaml
, you can perform checks. For example, if you change the Bridge Domain setting subnets: list(include('ten_bridge_domain_subnets'), required=False)
in the default .schema.yaml
to subnets: list(include('ten_bridge_domain_subnets'), required=True)
, this will make sure that a subnet is always added to a Bridge Domain. If the subnet is omitted, the validation will fail:
> nac-validate --non-strict data/ERROR - Syntax error 'data/tenant_PROD.nac.yaml': apic.tenants.0.bridge_domains.1.subnets: Required field missingERROR - Syntax error 'data/tenant_PROD.nac.yaml': apic.tenants.0.bridge_domains.2.subnets: Required field missing
Note that the
--non-strict
flag is added above, which allows unexpected elements in the.yaml
files. In other words, this means that it is not required to have a matching check in.schema.yaml
for each resource defined in the.yaml
files.
More complex logic can also be applied to run semantic validation. Below is an example subnet_overlap.py
to avoid creating overlapping subnets:
import ipaddress
class Rule: id = "100" description = "Verify VRF subnet overlap" severity = "HIGH"
@classmethod def match(cls, data): results = [] try: for tenant in data["apic"]["tenants"]: for vrf in tenant["vrfs"]:
# get a list of all bridge domain subnets of a vrf subnets = [] for bd in tenant["bridge_domains"]: if bd["vrf"] == vrf["name"]: for subnet in bd.get("subnets", []): subnets.append( ipaddress.ip_network(subnet["ip"], strict=False) )
# check subnet overlap with every other subnet for idx, subnet in enumerate(subnets): if idx + 1 >= len(subnets): break for other_subnet in subnets[idx + 1 :]: if subnet.overlaps(other_subnet): results.append( "apic.tenants.bridge_domains.subnets.ip - {}.{}.{}".format( tenant["name"], vrf["name"], subnet ) )
except KeyError: pass return results
The following error is returned when overlapping subnets have been specified in any of the *.yaml
files in the data/
folder:
> nac-validate --non-strict data/ERROR - Semantic error, rule 100: Verify VRF subnet overlap (['apic.tenants.bridge_domains.subnets.ip - prod.prod.prod-vrf.10.1.201.0/24'])
For more ACI example rules, click here.