DevSecOps - Begin, Operate, Optimize
By Rohit Salecha, Principal Security Consultant
This article looks at DevSecOps as a process and how it integrates into SDLC.
The aims are:
How to Kickstart DevSecOps?
The pre-requisite for kickstarting a DevSecOps process is provisioning a Vulnerability Management Tool (VMT) to store all vulnerabilities, that can be reported by different tools, and will integrate into the pipeline. This allows us to evaluate our progress later in the DevSecOps journey. Tracking the count of vulnerabilities is necessary in indicating the maturity of your DevSecOps pipeline. The VMT needs to ingest the output spooling from these tools and display it in a normalized fashion.
Next, take a small application run by a team which is currently in a DevOps process with a well-established pipeline. Take each tool and start adding them in the pipeline step-by-step e.g. for Software Composition Analysis integrate the tool in the pipeline and observe the vulnerabilities captured in the VMT. Optionally, have the VMT taking a decision whether the builds need to fail if HIGH severity issues cross a certain threshold.
Evaluating the Tools
Selecting the right tool is key to establishing a stable pipeline. The tools need the following characteristics:
- An API/CLI interface so they can be easily invoked from within the pipeline.
- Having outputs in JSON/XML format, helps in making stronger resilient parsers to ingest data into the VMT.
- Minimal installation procedure and dockerised tools are preferred as it is easier to port them across different pipelines.
- Stable spooled outputs, and the schema must not change. We once updated a tool to the latest version it ended up breaking the parser which was utilised to ingest the data into the VMT.
- Minimum licensing arrangements should be agreed e.g. Tool vendor charging per execution is expensive, if you are executing your DevSecOps pipeline more than 100 times a week. A per project or per application license is recommended.
- Constantly lookout for new tools and keep challenging the pipeline with them.
Optimizing the Pipeline
Once the tool(s) has been selected and is spooling data into the VMT the next stage is Optimization. Running multiple tools in the entire pipeline will take time for each build and go against the DevOps principle of builds completing in less than 15 minutes.
The best way to optimize the pipeline is to skip running certain tools under certain conditions:
- For html/css only changes, spot decisions can be taken to disable the Software Composition Analysis and infrastructure scanning stage.
- If new dependencies are being added with little code modification, then only SCA stage can be run skipping the SAST stage.
- If there are no infrastructure changes and previous scan gave acceptable vulnerabilities, then skip the infrastructure scan.
- Tools like dependency-check, Clair, OpenVAS require continuous syncing between online databases holding the vulnerability data. It's not be possible to run the updates on these tools when the scan is running as it consumes time. Hence these tools must be updated periodically when the pipelines are not operational.
- Epic changes/major builds must not skip any stages or alternatively periodically run non-optimized scans over the entire build.
Finally, DevSeCops is not a one-size-fit all process and each application may require a different strategy and approach to run the tools. Hence it is extremely important to cross-skill individuals who could lead the DevSecOps process for their own application and be the point-of-contact with the security team mentoring them.