SOFTWARE IS INFRASTRUCTURE
The realization that software is becoming an essential component of our everyday lives was reflected yet again in this year’s Black Hat
. Even more solutions are being touted to deal with the ever-growing exposure of software to malicious threats. Unfortunately, a lot of the solutions focus on dealing with the symptoms of our current predicament without addressing the fundamental truth - software is built insecurely despite our best efforts.
What is required is a change of perspective. Software is infrastructure.
This is particularly true in safety criticality systems. Think of recent advances in the automotive industry, aeronautics, and medical devices. All would not have been possible without the introduction of software as part of the innovation. This however has the unfortunate side-effect of imbuing these systems with an additional characteristic - the fusion of hardware and software make these systems essentially cyber-physical systems. The problem is that the processes which we’ve developed to deal with the challenges of modern software development have in general not yet reached the level of maturity required for systems where life and death are at stake.
What’s missing from the process is the concept of resilience. Resilience is the ability to resist catastrophic failure in the face of adverse conditions. Resilience is an essential requirement for safety-critical cyber-physical systems specially when theses systems are expected to function for decades, not merely years.
While there are a number of technologies which help address the challenge of building resilient systems, by themselves, they only address a fraction of the problem. Let’s look at the various strengths and weaknesses of these solutions:
- Software Composition Analysis allows organizations to find outdated software dependencies. By using non-vulnerable versions of these components, security can be immediately improved. The challenge is that this sense of safety is at a point-in-time. There is no guarantee that having the latest components that your application is secure against future threats.
- Static Analysis can be applied to a program’s source code, but works with an abstraction that does not operate against the code that actually executes. In addition, even the best tools required organizational effort to employ as the technique suffers from a fundamental issue of False Positives (the mis-identification of issues which are in fact _not_ defects). The application of SA is further complicated by the ever increasing size of code bases. While the best SA tools can have FP rates under 5%, when applied to projects with 1MLoC to 10+MLoC (Lines of Code), this results in the identification of approximately 50k - 500k defects (using 5% as the FP rate). This number of defects requires significant time and developer resources to address. Imagine when the SA tool being used has an even higher FP rate…
- Dynamic Analysis (such as protocol fuzzers, Interactive Application Security Tools- IAST, vulnerability scanners) are useful in the context of acceptance testing, but application of these tools requires understanding of when in the Software Development Life Cycle they can be applied. These tools generally work on fully developed/deployed applications which fundamentally shifts them rightmost in the SDLC. There is a cost associated with this lag in the developer feedback cycle.
- Software Auditing and Penetration Testing can also be used to secure software but with significant cost (as it requires a degree of expertise) and is limited in by human scale. This option is generally only available to organizations with the resources to hire/purchase these services which leaves a majority of companies unnecessarily exposed.
So what’s the solution?
Coverage guided fuzzing is a technique gaining popularity that is empowered by recent advances in cloud scale infrastructure. Fuzzing is the process of generating pseudo-random inputs and feeding into a program to see if it behaves in an unexpected manner. Surprisingly, this technique is very effective in discovering new defects which can have stability/security implications. Hackers have been known to use fuzzing to discover new vulnerabilities.
The cutting-edge of this technique combines both fuzzing with Symbolic Execution. While fuzzing can be thought of as brute force mutational input testing, SE can look at the execution context of program and discover interesting paths for analysis which fuzzing by itself would have difficulty making progress against.
In addition, test cases are automatically generated as part of the analysis. These test cases are important because:
- They can function as regression tests for future versions of the software without additional developer effort. Instead of waiting for defects/vulnerabilities to be reported against future versions, you can test the most current version of a dependency to ensure the integrity of the program's behavior.
- A discovered defect has direct/measurable impact on the running program and is extremely unlikely to result in a False Positive.
- They can be reduced to a minimum set of cases which exercise the discovered execution paths. This is much faster than running a full analysis of the program and can be easily incorporated into a DevOps pipeline. This gives developers immediate visibility into regressions/defects discovered through analysis .
- They can be used to provide defect reproducers so the developers can quickly identify where the code needs to be fixed. In essence, tests give the type of context an experienced auditor/pen-tester can provide.
- As analysis progresses, new test cases are generated.
While no analysis can ever claim to find all possible bugs, having a collection of test cases that evolves with the program gives organizations confidence that a program which has undergone analysis will be resilient.