Sunday, December 8, 2013

Better living through tools

In an earlier post http://vzimmer.blogspot.com/2013/09/end-of-summer-13.html I spoke about architecture versus implementation, and the process of successively refining an architecture to some implementation artifact. I didn't elaborate upon some of the techniques to demonstrate correspondence between the design goals and the implementation behavior. In the industry this can be naively thought of as just a simple matter of testing, but I believe it goes further than that, namely the broader concern of representing architectural intent in a product.

With respect to software development and testing, though, there are many different approaches. Recent efforts like Test Driven Development (TDD) have shown promise in practice, and I enjoyed Grenning's Test Driven Development in Embedded C on the same. Similarly, books like James Whittaker's How Google Tests Software and its associated blog http://googletesting.blogspot.com/2011/01/how-google-tests-software.html provides a very pragmatic approach to the problem, namely all developers write test and a centralized test organization is both responsible for a consultative and automation/infrastructure role. In our world of UEFI and EDK2 we have the Self-Certification Tests (SCTs) http://www.uefi.org/sites/default/files/resources/UPFS11_P3_UEFI_SCT_HP_Intel.pdf.

Now lets look at the problem from a broader level, namely expressing architectural intent. To me this is nowhere more important than in the world of building trustworthy systems, especially the "Defender's Dilemma" that Jeremiah reminds us of in slide 7 of http://www.uefi.org/sites/default/files/resources/UEFI_Summerfest_2013_-_Microsoft_Hardware_Security_Test_Interface.pdf. Namely, the attacker only has to discover one flaw, whereas the defender has to provide no gaps. And it is the 'gaps' that are important in this post since flaws can span from the architecture down into the implementation.

To that end of mapping architectural intent directly to code, I have continually been intrigued by the work of Gernot Heiser http://www.cse.unsw.edu.au/~gernot/ at NICTA on Trustworthy Systems http://www.ssrg.nicta.com.au/projects/TS/. The trophy piece of that effort is seL4, a formally validated microkernel with correspondence between C code, a machine model, Haskell, and a theorem prover. This is undoubtedly a BHAG http://en.wikipedia.org/wiki/Big_Hairy_Audacious_Goal to scale more generally, but it does serve as a beacon to show what can be done given sufficient focus and incentives.

Gernot's effort is not alone, of course. There is the verification of the CMU SecVisor http://www.cs.cmu.edu/~jfrankli/tr/franklin_secvisor_verification.pdf, UTexas Hypervisor verification http://arxiv.org/pdf/1110.4672.pdf, and application of formal methods to industrial problems like http://swtv.kaist.ac.kr/courses/cs350-08/ase08_submitted.pdf.

Beyond seL4, though, there are other efforts that NICTA incubates under the banner of Trustworthy Systems, as best described in http://www.nicta.com.au/pub?doc=4163. One of the authors of the latter paper in Leonid Ryzhyk, and in 4.2 the paper references work on the long goal of device driver synthesis, or correct-by-construction for this class of system software.

And it is the holy grail of 'correct-by-construction' for systems code that I want to mention next in a little more detail. Intel recently published a paper Device Driver Synthesis http://noggin.intel.com/content/device-driver-synthesis in the Intel Technology Journal, Volume 17, Issue 2, December 2013 titled Simics Unleashed - Applications of Virtual Platforms http://www.intel.com/content/www/us/en/research/intel-technology-journal/2013-volume-17-issue-02-intel-technology-journal.html that goes into some detail on a real instance of code synthesis.

Regarding driver synthesis, an overview of the effort may best be described in a picture, such as


above. The idea entails taking a model of the hardware to be managed by a driver plus a formal interface of how the driver interacts with the system software environment, and then synthesize the reactive code for the driver. The ideal would be automation that simply emits code, but given the human aspect of software development, such as maintenance, review, evolution, the process can act as an interactive session to have the user add code as part of synthesis, and ensure those additions are correct. The effort also focuses on making the resultant code something that has seemly names and meets other psychological constraints in working with code, such as cyclomatic complexity http://en.wikipedia.org/wiki/Cyclomatic_complexity.

Within Intel, I had the pleasure of engaging with Mona Vij who has led the team in Intel labs on evolving this technology since the summer of 2012. She, along with the Intel and external university researchers, have proven valuable, innovative parties with whom to engage. You can see the elements of our collaboration via the UEFI aspects of the effort in the paper. I believe realizing a vision of this type of work-flow would complement other efforts for the UEFI community, such as http://sourceforge.net/apps/mediawiki/tianocore/index.php?title=UEFI_Driver_Wizard.

For additional details, the Termite page http://www.ertos.nicta.com.au/research/drivers/synthesis/home.pml calls out the collaboration. More details on the engagement with Intel and the university can be found at  

From the perspective of evolving my skills as a technologist, the engagement offered an interesting view into another approach for system software Better living through tools. It also opened my eyes to look beyond my old friend of C code to a world of functional languages like Haskell, into DSL creation, use of Binary Decision Diagrams (BDD's), SAT solvers, hardware modeling languages like DML and SystemC, too.

The industrial advantages of functional languages, albeit Lisp and not Haskell, finds an interesting discussion in the writings of Paul Graham http://www.paulgraham.com/avg.html. I recommend reading his essays, including the book version Hackers and Painters http://www.amazon.com/Hackers-Painters-Big-Ideas-Computer/dp/1449389554.

The above paper will give you a feel for the effort, but if you are hungry for more details on the underlying mechanics, I recommend visiting http://www.ssrg.nicta.com.au/projects/TS/drivers/synthesis/, too.

So again, these are my thoughts and not a plan-of-record of my employer, as my obligatory blog bio reminds people. But what I did want to do with this post is enjoin system software engineers in a conversation to think differently about how we write specifications and the process by which we refine these specifications to code, ensure that the code matches the specifications, and finally, evolve code + spec over the life of our technologies.


PS
September 2014 update.  Termite is now open source https://github.com/termite2 

1 comment:

Tim Lewis said...

Fifteen years ago, at Phoenix, we actually purchased a C++ front end from Edison and grafted on Prolog-like constructs to allow the description of chipset register behavior and expression of goals. Our test piece was to see if we could express the IDE controller cycle timing from the ATA spec as requirements and then the chipset register values as combinations and then let the compiler produce final code which would, in effect, program the hard drive controller. In the end, it turned out the 440BX EDS was not sufficiently detailed enough. There were often statements like "set this to 1" with no rationale or connection to a specification. Some of them turned out to be signal strength, etc. But others were "debug feature" enablement which would have almost required access to the VHDL or Verilog in order to understand their full importance.