Tuesday
Sep042012

Tech Talk Video: Comprehensive Analysis of the Android Ecosystem

We are pleased to announce the availability of a new tech talk video: "Comprehensive Analysis of the Android Ecosystem", presented by Iulian Neamtiu. For more details, please visit the talk announcement page.

For more videos, please visit our Vimeo channel.

Friday
Aug242012

Tech Talk: Formal Verification of Monad Transformers

Galois is pleased to host the following tech talk. These talks are open to the interested public--please join us! (There is no need to pre-register for the talk.)

Please note the unusual time for this talk. It is on Thursday at 11am.

title: Formal Verification of Monad Transformers

speaker: Brian Huffman

time: Thursday, 30 August 2012, 11:00am

location:
Galois Inc.
421 SW 6th Ave. Suite 300,
Portland, OR, USA
(3rd floor of the Commonwealth building)

abstract:

We present techniques for reasoning about constructor classes that (like the monad class) fix polymorphic operations and assert polymorphic axioms. We do not require a logic with first-class type constructors, first-class polymorphism, or type quantification; instead, we rely on a domain-theoretic model of the type system in a universal domain to provide these features. These ideas are implemented in the Tycon library for the Isabelle theorem prover, which builds on the HOLCF library of domain theory. The Tycon library provides various axiomatic type constructor classes, including functors and monads. It also provides automation for instantiating those classes, and for defining further subclasses. We use the Tycon library to formalize three Haskell monad transformers: the error transformer, the writer transformer, and the resumption transformer. The error and writer transformers do not universally preserve the monad laws; however, we establish datatype invariants for each, showing that they are valid monads when viewed as abstract datatypes.

bio:

Brian Huffman is a recent PhD graduate from the Department of Computer Science at Portland State University, now working as a postdoc at the Technical University of Munich. He has been an avid Haskell programmer since 2001, and has contributed to the development of the Isabelle interactive theorem prover since 2005, with an emphasis on tools for verifying lazy functional programs. Several of Brian's formalizations can be found online at the Archive of Formal Proofs.

Tuesday
Aug212012

Tech Talk: Abstract "Anything": Theory and Proof Reuse via DTP

Galois is pleased to host the following tech talk. These talks are open to the interested public--please join us! (There is no need to pre-register for the talk.)

title: Abstract "Anything": Theory and Proof Reuse via DTP

speaker: Larry Diehl

time: Tuesday, 28 August 2012, 10:30am

location:
Galois Inc.
421 SW 6th Ave. Suite 300,
Portland, OR, USA
(3rd floor of the Commonwealth building)

abstract:

Many advanced branches of mathematics involve structures that abstract over well known specific instances, e.g. abstract algebraic structures, equivalence relations, various orders, category theoretic structures, etc. In typed functional programming (e.g. with type classes in Haskell), encoding such structures amounts to enforcing the definition of elements and operations for a particular structure instance on one hand, and being able to use said elements and operations in generic definitions on the other hand. With Dependently Typed Programming (DTP) we can go one step further and define propositions/properties for abstract structures, and subsequently require proofs in particular instances.

This talk will tell the story of how examples of particular instances inspire an abstract definition (including its propositional properties), how to then instantiate the original concrete examples in the new abstract definition, and finally create further abstract definitions that depend on previously defined ones. Emphasis will be given on how easily concrete proofs can be used as evidence for abstract propositions, and how proofs about abstract structures parameterized by other abstract structures may reuse their proofs (similar to the more common concept of reusing operations in subsequent abstract definitions). The talk will use Agda as its demonstration language, but proofs will mostly be in the form of equational reasoning that should look familiar to the non-expert.

bio: Larry Diehl is an incoming PhD student of Computer Science at Portland State University. He has most recently worked for Engine Yard building infrastructure for deploying web applications to cloud providers, and previously got his bachelor's degree from the University of Central Florida. Larry has been merrily programming in Agda as a hobby for the last few years, and some of his work can be seen on GitHub under the user larrytheliquid.

Tuesday
Aug212012

Galois Internship Available

Galois, Inc. has one or more internships available in Portland, Oregon, USA.

PROJECT OVERVIEW:

The project is a 4+ year research project on high-assurance autonomous vehicles. Galois will be working on three aspects of this problem:

  • Synthesizing software components from Haskell-based embedded DSLs.
  • Building/porting a hardware platform for testing our prototypes.
  • Performing static/dynamic analysis on C/C++ code to detect vulnerabilities.

We intend to release open-source most (if not all) source code developed on the project and we will be publishing on our research results.

Being an intern for repeated terms is a possibility.

LOGISTICS:

The length and start date of the internship are all negotiable: anytime from this fall through next fall is acceptable. Ideally, you can be at Galois for at least 3 continuous months. The internship is paid competitively, and the intern will be responsible for her own living arrangements (although we can certainly help you find arrangements). Galois is located in the heart of downtown, with multiple public transportation options available and world-class bicycle infrastructure, so living here without an automobile is a viable option.

QUALIFICATIONS:

  • MUST-HAVES:

    • The ability to be geographically located in Portland during the internship.
    • Good knowledge of C/C++.
    • Some experience with low-level/embedded design.
    • Excellent software engineering ability and aptitude.
  • NICE-TO-HAVES (not necessary, but let me know if you have these qualifications!):

    • Experience with Haskell and particularly embedded DSLs.
    • Experience with microcontrollers (particularly ARM Cortex M).
    • Experience with control systems.
    • Interest in/experience with real-time systems and RTOSes.
    • Interest in/experience with software security.
    • Interest in/experience with static analysis.
    • Experience with ArduPilot or other autopilot systems.
    • Good writing skills/experience in writing technical papers.
  • DO NOT NEED: a specific degree (we're interested in hearing from anyone from post-docs to undergrads).

ABOUT GALOIS:

Galois, Inc. is located in Portland, Oregon with a mission to create trustworthiness in critical systems. We're in the business of taking blue-sky ideas and turning them into real-world technology solutions. We've been developing real-world systems for over 10 years using Haskell.

To get a sense of life at Galois, one of our previous interns documented his internship here.

TO APPLY:

In one email,

  • Please attach a C.V. (PDF, plain text, or markdown only).
  • In the body of the email, include a brief plain-text note stating your interest and experience and any other relevant details.

Send this to Lee Pike  with the subject line "Internship 2012". If you follow these directions, you'll get a confirmation from me.

Thursday
Aug022012

Is the NIST Risk Management Framework poised to become a national cybersecurity standard?

A lot of organizations, including small businesses and critical infrastructure operators, might soon get new technical security requirements from the federal government. This will probably be very costly, especially for small businesses that don't already implement the kinds of security measures that are standard for large federal contractors. I'll give a brief overview of two efforts: a bill in the US Senate called the Cybersecurity Act of 2012 (CSA) that just failed, but gives us an idea of how Congress is thinking about securing critical infrastructure operators, and a new federal contracting rule that's closely related to parts of CSA in its goals and technical details. Both of these efforts focus on NIST's Risk Managment Framework, and if you're not already familiar with this process, now might be the time to get up to speed.

NIST's cybersecurity standards

Almost 10 years ago, Congress passed a law requiring federal agencies to develop and follow computer security plans that conform to standards set by the National Institute of Standards and Technology (NIST). In turn, NIST implemented the Risk Management Framework (RMF), which is a set of interlocking security standards that includes processes and specific technical requirements. By law, federal agencies have to follow these standards, and the GAO regularly  calls  out organizations that aren't doing enough.

Lately, the federal government has started to push rules and legislation that would bring NIST standards more forcefully into the private sector.

DoD contractors may get new security requirements

The first example is a proposed federal contract rule that would apply to many companies that do business with the DoD. This includes 76% of small businesses that work with the DoD. In its current draft form, the rule would mean that businesses need to apply complex and detailed NIST technical security requirements (called "controls") to unclassified data kept on their own systems. A small business might be required to drastically modify their internal communications and collaboration systems since the controls state that they must apply the "Principle of Least Privilege," meaning users can access only data they need to do their job. Another rule would require the use of FIPS certified cryptography to protect unclassified information. This is just a small sampling; other controls cover training, security auditing, configuration management, contingency planning, authentication, incident response, visitor controls, physical protections, and many others.

Risk management vs. legislating technical requirements

The NIST framework normally distinguishes between systems based on the potential negative impacts of a security incident and recommends different controls for different types of systems based on a risk assessment. My reading of the contracting rules as currently described is that they dictate specific technical controls without performing any risk assessment.

The upside of including a specific set of controls might be that small companies don't have to perform a complex risk assessment to decide which controls to implement. The NIST risk management framework is designed for extremely large government agencies, and there is currently no guidance for applying it to small businesses.

There is a downside, though, to dictating specific controls without a risk assessment. Businesses might have to implement overly restrictive systems that aren't justified by the risks. For instance, many of the proposed security controls would normally be applied only to systems where a security incident would cause a "serious" effect like significant financial loss or significant harm to individuals. A few of the controls, like the FIPS certification of cryptography, are usually recommended only for systems with potential "catastrophic" effects.

Will these new rules make it hard for small businesses to work with the federal government?

The goal of these rules is clearly to improve security for organizations that handle DoD data, but the compliance requirements for small federal contractors will likely be expensive (according to the proposal itself) and put them at a disadvantage in the commercial and government markets.

At the same time, DARPA programs like the Cyber Fast Track are seeking out non-traditional small businesses to create innovative security solutions, and it's designed to work well for firms who have not previously done work with the federal government. It's reasonable to expect that, if these rules are put into place, and if they are applied to such companies, it will make it harder for programs like Cyber Fast Track to bring in new talent.

The Cybersecurity Act of 2012

Another major effort by the federal government to bring better cybersecurity into the private sector was the Cybersecurity Act of 2012 (CSA). This is a bill just failed in the Senate, but it was strongly backed by President Obama in a Wall Street Journal op-ed. Even though the bill didn't pass, it gives us some insight on the direction Congress would like to go. This law would have given NIST the power to set cybersecurity standards for operators of critical infrastructure like the electric grid.

The law would require risk assessments across all the critical infrastructure industries, establish performance requirements, and provide for civil penalties for failure to meet the requirements. The Cybersecurity Act would also establish standards for certified third-party assessors who can measure an organization's security. If an organization complies with the security requirements or passes this type of assessment, they won't be held liable if there is a damaging cyber attack. The law doesn't seem to say that the NIST RMF would be used, but that NIST would set the standards and would use a risk assessment in doing so. It's reasonable to guess that the standards would look a lot like the existing RMF.

Challenges in applying NIST's framework

Taken together, the existing laws and potential new laws and regulations point to the major influence of the NIST Risk Management Framework over improving cybersecurity efforts, large and small. I'm a big fan of the framework, but there are significant gaps in it. For instance, it is pretty complex, and NIST should build a streamlined version for small businesses rather than giving contractors specific technical requirements without any risk assessments.

Furthermore, federal cybersecurity requirements might make it difficult for businesses (especially small businesses) to use cloud services, create new barriers to entry for innovative small cloud service providers, and give established large contractors a leg-up in secure cloud services in private industry. The rules that govern federal agencies are confusing enough that last year, Microsoft and Google engaged in a war of words about whether their products met federal standards and could be used for government services. This type of confusion might spread to private industry if these rules are adopted over the next few years, slowing the pace of cloud adoption.

Is the NIST Risk Management Framework poised to become a national cybersecurity standard?

There is no guarantee that these laws and regulations will pass, but changes are coming in cybersecurity, and it's very likely that NIST's approach will win the day since it's so widespread in the federal government. If you think you might be affected by the new rules, you should familiarize yourself with the NIST process so you can plan for the new regulations, and potentially even effect them before they are put in place.