Overview
ProposerThijs Kinkhorst , SURF
Area

SECURITY & PRIVACY

Type of work

DEVELOPMENT RESEARCH

Output


History
Original proposal

The SAML 2.0 protocol relies on XML signatures as the foundation of its security. A SAML assertion is signed with XMLDsig and the SP must properly validate this signature. If it does not, basically anyone in the world can trivially provide it with assertions thereby logging in as anyone, which also cannot be easily detected or even seen by the IdP. XMLDsig (and SAML) is notoriously complex and allows for many ways to create one or more signatures for any document. This makes that an implementation can easily fall victim to accepting not properly signed data - and even common implementations in our world like Shibboleth and SimpleSAMLphp have had issues here in the past. Besides these common products, which at least are periodically audited for such problems, a much larger risk is custom implementations that use different or even home grown libraries. Most of the times, the happy path is tested (does login work), but the unhappy path (do invalid assertions fail), not so much.

Given the paramount importance of signature validation, we should have a way to test whether SPs check signatures correctly. Although this can be done manually already, what's lacking is a scalable way that can test e.g. eduGAIN-like size of service providers (repeatedly) and for a large proportion of that set, determine if signatures are processed correctly. This requires to devise tests to fire off at these SPs and heuristics to determine automatically whether the tests passed or failed.

Some ideas of specific scenarios to test, all of which we've seen in real life to fail:

  • Signature not checked at all, modified message accepted
  • Modified message with signature rejected, but message without any signature accepted
  • Multiple signatures on the same message/signature wrapping attacks
  • Correctly signing a part of the message but unsigned part with attributes accepted.
Description of the activity

The goal of the activity is to deliver a (software or service) solution that assists federation operators of NREN federations in testing at scale of several core security aspects of Service Providers SAML deployments within their federation.
Deployment scenarios, to be confirmed with stakeholders, might include:

  • Self-testing by an SP as part of the route towards becomig a production deployment
  • (Automated) Testing the SP deployment as part of the inital onboarding into the federation by FedOps
  • (Automated)Testing the SP deployment as part of periodic review by FedOps
  • Instituion initiated testing of SP as part of compliance review, e.g. wrt GDPR compliance, for a service they have a contract with

This topic should include the technical implementation of the use cases we would like to test against. In addition it needs to discuss and if need be develop a means to support FedOps to deploy the testsuite both technically and operationally.
Next to technical and operational requirements we need to understand as well as potential legal aspects, so we can include all of these in the design of the test suite.

Activities:

  • Run at least 1 workshop with the community of Federation Operators to collect and discuss use cases, requirements and deployment scenario's
  • Gather at least 3 federation operators who are willing to act as stakeholders and help test the tool in a controled environment
  • Discuss feasibility, risks and risk mitigation possibilities with legal advisor, describe design considerations that result from this discussion
  • Discuss challanges around use cases and describe proposed resolution, allow stakeholders to review
  • Select and implement use cases into test suite
  • Develop a deployment plan with stakeholders to scale up the use of the tool to real world usecases
  • Optional: consider what would be needed to extent this test to OIDC RPs
Ownership & Utilisation

The following parties will use the results of this activity:

T&I Service
R&E Community
External Party


Results & Deliverables

The following results were created and delivered:

  • No labels

1 Comment

  1. Nice comment from Scott Cantor :

    People just assume their vendors are doing the right stuff....I'm here to tell you they ain't.

    You have multiple, likely critical, apps right now that are on fire and you simply don't know it.

    Just in case you needed additional motivation! (wink)