Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Update cycle 8 selection


titleTopic submission deadline

The submission period for the second cycle (Sep 23 - Apr 24) has ended. We will notify the proposers of the selected topics soon.Topics for cycle 4 Call for Ideas for for cycle 8 (Apr 24 - Dec 24) may already be proposed.


Scalable testing for insecure SAML signature validation


) is open.

The submission deadline for the next cycle is 15 March 24.

The SAML 2.0 protocol relies on XML signatures as the foundation of its security. A SAML assertion is signed with XMLDsig and the SP must properly validate this signature. If it does not, basically anyone in the world can trivially provide it with assertions thereby logging in as anyone, which also cannot be easily detected or even seen by the IdP. XMLDsig (and SAML) is notoriously complex and allows for many ways to create one or more signatures for any document. This makes that an implementation can easily fall victim to accepting not properly signed data - and even common implementations in our world like Shibboleth and SimpleSAMLphp have had issues here in the past. Besides these common products, which at least are periodically audited for such problems, a much larger risk is custom implementations that use different or even home grown libraries. Most of the times, the happy path is tested (does login work), but the unhappy path (do invalid assertions fail), not so much.

Given the paramount importance of signature validation, we should have a way to test whether SPs check signatures correctly. Although this can be done manually already, what's lacking is a scalable way that can test e.g. eduGAIN-like size of service providers (repeatedly) and for a large proportion of that set, determine if signatures are processed correctly. This requires to devise tests to fire off at these SPs and heuristics to determine automatically whether the tests passed or failed.
Some ideas of specific scenarios to test, all of which we've seen in real life to fail:
  • Signature not checked at all, modified message accepted
  • Modified message with signature rejected, but message without any signature accepted
  • Multiple signatures on the same message/signature wrapping attacks
  • Correctly signing a part of the message but unsigned part with attributes accepted.

While supporting new federations in setting up their infrastructures, IdPs and SPs,  generally speaking, we still do not have much automation in place. All is done, still very manually, and takes much time. Talking specifically of the SPs, both for the installation and configuration of the services themselves, and the required operations to federate them (i.e. make them fully functional SAML2 Service Providers), in order to be able to provide them in a federated (e.g.eduAGAIN) fashion, pretty much all is still left to manual set up. 

It would be useful to enhance the level of support we provide to them with the aim of quickly being able to deploy an initial set of services, the ones which could de-facto start to attract users towards the newly deployed federation infrastructure and the federated IdPs. 

The idea here is to propose  a new cycle of T&I incubator task activities aimed at the following tasks:

  • Identifying an initial set of 2-3 services we’d like to promote as SPs to the new identity federations. (e.g.: Wiki, Moodle, Joomla, eduMEET, Filesender, ..)
  • Design a solution based on automation, possibly using containers, or automated deployment tools like Ansible, Puppet (which we should aim at making easy for early services deployers), for the services we’d like to deploy. Or any script with the corresponding clear, easy to use documentation which would do as much of the initial installation and configuration work as possible, leaving to a minimum the amount of residual manual interventions required. 
  • Define both technical and strategic roadmaps to ensure sustainability of these deployment solutions: how will they be upgraded/ported to new versions, which task, or permanent activity in the GN project, or the community could endorse the future work to keep the developed solution working also in future.
    This proposal is about using a full Incubator cycle  to develop an initial solution, work on it, and add some work to design in a clear way how things can be made sustainable after the T&I cycle would be over. More information on the proposal on  
    Simple IdM software tailored for R&E institutions

    Deploying IdM software from scratch in R&E context is not easy. There are many moving parts, like LDAP, RADIUS, OIDC, SAML that all require their own installation, setup, configuration and maintenance.

    Existing solutions are usually a combination of expensive, outdated, complicated, offer limited functionality, not well-matched for R&E requirements, difficult to install, operate, update or deploy securely.

    Most solutions are focused on big enterprise deployments with thousands of users, but small(er) organizations are left without an and easy to deploy IdM.

    What is needed is software that makes it very easy for small to medium organizations to host and deploy their own IdM which requires minimal babysitting and is easy to configure through the web by designated administrators.

    What if you could simply install the software in ~5 minutes and configure it through the web interface in 5 more minutes and updating wouldn't be more complicated than running "apt upgrade" without having to worry about breaking your setup?

    More details on the proposal can be found here:

    Janos Mohacsi (KIFÜ
    TitleProposerDescriptionSupporter (+1

    Peter Brand (ACOnet)

    Anass Chabli (RENATER)

    Automation of
    deployment and
    of initial set of SPs
    for new federations

    Davide Vaghetti (GARR)

    Janos Mohacsi (KIFÜ)

    François Kooman (DeiC)

    Anass Chabli (RENATER)

    Arnout Terpstra (SURF)

    Refeds Assurance Profile -like information for verifiable claims
    Mihály Héder (KIFÜ/SZTAKI)

    Here is an example of a verifiable credential from the W3C VC v2.0 draft. The bold parts are relevant for us.

      "@context": [
      "id": "http://university.example/credentials/3732",
      "type": ["VerifiableCredential", "ExampleDegreeCredential"],
      "issuer": "https://university.example/issuers/565049",
      "validFrom": "2010-01-01T00:00:00Z",
      "credentialSubject": {
        "id": "did:example:ebfeb1f712ebc6f1c276e12ec21",
        "degree": {
          "type": "ExampleBachelorDegree",
          "name": "Bachelor of Science and Arts"

    People who consume such an academic degree / earned credits / passed exams, etc. credentials are naturally curious about the circumstances of the activity achieved. Was it an in-person course or mixed or fully online? Was the identity of the exam participant verified and how? Was it a supervised event? Unless there is an assurance vocabulary to express these facts, the types themselves will proliferate, eg. there will be an OnlineBachelorDegree, OnlineBachelorDegreeWithInPersonMajorExams, etc. The problem is very simiar to what RAF solves for the context of an authentication. For instance, RAF Identity Assurance Profile introduces the concept of identity evidence and discusses in-person and supervised remote proofing. This is exactly what is needed for a claim about an exam taken. As an example, the default kind of badge (claim) earned on Coursera will be IAP/low, as ultimately a Coursera account is self-asserted. However, everybody who does not cheat and pays for a course would benefit from a badge/claim in which their idenity is more rigorously assured.

    The proposal is to identify how elements of RAF could be re-used in the VC context as well as extended with other elements, to express the supervised closed-room exams, etc.

    Investigate Google WEI & Apple Private Access Tokens
    Mihály Héder (KIFÜ/SZTAKI)

    Google Web Environment Integrity is a method for websites to verify that the client platform (User Agent a.k.a. browser + operating system) is indeed genuine and has not been "tampered with".
    The protocol relies on integrity attestations.

    The proposal has received strong criticism, the interlocutors mostly claim that it is just a harmful way of achieving DRM. For a summary, see the Wikipedia entry:

    The insight of the CEO of Vivaldi browser is especially interesting: they apparently already need to spoof the user agent string in order to be able to use Google Docs, despite the fact that Vivaldi is based on chromium.

    By the proposers it is purported to be a replacement of browser fingerprint-based anti abuse methods.

    They also claim that it is a better alternative than Apple's Similar Private Access tokens, another attestation scheme that works between Apple devices and Cloudflare. They also claim in defense of WEI that they may help sunsetting the increasingly useless CAPTCHAs.

    WEI is already supported by Chrome on Android.

    It could turn out to be crucial that our community understands these protocols and develop its own relationship to them. Also, while the attestation about personal devices seems indeed quite privacy-endangering as well as the prospect of enhanced DRM, there may be legit use cases for classroom devices or around the integrity of wallets.

    The proposal is to explore, try out WEI and write a report for the community. Perhaps the timing of this proposed activity is also a strategic concern - if the WEI proposal will have no good reception then there is no point in wasting resources on it, but if it there is uptake then we should reac.

    Janos Mohacsi (KIFÜ)

    Webwallet for research and education use case
    Stefan Liström (SUNET)

    Europe is working towards a wallet-based identity ecosystem. The Architecture and Reference Framework (ARF) serves as a basis for the implementation of the proposal for the European Digital Identity Framework.
    The current framework assumes all interactions will be handled via an app on a mobile phone. While this may suffice for many users, it will leave out groups that cannot or will not use such devices. In addition, it creates a dependency on the vendors of the devices and the software they run on. Finally, users may not be willing to store and aggregate work related data on a personal device. 
    This activity will investigate if a browser based wallet may be created which can support (parts of) the ARF. To confirm usability for our community, the browser based wallet should be tested with the same scenarios as were previously tested in the incubator using mobile based wallets (Using Distributed Identity for managing researcher access).

    Janos Mohacsi (KIFÜ)

    Trust fabric for wallets
    Leif Johansson (SUNET)

    Europe is working towards a wallet-based identity ecosystem. The Architecture and Reference Framework (ARF) is intended to serve as a basis for the implementation of the proposal for the European Digital Identity Framework. Two protocols are in the core of the specification: ISO 18013-5:2021 (mDL) and OpenID4VC + Verifiable Credentials. The current version of the ARF has declared the organizational trust out of scope. However, for a real world ecosystem, it is clear an interoperable trust fabric will be needed to support the OpenID4VC + Verifiable Credentials ecosystem, but also the mDL based scenarios.
    The OIDC federation specification seems to have many characteristics that would allow such a wallet ecosystem to be defined. This activity will investigate and test the use of the OIDC federation protocol as a trust fabric for a wallet ecosystem.

    Scalable, interoperable revocation
    Stefan Liström (SUNET)

    Revocation is not only a mandatory privacy enhancing feature for endusers, it is also a core security feature. Both use cases for revocation need to be implemented in a future EUDI wallet ecosystem. There is currently however no clear solution for interoperable, scalable revocation in the EUDI. This activity investigates and describes the possible approaches for scalable, interoperable ways to handle revocation. The activity should try to test at least two of the approaches with respect to requirements on scalability and interoperability as may needed for the EUDI