Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

While every care is taken that all eduGAIN services function reliably, the selected operational model allows that services updates and modifications can be done at a short-term notice allowing for a small risk of a downtime required to restore the system snapshot.

Operational Team procedures

...

The aggregation, signing and publishing of the eduGAIN metadata aggregate is done on the an hourly basis.

All information about the system status, federation metadata channel information, federation public keys etc. is kept in the eduGAIN database and taken from there as required within the aggregation process.

  • Half past every hour metadata acquisition is started on mds-feed and is pefromed in the following steps
    • mds-feed downloads federation metadata feeds using conditional GET.

    • if the conditional GET resulted in a download of a new metadata file, such file is passed through the local validator instance, if validation succeeds the downloaded file is used as an input for aggregator if it fails, the previous correct feed copy us used instead

    • the newest available validated copy of the federation metadata feed is kept for future use
    • the validated metadata files are passed to a pyFF flow, see also [eduGAIn-meta] Metadata combination and collision handling

    • pyFF aggregates and then signs the resulting feed; currently the signing is done with key files stored at the mds-feed host

    • the resulting file is analysed, broken into entities and used to update the edugain-db

    • the final output is uploaded with sftp to the technical host using a dedicated user account on the the technical host

  • at

    At 45 minutes past every hour the new copy of eduGAIN metadata aggregate is copied to the final destination directory and when the copy is completed the mv action is performed in order to substitute the production file in an atomic mode

  • finally

    Finally the new eduGAIN metadata aggregate file is copied to the history repository and compressed

  • At midnight (CET) hourly copies of metadata are deleted from the repository, leaving only a single daily file. These daily files can then be used as a source of various data analysis.

Handling of aggregation alerts

...