Conveners: Roland Hedberg, Rainer Hörbe
Abstract: Project that focuses on creating testing tools for different forms of access configurations (which are including an option of fingerprint recognition) that can be used by a whole community of identity experts.
Tags: SAML, Test tools
The front end wasn’t built yet but the thinking is already up. If you have a set of IDPs and you want to continuous integration and if they switch to another software version then the test is being run automatically, as soon as the IDP is changed, it has to be ran again.
Automation is a problem if there is an authentication where a person needs to sign in. There are several options how it can be mechanised.
Q: How do the people do testing?
A: One forma are unit tests, which you configure manually. The unit tests are fairly automated.
The tricky thing is how to get the configuration data in place, so if many SPs in a federation test with (slightly) different configurations, it would be a copy paste hell that would be hard manage.
Other use cases would be different, the first tests would be for IDPs and check the functionality, like if it’s able to consume the metadata and to verify signatures. For SPs it would be more automated. It could be used as a tool for a service that is asking to join the federation to test the SP.
Rainer: I am interested in the different use cases that people have for using SAML tests.
Nick's FO use case:
Local IDP machine, an IDP tester, it has the tests I want to run...
The test takes to the IDP, there is a legend which shows what kind of errors or reports you might get.
If saml2int is executed, then another set of tests is established, if there was some kind of an error, then the code can be checked by clicking on the question mark and there can be checked and seen what went wrong or what was the error in the code.
IDP form pops out on the browser in any case as the automation hasn’t been made.
A proxy is also a possibility. Peter couldn’t enter stuff with only HTML. If you want to mechanise it, this is the main information, it has to match the URL, there is a login page and a form that looks for the sets of input boxes, and that’s it, you press the button, mechanise and you’re done.
Pietu: The providers are mostly commercial and they are interested in testing the functionality.
Informal lightweight community should be created where the idea can be worked on.
There is a test repository where you collect the configurations. Question to the group: Would it be in a Jenkins server or a Cloud based application, or would you download it into your infrastructure?
Current practise is when IDP wants to join the AI, he has to join the federation, the SP logs have to be checked for errors and after red and green light come up it should be moved to the production environment.
Idea is to deploy the IDP test with two or three common configurations with different entity IDs, so everything would be in the test federation metadata so that we don’t have to check everything but we can see what happened where.
The tests can be run by the federation all the time. If the user is having a problem and you want to see what happened, these two can be quite useful, you can point the user to these tests and probably only one button should be clicked and the result of the test would be sent to the tech support automatically. That could be one useful thing to do.
How can tests be utilised and what are the test cases? Are we testing IDP and SP?
The test federation should be assembled and to practise the tests there.
The idea is to have a list of specs and each chapter has a number of requirements, these are the existing labs, and each requirement is supported by a number of test operations. The test is gotten from a little bit of configuration.
Test case: fingerprint IDP/SP version.
A database with tests should be good to create. The fingerprinting should report the version number of the software helping to flag insecure deployments. Operators can to notified to upgrade the version of Shibboleth.
You really want to test the security, e.g. you could test if an SP would reject a broken signature. There was a product that went into the market already and did not check the signature at all.
Otherwise check general functionality, see if it is consuming updated metadata.
SP less automated: preconfigured test handed over to the SP operator, who has to complete the config and run the test, then feedback the test results to the FO.