Relatively complex authorization strategies are somewhat challenging to setup. I have a functional configuration but I do question whether there may be some gotchas (which should be documented).
What I was able to get working is:
Apache Virtual Host 1
* SSL terminated by HW accelerator
* read/write access to each repository
* Location 1 [SVNParentPath /the-path/svn-parent-loc]
* All projects covered
* Location 2 [SVNPath /the-path/svn-parent-loc/project-X]
* this project is covered by parent path as well
* Project access/Authorization specific to a special limited user group
Apache Virtual Host 2
* SSL passed directly through to Apache
* client certificate authentication/authorization
* Read-only access for all repositories
* Location 3 [SVNParentPath /the-path/svn-parent-loc]
This does work; I can access all required resources with the appropriate credentials. Conversely, without appropriate credentials, access is denied to the protected resources. This was somewhat painful to setup. I would recommend that a test case of a file move to a different directory be used as sanity check of proper operational behavior. The SSL accelerator causes the biggest headache of which a file move typically will trigger a failure if the server is misconfigured. Usually the problem is in the server name (had to specify http://server.x.y instead of https://server.x.y). This last statement assumes you are using a rewrite type rule as the various docs mention to handle the self-referential URL issues produced in this case. I think various Subversion/Apache documentation gets you close but this last part could use more/improved examples (and maybe corrections).
I do have some concern about access to the same SVN DAV resources being available through multiple virtual hosts and location elements. It seems possible that caching of various items (meta-data, etc) could cause stale results to get returned in some use cases. This fear is somewhat driven by the fact that each location element specifies SVN DAV related items. I have not looked into the mod_dav_svn ,etc to see if there is any intelligent aggregation of duplicated SVNPath info (for example) or whether everything is completely distinct. My slight uncomfortableness is mitigated by the fact that most resources are not accessed by more than one or 2 users and usually via the same host/location. In the few use cases where I expect differences in access, if I run into issues I think that various tweaks to cache/timeout type values can further mitigate the chance of stale data impacting things. Time will tell on this. If time ever permits, I will try and review the SVN code myself or try to form some intelligent question for the various related forums/lists.
I don't think posting extra detail is wise in this area but hopefully what I did post may help someone solve a setup issue.
Software Development, family, religious, hobby, fun and humorous items.
Showing posts with label certificate auth. Show all posts
Showing posts with label certificate auth. Show all posts
Friday, May 30, 2014
Monday, October 7, 2013
Jenkins jobs and subversion access via certificate and batch related stuff
Jenkins is a very nice solution to a number needs outside of automated builds. Use as a general batch solution is feasible at smaller scales. This has further improved recently with the credentials plugin. This makes it pretty easy to setup jobs which use certificates to access subversion instead of requiring a specific persons credentials which require a much more regular change. Of course, this works best when an organization has the infrastructure to securely manage certificates.
[2014/05/20] Works! It seems easiest to setup 2 separate Apache Virtual servers though; one with Basic Auth and the other with Certificate auth.
Related to general batch processing, the Elastic Axis plugin is a great solution. This is a perfect fit for when you have load balanced servers implementing an application and you need to run a job on any one but only one of them. You can take an application node offline and Jenkins/Elastic Axis will happily pick an available node out of the remaining configured nodes.
My wish list for Jenkins includes a few enhancements which would improve some things to a large extent - this mainly revolves around new/improved support for high availability, multiple masters with fail-over and a clear/clean clustering solution. A database backed job store might be a big plus depending on how the previously mentioned features were implemented. I would love to be able to easily integrate Jenkins into our DR environment but as it stands it is a more manual integration.
[2014/05/20] Works! It seems easiest to setup 2 separate Apache Virtual servers though; one with Basic Auth and the other with Certificate auth.
Related to general batch processing, the Elastic Axis plugin is a great solution. This is a perfect fit for when you have load balanced servers implementing an application and you need to run a job on any one but only one of them. You can take an application node offline and Jenkins/Elastic Axis will happily pick an available node out of the remaining configured nodes.
My wish list for Jenkins includes a few enhancements which would improve some things to a large extent - this mainly revolves around new/improved support for high availability, multiple masters with fail-over and a clear/clean clustering solution. A database backed job store might be a big plus depending on how the previously mentioned features were implemented. I would love to be able to easily integrate Jenkins into our DR environment but as it stands it is a more manual integration.
Subscribe to:
Posts (Atom)