Although lot of framework exists in the Java world, there is less available pointers on this topic with Perl. The aim of this post if to give a few practical hints on how we set up our continuous testing and deployment system using Husdon for multiple, dependent perl distributions.
Be ready, it's easy!
The problem
Well, a test server check every 10 minutes is new code has been subversion (or other SCM) commited (we try to apply TDD, so *.t tests are commit along with the lib). Then:
- test are run;
- in case of failures, mail is sent;
- in case of success, code is deployed and ready to be installed; dependent distribution testing is launched
In fact, I mainly used two solutions:
- crontab scripts: harder to make dependencies, report, mail and... colleagues to take over
- cruisecontrol: configurations is xml based, and even with a few scripts, it's rather repetitive and tedious for perl
On the Perl distribution side
Module are quite classic, ready for CPAN deployment. I personally create new module with the command module-starter and then open the directory as a new Perl project with eclipse EPIC plugin.
Dependencies are set in Build.PL and test files resides in the t/ directory.
Testing / deploying is just a matter of:
perl Build.PL
./Buil test
./Build dist
Configuring Hudson
Installation is straight forwards on ubuntu server
apt-get install hudson
New Job
Each distribution is tested by one hudson job. A job is configured via straightforwards steps.
SCM dependency
A job can be linked to a SCM a launched each time a commit is done (in this example, we check every 5 minutes)
Launching tests
A job is a succession of steps and hudson offers us the possibility for a Execute shell type. As you could imagine, that will be a script (default is sh, but you can configure bash if you prefer in the global config).
A very basic step is
perl Build.PL
prove --formatter TAP::Formatter::JUnit t > test-output.junit.xml
2 remarks, at this stage:
- we prefer prove as it gives the possibility to output junit formated report (better for parsing, for chart about number of successful tests and overall integration), although it is not compulsory
- we write the output in an xml file and later we can configure the job to get junit test results from this file
Splitting a project among distributions is certainly a healthy habit. Thus, there are some dependencies among these modules and it must be taken into account. The inter jobs flow is defined via hudson, but we need some more tuning from the Perl point of view.
We can setup a $SEARH_ENGINE_PERL_INSTALL_BASE directory (well, we are doing search engines, here) variable at the global hudson config.
Then, the actual test process becomes:
The second step deploys the module in this directory, while the test part take it into account in the include lib.
Deployment
What we need is to build a tar.gz archive to be installed on other machine. Once the archive is classically built, we ftp it on directory later accessible via http.
The build-manifest.sh script is here to build automatically the MANIFEST file, based on *.pm, *.pl and cgi scripts.
The global picture
And here we go.
The hudson home page has a global view (and a direct button to launch the build in case you don't want to wait for the cron).
I love the meteo icon (the worst the weather, the most unsuccessful built attempts in the recent pas).
Et voilĂ ! Thanks a lot the hudson team!
Well Written. Keep sharing more and more
ReplyDeleteSEO Online Training
Java Online Training
python Online Training
Salesforce Online Training
Tableau Online Training
AWS Online training
Dot Net OnlineTraining
DevOps Online Training
Selenium Online Training