Tag - ocaml

Entries feed - Comments feed

Monday, August 22 2016

Release of OASIS 0.4.7

I am happy to announce the release of OASIS v0.4.7.

Logo OASIS small

OASIS is a tool to help OCaml developers to integrate configure, build and install systems in their projects. It should help to create standard entry points in the source code build system, allowing external tools to analyse projects easily.

This tool is freely inspired by Cabal which is the same kind of tool for Haskell.

You can find the new release here and the changelog here. More information about OASIS in general on the OASIS website.

Pull request for inclusion in OPAM is pending.

Here is a quick summary of the important changes:

  • Drop support for OASISFormat 0.2 and 0.1.
  • New plugin "omake" to support build, doc and install actions.
  • Improve automatic tests (Travis CI and AppVeyor)
  • Trim down the dependencies (removed ocaml-gettext, camlp4, ocaml-data-notation)

Features:

  • findlib_directory (beta): to install libraries in sub-directories of findlib.
  • findlibextrafiles (beta): to install extra files with ocamlfind.
  • source_patterns (alpha): to provide module to source file mapping.

This version contains a lot of changes and is the achievement of a huge amount of work. The addition of OMake as a plugin is a huge progress. The overall work has been targeted at making OASIS more library like. This is still a work in progress but we made some clear improvement by getting rid of various side effect (like the requirement of using "chdir" to handle the "-C", which leads to propage ~ctxt everywhere and design OASISFileSystem).

I would like to thanks again the contributor for this release: Spiros Eliopoulos, Paul Snively, Jeremie Dimino, Christopher Zimmermann, Christophe Troestler, Max Mouratov, Jacques-Pascal Deplaix, Geoff Shannon, Simon Cruanes, Vladimir Brankov, Gabriel Radanne, Evgenii Lepikhin, Petter Urkedal, Gerd Stolpmann and Anton Bachin.

Thursday, October 23 2014

Release of OASIS 0.4.5

On behalf of Jacques-Pascal Deplaix

I am happy to announce the release of OASIS v0.4.5.

Logo OASIS small

OASIS is a tool to help OCaml developers to integrate configure, build and install systems in their projects. It should help to create standard entry points in the source code build system, allowing external tools to analyse projects easily.

This tool is freely inspired by Cabal which is the same kind of tool for Haskell.

You can find the new release here and the changelog here. More information about OASIS in general on the OASIS website.

Here is a quick summary of the important changes:

  • Build and install annotation files.
  • Use builtin bin_annot and annot tags.
  • Tag .mly files on the same basis as .ml and .mli files (required by menhir).
  • Remove 'program' constraint from C-dependencies. Currently, when a library has C-sources and e.g. an executable depends on that library, then changing the C-sources and running '-build' does not yield a rebuild of the library. By adding these dependencies (rather removing the constraint), it seems to work fine.
  • Some bug fixes

Features:

  • noautomaticsyntax (alpha): Disable the automatic inclusion of -syntax camlp4o for packages that matches the internal heuristic (if a dependency ends with a .syntax or is a well known syntax).
  • compiledsetupml (alpha): Fix a bug using multiple arguments to the configure script.

This new version is a small release to catch up with all the fixes/pull requests present in the VCS that have not yet been published. This should made the life of my dear contributors easier -- thanks again for being patient.

I would like to thanks again the contributor for this release: Christopher Zimmermann, Jerome Vouillon, Tomohiro Matsuyama and Christoph Höger. Their help is greatly appreciated.

Tuesday, March 25 2014

Release of OASIS 0.4.3

I am happy to announce the release of OASIS v0.4.3.

Logo OASIS small

OASIS is a tool to help OCaml developers to integrate configure, build and install systems in their projects. It should help to create standard entry points in the source code build system, allowing external tools to analyse projects easily.

This tool is freely inspired by Cabal which is the same kind of tool for Haskell.

You can find the new release here and the changelog here. More information about OASIS in general on the OASIS website.

Here is a quick summary of the important changes:

  • Added -remove switch to the setup-clean subcommand designed to remove unaltered generated files completely, rather than simply emptying their OASIS section.
  • Translate path of ocamlfind on Windows to be bash/win32 friendly.
  • Description is now parsed in a more structured text (para/verbatim).

Features:

  • stdfiles_markdown (alpha): set default extension of StdFiles (AUTHORS, INSTALL, README) tp be '.md'. Use markdown syntax for standard files. Use comments that hides OASIS section and digest. This feature should help direct publishing on GitHub.
  • disableoasissection (alpha): it allows DisableOASISSection to be specified in the package with a list of expandable filenames given. Any generated file specified in this list doesn't get an OASIS section digest or comment headers and footers and is therefore regenerated each time `oasis setup` is run (and any changes made are lost). This feature is mainly intended for use with StdFiles so that, for example, INSTALL.txt and AUTHORS.txt (which often won't be modified) can have the extra comment lines removed.
  • compiledsetupml (alpha): allow to precompile setup.ml to speedup.

This new version closes 4 bugs, mostly related to parsing of _oasis. It also includes a lot of refactoring to improve the overall quality of OASIS code base.

The big project for the next release will be to setup a Windows host for regular build and test on this platform. I plan to use WODI for this setup.

I would like to thanks again the contributor for this release: David Allsopp, Martin Keegan and Jacques-Pascal Deplaix. Their help is greatly appreciated.

Sunday, February 23 2014

Release of OASIS 0.4.2

I am happy to announce the release of OASIS v0.4.2.

Logo OASIS small

OASIS is a tool to help OCaml developers to integrate configure, build and install systems in their projects. It should help to create standard entry points in the source code build system, allowing external tools to analyse projects easily.

This tool is freely inspired by Cabal which is the same kind of tool for Haskell.

You can find the new release here and the changelog here. More information about OASIS in general on the OASIS website.

Here is a quick summary of the important changes:

  • Change BSD3 and BSD4 to BSD-3-clause and BSD-4-clause to comply with DEP5, add BSD-2-clause.

BSD3 and BSD4 are still valid but marked as deprecated.

  • Enhance .cmxs supports through the generation of .mldylib files.

When one of the modules of a library has the name of the library, ocamlbuild tends to just transform this module into a .cmxs. Now, the use of a .mldylib fix that problem and the .cmxs really contains all modules of the library.

  • Refactor oasis.cli to be able to create subcommand plugins.
    • Exported modules starts now with CLI.
    • Display plugins in the manual.
    • Design so that it is possible to be thread-safe.
    • Try to minimize the number of functions.
    • Make better choice of name and API.
    • A subcommand plugin 'dist' to create tarball is in preparation, as a separate project.
  • Remove plugin-list subcommand, this command was limited and probably not used. A better alternative will appear in next version.
  • Sub-command setup-dev is now hidden and will soon be removed.

I have published a quick intermediate version 0.4.1, a few days after the previous release. This was a bug fix related to threads. I also decided to skip the release of last month. I was in the US at this time and didn't have time to work enough on OASIS (christmas vacation and a travel to the US). This month I am back on track.

This new version doesn't feature a lot of visible changes. I mostly work on the command line interface code, in order to be able to create external plugins. A first external plugin is almost ready, but need some more polishing before release. This first plugin project is a port of the script that I have used for a long time and that was present in the source code of oasis (oasis-dist.ml). It will be a project on its own and have a different release cycle. The point of this plugin is to create .tar.gz out of an OASIS enabled project.

I also must admit that I am very happy to see contributors sending me pull-request through GitHub. It helps me a lot and I also realize that the learning curve to enter OASIS code is steep. This last point is something I will try to improve.

Friday, December 13 2013

Release of OASIS 0.4.0

I am happy to announce that OASIS v0.4.0 has just been released.

Logo OASIS small

OASIS is a tool to help OCaml developers to integrate configure, build and install systems in their projects. It should help to create standard entry points in the source code build system, allowing external tools to analyse projects easily.

This tool is freely inspired by Cabal which is the same kind of tool for Haskell.

You can find the new release here and the changelog here. More information about OASIS in general on the OASIS website.

I have recently resumed my work on OASIS and this will be hopefully the new version that will lead to quicker iteration in the development of OASIS. The development process was slowdown by the fact, that I feared introducing new fields in _oasis or regression. This was a pain and I decided to change my development model.

Features

The most important step is the introduction of AlphaFeatures and BetaFeatures fields. They allow to introduce pieces of code that will only be activated if certain features are listed in those fields. It should help to be always ready to release.

The features also cover other aspect like flag_tests and flag_docs which has been introduced in OASIS v0.3.0. In fact the features API is now used to introduce all enhancement while keeping backward compatibility with regard to OASISFormat. Rather than defining a ~since_version:0.3 for fields we use a feature that handle the maturity level of the feature. When I feel a specific feature is ready to ship, I just change the InDev Alpha to InDev Beta and then SinceVersion 0.4. On the long term, when we won't support anymore a version of OASIS that existed before the SinceVersion, the feature will always be true and I will fully integrate it in the code.

The only constraint around features is: if you use AlphaFeatures or BetaFeatures field, you must use the latest OASISFormat...

Features section in the manual.

Example of features available:

  • section_object: allow to create object (.cmo/.cmx) in _oasis
  • pure_interface: an OCamlbuild feature that allows to handle .mli without a .ml file

Automate

Another topic is automation of testing releases. For OASIS v0.3.0, I ran tests on all platforms manually, late in the development of v0.3.0 and it was painful to fix. So I have decided to setup a Jenkins instance that automate testing on Linux. On the long term, I plan to also setup a Mac OS X builder and start looking at Windows as well. This should help me catch errors early and be able to fix them quickly.

However, for v0.4.0 I have decided to just release what I have and which has mainly be tested on Linux. The point here is to quickly release and iterate, rather than wait for perfection. Hopefully end user testing will allow to quickly discover new bugs.

Time boxed release

In the coming months, I will try to do time boxed releases. I will try to release version of OASIS every 15th of the month. The point here is to try to iterate faster and avoid long delay between release.

See you in 1 month for the next release.

Sunday, November 10 2013

opam2debian, a tool to create Debian binary package out of OPAM

One week ago, a thread in opam-devel mailing list started about the possibility to create binary snapshot to distribute OPAM packages. Distributing binary packages with everything already compiled is pretty useful when you want to make sure that everyone has the same version of packages installed and you don't want to spend time configuring all the computer of your colleague.

As a matter of fact I was also interested to see that happening. I have several computers where I want to install a set of packages and I want to snapshot OPAM archives when I am ready for an upgrade. I have tested for a long time another source distribution: GODI. I even wrote a puppet module to drive it. This was a fun experience, but as with any source only distribution, there are some drawbacks. Especially even if it is fully automated, you get a lot of errors and timeout when trying to build automatically from source.

This is how opam2debian has started. I wanted a replacement for puppet-godi that takes advantage of OPAM and the possibility to distribute my set of packages everywhere quickly.

The goals of this project are:

  1. generate a Debian binary package with all the dependencies set on external packages (like libpcre)
  2. use OPAM to build everything
  3. use standard Debian process to build package
  4. create snapshot of OPAM repository that allow to rebuild exactly the same thing on different arches

The real challenge behind opam2debian was to be able to use standard Debian process to create package to get all the power of dependencies computation provided by the Debian maintainer scripts. My great achievement on this topic was to find a tool called proot that allows you to bind mount a directory as a user. This is an achievement because it allows to create a package using a directory where you are not allowed to write (i.e. /opt/opam2debian directory which is owned by root). This is a way to do it on any Jenkins builder without root access and a standard way for Debian to build packages.

Usage example

I need to compile a set of package to install on all my Jenkins builder. Here is the list of packages I need and I also want to use the latest OCaml version. I want to build a Debian package for Debian Wheezy which has only OCaml 3.12.1.

Building the package:

$ opam2debian create --build --name opam2debian-test2 --compiler 4.01.0 \
 ocamlfind fileutils ocaml-data-notation ocaml-expect ounit \
 ocamlmod ocamlify oasis yojson sexplib extlib pcre-ocaml \
 calendar ocaml-inifiles ocamlnet ocurl gettext inotify ocaml-sqlexpr \
 ocamlrss ocaml-xdg-basedir

Getting the package list right can be tricky because the process will stop at the first error -- and if it is because the last package fails to build it can be long. I have implemented a --keep-build-dir in case you want to tune the package list live.

The program will build everything and it can takes quite a while. At the end you get a file opam2debian-test2_20131105_amd64.deb which has a reasonable size of 232MB. That is big, but it looks like the biggest directory is $OPAMROOT/4.01.0/build, I am not sure if I can remove it but we may save some space here (the build directory represents 50% of the package size).

Then you can just do a standard Debian installation:

$ sudo dpkg -i opam2debian-test2_20131105_amd64.deb

And you can use it:

$ eval $(opam config env --root /opt/opam2debian/opam2debian-test2/)
$ which ocamlfind
 /opt/opam2debian/opam2debian-test2/4.01.0/bin/ocamlfind
$ ocamlfind list
 [...]
 stdlib              (version: [distributed with Ocaml])
 str                 (version: [distributed with Ocaml])
 threads             (version: [distributed with Ocaml])
 threads.posix       (version: [internal])
 threads.vm          (version: [internal])
 type_conv           (version: 109.41.00)
 unidiff             (version: 0.0.2)
 unix                (version: [distributed with Ocaml])
 userconf            (version: 0.3.1)
 xdg-basedir         (version: 0.0.3)
 xmlm                (version: 1.1.1)
 yojson              (version: 1.1.5)

A nice thing to note, is that it is a standard OPAM install. You can install, update and upgrade OPAM from there, as root. However, I would recommend not to do it, and just rebuild a newer Debian package to upgrade.

Install

You will need the opam and proot Debian package available in Debian jessie and sid. You will also need various OCaml libraries (cmdliner, calendar, fileutils and jingoo).

Download the opam2debian tarball on the forge, build and install it.

The project is hosted on github.

Open issues

On the initial list of goals of the project. Not everything is completed. I have still several open issues.

In particular 4. (create snapshot of OPAM repository) was blocked by a bug in opam-mk-repo, that prevents snapshots (see my pull-request to solve it).

Another issues is about licenses of the included files. I should list them all and I need to figure out a way to extract every licences.

Submit bugs directly to github.

Sunday, September 29 2013

OUnit 2.0, official release

After 1.5 month of work, I am proud to officialy release OUnit 2.0.0. This is a major rewrite of OUnit to include various feature that I think was missing from OUnit1. The very good news is that the port of the OASIS test suite has proven that this new version of OUnit can drastically improve the running time of a test suite.

OUnit is a unit test framework for OCaml. It allows one to easily create unit-tests for OCaml code. It is based on HUnit, a unit testing framework for Haskell. It is similar to JUnit, and other XUnit testing frameworks.

Download OUnit v2.0.0

Documentation of v2.0.0

Website

The basic features:

  • better configuration setup
    • environment variable
    • command line options
    • configuration files
  • improved output of the tests:
    • allow vim quickfix to jump in the log file where the error has happened
    • output HTML report
    • output JUnit report
    • systematic logging (verbose always on), but output log in a file
  • choose how to run your test:
    • run tests in parallel using processes (auto-detect number of CPU and run as many worker processes)
    • run tests concurrently using threads
    • use the old sequential runner
  • choose which test to run with a chooser that can do smart selection of tests:
    • simple: just run test in sequence
    • failfirst: run the tests that failed in the last run first and skip the success if they are still failing
  • some refactoring:
    • bracket: use a registration in the context but easier to use
    • remove all useless functions in the OUnit2 interface
  • non-fatal section: allow to fail inside non-fatal section without quitting immediately the whole test
  • allow to use OUnit1 test inside OUnit2 (smooth the transition)
  • timer that makes tests fail if they take too long, only when using the processes runner (I was not able to do it cleanly using threads and sequential)
  • allow to parametrize filenames so that you can use OUNIT_OUTPUT_FILE=ounit-$(suite_name),log and have $(suite_name) replace by the test suite name
  • create locks to avoid accessing the same resources within a single process or the whole application (typically to avoid doing a chdir while another thread is doing a chdir elsewhere)
  • create a in_testdata_dir function to locate test data, if any

Migration path

OUnit 2.0.0 still provides the OUnit module which is exactly the same as the last OUnit 1.X version. This way, you are not forced to migrate. However, this means that you will gain no advantage of the new release and even some slowdown due to increase complexity of the code. Though, I strongly recommend to upgrade to OUnit2.

Here is a checklist to do the migration:

  • replace all open OUnit by open OUnit2
  • the test function now takes test_ctxt argument, so replace all fun () -> ...) by fun test_ctxt -> ...
  • bracket are now inlined so bracket setUp f tearDown is now let x = bracket setUp tearDown test_ctxt in
  • make sure that you don't change global process state like chdir or Unix.putenv or that you don't rely on another test setting something for the next test

The OASIS test suite migration

In order to check that everything was working correctly, I have migrated the OASIS test suite to OUnit2. This is a big test suite (210 test cases) and it includes quite big sequences of tests (end to end tests from calling oasis setup to compiling and installing the results). This was really time consuming and I wish to see a significant speedup for the tests with OUnit2.

You can see the result in term of code of the full migration here.

Here are the results on my Intel Core i7 920/SSD:

  • Pristine test suite (210 tests):
    • oUnit v1: 52.36s (i.e. latest OUnit v1.x, reference time)
    • oUnit1 over oUnit2: 60.39s (OUnit v2.0.0 using the OUnit v1 layer)
  • Migration to OUnit2 (166 tests):
    • processes (8 shards): 10.12s
    • processes (autodect, 4 shards): 12.99s
    • sequential: 58.77s

OUnit v2 benchmark

The migration was quite heavy because this test suite had a big design problem. It uses in-place modification of the test data. I think I pick this design because I thought this was a good idea to decrease the running time. As a matter of fact, this was a huge mistake, that keeps popping failed test cases because one of the previous tests was failing. I have refactored all this and now we start by copying the test data into a temporary directory, which ensure that everything is always starting from pristine test data.

During the redesign, I have decided to reduce the number of tests by merging some of them. This should have no big impact on the running time, although this is not as pure 1:1 comparison with OUnit v1. Although it is still testing exactly the same thing. This explain the loss of ~50 tests, which in fact has been merged in other tests.

The overall speedup is 4x compared to OUnit v1, when using processes. However there is a 12% increase compared to OUnit v1 for sequential and a 15% increase when using OUnit v1 compatibility layer. While this is not very good score, I hope this is small enough to compensate the huge win of being able to run tests in parallel with processes.

And now the magic !

At this point, if you read carefully the numbers, you would have noticed that there is a 4.5x decrease in speed when you compare sequential and 4 processes for OUnit2. Since we are only actively testing in 4 shards, it looks strange. I don't expect a super-linear speedup due to the use of processes. I have checked that every tests was indeed running and found no solutions to this mystery. Right now, I think this is due to the fact that we are running less tests in more processes which should lighten the load of the GC (which may not trigger at all). I am not sure about this explanation and will welcome any bug report which shows a problem in the implementation of either sequential or processes runner. Although, this is great.

Help still wanted

If you find any bugs with OUnit v2, this is the time to submit a bug. OUnit BTS

If you want to try to fix bugs by yourself, please checkout the latest version of OUnit:

 $> darcs get http://forge.ocamlcore.org/anonscm/darcs/ounit/ounit

Patches always welcome.

Wednesday, September 25 2013

OUnit 2.0 progress, September 2013

Continuing last month progress report onOUnit2. The release is just a few days away, I am testing real life application and the core of the work is already in the VCS.

The basic features:

  • better configuration setup
    • environment variable
    • command line options
    • configuration files
  • improved output of the tests:
    • allow vim quickfix to jump in the log file where the error has happened
    • output HTML report
    • output JUnit report
    • systematic logging (verbose always on), but output log in a file
  • choose how to run your test:
    • run tests in parallel using processes (auto-detect number of CPU and run as many worker processes)
    • run tests concurrently using threads
    • use the old sequential runner
  • choose which test to run with a chooser that can do smart selection of tests:
    • simple: just run test in sequence
    • failfirst: run the tests that failed in the last run first and skip the success if they are still failing
  • some refactoring:
    • bracket: use a registration in the context but easier to use
    • remove all useless functions in the OUnit2 interface
  • non-fatal section: allow to fail inside non-fatal section without quitting immediately the whole test
  • allow to use OUnit1 test inside OUnit2 (smooth the transition)
  • timer that makes tests fail if they take too long, only when using the processes runner (I was not able to do it cleanly using threads and sequential)
  • allow to parametrize filenames so that you can use OUNIT_OUTPUT_FILE=ounit-$(suite_name),log and have $(suite_name) replace by the test suite name
  • create locks to avoid accessing the same resources within a single process or the whole application (typically to avoid doing a chdir while another thread is doing a chdir elsewhere)
  • create a in_testdata_dir function to locate test data, if any

Still remaining to do, but quite straightforward:

  • sys admin (website, release process)
  • update the whole documentation

Some things that I decided not to do for OUnit 2.0 release:

  • introduce a 'cached' state to avoid rerunning a test if you can programmaticaly determine that the result will be the same.

The main development is now done, but before releasing I decided to test it first on a real scale application. The first big migration to OUnit2, will be the OASIS test suite. This is a pretty big test suite (100+ tests) that takes a fair amount of time to run. I hope that during the next week I will be able to port the whole test suite and come back with some timing results.

You can follow my progress on porting OASIS to OUnit 2.0, in github.

Help wanted

If you have a long standing issue with OUnit, this is the time to submit a bug. OUnit BTS

If you want to try the dev version of OUnit:

 $> darcs get http://forge.ocamlcore.org/anonscm/darcs/ounit/ounit

Patches always welcome.

You can read the documentation of the devel version the website.

Friday, September 6 2013

OUnit 2.0 progress, August 2013

After a long pause, I have resumed my work on OUnit2. It is going quite well.

The basic features:

  • better configuration setup
    • environment variable
    • command line options
    • configuration files
  • systematic logging (verbose always on), but output log in a file
  • allow vim quickfix to jump in the log file where the error has happened
  • output HTML report
  • output JUnit report
  • choose how to run your test:
    • run tests in parallel using processes (auto-detect number of CPU and run as many worker processes)
    • run tests concurrently using threads
    • use the old sequential runner
  • refactoring of the bracket, now easier to use
  • refactoring of OUnit2 interface (remove all useless functions)
  • non-fatal section: allow to fail inside non-fatal section without quitting immediately the whole test
  • allow to use OUnit1 test inside OUnit2 (smooth the transition)

I still need to do the following:

  • a test chooser that does smart selection of tests:
    • run the one that failed in the last run and run them first
    • before re-running the one that was ok, check that all failing tests are now passing otherwise skip the already passing tests.
  • timer that makes tests fail if they take too long
  • allow to parametrize output filename so that you can use OUNIT_OUTPUT_FILE=ounit-$(name),log and have $(name) replace by the test suite name
  • create locks to avoid accessing the same resources within a single process or the whole application (typically to avoid doing a chdir while another thread is doing a chdir elsewhere)
  • better logging when using multiple workers
  • add more tests for the new runners
  • introduce a 'cached' state to avoid rerunning a test if you can programmaticaly determine that the result will be the same.
  • create a in_testdata_dir function to locate test data, if any
  • sys admin (website, release process)
  • update the whole documentation

There is still a lot of work, but the current results are already quite good. The speed improvement of the processes runner is a good thing to shorten test timing (HINT: tester needed!).

Focus on: the new bracket.

In OUnit 1, a bracket was very functional:

 bracket 
    (fun () -> "foo")  (* setup *)
    (fun foo -> ())
    (fun foo -> ())  (* tear down *)

So for common bracket, like bracket_tmpfile

   bracket_tmpfile
       (fun (fn, chn) ->
             (* Do something with chn and fn *)

The problem is that if you were using 2 or 3 temporary files, the level of indentation was high. I have decide to switch to a more imperative approach, registering the tear down function inside the test context:

   let (fn1, chn1) = bracket_tmpfile ctxt in
   let (fn2, chn2) = bracket_tmpfile ctxt in
      ....

This is shorter and clearer (albeit less functional).

Focus on: non fatal section

Sometimes, you want to verify a set of properties but to have a clear vision of what is going wrong, you need to do more than one assert.

In OUnit1, you can do:

 assert_equal exp1 v1;
 assert_equal exp2 v2

But if exp1 <> v1, you quit immediately and you'll never know for exp2 and v2.

In OUnit2, you can do:

 non_fatal ctxt (fun ctxt -> assert_equal exp1 v1);
 non_fatal ctxt (fun ctxt -> assert_equal exp2 v2)

In this new version, you will test both equality and the result of the test will be the worst failure you get (or success if both of them succeed).

Help wanted

If you have a long standing issue with OUnit, this is the time to submit a bug. OUnit BTS

If you want to try the dev version of OUnit:

 $> darcs get http://forge.ocamlcore.org/anonscm/darcs/ounit/ounit

Patches always welcome.

Special thanks to Thomas Wickham who has entirely written OUnitRunnerThreads and kickstarted the processes runner.

Friday, August 16 2013

OASIS website updated

Logo OASIS small The OASIS website has not been updated since a while. So I decide to take a shot at making more up to date. This blog post is about the pipeline I have but in place to automatically update the website. It is the first end to end 'continuous deployment' project I have achieved.

Among the user visible changes:

  • an invitation to circle the OASIS G+ page, which is now the official channel for small updates in OASIS.
  • an invitation to fork the project on Github since it is now the official repository for OASIS.
  • some link to documentation for the bleeding edge version of OASIS manual and API.

The OASIS website repository is also on Github. Feel free to fork it and send me pull request if you see any mistake.

The website is still using a lot of markdown processed by pandoc. But there are some new technical features:

  • no more index.php, we use a templating system to point to latest version
  • we use a Jenkins job to generate the website daily or after any update in OASIS itself.

Since I start using quite quite a lot Python, I have decided to use it for this project. It has a lot of nice libraries and it helps me to quickly do something about the website (and provides plenty of idea to create equivalent tools in OCaml).

The daily generation: Jenkins

I have a Jenkins instance running, so I decided to use it to compile once a day the new website with updated documentation and links. This Jenkins instance also monitor changes of the OASIS source code. So I can do something even more precise: regenerate the website after every OASIS changes.

I use the Jenkins instance to also generate a documentation tarball for OASIS manual and API. This helps a lot to be able to display the latest manual and API documentation. This way I can quickly browse the documentation and spot errors early.

Another good point about Jenkins, is that it allows to store SSH credential. So I created a build user, with its own SSH key, in the OCaml Forge and I use it to publish the website at the end of the build.

Right now Jenkins do the following:

  • trigger a build of the OASIS website:
    • every day (cron)
    • when a push in OASIS website repository is detected
    • when a successful build of OASIS is achieved.
  • get documentation artifact from the latest successful build of OASIS
  • build the website
  • publish it

Data gathering

To build the website I need some data:

  • documentation tarballs containing the API (HTML from OCamldoc) and manual (Markdow)
  • list of OASIS version published
  • links to each tarball (documentation and source tarball)

The OCaml Forge has a nice SOAP API. But one need to be logged in to access it. This is unfortunate, because I just want to access public data. The only way I found to gather my data was to scrape the OCaml Forge.

Python has a very nice scraping library for that: beautifulsoup.

I use beautifulsoup to parse the HTML downloaded from the Files tab of the OASIS project and extract all the relevant information. I use curl to download the documentation tarball (for released versions) and for the latest development version.

Code

Template

Python has also a very nice library to process template: mako.

Using the data I have gathered, I feed them to mako and I process all the .tmpl files in the repository to create matching files.

Among the thing that I have transformed into template:

  • the index.php has been transformed into a index.mkd.tmpl, it was a hackish PHP script scraping the RSS of updates before, it is now a clean template.
  • robots.txt.tmpl, see the following section for explanation
  • documentation.mkd.tmpl in order to list all version of documentation.

Fix documentation and indexing

One of the problem of providing access to all versions of the documentation, is that people can end up reading an old version of the documentation. In order to prevent that, I use two different techniques:

  • prevent search engine to index old version.
  • warn the user that he is reading an old version.

To prevent search engine to index the file, I have created a robots.txt that list all URL of old documentation. This should be enough to prevent search engine to index the wrong page.

To warn the user that he is reading the wrong version, I have added a box "you are not viewing the latest version". This part was tricky but beautifulsoup v4 provide a nice API to edit HTML in place. I just have to find the right CSS selector to define the position where I want to insert my warning box.

Code

Publish

The ultimate goal of the project is the 'continuous deployment'. Rather than picking what version to deploy and do the process by hand, I let Jenkins deploy every version of it.

Deploying the website used to be a simple rsync. But for this project I decided to use a fancier method. I spend a few hours deciding what was the best framework to do the automatic deployment. There are two main frameworks around: capistrano (Ruby) and fabric (Python).

Fabric is written in Python, so I pick this one because it was a good fit for the project. Fabric biggest feature is to be a SSH wrapper.

The fabric script is quite simple and to understand it, you just have to know that local run a local command and run run a command on the target host.

The fabfile.py script do the following:

  • create a local tarball using the OASIS website html/ directory
  • upload the tarball to ssh.ocamlcore.org
  • uncompress it and replace the htdocs/ directory of the oasis project
  • remove the oldest tarballs, we keep a few versions to be able to perform a rollback.

Given this new process, the website is updated in 3 minutes automatically after a successful build of OASIS.

Thursday, April 4 2013

Sekred a password helper for puppet.

Puppet is a nice tool but it has a significant problem with passwords:

  • it is recommended to store puppet manifests (*.pp) and related files in a VCS (i.e. git)
  • it is not recommended to store password in a VCS

This lead to complex situation and various workaround that more or less work:

  • serve password in a separate file/DB or do an extlookup on the master (pre-set passwords)
  • store password on the server and get them through a generate function (random password but on the master)

Most of these workarounds are complex, don't allow you to share the password you have set easily and most of the time are stored in another place than the target node.

So I have decided to create my own solution: sekred (LGPL-2.1).

The idea of sekred is to generate the password on the target node and made it available to the user that needs it. Then the user just have to ssh into the host and get the password.

Pro:

  • the password is generated and stored on the node
  • no VCS commit of your password
  • no DB storage of your password beside the local filesystem of the host
  • no need to use a common pre-set password for all you host, the password is randomly generated for only one host
  • to steal the password you need to crack the host first but if you have root access on the host, accessing a random generated password is pointless

Cons:

  • the password is stored in clear text
  • the password is only protected by the filesystem ACL

Let see some concrete examples.

Setting mysql root password

This is a very simple problem. When you first install mysql on Debian Squeeze, the root password is not set. That's bad. Let set it using sekred and puppet.

node "mysqlserver" {

  package {
    ["mysql-server",
     "mysql-client",
     "sekred"]:
      ensure => installed;
  }

  service {
    "mysqld":
      name => "mysql",
      ensure => running,
      hasrestart => true,
      hasstatus => true;
  }

  exec {
    "mysql-set-root-password":
      command => "mysqladmin -u root password $(sekred get root@mysql)",
      onlyif => "mysql -u root",  # Trigger only if password-less root account.
      require => [Service["mysqld"], Package["mysql-client", "sekred"]];
  }
}

And to get the root password for mysql, just login into the node "mysqlserver":

$> sekred get root@mysql
Cie8ieza

Setting password for SSH-only user

This example is quite typical of the broken fully automated scenario with passwords: - you setup a remote host only accessible through SSH - you create a user and set its SSH public key to authorize access - your user cannot access its account because SSH prevent password-less account login!

In other word, you need to login into the node, set a password for the user and mail him back.... That defeats a little bit the "automation" provided by puppet.

Here is what I do with sekred:

define user::template () {
  user {
    $name:
      ensure => present,
      membership => minimum,
      shell => "/bin/bash",
      ....
  }
  include "ssh_keys::$name"

  # Check password less account and set one, if required.
  $user_passwd="$(sekred get --uid $name $name@login)"
  exec {
    "user-set-default-password-$name":
      command => "echo $name:$user_passwd | chpasswd",
      onlyif => "test \"$(getent shadow $name | cut -f2 -d:)\" = \"!\"",
      require => [User[$name], Package["sekred"]];
  }
}

So the command "test \"$(getent shadow $name | cut -f2 -d:)\" = \"!\"" test for a password-less account. If this is the case, it creates a password using sekred get --uid $name $name@login and set it through chpasswd.

Note that $user_passwd use a shell expansion that will be evaluated when running the command only, on the host. The --uid flag of sekred assign the ownership of the password to the given user id.

So now the user (foo) can login into the node and retrieve its password using sekred get foo@login.

Try it!

Sekred was a very short project but I am pretty happy with it. It solves a long standing problem and helps to cover an extra mile of automation when setting up new nodes.

The homepage is here and you can download it here. Feel free to send patches, bugs and feature requests (here, login required).

Saturday, February 26 2011

OCaml Debian News

... or don't shoot yourself in the foot.

This is not a big secret, Debian Squeeze has been released. Right after this event, the OCaml Debian Task force was back in action -- with Stephane in the leading role. He has planned the transition to OCaml 3.12.0. We will proceed in two steps: a small transition of a reduced set of packages that can be transitioned before 3.12 and then the big transition.

The reason for the small transition, is to avoid having to dep-wait (waiting for dependencies) of package upload by human. In -- a not so far -- past, the OCaml Debian Task force members were uploading packages by hand and waited for a full rebuild to go to the next step. This was long and cumbersome. We use now binNMU: it is binary only uploads -- with no source changes -- processed automatically by the release team and its infrastructure. This is far more effective and helps us to reduce the delay of the transition...

The small transition is happening now!!! Don't update/upgrade your critical Debian installations with OCaml packages, you'll get a lot of removal if you do so. N.B. these removal are part of the famous {{Enforcing type-safe linking using package dependencies}} paper.

As a side note, I am happy to announce that a full round of new OCaml packages has landed in Debian unstable:

People aware of my current work, should notice that all the dependencies of OASIS are now in Debian unstable: ocaml-data-notation, ocamlify, ocaml-expect. This is a hint about the next OCaml Debian package, I will upload. You can also have a look at OASIS enabled packages (all the OASIS dependencies, ocaml-sqlexpr and ocaml-extunix). These packages have been generated using oasis2debian a tool to convert _oasis into debian/ packaging files.

After these transition we will continue proceeding with standard upgrade work (e.g. camomile to 0.8.1).

Sylvain Le Gall is an OCaml consultant working for OCamlCore SARL

Friday, October 22 2010

Compiling pcre-ocaml with Visual Studio 2008

One big change in the recent OASIS v0.2 release is the replacement of Str by Pcre. The big advantage of Pcre is that it can be used in multi-threaded environment, whereas it is not recommended to do so with Str. Since OASIS is used by the OCsigen part of OASIS-DB, we need to make it work safely with Lwt with multiple users at the same time.

Note that, we can probably use Str directly with Lwt, because it is not really multi-threaded. But we want to be safe on this point and Pcre is a very powerful library.

But the OCaml Pcre library depends on an external C library (pcre). This is not a problem on Linux et al, where it is shipped with the OS by default. But on Windows, you need to build it yourself. We want to do it using Microsoft Visual Studio 2008, mainly because OCaml was compiled with it -- and it seems the most natural way to build C library on Windows. As usual, building Open Source C libraries using MS Visual Studio, is not the most common way to process. However, the use of CMake enables a simple way to do it. Here is how we can compile pcre and pcre-ocaml for Windows:

  • Download pcre and unzip it. We use 7.7, which is not the latest version, but the one matching GODI configuration
  • Create c:\pcre\build and c:\pcre\pcre-7.7, beware that you should avoid spaces in the name, there is a small bug in OCaml with spaces for C options
  • Download and install cmake
  • Run cmake-gui and point the sources to the location of your unzipped pcre directory
  • Run Configure a first time
  • Choose Visual Studio 9 2008 generator
  • You should get the configuration display as following:

CMake-GUI with pcre

  • Change the CMAKEINSTALLPREFIX to c;\pcre\pcre-7.7, don't select BUILDSHAREDLIBS
  • Run Configure again
  • Run Generate
  • Start MS Visual Studio 2008
  • Open the PCRE solution located in c;\pcre\build

Visual Studio 2008 with pcre

  • Choose Solution: Release
  • Generate the target ALL_BUILD
  • If you don't have already administrator rights, restart Visual Studio with administrator rights
  • Generate the target INSTALL
  • Go to c:\pcre\pcre-7.7\lib and copy pcre.lib to libpcre.lib, to avoid name clashes with the future OCaml library
  • Download and unzip pcre-ocaml
  • Start a Visual Studio command line and run a cygwin shell inside it (e.g. my script to run cygwin shell with Visual Studio 2008)
  • Change directory to lib directory of the unzipped source of pcre-ocaml
  • Apply this patch, if it is not already applied
  • Edit make_msvc.bat and set the variables PCREH and PCRELIB to good values set PCRE_H=C:\pcre\pcre-7.7\include and set PCRE_LIB=C:\pcre\pcre-7.7\lib\libpcre.lib
  • Run make_msvc.bat

The best way to test your newly created OCaml pcre library is to try to compile an example from the examples directory of pcre-ocaml itself.

Sylvain Le Gall is an OCaml consultant at OCamlCore SARL

Wednesday, October 20 2010

Unison on windows tips

The big advantage of Unison on Windows is that it allows quite easily to synchronize between Windows and Linux. For those who need to work on Windows with the same set of files as on Linux, this is a big plus. Other tools do it as well, but the 2-way sync of Unison is quite nice. When you need to compile a software on Linux and Windows, you can modify both side at the same time and (almost) don't have problems.

On Windows, the .unison and unison.log are located into your %HOMEPATH%, which the upper directory of the classic Documents folder. In the directory .unison, you will find the .prf files that describe your unison profiles. As usual, default.prf in this directory is the default profile.

Basic

The basic tips are:

  • use fastcheck = true in your default.prf
  • disable directory indexing

Disable directory indexing

You can also disable virus live scan -- if you think it is safe!!!!

SSH

Using ssh under Windows is always a challenge. As a matter of fact, this tool doesn't match Windows context and it is not as integrated as in Linux/BSD.

There is Putty which can help you. It has a good support for remote shell but it is not very easy to setup with Unison. Putty and OpenSSH doesn't have precisely the same set of options and Unison relies on some not available in Putty. There is a script called ssh2plink.bat that can help you using Putty's plink as with Unison. I used it for a while, but this didn't give the expected throughput.

The best option is to use the ssh command provided by Cygwin. In this case you have at the same time good throughput and unison integration. I explain here how to configure you Cygwin's ssh to use a SSH key.

You can bypass the following steps, if you wish to use a password or if you have already setup your ssh to connect to the target computer.

Launch Cygwin's setup.exe and select openssh for installation.

To add a SSH key, launch the cygwin shell:

$ ssh-keygen -t rsa
Generating public/private rsa key pair.
[...]

Copy the file .ssh/id_rsa.pub to your target computer's .ssh/authorized_keys. You should be aware that the file format can be Windows EOL style (in this case use dos2unix to convert the file) and if you copy/paste from a dos box, some end of lines are added and you should remove them from the authorized_keys, to have a single line key.

Once, you have installed your ssh key into target computer, you should try to connect directly from the cygwin shell.

$ ssh XXX

Now, you can configure sshcmd = c:\cygwin\bin\ssh.exe to your default.prf.

Using Cygwin's ssh allow you to get ~2MB/s (or more) when you only get ~100KB/s using ssh2plink.bat.

If you have any other tips to improve Unison on Windows, I will be happy to test them and post it here.

Friday, September 10 2010

Dirty fix for omlet vim extension

omlet (or here) is a vim extension for writing OCaml code.

In my opinion, it has a better indentation than the standard OCaml vim support. Unfortunately, it has a cost: the indentation vim code is more complex. And it has a few bugs :-(

The main bug is that it doesn't like unbalanced comment opening "(*" and closing "*)" tags. From time to time, it enters an infinite (or very long) loop when there such a tag left in your file. It can be very far away from the point you are editing.

It isn't too problematic, because unbalanced tags are a syntax error. But the problem is that it matches these tags inside strings also. So whenever you start using regular expression like "(.*)" the whole indentation fails.

But there is a very ugly solution to this problem!

Problematic code:

 let parse_rgxp =
   Pcre.regexp ~flags:[`CASELESS] 
     "^(?<license>[A-Z0-9\\-]*[A-Z0-9]+)\
      (?<version>-[0-9\\.]+)?(?<later>\\+)?\
      ( *with *(?<exception>.*) *exception)?$"

Solution, add ignore"(*":

 let parse_rgxp =
   ignore "(*";
   Pcre.regexp ~flags:[`CASELESS] 
     "^(?<license>[A-Z0-9\\-]*[A-Z0-9]+)\
      (?<version>-[0-9\\.]+)?(?<later>\\+)?\
      ( *with *(?<exception>.*) *exception)?$"

Very very ugly coder: you balance comment tags in dead code -- very very bad ;-)

ps: another solution when the plugin enters the infinite loop, hit Ctrl-C. This will stop it and let you define your own indentation.

Wednesday, September 1 2010

OCaml 3.12 with Debian Sid right now!

Some careful readers of Planet OCamlCore should wonder why the OCaml packages in Debian has not yet been upgraded to 3.12.0. For the Planet Debian readers, this is the latest version of the Objective Caml programming language.

The answer is simple: Debian Squeeze froze on 6th August. This means that Debian folks focus on fixing release critical bugs and avoid doing big transitions in unstable (Sid). In particular, the Debian OCaml maintainers has decided to keep OCaml 3.11.2 for Squeeze, because the delay was really too short: OCaml 3.12 was out on 2nd August.

A great work has already been done by S. Glondu and the rest of the Debian OCaml maintainers to spot possible problems. The result was a series of bugs submitted to the Debian BTS. This effort has started quite early and have been updated with various OCaml release candidates.

S. Glondu has also built an unofficial Debian repository of OCaml 3.12.0 packages here.

Let's use it to experiment with OCaml 3.12.0.

schroot setup

Following my last post about schroot and CentOS, we will use a schroot to isolate our installation of unofficial OCaml 3.12.0 packages.

approx

approx is a debian caching proxy server for Debian archive files. It is very effective and simple to setup. It is already on my server (Debian Lenny, approx v3.3.0). I just have to add a single line to create a proxy for ocaml 3.12 packages:

 $ echo "ocaml-312   http://ocaml.debian.net/debian/ocaml-3.12.0" >> /etc/approx/approx.conf
 $ invoke-rc.d approx restart

approx is written in OCaml, if you want to know how I come to it.

debootstrap and schroot

We create a chroot environment with Debian Sid:

# PROXY = host where approx is installed, debian/ points to official Debian repository of 
# your choice. 
$ debootstrap sid sid-amd64-ocaml312 http://PROXY:9999/debian

We create a section for sid-amd64-ocaml312 in /etc/schroot/schroot.conf (Debian Lenny):

[sid-amd64-ocaml312]
description=Debian sid/amd64 with OCaml 3.12.0
type=directory
location=/srv/chroot/sid-amd64-ocaml312
priority=3
users=XXX
root-groups=root
run-setup-scripts=true
run-exec-scripts=true

Replace XXX by your login.

And we install additional softwares:

 $ schroot -c sid-amd64-ocaml312 apt-get update
 $ schroot -c sid-amd64-ocaml312 apt-get install vim-nox sudo

OCaml 3.12 packages

Now we can start the setup to access OCaml 3.12.0 packages.

The repository is signed by S. Glondu GPG key (see here). We need to get it and inject it into apt:

$ gpg --recv-key 49881AD3 
gpg: requête de la clé 49881AD3 du serveur hkp keys.gnupg.net
gpg: clé 49881AD3: « Stéphane Glondu <steph@glondu.net> » n'a pas changé
gpg:        Quantité totale traitée: 1
gpg:                      inchangée: 1
$ gpg -a --export 49881AD3 > glondu.gpg
$ schroot -c sid-amd64-ocaml312 apt-key add glondu.gpg

The following part is done in the schroot:

$ schroot -c sid-amd64-ocaml312
# PROXY = host where approx is installed
(sid-amd64-ocaml312)$ echo "deb http://PROXY:9999/ocaml-312 sid main" >> /etc/apt/sources.list
(sid-amd64-ocaml312)$ cat <<EOF >> /etc/apt/preferences
Package: *
Pin: release l=ocaml
Pin-Priority: 1001
EOF
(sid-amd64-ocaml312)$ apt-get update 
...
(sid-amd64-ocaml312)$ apt-cache policy ocaml
  Installé : (aucun)
  Candidat : 3.12.0-1~38
 Table de version :
     3.12.0-1~38 0
       1001 http://atto/ocaml-312/ sid/main amd64 Packages
     3.11.2-1 0
        500 http://atto/debian/ sid/main amd64 Packages
(sid-amd64-ocaml312)$ apt-get install ocaml-nox libtype-conv-camlp4-dev libounit-ocaml-dev...

That's it. The apt-policy command shows that OCaml 3.12 for the ocaml-312 repository has an higher priority for installation.

Good luck playing with OCaml 3.12.0.

Monday, August 2 2010

OCaml 3.12.0 is out: watch the movie

I have been quite busy the few last months. But anyway, I found time and solved various technical pitfalls to be able to bring you the first movie of the OCaml Meeting:

Foreword by X. Leroy at OCaml Meeting 2010 (subtitle: OCaml 3.12.0 features presentation)

In this video, Xavier Leroy told us about the features in OCaml 3.12.0. This version is now released so it is high time to release the matching movie.

I will release other movies of the OCaml Meeting during August and will try to explain the various pitfalls I encounter -- and the OCaml solutions I use to solve them.

Friday, September 18 2009

OCaml cryptokit and Java PBEWithMD5AndDES

During one of my project I need to interact with Java cryptographic extension. Some data has been encrypted using PBEWithMD5AndDES. I need to access it from OCaml.

I take a look at available cryptographic extension in the Debian project for OCaml: cryptgps and cryptokit. I choose cryptokit, because its author is well known: Xavier Leroy.

This article was my starting point. Of course, I keep in mind that the reference is there and that there is a good article covering it.

Here is the result in OCaml:

 let decrypt passphrase salt ?(iterationCount=41) str =
   let key, iv =
     let rec hash_aux iter str =
       if iter > 0 then
         (* Rehash string *)
         hash_aux
           (iter - 1)
           (hash_string
              (Hash.md5 ())
              str)
       else
         (* Key = first 8 bytes of the MD5 hash *)
         String.sub str 0 8,
         (* IV = last 8 bytes of the MD5 hash *)
         String.sub str 8 8
     in
       (* Hash n times combination of passphrase and salt,
           return key and iv 
         *)
       hash_aux
         iterationCount
         (passphrase ^ salt)
   in
     transform_string
        (Cipher.des
           ~pad:Padding.length
           ~iv:iv
           key
           Cipher.Decrypt)
       str

The only missing information was the pad algorithm to use (Padding.length). For this piece of information, I need to browse the RSA documentation and test a little bit.

Rewriting PBEWithMD5andDES is quite straightforward with cryptokit and OCaml. It takes 25 lines with C# and OCaml (only counting LoC, no comment, no empty constructor or declaration in C#). I was thinking that this task will require 2 or 3 days, but it has been done in 4 hours...

Many thanks to cryptokit ;-)

Friday, January 23 2009

OCaml meeting 2009, subscription about to end

The subscription will end on January the 25th, for people willing to come to this unique OCaml event, this is your last chance !!!!

We have 40 participants for now. I am quite happy with this number because it is almost the same as last year -- except that we do it in Grenoble rather than Paris. Anyway, I see that a lot of people who were there last year, will come back to this year events. There is also a few more people, in particular more from CAML consortium members.

I have stepped down from the first talk about OCamlCore.org in favor of Stefano Zacchiroli. Zack has really helped me on this project since the begin (with Pietro Abate and Romain Beauxis). I am very happy that he can do the talk.

I will do myself, a talk about my last project for Talend. This was a great project targeting to be as fast as a standard C application but in OCaml. The point is that OCaml helped me to do quickly a working prototype and then optimize it for speed. All in all, after 6 months I was able to deliver Talend with a working and fast application. This kind of release cycle is harder to achieve in other programming language. OCaml compile time verification helped me a lot.

I hope that in the last remaining 2 days, some more people will subscribe.

Wiki Subscription

Sunday, November 9 2008

OCaml meeting 2008 participants

Unfortunately I forget to post the last year participant pictures. Here is the official picture.

Thanks to R.W.M Jones for reminding me.

- page 1 of 2