Archive for August, 2011

Michael Hudson-Doyle

Viewing lava results in android-build

It seems like it’s taken a long time to get all the pieces hooked up, but I’m very happy to report that finally you can see the results of testing an android build in LAVA directly in the build page!  If you go to a build page for a build that is automatically tested such as https://android-build.linaro.org/builds/~linaro-android/panda/#build=250 (after a little while for some ajax to happen), you should see a table listing the test runs we ran and the summary of the pass/fail counts:

It may not seem all that earth shattering, but there have been many bits and pieces that needed to be put together to get this to work:

  • We needed to build the scheduler, so that we could submit this job to run on the first Panda board to become available.
  • We needed to build the infrastructure to allow the job to be submitted with some measure of security from a remote host.
  • We needed to work on the build system scripts to submit the job without disclosing the authorization token.
  • We needed to add views to the scheduler and dashboard that present data in a ajax-friendly way.
  • We needed to work on the build system frontend to make uses of these views and format the data.
  • And finally, we needed to test and deploy all of these changes.

So I think I’m justified in being happy to have this finally working in production :-)  Of course, it’s just a start: we want to build similar facilities for the other working groups to use, if nothing else.

Michael Hudson-Doyle

New stuff in LAVA 2011.08

We’ve just deployed the 2011.08 release of the LAVA components to http://validation.linaro.org.  It feels like we’re getting to the point where we can start working on the really interesting features now, which is nice after what feels like months of building infrastructure.  While the official release highlights are a good list of what happened, here is a summary from my point of view of the changes from what was previously deployed:

lava-dashboard

  • I added a put_ex method for uploading a bundle and getting the URL of the bundle in response rather than just its SHA1.
  • The main change though is a shiny new image status page.

lava-dispatcher

  • I changed the dispatcher to print out the URL of the result bundle (on a specific fd, so the scheduler can cleanly see it in amongst all the useful but noisy stuff the dispatcher usually outputs)
  • Le Chi Thu worked on being able to define test suites to run outside the lava-dispatcher/lava-test source code.
  • There were also sundry changes to make things more reliable.

lava-scheduler

  • Many changes here – I spent most of the month hacking on the scheduler
  • Record the link to the results as output by the dispatcher and show it on the job page.
  • I made the scheduler daemon reconnect to the database if the connection fails or drops.  This means we could finally change to starting the daemon on boot, without having to care if postgres was fully initialized or not.
  • I changed the way the dispatcher is run to allow the scheduler daemon to be restarted while there are running jobs.
  • Jobs can now be cancelled while they are in the submitted or running states.
  • There is now a simple device view, and admins can ask for no more jobs to be submitted to a board.

lava-server

  • The main changes here were to allow the other LAVA components to provide content to put on the front page.

lava-test

  • The work to support out-of-tree tests had an impact on lava-test, too
  • We added support for the smem, bootchart and xrestop tests, as well as the usual reliability fixes.
  • The bootchart test for a while managed to overwrite the kernel for the master/recovery partition of the SD cards, which was a problem, especially when the new kernel had broken ethernet support…

What’s on for next month?  Well from my side, it’s likely to be still around ‘joining the dots’, starting with enabling the android team to see the results of a test run on the build page.  I think other likely developments are to be making our story for running tests on android a bit clearer, refactoring the way configuraton is done for the dispatcher (and making debian packages for it), supporting more hardware in the lab, putting the final pieces together to allow daily testing of the kernel working group output, documentation, email notifications of various things, and I’m sure various things that I’ve forgotten or haven’t cropped up yet.  It’s going to be another busy month!

Michael Hudson-Doyle

Linaro Validation at Linaro Connect

A few weeks ago now, most of the Linaro engineers met at “Linaro Connect”, the new name for our get-together.  Linaro bootstrapped its processes by borrow heavily from Ubuntu, including the “two planning meetings, two hacking meetings” pattern. Over the last year though it’s become clear that this isn’t totally appropriate for Linaro and while we’re sticking to the same number of meetings, 4 a year, each meeting now has the same status and will be a mix of planning and hacking.  Based on a sample size of 1, this seems to be a good idea – last week’s meeting was excellent.  Very intense, which is why I never got around to blogging during the event, but also very productive.

The validation team had a dedicated hacking room, and on Monday we set up a “mini-Lab” that we could run tests on.  This took a surprisingly (and pleasingly) short amount of time, although we weren’t as neat about the cabling as we are in the real lab:

The main awkwardness in this kind of setup where you are connecting to the serial ports via USB rather than a console server is that the device names of the usb serial dongles is not predictable, and so naming boards becomes a challenge.  Dave worked out a set of hacks to mostly make this work, although I know nothing about the details.

Now that a few weeks have passed I can’t really remember what we did next :)  There was a lot of hacking and a lot talking.  These are some things I remember:

  • I spent some time talking to the Android developers about getting the results of the tests to display on the build page. Luckily there were no new surprises and I managed to come up with a plan for getting this to work (have the process that runs the tests and uploads the bundle to the dashboard print out the URL to the result bundle and have the lava scheduler read this and record the link).
  • We all talked to the kernel team about how to test their work on an automated basis.
  • I talked to Michael Hope about the toolchain builds that are currently done in his basement, although we mostly deferred that conversation until after the event itself.
  • There was a lot of talk about making the front page of the validation server show something more useful.
  • I implemented a prototype for replacing QATracker with something that could guide a user through manual tests and upload the results directly to the dashboard.
  • We talked to ARM about possibly using some of the LAVA components we have built for their internal testing,
  • There was talk about the practicalities of using the LAVA lab to measure the effect of power management changes.

I’m sure there was lots of other stuff, but this should give some impression of how much went on!



Create a new blog