Canonical Voices

Ara

I have been asked to write a chapter for a book about the experiences of people involved in Open Source with the idea of “If I knew what I know today”. I asked if I could re-print my contribution here. I hope it is interesting for people concerned about Open Source testing.

Dogfooding Is Not Enough

I have been involved with Open Source since my early days at University, in Granada. There, with some friends, we founded the local Linux User Group and organized several activities to promote Free Software. But, since I left university, and until I started working at Canonical, my professional career had been in the proprietary software industry, first as a developer and after that as a tester.

When working in a proprietary software project, testing resources are very limited. A small testing team continues the work that developers started with unit testing, using their expertise to find as many bugs as possible, to release the product in good shape for end user consumption. In the free software world, however, everything changes.

When I was hired at Canonical, apart from fulfilling the dream of having a paid job in a free software project, I was amazed by the possibilities that testing a free software project brought. The development of the product happens in the open, and users can access to the software in the early stages, test it and file bugs as they encounter them. For a person passioned about testing, this is a new world with lots of new possibilities. I wanted to make the most of it.

As many people do, I thought that dogfooding, or using the software that you are aiming to release, was the most important testing activity that we could do in open source. But, if “given enough eyeballs all the bugs are shallow”, (one of the key lessons of Raymond’s “The Cathedral & The Bazaar”), and Ubuntu had millions of users, why very important bugs were still slipping into the release?

First thing that I found when I started working at Canonical was that the organized testing activities were very few or nonexistent. The only testing activities that were somehow organized were in the form of emails sent to a particular mailing list calling for testing a package in the development version of Ubuntu. I don’t believe that this can be considered a proper testing activity, but just another form of dogfooding. This kind of testing generates a lot of duplicated bugs, as a really easy to spot bug will be filed by hundreds of people. Unfortunately, the really hard to spot but potentially critical bug, if someone files it, is likely to remain unnoticed, due to the noise created by the other hundreds of bugs.

Looking better

Is this situation improving? Are we getting better at testing in FLOSS projects? Yes, I really believe so.

During the latest Ubuntu development cycles we have started several organized testing activities. The range of topics for these activities is wide, including areas like new desktop features, regression testing, X.org drivers testing or laptop hardware testing. The results of these activities are always tracked, and they prove to be really useful for developers, as they are able to know if the new features are working correctly, instead of guessing that they work correctly because of the absence of bugs.

Regarding tools that help testing, many improvements have been made:

  • Apport has contributed to increase the level of detail of the bugs reported against Ubuntu: crashers include all the debugging information and their duplicates are found and marked as such; people can report bugs based on symptoms, etc.
  • Launchpad, with its upstream connections, has allowed having a full view of the bugs, knowing that bugs happening in Ubuntu are usually bugs in the upstream projects, and allowing developers to know if the bugs are being solved there.
  • Firefox, with its Test Pilot extension and program, drives the testing without having to leave the browser. This is, I believe, a much better way to reach testers than a mailing list or an IRC channel.
  • The Ubuntu QA team is testing the desktop in an automated fashion and reporting results every week, allowing developers to have a very quick way to check that there have not been any major regressions during the development.

Although testing FLOSS projects is getting better, there is still a lot to be done.

Looking ahead

Testing is a skilled activity that requires lots of expertise, but in the FLOSS community is still seen as an activity that doesn’t require much effort. One of the reasons could be that the way we do testing is still very old fashioned and does not reflect the increase of complexity in the free software world in the last decade. How can it be possible that with the amount of innovation that we are generating in open source communities, testing is still done like it was in the 80s? Let’s face it, fixed testcases are boring and get easily outdated. How are we going to grow a testing community, who is supposed to find meaningful bugs if their main required activity is updating testcases?

But, how do we improve testing? Of course, we cannot completely get rid of testcases, but we need to change the way we create and maintain them. Our testers and users are intelligent, so, why creating step-by-step scripts? Those could easily get replaced by an automated testing tool. Instead of that, let’s just have a list of activities you perform with the application and some properties it should have, for example, “Shortcuts in the launcher can be rearranged” or “Starting up LibreOffice is fast”. Testers will figure out how to do it, and will create their testcases as they test.

But this is not enough, we need better tools to help testers know what to test, when and how.  What about having an API to allow developers to send messages to testers about updates or new features that need testing? What about an application that tell us what part of our system needs testing based on testing activity? In the case of Ubuntu we have the data in Launchpad (we would need testing data as well, but at least we have bug data). If I want to start a testing session against a particular component I would love to have the areas that haven’t been tested yet and a list of the 5 bugs with more duplicates for that particular version, so I avoid filing those again. I would love to have all this information without leaving the same desktop that I am testing. This is something that Firefox has started with Test Pilot, although they are currently mainly gathering browser activity. Google is also doing some research in this area.

Communication between downstream and upstream and vice-versa also needs to get better. During the development of a distribution, many of the upstream versions are also under development, and they already have a list of known bugs. If I am a tester of Firefox through Ubuntu, I would love to have a list of known bugs as soon as the new package gets to the archive. This could be done by having an acknowledged syntax for release notes, that could then get easily parsed and bugs automatically filed and connected to the upstream bugs. Again, all of this information should be easily available to the tester, without leaving the desktop.

Testing, if done this way, would allow the tester to concentrate on the things that really matter and that make testing a skilled activity; concentrate on the hidden bugs that haven’t been found yet, on the special configurations and environments, on generating new ways to break the software. On having fun while testing.

Wrapping Up

From what I have seen in the latest three years, testing has improved a lot in Ubuntu and the rest of FLOSS projects that I am somehow involved with, but this is not enough. If we really want to increase the quality of open source software we need to start investing in testing and innovating the ways we do it, the same way we invest in development. We cannot test 21st century software, with 20th century testing techniques. We need to react. Open Source is good because is open source is not enough anymore. Open Source will be good because it is open source and has the best quality that we can offer.

 

Read more
Ara

Discontinuing Ubuntu Ready

If a user goes to the Ubuntu certification site, what he or she will find, apart from a list of certified systems, two different types of certification Ubuntu Certified and Ubuntu Ready.

Having two commercial hardware validation programs confuses the customer, as it is quite difficult to understand the differences between the two. For this, an other problems with the programme, we have decided to discontinue the Ubuntu Ready programme in 11.10.

Presenting Ubuntu Friendy

Instead of just removing Ubuntu Ready, we would like to start a non-commercial new hardware validation programme, created by Canonical, and with co-ordination with the rest of the community. This new programme is called Ubuntu Friendly (although the name might change).

The great thing about this new Ubuntu programme is that it will be completely community driven. There won’t be any commercial requirements for systems to be Ubuntu Friendly.

Although all the specifics of the programme will be discussed at UDS Oneiric, the basic ideas of the programme are:

  • Anyone will be able to test their systems and provide test results.
  • Anyone will be able to review and provide feedback on the results (something like triaging bugs in the Bugsquad). All the test results will be public (except those made it private by the certifiers)
  • Only a small subset of people (certifiers) will be able issue the Ubuntu Friendly certificate, based on results (in the same way Bug Control is a subset of the Bugsquad that have more permissions to work with Ubuntu bugs). There will be a formal and specified way to apply to be a certifier.
  • Many positive results (and not just one) for a given model and hardware configuration will be needed to mark the system as Ubuntu Friendly.
  • All the client tools to test Ubuntu, and the tests themselves, will be open source.

UDS Oneiric

Most of the specifics of the programme will be discussed during UDS Oneiric, during two sessions, on Wednesday:

This blueprint will contain the work needed to be done in terms of the programme itself: description of the programme, governance, etc.

This blueprint will contain the work needed to be done in terms of the technical infrastructure that is required to make the programme possible: testing tools, backend infrastructure, etc.

If you are interested in hardware validation and would like to share your ideas and make this project possible, feel free to subscribe to the blueprint and attend the sessions at UDS. And remember that remote participation is also possible!

Read more
Ara

After two years and a half in the Ubuntu QA team I have moved to the Hardware Certification team, on Platform Services. I was very happy to be given the opportunity to join the Hardware Certification team and I couldn’t reject. I am very excited to be able to help in the goal of making Canonical profitable.

How will that affect my collaboration with Ubuntu QA

It is been a great pleasure to work all this time with a great group of people  (both Canonical and non-Canonical). Obviously, as working directly on Ubuntu QA won’t be part of my paid job, I won’t be able to dedicate it as much as time as I did before. So, first of all, sorry if I am not getting back to you as quickly as I did in the past. But I love the Ubuntu QA community and I am still part of it. I will still be working on some of the things I did before, in my spare time. There are some projects, like the Desktop Testing Program or Desktop Testing Automation that are very close to my heart and I would like to stay somehow involved with them, I will try to attend the Ubuntu QA meeting at least twice per month and I will stay active in the ubuntu-qa mailing list.

The Hardware Certification Team

Having a successful certification program is both beneficial for vendors, ensuring their costumers that the hardware will work with Ubuntu, and the Ubuntu community, as they have access to the list of systems that work out of the box. We agree that sometimes we don’t expose the Hardware Certification program to our community as much as we should. There are people very active in the Ubuntu development that don’t know that Canonical has a public website where people can check what systems are certified to work with Ubuntu or even that the certification program exists. We don’t keep it as a secret, though; the Hardware Certification team has been attending UDS, our main testing tool is open sourced and installed by default in Ubuntu and, as I said, the systems that have been certified are published. I understand that we can do a better job explaining how the hardware certification program itself works and keep it more in the open, accepting suggestion and criticism.

To start with, Victor, the Hardware Certification team manager, has written a wiki page to explain a bit how the certification programs works and we have now a public project in Launchpad where you will be able to file bugs related to the project or ask questions. We would love to hear your thoughts and feedback about the program. In return, we will make our best to improve the program in ways that makes it more useful for our community.

Read more
Ara

Magomatic

It’s been a while since the last time I blogged about Mago, but Natty is going to be an exciting cycle for desktop testing automation (a lot is happening!) and I would like to present some of the work we have been doing.

Today I will write about Magomatic, a new side project related to Mago.

If you have tried to add a new testcases to an existing Mago wrapper, you can see that this is pretty straight forward. Most of the things that you need are already there, and you need to add only the code of the test, without thinking on the accessibility information of the application (OK, sometimes you have to, but it is quite simple to start the process). If you, however, have ever tried to add a new wrapper application to Mago I guess that you have found the process a bit difficult: you need to understand how the accessibility information is presented by LDTP, you need to create the application python file, you also will have to create a test suite python file, and a XML data file. This is time consuming and I though it could (and should) be automated.

So I created Magomatic. And how does it work?

Magomatic uses templates and accessibility information to create those files for you. Using it is pretty straight forward:

  1. Open the application you want to create the wrapper for.
  2. Run Magomatic:
    $ bzr branch lp:magomatic
    $ cd magomatic/bin
    $ ./magomatic
  3. When prompted, you will need to select the window you want to create the wrapper for with the mouse pointer.
  4. Done! Under the
    data/

    folder you will find a folder with the name of the application with the needed files to add to Mago and start coding your tests.

This is a work in progress, but the main and most important functionality is already there. We really hope that this will lower the entry barrier to Mago and more people will join us adding new tests in the Natty cycle.

Read more
Ara

The HW Certification team at Canonical is hiring three localized engineers: one in our offices in Lexington (MA), USA; one in Taipei, Taiwan; and one in Montréal, Canada.

The HW Certification team, part of Platform Services, provides certification as a service to partners. Basically, if a computer vendor wants to get one of their machines certified to work nicely with Ubuntu, they can buy this service.

The variety of work involved in our side is big and fun: from receiving the machines to be certified (some of them even brand new machines not yet public!), to commercial relationships, passing from a lot (a lot!) of technical work: writing testcases, maintaining our testing infrastructure, etc.

For me, working at Canonical, has been (and it is) the best professional experience I have had so far. It is great to be able to work in such a great environment, with very smart people and making Ubuntu better for everybody. I truly recommend Canonical as workplace. Please, ping me me on IRC if you would like to know more about how it is to work here.

So, if you live in any of these three locations or are willing to relocate (Canonical does not offer a relocation package), please, have a look to the following job descriptions and send your resume to victorp AT canonical DOT com.

Read more
Ara

As you may know, during Maverick cycle we introduced statistics from our testers to be reflected in the Ubuntu Hall Of Fame. Well, now, and thanks to my good friend Daniel Holbach, we have, not only individual statistics, but also statistics by LoCo team.

This is how it looks on the Hall Of Fame. Nifty, isn’t it?

Read more
Ara

On First Experiences

Everybody agrees how important the Out-Of-Box experience is for a product. If the users’ first experience with any kind of product is frustrating it is very likely that they’ll return it and never look back.

On Operating Systems, usually, this first experience is turning on the computer and reply to some basic questions. For all the major OS like Windows or Mac OS, the software comes preinstalled in the computer. For many users, therefore, that’s the computer itself: something that you turn on and it starts working.

For Linux is quite different. Despite the efforts that some companies (including Canonical) are doing in order to be easier and easier to buy a computer with a Linux distribution preinstalled, nowadays, the first experience a user has with Linux is, most of the times, a CD and an installation process. Well, if you then want to have one of the best first Linux experiences ever, wait until Sunday and install the brand new Ubuntu 10.10. Its installation process is, ladies and gentlemen, pure joy:

I would like to congratulate the Ubiquity team for the fantastic work they do every six months. They make the first experience of those people who start with Linux better every time. It is great to have the opportunity to work with you, guys.

Read more
Ara

As you might know, Ubuntu version for netbooks, UNE, is having a major re-engineering work for Maverick Meerkat,  soon to be Ubuntu 10.10. The old interface, which included packages like netbook-launcher or maximus, is going to be replaced by Unity. If you don’t know what Unity is, the nice people behing OMGubuntu published some days ago a nice review of the brand new UNE interface.

I will give you a clue: it does look very different from what you’re used to. That’s why we want to collect as many reports as possible of people upgrading from UNE 10.04 to 10.10 (with Unity).

My experience upgrading

OK, I don’t want to ask people to test something if I haven’t (and I have the means for it). I took my Dell Mini 9 (well, technically is Canonical’s, but, anyway) and I installed Ubuntu Netbook Edition 10.04.1 in Spanish. The installation went very well and fast.

After reboot, I updated my 10.04 installation and started the upgrade to Unity. Although the upgrade itself did not have any major problems it took almost six hours! Of course, I reported this as bug 646638. I talked with Michael Vogt on IRC and he will investigate.

Once the upgrade finished everything worked as expected: the language was still Spanish and there were no major crashes. Nevertheless, the global menu stayed there with “File Editar” even when no applications were running. I filed that as bug 646890 in Unity.

Apart from that, I found a couple of bugs in the Dash, but unrelated with the upgrade itself. These are bugs 646758 and 646756.

Your experiences upgrading

My system is not a real system. I use the Dell Mini 9 with SSD 8GB for testing purposes. I don’t use it on a daily basis. I reinstall almost every flavour of Ubuntu on it every milestone. My upgrade experience was from a nice, cleaned, UNE 10.04 to 10.10. No PPAs or third party software installed.

That’s why we need real feedback from the people that often use UNE 10.04 on their upgrades to 10.10. If you want to participate in our testing effort, just follow the next steps:

  1. Create an account in our tracker
  2. Upgrade to Maverick
  3. Report back!

Use my testing report as an example. You see that you can add comments (like the system you used,or general impressions) and, of course, add bug numbers if you encounter any.

Thanks and happy testing!

Read more
Ara

So far, so good

It’s been more than two years since I started working at Canonical and, although I have been blogging about my daily job here, I have never talked about how this job is important to me.

I love testing software. Yes, I know it seems strange to love an activity that some other people find a bit tedious, but I do. I was a full time developer when I discovered that I liked testing the software. Testing software gets you the opportunity to see the product as a whole, but without losing the technical part of the job. So, when I got the opportunity to work at Canonical as a member of the Ubuntu QA team it was like a dream job. Not only I was going to be able to test free software during my daily job but, also, I was going to test ALL the free software that is included in Ubuntu.

When I was hired, one of my first missions was to create a way to to test the desktop in a repeatable way, easy to maintain. That’s how Mago project started, a couple of years ago. Working on this project has been great, as it has been working closely with LDTP upstream developers. I have contributed to LDTP through bug reports, patches and helping with the release of LDTP in Ubuntu. I always tried that the latest LDTP was successfully released and uploaded to Ubuntu.

I specially remember when we were trying to get the latest LDTP before Ubuntu 10.04 Feature Freeze. Nagappan, the main LDTP upstream developer and I worked closely on IRC to meet the deadline. Together, we fixes issued, verified them, got everything together and got it uploaded it to Ubuntu just in time. It was the perfect example of Open Source collaboration.

Mago, itself, is free software, released under the GPLv3. People inside and outside Canonical have contributed to it with bug reports, patches, new features and, of course, new tests to test the desktop applications, often GNOME applications.

But, apart from Mago and desktop testing automation, I am specially happy to be able to test all the open source bits that make Ubuntu: from the kernel to the desktop, from brand new topics as Multitouch, to all time classics as Firefox.

I work for Canonical, testing free software, trying to make it better for everybody. I need to remind myself everyday how privilege I am.

Read more
Ara Pulido

Multitouch testers in the Hall of Fame

Thanks to Daniel Holbach, the people testing uTouch and reporting back in the Multitouch Testing Tracker now appear in the Ubuntu Hall of Fame.

If you have been helping testing MT and you’re name does not appear there, is due to a bug in the ISO tracker, that prevents testers that haven’t set their Launchpad ID properly. This bug has been fixed, and it will be released in our next roll out of the testing tracker, but, in the mean while, there is a workaround that I explained in a previous post.

Please, if you are helping testing uTouch, check out how to appear in the Hall Of Fame.


Read more
Ara Pulido

GUADEC 2010 Videos

This year, due to family commitments, I was unable to attend GUADEC. Although the reason why I couldn’t attend made me very happy, I also was sad by the fact that I wasn’t going to be able to attend one my favourites FOSS conferences.

Happily, and thanks to Flumotion,  the videos are now available for download.

I have started downloading some of them and, of course, the first talk that I watched was the one given by my good friend, excellent hacker and accessibility advocate, Eitan Isaacson. In his talk, Eitan explains in a non technical way, why it is important to have accessibility in mind when designing any kind of products: from buildings to software. If you are a software designer or developer, I really recommend watching his talk. I am sure you will start thinking about accessibility when designing your next application.


Read more
Ara Pulido

Testing your multitouch device

Maverick is coming with multitouch & gestures support!

OK, right, this is not news, a lot of people have been already been talking about it, inside the Ubuntu community, and also outside the community. I cannot express how excited I am about multitouch support and the possibilities it opens (phones?, tablets?, the-next-great-small-device?). But, first, we need to test it!

So, maybe, you have a multitouch device. OK, maybe you don’t. Maybe you just have a single touch device (a touchpad, a tablet). OK, maybe you don’t. Maybe you just have a mouse. In all those cases we need your help. Obviously, our main interest is in getting feedback from people with multitouch devices, but, we also need to see if regressions were introduced in the process.

So, how can you help?

Setting Up Instructions

  1. Install the utouch package.
  2. You have to have an account in our tracking system.
    1. Go to http://multitouch.qa.ubuntu.com.
    2. Click on “Log In” and “Create New Account”
  3. Once you’re done with the tracker, subscribe the Ubuntu Multitouch dev mailing list, where you will be able to contact the developers in case you face any problem.
  4. If you want, send an email to the mailing list introducing yourself.
  5. You’re all set!

Testing Instructions

We will be announcing new testing cycles in the mailing list. The tests will appear in the tracker and everybody is free to submit their results any time, while the testing cycle is opened.

Every testing cycle, you will see the tracker (http://multitouch.qa.ubuntu.com) reset. You need to click on a set of testcases to see the list:

applications.png

The list of testcases appear, with a summary on how many results have been reported:

list.png

To read the instructions on how to perform the testing, and report your result, click on any of them. The testcase view will show a form to report your result and a link to the testcase wiki, to guide you through the process.

/!\ Please note that the link “Additional instructions are available” is actually the link to the testcase description. This is a wrong wording of the link and we will fix it in the next roll out of the tracker.

testcase.png

If the testcase passed, just mark it as passed.

Filing bugs

Utouch packages come with Apport hooks, that will make it easy to file bugs with the relevant information for the developers. To file a new bug please, open a terminal window and type:

$ ubuntu-bug utouch

After the relevant information has been collected, it will be sent to Launchpad, where you will be able to describe your problem. As easy as that! You have to love Apport!

You can also point to bugs that have been already reported.

Update: note that the images are cropped screenshots; to see the full text, please visit the http://multitouch.qa.ubuntu.com


Read more
Ara Pulido

A couple of weeks ago we launched the Desktop Testing Program. You can read more about it in the original announcement but, basically, we have some infrastructure to track test results for desktop applications, a wiki that stores the testcases description and a large community willing to test every Ubuntu milestone.

The Alpha 3 testing cycle went very well, but we still need more testcases to make the Beta testing cycle event better.

Mathieu Trudel-Lapierre, one of the Network Manager upstream developers, stepped ahead and wrote some testcases for Network Manager. He, as an upstream, wanted Network Manager to be part of the testing program, to have the opportunity to get test results every Ubuntu milestone. His tests will be part of the Desktop Testing Program starting on Maverick Beta.

If you are an upstream (or would like to collaborate somehow with your favourite upstream project), you can review the available tests in our testcases wiki, and, if the application is already there, make sure that the tests still apply and write more to cover new features. If your application is not there, just create a new page and start adding new testcases. In both cases you can follow our syntax guidelines.

I think this is a great opportunity for upstreams to have their project tested on a regular basis by a great community, with results they can browse, in a repeatable way. I just hope more upstreams could know about it. If only this blog was syndicated in Planet Gnome


Read more
Ara Pulido

Today, one day after reaching the third Maverick milestone, Alpha 3, I am happy to announce the birth of a new testing project and team in Ubuntu: the Desktop Testing Team.

Every time we release a new Ubuntu milestone, testers are encouraged to install the new milestone and play around with it, filing bugs as they go. We want to go a bit further and use a more methodological approach for those people that love testing and want to help improving Ubuntu that way.

How will it work?

For every milestone of the development release of Ubuntu, we will be providing a series of testcases for you to run in that milestone. As soon as the milestone is officially released, you will be able to complete the tests in the following two weeks (although we encourage you to run them as soon as possible, to allow enough time for developers to fix the bugs).

One of the good things about this program is that you, as testers, will be able to know every time what to test, you will be able to check the new features before anybody else, and you will gain experience on the Ubuntu development process. Also, there will be a mailing list to share your experiences, bugs and to have direct feedback from the developers.

We will we using a test tracker to track your results and positive feedback (a test passed correctly) will be also really helpful. Right now, if things are working correctly, the developers need to guess it from the lack of test reports.

When will it start?

Just now! Although we don’t have a lot of testcases yet, we wanted to start the program just after the release of Maverick Alpha 3. The first weeks of the program are going to be busy and fun. Apart from testing and updating results, we are going to be introducing ourselves in the mailing list, commenting what testcases need updates and what applications we need to add when we reach Maverick Beta.

How can I participate?

Start by joining the Launchpad team and subscribe to the mailing list. Introduce yourself in the mailing list, tell us about you and what applications are you mostly interested in. Create an account in the tracker (if you already have an account at iso.qa.ubuntu.com it will work as well). Blog about it, tell your friends, tweet it. And, of course, start testing Maverick Alpha 3. We are going to make Ubuntu better. And GNOME. And many other applications that are part of Ubuntu.

You can find the full documentation on how to test on the Desktop Testing wiki page.


Read more
Ara Pulido

Alpha 3 ISO Tracker New Features

As Martin Pitt announced, we are in the Maverick Meerkat Alpha 3 release week. As for every milestone, we will be coordinating the testing of the different images we produce in the ISO Tracker. This time, however, the ISO Tracker comes with some new features (and some bug fixes) that will make your testing easier.

Coverage report is back (including optional testcases!)

After way too many milestones and releases, the coverage report is back again! This report is useful to see what testcases need some testing, and which ones haven’t been covered. This is special important for optional testcases, that are now included in the report. Optional testcases are testcases that do not need to be covered for every image but that need to be covered at least once. Visiting this report testers will clearly see which optional testcases have not been covered. So, go ahead, use this new feature, and cover some of those not-so-optional testcases!

Started tests in the landing page

In the global list of images, apart from the finished tests and the tests that failed (in red), we do have now the number of testcases that have been marked as “Started”. This number is shown in yellow.

“Not Complete” Filter

Apart from the usual status filters, we have included the “Not Complete” filter, which will show the images that have at least one uncovered testcase.

Delete your own result

Testers were complaining that, if the marked a test as Started and then something prevented them from finishing the test, they couldn’t delete their results. Now it is possible to delete your own results. If you made a mistake, nothing prevents you from going back.

Improved administrator interface

Although this feature will only be useful for administrators, I wanted to include it in this list. Apart from adding new milestones and builds, now it is also possible to add new products and new testcases using the web interface. This will accelerate the addition of new products and will help other teams to adopt the ISO Tracker quicker.


Read more
Ara Pulido

ISO testers for the Hall of Fame

I am glad to announce that, starting in the Maverick Alpha 3 ISO testing cycle, the activity of the ISO tracker will be reflected in the Ubuntu Hall Of Fame.

Actually, and as a beta of the new feature, you can now check the ISO Alpha 2 heroes at the Ubuntu Hall Of Fame under “Top ISO Testers”. If you are wondering why you are not there and you helped during Maverick Alpha 2 ISO testing, I will try to explain why.

The ISO tracker and Launchpad are not connected, but we need to use the Launchpad ID in the Hall Of Fame to get some other user information. Fortunately there is a Launchpad ID field in your ISO tracker user profile.



So, if you want your awesomeness to be reflected in the Hall Of Fame, please, update your profile in the ISO tracker and fill your correct Launchpad ID.


Read more
Ara Pulido

Alpha 2 week ahead

This week is a release week. On Thursday, the Release Team will be releasing Maverick Alpha 2, the second development release for this cycle. As every release week, this is going to be a busy one.

  • I am syncing my ISO images to have them more or less prepared when the first candidate images start to appear. I am using dl-ubuntu-test-iso to sync my ISOs. If you want to use it as well, you just need to install the ubuntu-qa-tools package.
  • I am going to prepare some VM machines with different flavours of Lucid on it, to test the upgrades to Maverick as soon as possible.
  • I will spend most of the week testing the different images and reporting back my findings to the ISO tracker. If you want to help with ISO testing this week, make sure you read the documentation first.
  • On Friday, once Maverick Alpha 2 is released, I will be upgrading my own machine to Maverick.

Read more
Ara Pulido

Background

Firefox 3.0 and xulrunner 1.9 are now unsupported by Mozilla. Rather than backporting security fixes to these now, we are moving to a support model where we will be introducing major new upstream versions in stable releases. The reason for this is the support periods from Mozilla are gradually becoming shorter, and it will be more and more difficult for us to maintain our current support model in the future.

What we are going to do

We are going to release Firefox 3.6.4 as a minor update to the 3.6 series in Lucid. This will also be rolled out to Hardy, Jaunty and Karmic (along with xulrunner 1.9.2.4). The update for Lucid is quite trivial, but the update in Hardy, Jaunty and Karmic is not quite as simple.

Before releasing these updates to the public, we need testing in Firefox, the extensions in the archive and distributions upgrades after those updates. We have published all these packages in a PPA and we will track test results before moving anything to the archive.

How you can help

We need people running *Hardy* (Jaunty and Karmic will see a similar call for testing in the following days) in bare metal or a virtual machine. If you are willing to help, you can follow the instructions below:

  1. Add the Mozilla Security PPA to your software sources

    You need to manually edit your /etc/apt/sources.list and add the following lines:


    deb http://ppa.launchpad.net/ubuntu-mozilla-security/ppa/ubuntu hardy main
    deb-src http://ppa.launchpad.net/ubuntu-mozilla-security/ppa/ubuntu hardy main

    After saving the file, you have to run:


    sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 7EBC211F
    sudo apt-get update
    sudo apt-get dist-upgrade

  2. You have to have an account in our tracking system. Go to http://mozilla.qa.ubuntu.com and click on “Log In” and “Create New Account”
  3. Explore your Firefox installation

    Basically we want people to perform the same activities that the do daily, without issues. In order to make testing easier, this is a checklist of things worth looking:

    • The upgrade to Firefox 3.6.4 goes smoothly.
    • The extensions get upgraded as well.
    • All the Firefox plugins (i.e. Flash, Java) still work.
    • The extensions work correctly.
    • Full distributions upgrades are not broken.
    • Upgrades work with only the security pocket enabled (ie, hardy-updates disabled)

To report your findings you need to use the test tracker.

Once you have selected the Hardy image, you will see a set of “testcases”, with a summary of how many reports have been sent. Obviously, the most important one is “Firefox”.

List of testcases

Once you open one of the testcases, you will be able to report back your findings if something went wrong. Even if everything went fine, it is always good to report back the success (“Passed”) with a comment on the activities you performed.

Report back

Use the “Firefox” testcase for general testing (upgrade, rendering, plugins, etc.) and the rest of the testcases if you want to report something more specific (Upgrades to Lucid, specific extensions errors, etc.)

IMPORTANT!! How to file bugs

As we are testing a PPA, not an official Ubuntu package, if you find an issue is NOT OK to file a bug in Launchpad. Rather than doing so, please, just explain your issue in the Comments field of the tracker and mark the test as Failed.

The tracker requires a bug number in order to mark a test as Failed. To bypass this requirement, just use the bug number “1″ ;-)

Thanks for helping to maintain secured Ubuntu stable releases!


Read more
Ara Pulido

I have been busy with the release of Ubuntu 10.04 LTS and, although it may seem that Mago activity has decreased, there are some news related to the project that I want to share before UDS.

Ubuntu 10.04 LTS has been released with Mago 0.2, which is the release of Mago compatible with LDTP 2.0. Earlier this year, LDTP team released a complete rewrite of the testing framework in Python. After LDTP 2.0. arrived in Lucid, Mago suffered some weeks of instability, until it was working again with the new API. Also, I gained upload rights for ldtp and mago packages last week and, hopefully, this will be reflected in more activity during the Maverick cycle.

There is going to be work related to GUI testing during Maverick cycle, and some of them have been already reflected as blueprints for discussion during UDS:

Mago internationalization:
Mago works only with “C” locale applications. We need to modify it in order to make it work with different locales. This will be useful for local Ubuntu derivatives and to test language packs.

Mago Daily:
We aim to be running Mago tests on a daily basis. One of the biggest challenges to achieve this is having perfect integration of Mago with Checkbox. We will be discussing previous problems and will try to find a solution.

Roundtable: GUI Testing:
We will be discussing the different solutions for GUI testing available, their advantages and disadvantages. Sikuli, Mago, kvm-autotest, among others.

So, if you are coming to UDS (or want to participate remotely) and are interested in automated GUI testing, feel free to subscribe to those blueprints and participate in the discussion. See you all there!

Update:
James Tatum, a Mago contributor who is also coming to UDS, points me to another blueprint for discussion.

Simplify the creation of tests in Mago:
Adding applications to the Mago library is cumbersome. To foster the creation of more test cases, we will discuss ways to make this easier.


Read more
Ara Pulido

Ubuntu Global Jam
As many of you already know, this weekend we are celebrating the Ubuntu Global Jam, an event where all the participating LoCo teams gather together and contribute to make Ubuntu even better. There are lots of ways to contribute, from developing, to translations, documentation, packaging or testing. This time we have introduced a new and very valuable way to contribute: Upgrade Jams.

The objective of the jam is easy and everybody can participate prior to start contributing to the rest of the tasks: upgrading your own machines to Lucid Lynx Beta 1 and report back your experience. You can find information about how to run an Upgrade Jam in the wiki.

Remember! When arriving to your local Ubuntu Global Jam, and before starting contributing to the rest of the activities, upgrade your system to Lucid! The ISO tracker is already waiting for your results!


Read more