I have previously complained about the about the amount of gadgets that seem to be piling by my bedside table charging quietly every night.. Laptops, Tablets, phones, kindle (yes the plural is not a typo).
On top of that I am growing frustrated with my DVR. Last week the new series of “The Mentalist” was broadcasted in the UK. I did set the record in advance but somehow it clashed and did not get recorded. Even with the missed show only one-click away, in the TV channel’s website, it turns out that my only options were to wait for a repeat on TV in 4 days or go upstairs and watch it in the office desktop. Why is it so complicated!?
Centralised Content & Specialised Consumption Devices
So it turns out that I am not going to give up my E-Ink screen for reading books. Why? Because it doesn’t hurt my eyes like an tablet screen does. Neither I am going to convince my son that watching Peppa Pig in the iPad is not any better than watching it on TV. Why? Dunno, he isn’t talking yet.
The future for me looks like it is going to involve a lot of different devices, and I am fine with that as long as:
The good news is that the technology to allow all of this to happen is already being designed. Point number 3, is the easy one! you just need an Ubuntu One account. Point 1&2, I have considered them incompatible for a long time, until I heard about big.LITTLE.
big.LITTLE is going to be BIG
Big.LITTLE is a System-on-a-Chip (SoC) that pairs up to four top notch dual-core A15 processors with four very low powered dual-core A7 processors. The beauty is that they have very similar feature-set and architecture. ARM expects to be able to switch between them depending on the tasks asked to performed without the operating systems noticing the difference.
In a nutshell, it’s like being able to choose between a Prius or a Ferrari engine without having to change cars! Just choose the one that suits your needs better for today’s journey.
This is one of the technologies that is going to ignite the next personal computing revolution. I’ll tell you all about the other ones soon
Here is Ronald, doing a great job at explaining why Ubuntu on ARM is AWESOME!!!
If you follow the Canonical blog, you would have seen that a new white paper has been published on how to implement UEFI Secure Boot in a manner that can be used by all users, including Linux. The paper is signed and authored by Matthew Garret from Red Hat, Jeremy Kerr from Canonical and James Bottomley, Linux Kernel developer.
Since Microsoft talked about their plans for Secure Boot at /Build2011, there has been lots of things said on the matter. With more than 16,000 people signing the Free Software Foundation statement on “Secure Boot vs Restricted Boot”, it is clear that this is an issue that needed some attention.
It is great to see companies like Red Hat and Canonical getting together, and coming up with recommendations that benefit the whole industry. The paper is well worth the read, Enjoy!Read more
I have been using Ubuntu 11.10 on ARM now for a couple of days and I have to say: It Rocks! Ubuntu has had a long history of supporting ARM Systems on a Chip (SoC) since 2008, but Ubuntu 11.10 is a significant milestone.
Canonical announced back in August that Ubuntu Server 11.10 would include the first ARM version of the product, and here it is. While this is just the first step on an exciting journey, it is worth to celebrate that the voyage has started. I look forward to see what 12.04 LTS brings us on this space!
It is hard to really grasp the full experience of Ubuntu on
ARM when you are playing with a development board. For this reason, we have released a demo image for the Tegra2-based (Nvidia) Toshiba AC100.
Running Unity 2D, it shows off that Ubuntu on ARM is a great platform for computing, in a very compact design and with a very long battery life. For all these reasons, this is my system of choice to take to UDS-P.
If you have a Toshiba AC100, I encourage you to install Ubuntu 11.10 in it!
Powered by the Texas Instruments OMAP4430 processor, the Panda Board packs in “a dual-core 1 GHz ARM Cortex-A9 MPCore CPU, a PowerVR SGX540 GPU, a C64x DSP, and 1 GB of DDR2 SDRAM“. Providing an affordable and competitive design tool for the embedded mobile space.
Ubuntu 11.10 on ARM is available in Headless and full image for Panda. You can find download links and installation instructions here. You can also find there Ubuntu 11.10 for OMAP3 (Beagle Board).
The IMX53 Family is oriented towards automotive solutions. Ubuntu 11.10 on ARM is the first release of Ubuntu to provide support for the IMX53 QuickStart Board. You can find download links and installation instructions here.
Both the TI OMAP4 and Freescale images are based on the Linaro outputs for those SoCs. This has greatly our capacity to support ARM development boards.
I don’t believe I am saying this, but I am no longer interested on the phone industry.. The thing is that I have been paying attention to the gadget news all this year and I am pretty interested on the new kindle’s, however I have not been interested on phones now for long time.
Android has managed to make the phone industry boring. All the phones look the same, they run the same apps, they run the same services..YAWN! Do you feel the same way? The problem to me is that 1-2 years ago a phone was the only tech item that you really needed to access all services and do anything you could possibly want.
Since then, thanks to tablets and e-readers amongst other the phone is no longer the ultimate convergence device. I am back to carrying multiple gadgets and a never-ending battery charging nightmare. Can someone invent the evolution on computing devices? please..
So, this week Apple is launching the Iphone 5 – will see…
Are you running an up to date version of Oneiric? Do you have 15 minutes spare?YOU can help Ubuntu Friendly today! Read on..
The Ubuntu Friendly program is now on its test phase. One think that we could really do with is some more real user data to test website views. The Ubuntu Friendly feeds from test submissions from Launchpad.
So what do I need to do?
You need to run the recently improved System Test tool. This tool is in the default Oneiric image and the run-time has been reduced to under 15 minutes (disclaimer: this depends on how powerful is your system!)
If you are not sure how to find this tool, just go to the Unity Search lens and type “System Testing” and click in the icon that looks like a computer screen with a tick mark.
Just follow the instructions and, if you don’t mind, ping me a comment back on this post with how long it took you to run it and any other feedback you might have!
Go on, it is Friday don’t you know…
We frequently get asked what do we test on the certification program. While we do have a simple page covering this topic, some times we are asked for further details. We have now updated the certification program guide with a more comprehensive description of the test cases. We review and update if necessary the list of test cases for each release:
Note that these test cases only apply to hardware that actually supports the functionality. For example, we do not run the bluetooth tests on a laptop that does not list bluetooth on its specifications.
Here is what the program guide says for Oneiric:
We use three different lists:
- Whitelist, or features that are required for certification. If any of the tests in the whitelist fails, the certification will fail.
- Greylist, or features that are tested, but that don’t block certification. If any of the tests under the greylist fail, a note will be added to the certificate to warn the potential customer or user.
- Blacklist, or features that are not currently tested. We will consider adding more tests as needed.
- ia32 (x86), x86_64 and ARM processors are tested to ensure proper functionality.
- Stress tests are performed to ensure that they work during high utilization as well.
- Proper detection
- General usage
- Stress testing
Hard drive(s) tests are conducted to validate proper operation:
- High load
Optical drives (CD/DVD):
- Primary display (laptop panels or primary video port on desktops)
- Multiple-Monitor (where supported, we test multi-head display (2 heads))
- External video connections (HDMI, DisplayPort, VGA, RGB, etc.)
- Multiple resolutions
- Speakers and Headphones
- Microphone (Built-in, External)
- USB Mic, USB Headphones
USB controllers. Several USB devices are used to ensure all USB ports operate as expected:
Bluetooth controllers. Several bluetooth devices are used to ensure it works
- File transfer
Built-in Web cams
- Lid open
- Lid close
- Internal keyboard
- Touch screens (single touch)
Primary special keys (volume, mute)
Suspend/Resume (30 iterations)
Tested after resume:
- Display resolutions
- USB controllers
External Expansion Port
Firewire external storage devices
Data Card ports
Hibernate/Resume (30 iterations)
Data cards that are not SD or SDHC (for example MMC)
- Hybrid Graphics: if UMA or discreet work out of the box: all ports working
- we will note which card is the one that is certified.
- Whether proprietary drivers are necessary to enable 3D graphics.
- Wi-fi Slider: if the slider to turn the wi-fi on/off is not working, but the wi-fi
- can be disconnected through the UI controls, this failure is accepted (and noted).
Secondary special keys:
- Media Control
- Fingerprint readers
- HDMI/DisplayPort audio
- Surround audio
- Multitouch touchpads
- Multitouch screens
- Specific USB 3.0 devices
- 3G connections
After spending some time last week locked in a room thinking about how to better display hardware information to consumers for Ubuntu Friendly, I started to wonder if we could apply some of the ideas to the certification site.
Following some discussions on my previous blog post, I have come up with a wire-frame design that I hope would address these points:
What do you think, will this help? Does this address users concerns?
I was reading the Ubuntu Forum when I saw a thread called Ubuntu-certified hardware is not accurate ! This grabbed my attention.
The main issue seemed to be that the user that started the thread wanted to know if he should buy the Lenovo X220 or not. He had looked around and seen that the system is Certified (pre-install only) for 10.10 but found several user comments in the web pointing at problems with stock Ubuntu.
I was planning to reply explaining when I found this great reply from williumbillium:
First of all, the X220 works well with Ubuntu. I bought one last week and for the most part the laptop is well supported and IMO the current issues are either minor (probably wouldn’t cause the laptop to fail certification) or will likely be fixed soon. I’m documenting my experience on the wiki.
I believe that the “special image of Ubuntu” referenced on the certification page must be a business only deal. I’ve contacted Lenovo about it and been told that it’s not available.
That said, I saw a number of bugs fixed by Canonical employees before the laptop was even released so I believe that us consumers are benefiting from the fact that it’s certified.
Finally, I would not recommend installing 10.10 on this machine unless you have a particular reason to. Since it’s using brand new hardware (Sandy Bridge) it really needs the latest kernel to work well. I don’t have most of the issues mentioned on this ThinkWiki page for example.
The reason why Williumbillium “saw a number of bugs fixed by Canonical employees” is that Canonical has commercial engagements with companies like Lenovo to make Ubuntu work well on their systems. These engagements result on:
Following this process, the Canonical team has successfully Certified with standard Ubuntu over Fifty systems for 11.04 that previously did not work well with Ubuntu. And more are in the pipeline for 11.10…
Good news for embedded device developers trying to bring up a Linux software stack on their systems, Ubuntu Core is getting ready for Oneiric.
First thing you are going to ask me is what is Ubuntu Core? Well here is what the Ubuntu wiki says:
Ubuntu Core is a minimal rootfs for use in the creation of custom images for specific needs. Ubuntu Core strives to create a suitable minimal environment for use in Board Support Packages, constrained or integrated environments, or as the basis for application demonstration images.
Ubuntu Core delivers a functional user-space environment, with full support for installation of additional software from the Ubuntu repositories, through the use of the apt-get command.
So what does it all mean? Ubuntu core is all about making it easy to get started with a functional software stack that needs to fit into a tiny space.
I have seen the pain of many Symbian hackers bringing up new hardware with only a massive system configuration work with. Where do you start debugging?
Undoubtedly the best way to work is to start with a minimal system configuration, which you can use on your early stages of board support software development, and slowly add only what you need to it. Keeping the software from bloating is a corner stone of Bill Of Material (BOM) management.
A good example of this is the Ubuntu IVI remix that is build up from Ubuntu Core and has recently achieve Genivi Compliance. You can also check the Canonical site for more details on the benefits of Ubuntu Core.
Well, how small is SMALL? Well, it is around 100MB , although it compresses to a download of 32MB. So pretty small!
Undeniably, a Long Term Support release is all about the maintenance. In the certification team, we will be focusing our efforts in the next release cycle on improving our Stable Release Updates (SRU) testing for Certified hardware.
While we get a fair amount of feedback on regressions introduced by proposed SRUs in clients, we do not hear very often from the Server community. Therefore have less idea on where to improve our testing. I would like to assume that this is because we are catching all regressions but what are the chances of that?
If you are running Ubuntu Server, I would like to hear from you about your experience with SRUs and any serious hardware-specific regressions that you may have encounter? Anything that regularly goes wrong with SRUs that we should be looking for?
Thanks and look forward to discuss more with you at UDS-P!
As reported previously , the DELL Vostro 3300 has been pestered with continuous problems with external monitors. I am happy to report that since this week I am running Natty (11.04) with a dual monitor set-up and perfect image in both.
The downside is that I am currently using a custom Kernel. It was created by Seth Forshee to fix the “Intel Core i3 External Monitor Wavy Output” bug. Thanks Seth! You are my hero!
This bug has been a long standing issue, with over 40 users reported to be affected in Launchpad. That doesn’t seem so high, but if you go through the comments, you will see the variety of hardware impacted by this.
Seth has provided several custom kernel:
If you give them a try, please add your feedback to the bug! In case you are not too sure on how to install them, here is what I do:
I hope that this fix makes it upstream and through the SRU process soon, so I can keep installing kernel updates!
The Ubuntu Certification team is fully distributed and has now been running Scrum for over 9 months. The team has members in Canada&US, Europe and Asia. I have been blogging about several parts of our scrum experience, now is time to piece it all together!
We run in 2 week iteration cycles within a larger 6 month release cadence. Here is what those two weeks look like:
Day1 (Thursday)- Planning session
We run the planning session (30 minutes) just after the previous iteration Demo session – No room to breath! The reason for doing this is just down to timezone and trying to get as many people as possible into this sessions.
We host the planning session in Mumble, and we review the backlog for the next iteration. We found it a bit dull just for the Product Owner to explain what each story was about. Instead, we ensure that everyones participation by agreeing the definition of done for the stories. This eliminates any misunderstandings of what needs to be deliver and ensure that everyone is paying attention.
Just after the planning session, the scrum team gets together to flesh out the task-board for the iteration. At this point the stories are re-size via IRC planning poker: At the count of three by the Scrum Master every one pastes a t-shirt size on the IRC channel.
Following the poker planning, the team discusses possible implementations and they write down tasks in the IRC channel, to be later translated by the Scrum Master into the backlog.
We run two scrums (no longer than 15 minutes) a day. A reduced one at 9.30 UK time with Europe and Asia, and a larger one at 15.00 UK time including UK and US. We run both using Mumble but Google+ is also a good option.
Day 5 (Wednesday) – Backlog review with the Scrum Master
On Wendnesday, Ara and I review the progress of the backlog and discuss any stories that might need to be refocused, unblocked or delayed to a later iteration.
Day 6 (Thursday) – Discussing impediments and new ideas
At this point, we have reach the equator of the iteration. We host a 45 minutes meeting following the main scrum to talk about any issues the team wants to raise. This mainly focuses around problems facing our work or new ideas for future iterations or releases.
Also, the Scrum Master send an mid-iteration status email. This ensures that nothing is falling through the cracks and everyone knows the overall iteration progress. We find that scrums tent to focus on what people are working and not what is left in the backlog, this can lead to lower priority user stories being worked on while higher importance ones remain overlooked.
Day 9 (Tuesday) – Backlog review for next iteration
The Scrum Master, Product Owner and I get together to review what stories are likely to not be completed. This is normally 80% accurate and gives us a better idea of how many new stories are to be added for the next iteration. Then, we discuss priority of stories and we create a draft backlog for the next iteration. Although there are always changes during the planning session, this gives us a solid draft to start from.
Day 11/Next Day 1 (Thursday) – Demo
We completed the full circle and we are back at the demo and planning meeting, where a demo lead shows via Spreed (screen sharing tool) what has been achieved.
In my team, we spent most of our time working with system manufacturers improving hardware support in Ubuntu. Apart of allowing users to install Ubuntu after they purchases their laptop, we also like to increase the number of computers that you can buy from the shops with Ubuntu pre-installed.
If you ever had the chance to work within the logistics of a manufacturing line, you will understand the level of complexity and how far remove software developers are from the shop floor. As feedback is the best way to learn and improve, here is my request: Please share your Ubuntu Pre-installed story with us!
Have you ever bought a system with Ubuntu pre-installed? Where did you get it? What systems was it? How did it go? Could you then upgrade to the next Ubuntu release?
I look forward to hear your story!
(this blog has been reproduce from goingagile.org)
When working with Agile, make sure to define your long term strategy that gives direction to your product backlog.
The Ubuntu Certification programme follows the beat of the 6 monthly release cadence. In the certification team we run a two week iteration cadence. It is a continuous delivery machine! The danger is for your ambitions to get stuck in the quick rhythm.
Regardless if I am working with a product or a service team, I found it important to set a clear vision to aim for. The constant cadence of Agile is normally riddle with changes in priorities. While this enables the team to remain flexible, I have found that can be confusing for the individual: “Tell me again why are we doing this?”
Having a clear vision or product road map doesn’t only benefit your team, but also your stakeholders. I often find that a lack of a shared vision creates a mistrust – “This iteration could be the last one. Quick, I better ask for everything I need at once! Everything is high priority!”, sounds familiar?
Sharing a common set of principles and aspiration to deliver great value is sometimes confused with the need to have a committed two-year plan. To remain competitive, I rather stop second guessing the future and build working practices that allow for change and make people comfortable working with the unknown.
The Certification team at Canonical has been Going Agile now for the last 9 months. Oneiric is the first release that we are running full Scrum practices. We are a bit unique as we are spread all over the world. We have 2 people in Montreal (Canada), 1 person in Boston (USA) , 1 person in Raleigh (USA), 3 scatter over the United Kingdom, our Scrum Master is in Germany, and our latest team member is in Taipei (Taiwan). Running Scrum in this type of environment needs constant innovation. I am keeping track of our progress in my blog at victorpalau.net/tag/scrum/
Roughly every three months, we get together somewhere in the world. We just got back from the Ubuntu Rally in Dublin, where we decided to give our backlog some love!
We largely build our backlog at the Ubuntu Developer Summits and then we continue to add and remove items as we go.
Halfway through the project and with over 100 items to complete before the end of October, we needed to step back and make sure that we were working on the right priorities and that nothing had fallen trough the cracks. What better way to do this than a full poker planning session. Here is how it worked:
I was aware that data centers around the world were starting to be talked about as an environmental problem, but perhaps the statistic that data centers have the same carbon footprint than the aviation industry (about 2% of the global carbon footprint pie) really put things in perspective for me.
The Open Data Center Alliance ”Carbon Footprint Values” document starts its executive summary with:
According to market research and consulting firm Pike Research, data centers around the world consumed 201.8 terawatt hours (TWh)
in 2010 and energy expenditures reached $23.3 billion. That’s enough electricity to power 19 million average U.S. households. The
good news is that, according to Pike Research, the adoption of cloud computing could lead to a 38% reduction in worldwide data center
energy expenditures by 2020.
The prediction that cloud computing will lead to large savings of energy consumption can be justified by economies of scale. Todays’ enterprise data centers average 20-30% computing power utilisation. The same data center serving Infrastructure As A Service (IaaS) is expected to run at 80-90% occupancy. This plus the opportunity for enterprises to transform a fix cost of ownership into a flexible service subscription will lead to consolidation of data centers.
Economies of scale will also allow large scales data center providers to invest in propose build more sustainable and cheaper to run buildings. A good examples of this is the server and data center specifications shared by Facebook via the Open Compute Project, or Google’s water powered and cooled at-Sea data centers.
As discussions of hefty fines for London by the European Union are currently taking place, sustainability is becoming less a matter of corporate responsibility and more of legal compliance.
However, Cloud computing is bringing applications to individuals that were only available to enterprises a few years ago. This will multiply the need for data centers across the globe beyond the current demand. We need to work beyond finding cheaper ways to cool and power servers and start tackling the real problem, servers themselves need to be exponentially more efficient.
Certification is a generic level of functionality to be expected for hardware running on an Ubuntu Release. Part of the challenge is to identify what hardware components should be included in the test.
The aim is to cover all widely accepted components, while excluding fringe ones that may only be of interest to a small set of the user base.
In the past it has been a bit hard to understand what components were tested for Certification and this has lead to questions like “Why is the finger reader scan not working in this certified system?” when the answer is simply that certification does not test for finger-reading functionality.
You can now see at a high level what certification includes for both servers and clients (click on the image to see the full list):
© 2010 Canonical Ltd. Ubuntu and Canonical are registered trademarks of Canonical Ltd.