Canonical Voices

matthieu-james

The new Ubuntu icons

During last month’s vUDS we showcased the latest design explorations for the new Ubuntu icon theme. Here is a summary of what we presented.

Our objectives

This project’s main goal is to create a single modern, high-resolution icon theme for desktop and touch devices that can adapt to various screen densities and reinforces the Ubuntu user experience. We want our icons to express our values and convey Ubuntu’s personality in a unique way.

We already had mobile icons for the applications and symbols, but, because they evolved over time without strong guidelines, did not form a consistent set. On the desktop, even though the style is clean and consistent, the icons looked dated and needed to be replaced too.

Previous desktop icons
Previous mobile and monochromatic iconsThe previous version of desktop, mobile and monochromatic icons

New icons

We’ve been working on this on-going project for the past year. We’ve done extensive research on the subject with a focus on learning how best to classify the icons; and we’ve gone through several design iterations and explorations.

So here is the latest iteration of the new icon set. As I’ve mentioned, these are all still subject to change as we’re constantly improving and refining the designs.

Latest application iconsLatest application icons

Latest symbol iconsLatest symbolic icons

Icons in contextIcons in context — one of the latest design explorations of the dash

Next steps

The goals for 14.04 are to provide a new icon theme for mobile and tablet, and to provide guidelines with templates to help people to design consistent icons for their apps. We’d like to eventually implement the new set on the desktop too.

We’ve had lots of good feedback so far, and we’d like to get even more, so please let us know your thoughts in the comments!

Read more
Christina Li

App Design Clinic #6

We have been running the app design clinic every two weeks to answer any questions from community designers and developers on the apps they are working on!

For this session we talked about the community submitted convergence designs for file manager and clock app (thanks everyone!) as well as answering some questions from our Canonical engineers submitted apps, such as:
- If your app has two equal actions- how do you provide entry points?
- What if I want to show more content, but page stack is not appropriate?
- Where should ‘About’ & ‘Settings’ go? (Not in the tabs, please)

If you missed it, or want to watch it again, here it is:

Please send your questions and screenshots to
design@canonical.com by 1pm UTC on a Tuesdays to be included in the following Wednesday clinic.

Watch this space for our next App Design Clinic time.

Read more
Katie Taylor

App Design Clinic #5

Over the last few months we have been running the App Design Clinic and we want to thank you for all your submissions, contribution and feedback!
The Design Clinic happens every two weeks and the last one co-incided with vUDS, so we included more information on general app design as well as answering quesitons.

Here’s what happened:

- A summary of topics from previous App Design Clinics
- A run-down of where to find the necessary things and resources to design an app
- And finally, how you can contribute and participate in the clinic

The next one happens tomorrow, 4th of December. Please send your questions and screenshots to
design@canonical.com by 1pm UTC on a Tuesdays to be included in the following Wednesday clinic.

Read more
Christina Li

On 19-21 November we had our vUDS where we got to discuss and share with the community some of the design work we’ve been doing recently.

Our topics ranged from our design blog to convergence designs to Juju GUI cloud to icon designs!

If you missed any of our sessions, don’t worry. They are all below for you to check them out!

Design Blog

Love our blog? How can we make it better? What topics would you like to see?

Responsive Design

Hear about our thoughts on converging our patterns, components and designs from phone to tablet to desktop.

App Design Clinic

Every two weeks, we gather to talk about app designs and patterns. If you are developing an app or have any questions on apps, let us know!

Designing a responsive website and web guide

We talked about the process of designing a responsive website and shared the current web style guide we have been using for the main Ubuntu.com site.

Research on Windows and Android usability

Juju GUI design evolution

User research has informed the way Juju GUI has changed over the last year. Here is the evolution of Juju GUI.

Designing icons for Ubutnu

We have been designing icons for Ubuntu Phone and Tablet and Desktop. Check them out!

Let us know what you think, or suggestions on what you want to see next from the Design team at the next vUDS!

Read more
Inayaili de León Persson

Latest from the web team — November 2013

Even though Ubuntu 13.10’s release is behind us, we always find ways to keep busy. Here are the highlights of the past four weeks.

In the last few weeks we’ve worked on:

And we’re currently working on:

  • Ubuntu Resources: we’re iterating on the current alpha release, improving the design and adding new features
  • Canonical website: we’re currently exploring design directions and finalising the content for the site
  • Juju GUI: we’re refining the bundle experience and interactions for the 14.04 release
  • Fenchurch: we’ve been improving deployment scripts and asset deployment
  • Live chat trial: we’ve been helping the sales team to test a live chat feature on www.ubuntu.com

We also welcomed a new member of the team: Felipe is the new Lead User Experience. And we’ve learned about Karl’s cage fighting past.

Welcome lunchTeam lunch to welcome Felipe and Karl

Have you got any questions or suggestions for us? Would you like to hear about any of these projects and tasks in more detail? Let us know your thoughts in the comments.

Read more
matthieu-james

Juju ice-cream icon design

Who doesn’t like ice-cream? Here in the design team we sure do! In the last few weeks we’ve been preparing a special Juju demo for the OpenStack Summit in Hong Kong and we’ve created some very ‘tasty’ icons for it. We thought it would be nice to show you how those icons were created, so here’s a little insight on the design process.

The brief

We wanted to replace the normal Juju icons for something a little bit more special in order to explain to people that visited the Ubuntu stand what kind of things Juju can do. We decided to use the idea of an ice-cream with toppings and sauce which you can build in the same way that you can build services in Juju.

The best part of this demo is that people would actually get the ice-cream they had ‘built’ in Juju in real life!

JujuThe Juju interface, with its default icons

Finding good concepts

The first thing I needed to do was to find good concepts to present ice-creams and toppings in an icon format. Toppings were going to be especially tricky, as they can be very small and therefore hard to make out at small sizes.

I initially sketched and designed some ideas that were using a kind of flat look. This worked well for the ice-cream, but not so much for the toppings — I soon noticed they had to be semi-realistic to be recognisable.

Initial flat sketches
Initial version of the Juju ice-cream iconsInitial sketches and designs following a flat and more simplified look

At a second stage, I added perspective to the icons; it was important that the icons kept the same perspective for consistency.

Sketches with added perspectiveAnother set of sketches with added perspective

The shape of the sauce bottles was also something that needed a bit of trial and error. The initial design looked too much like a ketchup bottle, so we’ve decided to try a different approach.

Before and after sauce bottle shapeBefore and after shape of the sauce

For the backgrounds, I chose to use vibrant colours for the ice-cream icons, to contrast with the ice-creams’ monochrome palette, but paler colours for the toppings, as these are already quite colourful.

The amount of detail added to the icons is just enough for what we needed to show and for them to be recognised. I’ve also added larger pieces to the side of the toppings, to make them easier to be identified.

Juju Oreo toppingThe Oreo topping icon, with a side of Oreos

Working out the detail

The Oreo pieces were created from a single biscuit, which I cut into 9 different parts and then distributed in different layers — I guess in a similar way to what happens in real life.

9 Oreo piecesThe 9 pieces used to create the icon

The clone tool in Inkscape came in handy: repeating the same small set of different pieces made the final SVG file much lighter, and also Inkscape faster.

The whole process took 4 days from brief to final icons, which is quite a tight deadline, but it was a really fun project to work on.

Final icon setThe final icon set

Read more
Inayaili de León Persson

The new Ubuntu Resources

Today we’ve launched the alpha version of our latest project: the Ubuntu Resources website.

This is our first responsive project that follows the mobile-first methodology and we’re very excited to share this with everyone!

As you’ll be able to see, we’re not quite done with it yet, but we wanted to share what we’ve created so far, so we can get feedback and keep improving the design and expanding the features.

Ubuntu Resources on a phoneThe new Ubuntu Resources on an Ubuntu-powered phone

A little bit of background

This project grew from the need to separate content like case studies, news, press releases and events, from the core of the Canonical and Ubuntu sites — and it will eventually replace much of what currently is at insights.ubuntu.com. As the site is designed for reading and engaging with longer pieces of content, we thought it would be the ideal place to explore mobile-first and responsive approaches. And we plan to use what we’?ve learned from it to make www.ubuntu.com and our Web Style Guide responsive.

Scaling things down

We started the research phase taking a holistic view of the project, trying to understand what types of content and users we wanted to target. We realised that with limited time and resources we would have to divide the project into different releases, so that we could make sure each aspect of the site was given the attention it deserved.

The first and current release of the site — alpha — focuses solely on small screens. The main goal is that all the content is accessible and the visual style and features will be progressing and being added as we go.

WireframesInitial wireframes across a variety of screen sizes

Reusing existing styles

One of the challenges in this project was deciding how we were going to integrate the existing Web Style Guide, which we’ve been using internally for a while now and will be made public on design.ubuntu.com soon.

Ubuntu Web Style GuideSneak peek of our Web Style Guide

We decided to use a minimal version of the style guide that kept the Ubuntu Resources’ style coherent with www.ubuntu.com and that we could improve on.

You’ll also notice small details that align with our phone design, like the grid, navigation selection and icons, and we’ll be adding even more in the upcoming releases.

What’s coming

Apart from working on the larger screen versions of the site, some of the things we will be looking into for the next iterations are:

  • the ability to subscribe to different types of content
  • more curated topic landing pages
  • content filtering and sorting
  • cleaner URLs
  • the way we handle PDFs and other file formats
  • more content like a press section

Go and have a look at the site and let us know your thoughts. We want to know what you like and what you think can be improved, or any other comments you might have — we’ve included a handy link to the feedback form at the bottom of every page. Enjoy!

Read more
Christina Li

November Brown Bag lunch

Some of us in the Design team have been gathering on a monthly basis to have lunch together and share things we find interesting to us.

Today, I’d like to share with you the Brown Bag lunch we had this week.

Vesa shared with us his interest in photography and showed us some of the shots he took over time.

9690256316_8c4040a4bb_bWestminster at night by Vesa (flickr)

I came across an inspiring research done by Helen Hamlyn Centre for Design at the Royal College of Art in London. The research focused on facilitating older people using mobile phones, rather than designing a simpler phone for them to use.

And, our challenge of the month was to build the tallest paper tower! Each team had 20 pieces of paper and 6 minutes, with 2 rules:

1. You can only use paper to build your tower
2.You can tear or fold the pieces of paper.

Well, I’m happy to report that Rachel, Vesa and Olga proudly won this challenge with their paper tower!

photo (2)

How would you build your tower in 6 minutes?

Read more
Tingting Zhao

In the previous post, we talked about how to design effective user testing tasks to evaluate the usability of an interface. This post continues with this topic by highlighting a number of key strategies that you may need to use when conducting a formative user testing, whose main aim is to identify usability problems and propose design solutions, rather than compare quantitative metrics (summative testing), eg. task completion time and mouse clicks. It is unlikely that the prepared task script could be strictly applied without any changes, since the testing situation tends to be dynamic and often unpredictable. To get useful data, you need to be able to manipulate your task script with flexibilities, while also maintaining consistency.

Changing task orders during testing

Normally, to avoid order effect, the issuing of the tasks should be randomised for each participant. Order effect refers to the effect in which the tasks are presented, which can affect the results, specifically: users may perform better in later tasks as their familiarity with the interface increases; or users may perform worse as they become more fatigued. However, as discussed in the previous post, the tasks are often contextual and dependent on each other, so you need to carefully consider which tasks could be shuffled. For example, it is a good practice to mark on the script to indicate dependent tasks, so that you know these tasks should not be reordered and separated from each other. In other words, the dependent tasks must always be moved together. It is worth noting that the randomisation of task orders may not always be possible for a testing, for example, when the tasks are procedurally related, such as a testing focusing on payment flow.

Sometimes you may need to change the task orders by considering their levels of difficulty. This is useful in the following two scenarios: when you notice a participant appears to be slightly nervous before the testing has started, provide a simple first task to put him/her at ease; or when you observe a participant has failed to solve several tasks in a row, provide one or two easy tasks to reduce the frustration and stress, and boost confidence.

Another type of changing task order is made in response to users’ solicited goals that are associated with a coming task. For example, in one phone testing, after a participant checked the battery level, s/he spontaneously expressed a desire to know if there was a way to switch off some running apps to save battery. In this case, we jumped to the task of closing running apps, rather than waiting until later. This makes the testing feel more natural.

Remove tasks during testing

There are typically two situations that require you to skip tasks:

  • Time restriction

  • Questions being answered with previous tasks

Time restriction: user testing normally has a time limit. Participants are paid for certain lengths of time. Ideally, all the tasks should be carried out by all the participants. However, sometimes they take longer to solve tasks. Or you may discover areas that require more time for investigation. In this case, not all the tasks could be performed by a participant within the given time. Subsequently, you need to be able to quickly decide which tasks should be abandoned for this specific participant. There are two ways to approach this:

  • Omit tasks that are less important: it is always useful to prioritise the tasks in terms of their importance – what are the most important areas that have key questions that need to be answered and require feedback; what could be left for the next testing, if not covered this time?

  • Omit tasks that have already received abundant feedback: skip the tasks from which you have already gathered rich and useful information from other participants.

Questions were answered with previous tasks: Sometimes questions associated with a specific task would be answered while a participant was attempting to solve the previous task – in this case, you could skip this task.

In one of our phone testings, we asked a participant to send a text to a non-contact (a plumber). During the task-solving process, s/he decided to save the number to the contact book first and then send a text. In this case, we skipped the task of ‘saving a number to contact book’.

However sometimes you should not skip a task, even if it might seem repetitive. For example, if you want to test the learnability and memorability of a feature, having the participant perform the same task (with slightly different descriptions) for the second time (after a certain time duration) could afford useful insights.

Add tasks during testing

There are three contexts in which you could consider adding tasks:

  • Where the user formulates new goals

  • Re-examinations

  • Giving the user a second chance

The added task must be relevant to the aim of the testing, and should only be included if the testing time permits.

User formulates new goals: you could add tasks based on user-formulated goals in the task solving process.

For example, in one phone testing, one participant wondered if s/he could customise the tiles on the Windows phone’s home screen. We made this an added task for her/him. Adding tasks based on user articulated new goals follows their thought process and make the testing more natural. It also provides opportunities for us to discover new information.

Re-examinations: sometimes the users may succeed in a task accidently, without knowing how s/he did it. In this case, the same task (with a slightly changed description) could be added to re-assess the usability.

For example, in one phone testing, we had one task: “You want to call you sister Lisa to say thank you for the phone”. One participant experienced great difficulties in performing this task, and only completed it after a long time and by accident. In this case, we added another task to re-evaluate the ease of making a phone call:

“Your call is cut off while you are talking to your sister, so now you want to call her again.”

Similarly in the Gallery app testing, where participants managed to add a picture into a selected album accidently, we asked them to add another picture into a different album.

Re-examination allows us to judge accurately the impact of a problem, as well as to understand the learnability of interface – the extent to which users could detect and learn interaction patterns (even by accident), and apply the rules later.

Giving the user a second chance: in the majority of the user testing, participants used the evaluated interface for the first time. It could be very demanding for them to solve the tasks successfully in their first attempt. However, as the testing progresses, participants might discover more things, such as features and interaction patterns (although possibly by accident). Consequently, their knowledge of the interface may increase. In this case, you could give them another chance to solve the task that they failed earlier in the tests. Again, this helps you to test the learnability of the interface, as well as assess the impact of a problem.

For example, in a tablet testing, one participant could not find the music scope earlier in the testing, but later s/he accidentally discovered the video scope. To test if s/he now understood the concept of dash scopes, we asked the participant to find the music scope again after several other tasks.

Change task descriptions (slightly) during testing

Information gathered from pre-testing brief interview and participants’ testing verbal data could often be used to modify the task description slightly to make the task more realistic to the users. This also gives the user the impression that you are an active listener and interested in their comments, which helps to build a rapport with them. The change should be minor and limited to the details of the scenario (not the aim of the task). It is important that the change does not compromise the consistency with other participants’ task descriptions.

For example, in a tablet testing, where we needed to evaluate the discoverability of the HUD in the context of photo editing, we had this task: “You want to do some advanced editing by adjusting the colour of the picture.” One participant commented that s/he often changed pictures to ‘black and white’ effect. In response to this, we changed the task to “You mentioned that you often change a picture to black and white, and now you want to change this picture to ‘black and white’”. The task change here does not change the aim of the task, nor the requirements for solving the task (in this case, the access to the HUD), but it becomes more relatable to the participant.

Another example is from a phone testing. We changed the task of “you want to go to Twitter” to “you want to go to Facebook” after learning the participant uses Facebook but not Twitter. If we continued to request this participant to find Twitter, it would make the testing become artificial, which would result in invalid data. The aim of the task is to evaluate the ease of navigation in finding an app, therefore changing Twitter to Facebook does not change the nature of the task.

Conclusions

This post outlines a number of main strategies you could use to modify your task script to deal with typical situations that may occur in a formative user testing. To sum up:

Changing tasks orders: randomise tasks for each participant if possible, and move the dependent tasks as a whole; consider the difficulties of the task and issue an easy task to start with if you feel participant is nervous, or provide an easy task if participants failed several tasks in a row. Allow them to perform a later task if they verbalise it as a goal/strategy for solving the current task.

Remove tasks: if time is running out with a particular participant, omit certain tasks. This could be tasks with low priorities; tasks that already have enough feedback from other participants; or tasks the participant has already covered while attempting a previous task.

Add tasks: if time permits, allow users to perform a new task if it is a user initiated goal and is relevant to the testing; repeat a task (with slightly different wording and at an appropriate time) if the user succeeds in a task accidently, or has failed this task earlier, or if the aim is to test the learnability of the system.

Change task description: slightly amend the details of the task scenario (not the aim of the task) based on users’ verbal data to make it more relatable and realistic to the user. This will improve the reliability of the data.

If you have other ways to maneuver the tasks during the testing session, or have situations you are unsure about, feel free to share your experience and thoughts.

Read more
Jouni Helminen

App Design Clinic #4

App design Clinic #4 focuses on icons, with questions from Stuart Langridge including:

  • guidance on creating app icons (stylistically and in terms of file format and resolution)
  • tips on how to use action icons

The presentation deck link will be shared on the blog once it’s been checked by our icon designer, and we hope to have an icons guideline with downloadable templates and full API docs online within a month.

The next clinic is held in conjunction with vUDS. Let’s make it a great one,  please send any designs and/or questions to design@ubuntu.com

Read more
Spencer Bygraves

A week in San Francisco

I recently attended my first cloud sprint meeting held in San Francisco, and it turned out to be a great experience. It’s been 10 years since I last visited, so as well as working hard, it was nice to have the opportunity to see the city again.

Whilst there we worked on the UX and visual design for two of our cloud products, which we’ll be able to share with you soon. It was also a great opportunity to spend time with colleagues from around the world, working together during the day and having a few beers in the evening.

In terms of design, we are working to extend the cloud visual language that is being established through the Juju GUI, with a view to having a consistent suite of cloud products.

A post with some cloud designs will follow soon. For now, here’s some pictures from our week in San Francisco.

Watch this space!

 

SF-01Discussing Juju and collaborative coding

 

SF-02San Francisco

 

 

Read more
Peter Mahnke

So I am stretching the metaphor a bit, but I think it accurately explains my experience of the recent cloud sprint in San Francisco.

The week starts with some presentations and talks about where we are now and where we want to be from a company, marketplace and product perspective.  This lasts about two hours, then all 115 of us are set free to figure out what we can do to help best achieve these visions. Things are more organised than at an unconference , there are tracks and rooms and sessions planned, but it is all very fluid.  Each day reveals itself and  the week gathers its own momentum.

Spencer, Ale and Luca looking at a wireframe
Some people are here to finish off some work and coordinate releases. Some people are trying to plan the next six month cycle with team-mates they only see a few times a year. Some people have just joined the company and some people are trying to design for the next year or more.  That’s us.

While most here are looking at April, we are brainstorming, paper prototyping, grabbing stakeholders, talking to users, meeting with developers and trying to build that shared vision for a set of products and where they might go in the future — inspiring, chaotic, impossible, crazy, amazing.

set of wireframes and post-it notes

But we are also trying to finish things off from the last cycle, pay some technical debt, polish up a few things.  We are trying to listen to what else is happening, it all moves so fast. We also sign-up to get at least four other smaller things done in the next month.

At the end of the week, a few things are finished. Even better, a few more big things are planned. Dozens of drawings, hundreds of post-it notes are photographed. We shake hands with friends and colleagues that we will only talk to online for a few months and head home to get building.

Read more
Anthony Dillon

I was recently asked to attend a cloud sprint in San Francisco as a front-end developer for the new Juju GUI product. I had the pleasure of finally meeting the guys that I have collaboratively worked with and ultimately been helped by on the project.

Here is a collection of things I learnt during my week overseas.

Mocha testing

Mocha is a JavaScript test framework that tests asynchronously in a browser. Previously I found it difficult to imagine a use case when developing a site, but I now know that any interactive element of a site could benefit from Mocha testing.

This is by no means a full tutorial or features set of Mocha but my findings from a week with the UI engineering team.

Breakdown small elements of your app or website its logic test

If you take a system like a user’s login and register, it is much easier to test each function of the system. For example, if the user hits the signup button you should test the registration form is then visible to the user. Then work methodically through each step of the process, testing as many different inputs you can think of.

Saving your bacon

Testing undoubtedly slows down initial development but catches a lot of mistakes and flaws in the system before anything lands in the main code base. It also means if a test fails you don’t have to manually check each test again by hand — you simply run the test suite and see the ticks roll in.

Speeds up bug squashing

Bug fixing becomes easier to the reporter and the developer. If the reporter submits a test that fails due to a bug, the developer will get the full scope of the issue and once the test passes the developer and reporter can be confident the problem no longer exists.

Linting

While I have read a lot about linting in the past but have not needed to use it on any projects I have worked on to date. So I was very happy to use and be taught the linting performed by the UI engineering team.

Enforces a standard coding syntax

I was very impressed with the level of code standards it enforces. It requires all code to be written in a certain way, from indenting and commenting to unused variables. This results in anyone using the code, being able to pick up it up and read it as if created by one person when in fact it may have contributed by many.

Code reviews

In my opinion code reviews should be performed on all front-end work to discourage sloppy code and encourage shared knowledge.

Mark up

Mark up should be very semantic. This can be a case of opinion, but shared discussion will get the team to an agreed solution, which will then be reused again by others in the similar situations.

CSS

CSS can be difficult as there are different ways to achieve a similar result, but with a code review the style used will be common practise within the team.

JavaScript

A perfect candidate as different people have different methods of coding. With a review, it will catch any sloppy or short cuts in the code. A review makes sure  your code is refactored to best-practise the first time.

Conclusion

Test driven development (TDD) does slow the development process down but enforces better output from your time spend on the code and less bugs in the future.

If someone writes a failing test for your code which is expected to pass, working on the code to produce a passing test is a much easier way to demonstrate the code now works, along with all the other test for that function.

I truly believe in code reviews now. Previously I was sceptical about them. I used to think that  “because my code is working” I didn’t need reviews and it would slow me down. But a good reviewer will catch things like “it works but didn’t you take a shortcut two classes ago which you meant to go back and refactor”. We all want our code to be perfect and to learn from others on a daily basis. That is what code reviews give us.

Read more
Inayaili de León Persson

IKEA’s design process

Graham pointed out a recent Wall Street Journal article out to me as I was going on about my recent kitchen renovation (yes, I’ve used IKEA units). It gives a glimpse into IKEA’s ‘painstaking’ and, for me, fascinating design process.

IKEA kitchenPhoto by David

Even though, being IKEA, they can very much define how people will live, they go through long and careful research, which also has to be in line with their strict production processes. The symbiotic relationship between user, design, engineering, and the dedication to improving this relationship, reminded me of the design process that happens here at Canonical and Ubuntu.

Research Manager Mikael Ydholm leads a team that visits thousands of homes annually … and compiles reports from trend spotters and experts that look as far as a decade into the future.

I would love to learn more about IKEA’s design processes and their designers’ work, so if anyone knows of more in-depth articles, videos or books on it, please give me a shout.

Read more
Rosie Zhu

Since we released the initial demo of Ubuntu on phones, we’ve been looking at refining the whole Suru theme — the theme on Ubuntu for phones and tablets — and creating visual guidelines for it.

Two of the things that we evolved were the treatment of the indented style and the corner size of the Ubuntu shape — the squircle. We wanted to make sure these were consistent across the theme so that any designer and developer could follow the same guidelines.

The Ubuntu squircle shapeThe Ubuntu squircle shape

Explorations

There were lots of discussions about what kind of shadows we should use — blurred shadows, sharp shadows or a combination of the two — to represent the indented style, and we went through various iterations.

Variations on grey backgroundVariations on grey

Soon after we started looking into the indented style we decided to look at the corner size of the shape at the same time, as they work together. Since the squircle is not an ordinary shape, it can’t just be scaled up and down as needed, so we arrived at four different corner sizes that can be used in the different sizes necessary across the theme.

The four different corner sizes of the Ubuntu shapeThe four different corner sizes of the Ubuntu shape

One of the main goals for the shadows was to make sure they worked with different images inside the shape and on different backgrounds. We also needed to consider the pressed state of the shape, which has a bigger shadow inside.

Variations of icon on different backgrounds in the normal and pressed statesVariations of Maps icon on different backgrounds in the normal and pressed states

When the shape is used, it’s not always indented (like in popovers and notifications), so we also had to study these variations too.

Final styles

And finally after many iterations, discussions and reviews, here are the current styles of the indented and non-indented shapes.

Telephony icon on different backgroundsTelephony icon on different backgrounds

We’ve started to put these guidelines on design.ubuntu.com, where you can follow their evolution.

Read more
Inayaili de León Persson

Release month is always a busy one for the web team, and this time was no exception with the Ubuntu 13.10 release last week.

In the last few weeks we’ve worked on:

  • Ubuntu 13.10 release: we’ve updated www.ubuntu.com for the latest Ubuntu release
  • Updates to the new Ubuntu OpenStack cloud section: based on some really interesting feedback we got from Tingting’s research, we’ve updated the new pages to make them easier to understand
  • Canonical website: Carla has conducted several workshops and interviews with stakeholders and has defined key audiences and user journeys
  • Juju GUI: on-boarding is now ready to land in Juju soon
  • Fenchurch (our CMS): the demo services are fixed and our publishing speed has seen a 90% improvement!

And we’re currently working on:

  • Responsive mobile pilot: we’ve been squashing the most annoying bugs and it’s now almost ready for the public alpha release!
  • Canonical.com: with some of the research for the project already completed, Carla will now be working on creating the site’s information architecture and wireframing its key sections
  • Juju GUI: Alejandra, Luca, Spencer, Peter and Anthony are in a week-long sprint in San Francisco for some intense Juju-related work (lucky them!)
  • developer.ubuntu.com: we have been working with the Community team to update the site’s design to be more in line with www.ubuntu.com and the first iteration will be going live soon
  • Fenchurch: we are now working on a new download service

Release day at the Canonical office in LondonRelease day at the Canonical office

Have you got any questions or suggestions for us? Would you like to hear about any of these projects and tasks in more detail? Add your thoughts in the comments.

Read more
Katie Taylor

App Design Clinic #3

Today’s clinic included:

  • A discussion of launcher placement,
  • Brad Well’s Bible app – general design guidance
  • Michael Zanetti’s uAuthenticator app – general design guidance, plus a discussion of naming and how names appear in the dash and in the app header

The next clinic is on Wednesday 6th November. We love discussing design, so please send any designs and/or questions to design@ubuntu.com.

Read more
Luca Paulina

Over the last year we have been working on the Juju GUI to reach a broader audience. Juju is a way of building complex cloud environments. It connects different services, allows complex configuration and the ability to scale out quickly and easily. Juju is offered as a command line tool or as a GUI on the web.

The team

For the last 6 months a small dedicated team has been working together to push the design of the Juju GUI forward. The design team consists of 2 user experience designers, Alejandra and Luca, and 2 visual designers, Jamie and Spencer. The project has raised many questions and one of them was what it is like designing a product you don’t use. In this blog post Jamie and Luca attempt to clarify our process.

No assumptions

Luca: As a user experience designer part of my process is to create assumptions to further thought, design and development, these are later validated in interviews with stakeholders, user testing or with the development team. An assumption is something that is generally accepted as being true without proof. I’ll never be a direct user of Juju, therefore creating assumptions for the type of audience that the Juju GUI is designed for is an interesting challenge.

To help build assumptions, ideate and create cohesive user flows that will later be tested I’ve had to run planned and impromptu workshops, ask questions, have daily hangouts with the development team,  run week long sprints, ask more questions and lock myself away in the Juju war room to immerse myself in the world of Juju.

Juju_war_room

Jamie: From a visual perspective this digital product is unlike anything that I’ve worked on in the past. Whereas some rules of typography, hierarchies and readability apply to the design I’ve found myself a lot more focussed on subtle detailing and refinements than ever before. This is because users of the GUI are wanting to complete tasks, they want to be able to deploy their environments as quickly and painlessly as possible. So the design job became about helping them do that without the GUI getting in the way. It is intended to lay lightly across the canvas, aiding users when they need and not obstructing them when they don’t.

Juju GUIThe Juju GUI

Extensive and continuous research

Luca: I’m always surprised by the sheer amount of complexity that the GUI entails. The varying needs of our core target audiences means that we have to conduct a lot of research when we create user flows, ideas and when we’re examining if a feature is needed. Thankfully we have a great user research team which helps find users, conducts the testing and helps interpret the results.

I’ve found that with this particular product the interpretation of feedback has been key to making sure our designs resonate with our users. The feedback is catalogued in a document and shared out amongst the development teams to gain their insights and ideas as well. Solutions are then ideated and the design team then acts upon them creating new designs.

Jamie: The user testing results and feedback from the community has been key to the development of the visual style for the GUI. We’ve been through numerous rounds of testing to get to this stage of design development and each round of tests has moved the design forward. Once a round of testing has been completed the team will review the findings and create design tasks to solve any issues highlighted by the testers. The users we’ve tested with have been high-level cloud architects and system administrators, so familiar with the type of tasks that the GUI performs just not familiar with the way in which we perform those tasks in the GUI. Assumptions we’ve made about the way they would use the GUI have sometimes been mistaken so the design really has been guided by the users.

Juju GUI design evolutionEvolution of Juju’s interface

Constant validation from a multidisciplinary team

Luca: Throughout the project the need for validation on concepts and ideas has been incredibly important. The agile process we use allows us to create wireframes and designs quickly and get them in front the dev team and get their insight and feedback, we’re lucky enough to have a near 24 hour working cycle (Teams in Europe, North America and Australasia). Because of this it’s not uncommon for a design to go through many iterations in a week, for example; the inspector wireframes (pictured below) went through 9 revisions in 10 working days, the complexity of the inspector design and experience was refined and finessed collaboratively with the development team, this has turned the inspector into an integral and very powerful part of the GUI.

Juju inspector wireframesDetailed wireframes for the inspector

Jamie: Working within an agile process has meant that design decisions are required to be made quickly and collaboratively within the team. The design team in London is small so we can share work internally and move designs on sometimes multiple times a day. This means we’re able to keep up with the development cycle that releases every 2 weeks and means that users can see the design evolve far faster than waiting for a yearly or biannual release of the product. As a designer it’s been hard seeing the product not pixel perfect when it’s released but we’re working hard to craft, fine-tune and round the edges of it so it will be a beautiful thing to use and interact with each new release.

Inspector designVisual iterations of the inspector

Questioning language and terminology

Luca: Juju is expanding into a new field of creating clouds by managing services not machines. This means that there really isn’t a language framework that we can rely on and one thing that has been apparent over the last 6 months is the importance of terminology and language for developers. At the beginning of the project it was difficult and time consuming to learn the established vocabulary associated with the cloud and Juju, this gave us a great reason to start questioning words and terms used throughout the GUI. We uncovered words that were already established in other web services and words that didn’t connect with the user. Questioning these words and terms made it clear that not only do we (as non-users) not understand but this would also happen with users and it allowed us to finesse the language in the GUI to something more appropriate.

Good design principles and patterns

Jamie: The GUI is not just the work of the Cloud team. To harmonize the look of the products in the Canonical stable we’ve worked closely with the design team developing the phone OS looking for ways that design patterns developed by them can be applied to the Juju GUI. We’ve also worked with the Web team to see where we can integrate any elements from their UI library. The GUI is a product but it’s not a mobile OS and equally we interact with it in a desktop web browser but it’s not a website, so it ultimately has to have it’s own look. But by pooling the collective design wisdom of the teams who have been crafting interactions in their specific fields and by using patterns and guidelines already defined in this space we can create a interface that is better than the sum these parts but with it’s own clear voice.

Good design practices

Jamie: We like to sketch here. We sketch everything out before any work is done on screen and it’s enormously useful in iterating quickly through problems that users have and trying to come up with multiple solutions to these in a collaborative way. With a small team we can sketch our way through multiple problems towards multiple solutions and then move into applications like Photoshop and Illustrator once we’ve got a clear direction of the UX. This fast way of working also allows us to keep pace with the development cycle and to be able to add features to the GUI each time we do a release. Once a feature of the GUI is open to the world we’ll gather feedback then it’s back to the drawing board to refine it.

Isle of Man workshopUX sketching during a recent sprint

Playing to our strengths

Luca: Most of the processes to provision, create and manage services in the cloud are currently carried out via command line. A priority for us has been to think about how we can use visual language to provide a layer of information and understanding not readily available via the command line. As designers we understand that with colour, structure, layout and flow we can communicate the status of a system or process in a very powerful way. We have made it our goal to bring out the strengths of the GUI by exploring visual metaphors and relationships. We established that the command line is an input output tool, the GUI doesn’t have that type of interaction and offers a more holistic approach, we offer that by having a clear hierarchy and having concise user flows. Early on in the project we made a principle to not compete with the command line but to embrace it, there are users out there who will use Juju just as a command line tool or as the GUI or a mix of both.

Charm-iconsPlayful icons helps users navigate the GUI

Final thoughts

Pretty much everyone in the team has been involved in the conceptual stage of the project, this has helped us create a cohesive product with some really powerful features. I’m sure there are a lot of designers out there working on designs for products that they won’t end up using. We wanted to take the time to highlight how we’ve approached this problem while we’ve been working on the Juju GUI project. The coming months will see a redesign of the the navigation bar, notifications, service blocks and relationship lines. We’ve given you a preview of some of these features in the visuals above.

Read more
Inayaili de León Persson

We might have been quiet, but we have been busy! Here’s a quick overview of what the web team has been up to recently.

In the past month we’ve worked on:

  • New juju.ubuntu.com website: we’ve revamped the information architecture, revisited the key journeys and updated the look to be more in line with www.ubuntu.com
  • Fenchurch (our CMS): we’ve worked on speeding up deployment and continuous testing
  • New Ubuntu OpenStack cloud section on www.ubuntu.com/cloud: we’ve launched a restructured cloud section, with links to more resources, clearer journeys and updated design
  • Juju GUI: we’ve launched the brand new service inspector

And we’re currently working on:

  • 13.10 release updates: the new Ubuntu release is upon us, and we’re getting the website ready to show it off
  • A completely new project that will be our mobile/responsive pilot: we’re updating our web patterns to a more future-friendly shape, investigating solutions to handle responsive images, and we’ve set up a (growing) mobile device testing suite — watch this space for more on this project
  • Fenchurch: we’re improving our internal demo servers and enhancing performance on the downloads page to help deal with release days!
  • Usability testing of the new cloud section: following the aforementioned launch, Tingting is helping us test these pages with their target audience — and we’ve already found loads of things we can improve!
  • A new canonical.com: we haven’t worked on Canonical’s main website in a while, so we’re looking into making it leaner and meaner. As a first stage, Carla has been conducting internal interviews and analysing the existing content
  • Juju GUI: we’re designing on-boarding and a new notification system, and we’re finalising designs for the masthead, service block and relationship lines

We’ve also learnt that Spencer’s favourite author is Paul Auster. And Tristram wrote a post on his blog about his first experience with Juju.

Web team weekly meeting on 19 September 2013Spencer giving his 5×5 presentation at last week’s web team meeting

Have you got any questions or suggestions for us? Would you like to hear about any of these projects and tasks in more detail? Please let us know your thoughts in the comments.

Read more
Katie Taylor

Wednesday App Clinic – Update

Over the last few weeks, we’ve enjoyed running 2 Wednesday App Clinics. Thanks to all those who sent us their apps and questions. It’s been a fantastic response! It certainly has helped us to see how you’ve been using the design guides and the components, and I hope it has helped you.

The first clinic started with an introduction and then feedback on a Brad Wells’ Blackjack app, and Marcin Le?niowski’s Skydiving logbook app. The second clinic included Michael Zanetti’s GetMeWheels, Daniel Beck’s RamSamSam RSS reader and Szymon Waliczek’s uShopper shopping list apps.

Screenshot from 2013-09-26 10:22:43

If you would like feedback or to ask a particular question (and to see your app featured!) send a screenshot or link to design@canonical.com before 1pm UTC on Tuesday.

To watch previous clinics go to the Ubuntu OnAir youtube channel at http://www.youtube.com/UbuntuOnAir.

The clinics are on Wednesdays at 1pm UTC at http://ubuntuonair.com/ . Join us (or watch later) to find out more.

Read more