Canonical Voices

Colin Ian King

Analysis of Phoronix Test Suite Benchmarks

I've been recently investigating a wide range of bench marking tests to find suitable candidates to track down performance regressions in Ubuntu.  Over the past 3 weeks I have attempted to run the entire set of tests in the Phoronix Test Suite on a low-end Xeon server to determine a subset of reliable tests.

For benchmarks to be reliable they must show little variation in results when running the tests multiple times.  The tests also need to run in a timely manner too; waiting several days for a set of results is not very timely.

Linked here is a PDF of my set of results.  Some of the Phoronix tests are not listed as they either took way too long to complete or just didn't run successfully on the server.  Tests that have low variability in the results (that is, the standard deviation of the test runs is low compared to the average) are marked in green, high variability are marked in red.

My testing shows that a large proportion of tests have quite large variability > 5% (% standard deviation), so probably are not that trustworthy when comparing results of machines that have benchmark results that show little difference.   For regression testing, I'm going to only consider tests that have a variability of less than 2.5% as this seems like a good way to filter out the more jittery test results.

The bottom line is that some tests are just too variable to be deemed a solid benchmark.  It's always worth sanity checking tests before using them as a gold standard.

Read more
Colin Ian King

Linux I/O Schedulers

The Linux kernel I/O schedulers attempt to balance the need to get the best possible I/O performance while also trying to ensure the I/O requests are "fairly" shared among the I/O consumers.  There are several I/O schedulers in Linux, each try to solve the I/O scheduling issues using different mechanisms/heuristics and each has their own set of strengths and weaknesses.

For traditional spinning media it makes sense to try and order I/O operations so that they are close together to reduce read/write head movement and hence decrease latency.  However, this reordering means that some I/O requests may get delayed, and the usual solution is to schedule these delayed requests after a specific time.   Faster non-volatile memory devices can generally handle random I/O requests very easily and hence do not require reordering.

Balancing the fairness is also an interesting issue.  A greedy I/O consumer should not block other I/O consumers and there are various heuristics used to determine the fair sharing of I/O.  Generally, the more complex and "fairer" the solution the more compute is required, so selecting a very fair I/O scheduler with a fast I/O device and a slow CPU may not necessarily perform as well as a simpler I/O scheduler.

Finally, the types of I/O patterns on the I/O devices influence the I/O scheduler choice, for example, mixed random read/writes vs mainly sequential reads and occasional random writes.

Because of the mix of requirements, there is no such thing as a perfect all round I/O scheduler.  The defaults being used are chosen to be a good best choice for the general user, however, this may not match everyone's needs.   To clarify the choices, the Ubuntu Kernel Team has provided a Wiki page describing the choices and how to select and tune the various I/O schedulers.  Caveat emptor applies, these are just guidelines and should be used as a starting point to finding the best I/O scheduler for your particular need.

Read more
Max

Upgrading my blog theme with the Vanilla CSS Framework

Recently I have spent more and more time thinking about the usability of my website and the cleanliness of its code.

This has brought me to another iteration in its design process.

Of course also related to my employment at Canonical I have decided to give our Vanilla Framework a try.

The first thing I had to update is my gulp task to include node_modules during the build.

//  CSS
gulp.task("css", function () {
  return gulp
    .src(`${source}/assets/css/styles.scss`)
    .pipe(sourcemaps.init())
    .pipe(sass({
      includePaths: ['node_modules']
    }).on('error', sass.logError))
    .pipe(sourcemaps.write("."))
    .pipe(gulp.dest("assets/built/"));
});

Otherwise it wouldnt pick up on the imports. This is also described on the framework page.

Then I have included the basic settings from Vanilla in my styles file with

@import "vanilla-framework/scss/settings";
@import "vanilla-framework/scss/base";
@include vf-base;

The @include is necessary because vanilla provides mixins via the imported files. This way I can control exactly what I want on my blog, thus reducing bloat and data usage.

Just by doing this, the website, given its semantic HTML already looks pretty nice:

Upgrading my blog theme with the Vanilla CSS Framework

One thing that I would still like to highlight though is the main content vs the head of the webpage. After browsing through the patterns at https://docs.vanillaframework.io/en/ for a bit I decided that I am only going to need the `p-card` pattern to achieve this.

After including this with @import "vanilla-framework/scss/patterns_card"; and using the needed mixins for the default card, the highlighted one and the content inside the card with

@include vf-p-card-default;
@include vf-p-card-highlighted;
@include vf-p-card-specific-content;

I just had to add classes to my articles on the index page and the content on the post page to achieve the following:

One thing that still bothers me here is the focus of the content on the left side. So I went ahead and adjusted some of the margins to the sides and on the header.

An additional thing I added (or actually removed) afterwards is the Font. If you set the $font-base-family variable before including the Vanilla base mixin, it will replace the default Ubuntu Font. As my goal is a very data saving, content focused website, I have replace it with serif, sans-serif. This will use the Fonts that are installed on the system in that order.

So, if for any reason, there is no Font found that has [serifs](https://en.wikipedia.org/wiki/Serif), a Font without serifs will be used instead.

Resume

In a few hours I was able to reach a design that is both clear and allows the user to focus on the content, which is written text. The Vanilla Framework, and the way it is set up, made it possible for me to focus on the things that are important to me and disregard the rest.

This way I can leverage all the design decisions around readability etc. that the Vanilla team has put into the framework and still have a bloat-free, brutalisitc website design to my liking, that represents me on the Internet.

This is a great speed improvement and asset for someone like me who appreciates great design, but spends a lot of time in the backend.

Read more
Max

Upgrading my blog theme with the Vanilla CSS Framework

Recently I have spent more and more time thinking about the usability of my website and the cleanliness of its code.

This has brought me to another iteration in its design process.

Of course also related to my employment at Canonical I have decided to give our Vanilla Framework a try.

The first thing I had to update is my gulp task to include node_modules during the build.

//  CSS
gulp.task("css", function () {
  return gulp
    .src(`${source}/assets/css/styles.scss`)
    .pipe(sourcemaps.init())
    .pipe(sass({
      includePaths: ['node_modules']
    }).on('error', sass.logError))
    .pipe(sourcemaps.write("."))
    .pipe(gulp.dest("assets/built/"));
});

Otherwise it wouldnt pick up on the imports. This is also described on the framework page.

Then I have included the basic settings from Vanilla in my styles file with

@import "vanilla-framework/scss/settings";
@import "vanilla-framework/scss/base";
@include vf-base;

The @include is necessary because vanilla provides mixins via the imported files. This way I can control exactly what I want on my blog, thus reducing bloat and data usage.

Just by doing this, the website, given its semantic HTML already looks pretty nice:

Upgrading my blog theme with the Vanilla CSS Framework

One thing that I would still like to highlight though is the main content vs the head of the webpage. After browsing through the patterns at https://docs.vanillaframework.io/en/ for a bit I decided that I am only going to need the `p-card` pattern to achieve this.

After including this with @import "vanilla-framework/scss/patterns_card"; and using the needed mixins for the default card, the highlighted one and the content inside the card with

@include vf-p-card-default;
@include vf-p-card-highlighted;
@include vf-p-card-specific-content;

I just had to add classes to my articles on the index page and the content on the post page to achieve the following:

One thing that still bothers me here is the focus of the content on the left side. So I went ahead and adjusted some of the margins to the sides and on the header.

An additional thing I added (or actually removed) afterwards is the Font. If you set the $font-base-family variable before including the Vanilla base mixin, it will replace the default Ubuntu Font. As my goal is a very data saving, content focused website, I have replace it with serif, sans-serif. This will use the Fonts that are installed on the system in that order.

So, if for any reason, there is no Font found that has [serifs](https://en.wikipedia.org/wiki/Serif), a Font without serifs will be used instead.

Resume

In a few hours I was able to reach a design that is both clear and allows the user to focus on the content, which is written text. The Vanilla Framework, and the way it is set up, made it possible for me to focus on the things that are important to me and disregard the rest.

This way I can leverage all the design decisions around readability etc. that the Vanilla team has put into the framework and still have a bloat-free, brutalisitc website design to my liking, that represents me on the Internet.

This is a great speed improvement and asset for someone like me who appreciates great design, but spends a lot of time in the backend.

Read more
Max

Upgrading my blog theme with the Vanilla CSS Framework

Recently I have spent more and more time thinking about the usability of my website and the cleanliness of its code.

This has brought me to another iteration in its design process.

Of course also related to my employment at Canonical I have decided to give our Vanilla Framework a try.

The first thing I had to update is my gulp task to include node_modules during the build.

//  CSS
gulp.task("css", function () {
  return gulp
    .src(`${source}/assets/css/styles.scss`)
    .pipe(sourcemaps.init())
    .pipe(sass({
      includePaths: ['node_modules']
    }).on('error', sass.logError))
    .pipe(sourcemaps.write("."))
    .pipe(gulp.dest("assets/built/"));
});

Otherwise it wouldnt pick up on the imports. This is also described on the framework page.

Then I have included the basic settings from Vanilla in my styles file with

@import "vanilla-framework/scss/settings";
@import "vanilla-framework/scss/base";
@include vf-base;

The @include is necessary because vanilla provides mixins via the imported files. This way I can control exactly what I want on my blog, thus reducing bloat and data usage.

Just by doing this, the website, given its semantic HTML already looks pretty nice:

Upgrading my blog theme with the Vanilla CSS Framework

One thing that I would still like to highlight though is the main content vs the head of the webpage. After browsing through the patterns at https://docs.vanillaframework.io/en/ for a bit I decided that I am only going to need the `p-card` pattern to achieve this.

After including this with @import "vanilla-framework/scss/patterns_card"; and using the needed mixins for the default card, the highlighted one and the content inside the card with

@include vf-p-card-default;
@include vf-p-card-highlighted;
@include vf-p-card-specific-content;

I just had to add classes to my articles on the index page and the content on the post page to achieve the following:

One thing that still bothers me here is the focus of the content on the left side. So I went ahead and adjusted some of the margins to the sides and on the header.

An additional thing I added (or actually removed) afterwards is the Font. If you set the $font-base-family variable before including the Vanilla base mixin, it will replace the default Ubuntu Font. As my goal is a very data saving, content focused website, I have replace it with serif, sans-serif. This will use the Fonts that are installed on the system in that order.

So, if for any reason, there is no Font found that has [serifs](https://en.wikipedia.org/wiki/Serif), a Font without serifs will be used instead.

Resume

In a few hours I was able to reach a design that is both clear and allows the user to focus on the content, which is written text. The Vanilla Framework, and the way it is set up, made it possible for me to focus on the things that are important to me and disregard the rest.

This way I can leverage all the design decisions around readability etc. that the Vanilla team has put into the framework and still have a bloat-free, brutalisitc website design to my liking, that represents me on the Internet.

This is a great speed improvement and asset for someone like me who appreciates great design, but spends a lot of time in the backend.

Read more
K. Tsakalozos

MicroK8s is a local deployment of Kubernetes. Let’s skip all the technical details and just accept that Kubernetes does not run natively on MacOS or Windows. You may be thinking “I have seen Kubernetes running on a MacOS laptop, what kind of sorcery was that?” It’s simple, Kubernetes is running inside a VM. You might not see the VM or it might not even be a full blown virtual system but some level of virtualisation is there. This is exactly what we will show here. We will setup a VM and inside there we will install MicroK8s. After the installation we will discuss how to use the in-VM-Kubernetes.

A multipass VM on MacOS

Arguably the easiest way to get an Ubuntu VM on MacOS is with multipass. Head to the releases page and grab the latest package. Installing it is as simple as double-clicking on the .pkg file.

To start a VM with MicroK8s we:

multipass launch --name microk8s-vm --mem 4G --disk 40G
multipass exec microk8s-vm -- sudo snap install microk8s --classic
multipass exec microk8s-vm -- sudo iptables -P FORWARD ACCEPT

Make sure you reserve enough resources to host your deployments; above, we got 4GB of RAM and 40GB of hard disk. We also make sure packets to/from the pod network interface can be forwarded to/from the default interface.

Our VM has an IP that you can check with:

> multipass list
Name State IPv4 Release
microk8s-vm RUNNING 10.72.145.216 Ubuntu 18.04 LTS

Take a note of this IP since our services will become available there.

Other multipass commands you may find handy:

  • Get a shell inside the VM:
multipass shell microk8s-vm
  • Shutdown the VM:
multipass stop microk8s-vm
  • Delete and cleanup the VM:
multipass delete microk8s-vm 
multipass purge

Using MicroK8s

To run a command in the VM we can get a multipass shell with:

multipass shell microk8s-vm

To execute a command without getting a shell we can use multipass exec like so:

multipass exec microk8s-vm -- /snap/bin/microk8s.status

A third way to interact with MicroK8s is via the Kubernetes API server listening on port 8080 of the VM. We can use microk8s’ kubeconfig file with a local installation of kubectl to access the in-VM-kubernetes. Here is how:

multipass exec microk8s-vm -- /snap/bin/microk8s.config > kubeconfig

Install kubectl on the host machine and then use the kubeconfig:

kubectl --kubeconfig=kubeconfig get all --all-namespaces

Accessing in-VM services — Enabling addons

Let’s first enable dns and the dashboard. In the rest of this blog we will be showing different methods of accessing Grafana:

multipass exec microk8s-vm -- /snap/bin/microk8s.enable dns dashboard

We check the deployment progress with:

> multipass exec microk8s-vm -- /snap/bin/microk8s.kubectl get all --all-namespaces

After all services are running we can proceed into looking how to access the dashboard.

The Grafana of our dashboard

Accessing in-VM services — Use the Kubernetes API proxy

The API server is on port 8080 of our VM. Let’s see how the proxy path looks like:

> multipass exec microk8s-vm -- /snap/bin/microk8s.kubectl cluster-info
...
Grafana is running at http://127.0.0.1:8080/api/v1/namespaces/kube-system/services/monitoring-grafana/proxy
...

By replacing 127.0.0.1 with the VM’s IP, 10.72.145.216 in this case, we can reach our service at:

http://10.72.145.216:8080/api/v1/namespaces/kube-system/services/monitoring-grafana/proxy

Accessing in-VM services — Setup a proxy

In a very similar fashion to what we just did above, we can ask Kubernetes to create a proxy for us. We need to request the proxy to be available to all interfaces and to accept connections from everywhere so that the host can reach it.

> multipass exec microk8s-vm -- /snap/bin/microk8s.kubectl proxy --address='0.0.0.0' --accept-hosts='.*'
Starting to serve on [::]:8001

Leave the terminal with the proxy open. Again, replacing 127.0.0.1 with the VMs IP we reach the dashboard through:

http://10.72.145.216:8080/api/v1/namespaces/kube-system/services/monitoring-grafana/proxy

Make sure you go through the official docs on constructing the proxy paths.

Accessing in-VM services — Use a NodePort service

We can expose our service in a port on the VM and access it from there. This approach is using the NodePort service type. We start by spotting the deployment we want to expose:

> multipass exec microk8s-vm -- /snap/bin/microk8s.kubectl get deployment -n kube-system  | grep grafana
monitoring-influxdb-grafana-v4 1 1 1 1 22h

Then we create the NodePort service:

multipass exec microk8s-vm -- /snap/bin/microk8s.kubectl expose deployment.apps/monitoring-influxdb-grafana-v4 -n kube-system --type=NodePort

We have now a port for the Grafana service:

> multipass exec microk8s-vm -- /snap/bin/microk8s.kubectl get services -n kube-system  | grep NodePort
monitoring-influxdb-grafana-v4 NodePort 10.152.183.188 <none> 8083:32580/TCP,8086:32152/TCP,3000:32720/TCP 13m

Grafana is on port 3000 mapped here to 32720. This port is randomly selected so it my vary for you. In our case, the service is available on 10.72.145.216:32720.

Conclusions

MicroK8s on MacOS (or Windows) will need a VM to run. This is no different that any other local Kubernetes solution and it comes with some nice benefits. The VM gives you an extra layer of isolation. Instead of using your host and potentially exposing the Kubernetes services to the outside world you have full control of what others can see. Of course, this isolation comes with some extra administrative overhead that may not be applicable for a dev environment. Give it a try and tell us what you think!

Links

CanonicalLtd/multipass


MicroK8s on MacOS was originally published in ITNEXT on Medium, where people are continuing the conversation by highlighting and responding to this story.

Read more
Colin Ian King

New features in Forkstat

Forkstat is a simple utility I wrote a while ago that can trace process activity using the rather useful Linux NETLINK_CONNECTOR API.   Recently I have added two extra features that may be of interest:

1.  Improved output using some UTF-8 glyphs.  These are used to show process parent/child relationships and various process events, such as termination, core dumping and renaming.   Use the new -g (glyph) option to enable this mode. For example:


In the above example, the program "wobble" was started and forks off a child process.  The parent then renames itself to wibble (indicated by a turning arrow). The child then segfaults and generates a core dump (indicated by a skull and crossbones), triggering apport to investigate the crash.  After this, we observe NetworkManager creating a thread that runs for a very short period of time.   This kind of activity is normally impossible to spot while running conventions tools such as ps or top.

2. By default, forkstat will show the process name using the contents of /proc/$PID/cmdline.  The new -c option allows one to instead use the 16 character task "comm" field, and this can be helpful for spotting process name changes on PROC_EVENT_COMM events.

These are small changes, but I think they make forkstat more useful.  The updated forkstat will be available in Ubuntu 19.04 "Disco Dingo".

Read more

Snapcraft 3.0

The release notes for snapcraft 3.0 have been long overdue. For convenience I will reproduce them here too. Presenting snapcraft 3.0 The arrival of snapcraft 3.0 brings fresh air into how snap development takes place! We took the learnings from the main pain points you had when creating snaps in the past several years, and we we introduced those lessons into a brand new release - snapcraft 3.0! Build Environments As the cornerstone for behavioral change, we are introducing the concept of build environments.

Read more
Colin Ian King

High-level tracing with bpftrace

Bpftrace is a new high-level tracing language for Linux using the extended Berkeley packet filter (eBPF).  It is a very powerful and flexible tracing front-end that enables systems to be analyzed much like DTrace.

The bpftrace tool is now installable as a snap. From the command line one can install it and enable it to use system tracing as follows:

 sudo snap install bpftrace  
sudo snap connect bpftrace:system-trace

To illustrate the power of bpftrace, here are some simple one-liners:

 # trace openat() system calls
sudo bpftrace -e 'tracepoint:syscalls:sys_enter_openat { printf("%d %s %s\n", pid, comm, str(args->filename)); }'
Attaching 1 probe...
1080 irqbalance /proc/interrupts
1080 irqbalance /proc/stat
2255 dmesg /etc/ld.so.cache
2255 dmesg /lib/x86_64-linux-gnu/libtinfo.so.5
2255 dmesg /lib/x86_64-linux-gnu/librt.so.1
2255 dmesg /lib/x86_64-linux-gnu/libc.so.6
2255 dmesg /lib/x86_64-linux-gnu/libpthread.so.0
2255 dmesg /usr/lib/locale/locale-archive
2255 dmesg /lib/terminfo/l/linux
2255 dmesg /home/king/.config/terminal-colors.d
2255 dmesg /etc/terminal-colors.d
2255 dmesg /dev/kmsg
2255 dmesg /usr/lib/x86_64-linux-gnu/gconv/gconv-modules.cache

 # count system calls using tracepoints:  
sudo bpftrace -e 'tracepoint:syscalls:sys_enter_* { @[probe] = count(); }'
@[tracepoint:syscalls:sys_enter_getsockname]: 1
@[tracepoint:syscalls:sys_enter_kill]: 1
@[tracepoint:syscalls:sys_enter_prctl]: 1
@[tracepoint:syscalls:sys_enter_epoll_wait]: 1
@[tracepoint:syscalls:sys_enter_signalfd4]: 2
@[tracepoint:syscalls:sys_enter_utimensat]: 2
@[tracepoint:syscalls:sys_enter_set_robust_list]: 2
@[tracepoint:syscalls:sys_enter_poll]: 2
@[tracepoint:syscalls:sys_enter_socket]: 3
@[tracepoint:syscalls:sys_enter_getrandom]: 3
@[tracepoint:syscalls:sys_enter_setsockopt]: 3
...

Note that it is recommended to use bpftrace with Linux 4.9 or higher.

The bpftrace github project page has an excellent README guide with some worked examples and is a very good place to start.  There is also a very useful reference guide and one-liner tutorial too.

If you have any useful btftrace one-liners, it would be great to share them. This is an amazingly powerful tool, and it would be interesting to see how it will be used.

Read more
Tim McNamara

Hello world!

Welcome to Canonical Voices. This is your first post. Edit or delete it, then start blogging!

Read more
Max

Impulse

Bug hunting on iOS 6.3.1

Me and my SO recently got an old iPad2, which we were planning to use for everyday usage around the house. Meaning, having it ready to research a recipe, get information about a topic or researching things to do together.

Unfortunately though, the iOS version that it came preinstalled with was 9.3.5. On this later version of iOS there is quite a problem with the responsiveness though.
Probably due to the big alterations on the design side the iPad feels slow and has noticeable lags when switching between applications, and even when using the web browser. The argument for this was, that it never seemed that slow before, and it should not be the hardware that wore down.
So I decided to try and downgrade the iPad and compare a much older version, namely 6.3.1, to the 9.3.5 that was installed.

After doing so successfully via ITunes and an old [IPSW](https://en.wikipedia.org/wiki/IPSW) file it was clear that the assumption was correct. Being back on the old version improved the performance significantly.
This only goes as far as the web though. When visiting certain websites errors started bringing the user satisfaction back down again.

Here are a a few screenshots of some broken styling:

And some websites were still a pleasure to use, here are the BBC and RA as examples:

Reflection

This made me think to go onto my own blog and start breaking it.
Given all the modern tech that I use for this theme, I would not be surprised to find some thing not working correctly. Mostly due to the CSS Grid specification I am using for the styling.

And indeed it did not take long:
First of all the header is broken, with the navigation links appearing in a horizontal manner, instead of a vertical one.
Additionally after some scrolling the header starts covering most of the content.

Bug hunting on iOS 6.3.1
Broken Header
Bug hunting on iOS 6.3.1
Header covering content

My speculation for the first bug being the CSS Grid, while the latter one could be related to the vh and vw units I am using.

Action

Since I am now in possession of a great testing device I set out to fix this.
Setting up my development environment by changing the the address of my development server to 0.0.0.0:8000, thus exposing it my local network, and accessing it on my iPad via IP.OF.MY.LAPTOP:8000, I have a quick feedback loop to test out my ideas and refactorings.

After looking at the stylings for the header the position: fixed; top: 0; turned out to be the problem.
Since this was a problem that was not solved elegantly initially I decided to throw it out completely and instead reiterate on the website design.

Since design in general is an iterative process I will take this as a major stepping stone in finding the right one for my blog.
For now the user (so you) will have to scroll back up to the top of the page in order to navigate. Given all the advanced technology that is on this site though, and its relative simplicity I will start from scratch for the layouting. This time using old methods and then adding modern techniques on top via cascading rules.

Resume

This rather random chain on events has again taught me, and hopefully some of you that are reading this, a lesson. While I always speak up for accessibility and its importantance in our work, I wanted to write the theme for this blog, using all the latest technologies, making my developer life comfortable and the code elegant.
This hypocrisy was made obvious right now, and I will have to backtrack and rework some parts. It also comes to show that some problems stem from architectural decisions rather than a specific bug. My assumptions about what could be breaking were not too far off, but as it turns out the problem has more depth.
While the CSS is easy to refactor and well decoupled, this could have become a nightmare if the project was bigger, since I have already accumulated tech-debt and would just keep on growing it.
But as always, a mistake made now is a mistake avoided in the future.

Cover photo by Tim Goedhart on Unsplash

Read more
Max

Impulse

Bug hunting on iOS 6.3.1

Me and my SO recently got an old iPad2, which we were planning to use for everyday usage around the house. Meaning, having it ready to research a recipe, get information about a topic or researching things to do together.

Unfortunately though, the iOS version that it came preinstalled with was 9.3.5. On this later version of iOS there is quite a problem with the responsiveness though.
Probably due to the big alterations on the design side the iPad feels slow and has noticeable lags when switching between applications, and even when using the web browser. The argument for this was, that it never seemed that slow before, and it should not be the hardware that wore down.
So I decided to try and downgrade the iPad and compare a much older version, namely 6.3.1, to the 9.3.5 that was installed.

After doing so successfully via ITunes and an old [IPSW](https://en.wikipedia.org/wiki/IPSW) file it was clear that the assumption was correct. Being back on the old version improved the performance significantly.
This only goes as far as the web though. When visiting certain websites errors started bringing the user satisfaction back down again.

Here are a a few screenshots of some broken styling:

And some websites were still a pleasure to use, here are the BBC and RA as examples:

Reflection

This made me think to go onto my own blog and start breaking it.
Given all the modern tech that I use for this theme, I would not be surprised to find some thing not working correctly. Mostly due to the CSS Grid specification I am using for the styling.

And indeed it did not take long:
First of all the header is broken, with the navigation links appearing in a horizontal manner, instead of a vertical one.
Additionally after some scrolling the header starts covering most of the content.

Bug hunting on iOS 6.3.1
Broken Header
Bug hunting on iOS 6.3.1
Header covering content

My speculation for the first bug being the CSS Grid, while the latter one could be related to the vh and vw units I am using.

Action

Since I am now in possession of a great testing device I set out to fix this.
Setting up my development environment by changing the the address of my development server to 0.0.0.0:8000, thus exposing it my local network, and accessing it on my iPad via IP.OF.MY.LAPTOP:8000, I have a quick feedback loop to test out my ideas and refactorings.

After looking at the stylings for the header the position: fixed; top: 0; turned out to be the problem.
Since this was a problem that was not solved elegantly initially I decided to throw it out completely and instead reiterate on the website design.

Since design in general is an iterative process I will take this as a major stepping stone in finding the right one for my blog.
For now the user (so you) will have to scroll back up to the top of the page in order to navigate. Given all the advanced technology that is on this site though, and its relative simplicity I will start from scratch for the layouting. This time using old methods and then adding modern techniques on top via cascading rules.

Resume

This rather random chain on events has again taught me, and hopefully some of you that are reading this, a lesson. While I always speak up for accessibility and its importantance in our work, I wanted to write the theme for this blog, using all the latest technologies, making my developer life comfortable and the code elegant.
This hypocrisy was made obvious right now, and I will have to backtrack and rework some parts. It also comes to show that some problems stem from architectural decisions rather than a specific bug. My assumptions about what could be breaking were not too far off, but as it turns out the problem has more depth.
While the CSS is easy to refactor and well decoupled, this could have become a nightmare if the project was bigger, since I have already accumulated tech-debt and would just keep on growing it.
But as always, a mistake made now is a mistake avoided in the future.

Cover photo by Tim Goedhart on Unsplash

Read more
Max

Impulse

Bug hunting on iOS 6.3.1

Me and my SO recently got an old iPad2, which we were planning to use for everyday usage around the house. Meaning, having it ready to research a recipe, get information about a topic or researching things to do together.

Unfortunately though, the iOS version that it came preinstalled with was 9.3.5. On this later version of iOS there is quite a problem with the responsiveness though.
Probably due to the big alterations on the design side the iPad feels slow and has noticeable lags when switching between applications, and even when using the web browser. The argument for this was, that it never seemed that slow before, and it should not be the hardware that wore down.
So I decided to try and downgrade the iPad and compare a much older version, namely 6.3.1, to the 9.3.5 that was installed.

After doing so successfully via ITunes and an old [IPSW](https://en.wikipedia.org/wiki/IPSW) file it was clear that the assumption was correct. Being back on the old version improved the performance significantly.
This only goes as far as the web though. When visiting certain websites errors started bringing the user satisfaction back down again.

Here are a a few screenshots of some broken styling:

And some websites were still a pleasure to use, here are the BBC and RA as examples:

Reflection

This made me think to go onto my own blog and start breaking it.
Given all the modern tech that I use for this theme, I would not be surprised to find some thing not working correctly. Mostly due to the CSS Grid specification I am using for the styling.

And indeed it did not take long:
First of all the header is broken, with the navigation links appearing in a horizontal manner, instead of a vertical one.
Additionally after some scrolling the header starts covering most of the content.

Bug hunting on iOS 6.3.1
Broken Header
Bug hunting on iOS 6.3.1
Header covering content

My speculation for the first bug being the CSS Grid, while the latter one could be related to the vh and vw units I am using.

Action

Since I am now in possession of a great testing device I set out to fix this.
Setting up my development environment by changing the the address of my development server to 0.0.0.0:8000, thus exposing it my local network, and accessing it on my iPad via IP.OF.MY.LAPTOP:8000, I have a quick feedback loop to test out my ideas and refactorings.

After looking at the stylings for the header the position: fixed; top: 0; turned out to be the problem.
Since this was a problem that was not solved elegantly initially I decided to throw it out completely and instead reiterate on the website design.

Since design in general is an iterative process I will take this as a major stepping stone in finding the right one for my blog.
For now the user (so you) will have to scroll back up to the top of the page in order to navigate. Given all the advanced technology that is on this site though, and its relative simplicity I will start from scratch for the layouting. This time using old methods and then adding modern techniques on top via cascading rules.

Resume

This rather random chain on events has again taught me, and hopefully some of you that are reading this, a lesson. While I always speak up for accessibility and its importantance in our work, I wanted to write the theme for this blog, using all the latest technologies, making my developer life comfortable and the code elegant.
This hypocrisy was made obvious right now, and I will have to backtrack and rework some parts. It also comes to show that some problems stem from architectural decisions rather than a specific bug. My assumptions about what could be breaking were not too far off, but as it turns out the problem has more depth.
While the CSS is easy to refactor and well decoupled, this could have become a nightmare if the project was bigger, since I have already accumulated tech-debt and would just keep on growing it.
But as always, a mistake made now is a mistake avoided in the future.

Cover photo by Tim Goedhart on Unsplash

Read more
Max

Adding Comments

The landscape for software that helps in adding comments to a website is does not shine so bright. A few big closed source companies, such as Disqus or Facebook offer an API to integrate comments. The problem with these services is that they own all the data and we have no Idea what we expose the website visitor to.

While Wordpress does come with comments baked into the platform, unfortunately Ghost and others do not. This article has a good overview of the problem in Ghost and I would identify myself as the power user, that wants a good product and respects users privacy.
That which is why I kept looking for solutions before seriously considering moving away from Ghost.

Looking at the comments (This is why we do need them) from the above article is where I have found the solution that works for me => Commento.
Available on Gitlab and with pretty good setup documentation, this commenting service allows functionalities similar to Disqus while giving all the benefits of hosting the software on your own server. Of course you can also let the company host this for you, at a really reasonable price, while still being able to see the Codebase it is running on.

If you do care about your users and would like to provide the people that visit your website with privacy and no third party ADs than you should give Commento a try.

Btw, I have no connection to them whatsoever. I just really like their product and would love for others to discover it!

Updated privacy policy on this Blog

Since the commenting functionality needs to store data I have updated the Privacy Policy.

Read more
Max

Adding Comments

The landscape for software that helps in adding comments to a website is does not shine so bright. A few big closed source companies, such as Disqus or Facebook offer an API to integrate comments. The problem with these services is that they own all the data and we have no Idea what we expose the website visitor to.

While Wordpress does come with comments baked into the platform, unfortunately Ghost and others do not. This article has a good overview of the problem in Ghost and I would identify myself as the power user, that wants a good product and respects users privacy.
That which is why I kept looking for solutions before seriously considering moving away from Ghost.

Looking at the comments (This is why we do need them) from the above article is where I have found the solution that works for me => Commento.
Available on Gitlab and with pretty good setup documentation, this commenting service allows functionalities similar to Disqus while giving all the benefits of hosting the software on your own server. Of course you can also let the company host this for you, at a really reasonable price, while still being able to see the Codebase it is running on.

If you do care about your users and would like to provide the people that visit your website with privacy and no third party ADs than you should give Commento a try.

Btw, I have no connection to them whatsoever. I just really like their product and would love for others to discover it!

Updated privacy policy on this Blog

Since the commenting functionality needs to store data I have updated the Privacy Policy.

Read more
Max

Adding Comments

The landscape for software that helps in adding comments to a website is does not shine so bright. A few big closed source companies, such as Disqus or Facebook offer an API to integrate comments. The problem with these services is that they own all the data and we have no Idea what we expose the website visitor to.

While Wordpress does come with comments baked into the platform, unfortunately Ghost and others do not. This article has a good overview of the problem in Ghost and I would identify myself as the power user, that wants a good product and respects users privacy.
That which is why I kept looking for solutions before seriously considering moving away from Ghost.

Looking at the comments (This is why we do need them) from the above article is where I have found the solution that works for me => Commento.
Available on Gitlab and with pretty good setup documentation, this commenting service allows functionalities similar to Disqus while giving all the benefits of hosting the software on your own server. Of course you can also let the company host this for you, at a really reasonable price, while still being able to see the Codebase it is running on.

If you do care about your users and would like to provide the people that visit your website with privacy and no third party ADs than you should give Commento a try.

Btw, I have no connection to them whatsoever. I just really like their product and would love for others to discover it!

Updated privacy policy on this Blog

Since the commenting functionality needs to store data I have updated the Privacy Policy.

Read more
Max

Theme update

By chance I stumbled across an old device with an outdated browser and decided to go with an entirely new approach for the theme of this blog.
The motivation was support for this device, which led to reanalyzing the whole concept of only using modern techniques. An article on this will follow in the next weeks.

The new design and Codebase is much more functional and simple, focusing on content and support for the most devices possible.
While the series on creating this theme is still an interesting read into the things I tried, it is now outdated and I would not recommend following it for the actual styling.
If you are interested in setting up the build system for a theme than it is still relevant.

If you find any bugs, please contact me and I will try to get them fixed ASAP.

Read more
Max

Theme update

By chance I stumbled across an old device with an outdated browser and decided to go with an entirely new approach for the theme of this blog.
The motivation was support for this device, which led to reanalyzing the whole concept of only using modern techniques. An article on this will follow in the next weeks.

The new design and Codebase is much more functional and simple, focusing on content and support for the most devices possible.
While the series on creating this theme is still an interesting read into the things I tried, it is now outdated and I would not recommend following it for the actual styling.
If you are interested in setting up the build system for a theme than it is still relevant.

If you find any bugs, please contact me and I will try to get them fixed ASAP.

Read more
Max

Theme update

By chance I stumbled across an old device with an outdated browser and decided to go with an entirely new approach for the theme of this blog.
The motivation was support for this device, which led to reanalyzing the whole concept of only using modern techniques. An article on this will follow in the next weeks.

The new design and Codebase is much more functional and simple, focusing on content and support for the most devices possible.
While the series on creating this theme is still an interesting read into the things I tried, it is now outdated and I would not recommend following it for the actual styling.
If you are interested in setting up the build system for a theme than it is still relevant.

If you find any bugs, please contact me and I will try to get them fixed ASAP.

Read more
Max

Please Note that this theme is not in use anymore. See this post for more info.

Ghost theme - Part 4: Finishing the Sidebar

On mobile Layouts < 360px width the Sidebar still overflows vertically. Lets fix it and implement the features decided on in Part 3.

First a small screenshot of the problem:

Ghost theme - Part 4: Finishing the Sidebar

As you can see the Sidebar flows into the other content.
To fix this lets remove some margins and change the nav to list its items horizontally.

All done with this bit of CSS in the styles.css file since we are changing the default layout.

.sidebar {
  grid-area: Sidebar;
  display: flex;
  flex-wrap: wrap;
  max-height: var(--sidebar-height);

  & .page-title {
      margin: 0px;
  }

  & .page-description {
      margin: 0px;
  }

  & .navigation {
    & .nav {
      display: flex;
      list-style: none;
      padding: 0px;
      margin: 0px;
    }
  }
}
@media screen and (min-width: 961px) {
    .sidebar {
        position: relative;
    }
}

This has actually become quite a lot to keep in the styles.css file.
Lets extract it into components/sidebar.css and import it.

All should look good now. Except when the content doesnt wrap!
Here is a screenshot of what I mean:

Ghost theme - Part 4: Finishing the Sidebar

Everything is in a row. The fix is quite easy with flexbox. Lets just set the flex-direction to column. That should fix it. Right?

Almost. The flex-wrap: wrap from earlier now messes things up. So lets just get rid of it on the sidebar. It worked great when the direction was row but now it is wrapping at the vertical ends, which is not what we want.

Now the sidebar looks pretty nice and our layout is almost done!

The features

Lets go on to the features:

  1. when scrolling the Title disappears
  2. when scrolling the Description disappears
  3. Only the Navigation is shown and the sidebar should shrink to its size but stay at the top

With position: sticky; the last one is pretty easy to do these days.
But to hide the other two we would need to restructure the HTML of our Sidebar since the position is relative to its parent. Which means it would still go out of the screen together with it.

So the last resort here is JavaScript together with position: fixed;. In general we should not use JavaScript for styling, but for this weird case lets make an exception and add a new class to the sidebar when it scrolls out of view.
Using that class we can then use the correct styles to achieve our goal and maybe add some nice CSS Animation.

Lets code

To use JavaScript create the file assets/js/helpers/styling.js.
The gotede build tools will take care of doing all the hard work of getting it onto the page, such as using babel and concatenating the JavaScript files you create.
Just take note that imports wont work. Since this is a Theme for purely Server-side rendered pages, that is totally fine. The JavaScript should be kept to a minimum here.

It should contain the following code, which will attach a new class to the Sidebar when we scroll down and then remove this class again once we scroll to the top:

function initScrollingListener() {
  const sidebar = document.getElementsByClassName("sidebar")[0];

  function addScrollClass() {
    if (window.pageYOffset >= 0) {
      sidebar.classList.add("scrolled");
      window.removeEventListener("scroll", addScrollClass);
      window.addEventListener("scroll", removeScrollClass);
    }
  }

  function removeScrollClass() {
    if (window.pageYOffset == 0) {
      sidebar.classList.remove("scrolled");
      window.removeEventListener("scroll", removeScrollClass);
      window.addEventListener("scroll", addScrollClass);
    }
  }

  if (window.pageYOffset == 0) {
    window.addEventListener("scroll", addScrollClass);
  } else {
    addScrollClass();
  }
}

initScrollingListener();

You should have some basic understanding of programming to really understand what is happening here but here is a small breakdown:

1.  If pageYOffset is 0 we are at the top, so we add a Listener that will call our addScrollClass function once we scroll. Otherwise we are not at the top and can call that function directly.
2. addScrollClass adds the class to the Sidebar element when we are not at the top of the page and the removes the Listener for calling itself on scrolling. Then it adds the Listener that calls removeScrollClass when we scroll.
3. removeScrollClass does exactly the opposite of addScrollClass.

Here is a list of all the DOM functionalities that are used in this piece of code: window, document, getElementsByClassName, window.pageYOffset, Element.classList, addEventListener, removeEventListener

Maybe you notice that we will call one of these functions every time that we scroll. In this case it is a very small amount of code, but it still has to run on the CPU, which is why we should generally avoid these kind of hacks.

Using the new class

So lets go and implement the things that should happen when we scroll in CSS.

.sidebar {
    /* ... */
    position: fixed;
    /* ... */
    &.scrolled{
        height: calc(var(--sidebar-height) / 1.5);
        width: var(--sidebar-width);

        & .page-title {
           display: none; 
        }

        & .page-description {
           display: none; 
        }
      }

This is all we need for now.
It should be inside the .sidebar block and hides both the description and the title when the scrolled class is added.

While it is not an optimal solution, since we use JavaScript for styling, it works in this case and does not put much load on the CPU since we only add two Listeners to our Window.

An Animation

To add some spiciness to our Sidebar suddenly reducing in size, we can add a small Animation when this happens.
Lets just make a dummy one for now where the Background transitions through colors. We can fine-tune this later when we got to the real colors of the theme.

  &.scrolled {
    /*...*/
    animation: background-transition 3s;
    background-color: yellow;
    z-index: 1;
  }
  /*...*/
  
  @keyframes background-transition {
    0% {
        background-color: blue;
    }
    100% {
        background-color: yellow;
    }
  }

One more thing is left to do.
While we have a nice Sidebar for a mobile layout these rules will break the Desktop design. Since we would have to overwrite all these rules in a MediaQuery lets make it easy on us and instead wrap these inside one themselves to keep the default on the other layouts.
To do this put all the CSS we just created for the scrolled class inside @media screen and (max-width: 960px) and dont forget to wrap them inside the .sidebar class as well.

The final sidebar.css should look like this:

.sidebar {
  grid-area: Sidebar;
  display: flex;
  flex-direction: column;
  max-height: var(--sidebar-height);

  position: fixed;
  top: 0;

  & .page-title {
      margin: 0px;
  }

  & .page-description {
      margin: 0px;
  }

  & .navigation {
    & .nav {
      display: flex;
      list-style: none;
      padding: 0px;
      margin: 0px;
    }
  }
}

@media screen and (max-width: 960px) {
    .sidebar {
        &.scrolled {
            height: calc(var(--sidebar-height) / 1.5);
            width: var(--sidebar-width);
            animation: background-transition 3s;
            background-color: yellow;
            z-index: 1;

        & .page-title {
            display: none; 
        }

        & .page-description {
            display: none; 
        }
        }
    }
}

@media screen and (min-width: 961px) {
    .sidebar {
        position: relative;
    }
}

@keyframes background-transition {
    0% {
        background-color: blue;
    }
    100% {
        background-color: yellow;
    }
}

The end

Here is a GIF of the new behaviour:

Ghost theme - Part 4: Finishing the Sidebar

Caught that Image that overflows our page? A Bug, how nice. I will fix it by next time. If you somehow followed along and want to do this as well, check out the class of such and element and try to find a setting that restricts its width ;)

Otherwise it looks like we are done here from a Basic Layouting point of view.
Of course there is still much needed on the visual front, which is what we will start in Part 6, where it is going to be all about Typography and choosing a Font.

Cover: https://unsplash.com/photos/SXihyA4oEJs

Read more