For some reason, I have a printer at home. I think I bought it for printing wet signature-requiring legal documents a few years back, and buying a printer was cheaper/easier than getting things remotely printed and posted back to me. It’s a cheap-ish Brother greyscale laser printer.
Whatever the reason, since having the printer I have become immensely popular with my family, as it turns out that no one else near us has one. Despite no one else owning one, it also turns out that people do need to use them from time to time – e.g. for printing out shipping labels, gift voucher codes for birthday cards, ID document scans for certification, etc.
TL;DR (can I see your setup?): see this note.
–
I’ve now been exclusively using aerc for my day-to-day email workflows for a few months. This has been my first proper foray into using terminal-based mail clients as I never fully got around to trying other ones, such as Mutt (and NeoMutt), but had recently read good things about aerc in various threads and wanted to give it a go. From what I read, it seemed to be modern and actively developed, with a good ecosystem, and with a focus on being user-friendly and extensible.
Whilst browsing my GitHub home feed a little while back (not something I’m in a habit of doing, generally), I stumbled upon the command line journaling tool jrnl
. I thought it looked interesting, and so subsequently posted about it and had a good discussion on this and alternatives over on Mastodon. jrnl
also gave Kev Quirk the idea to create his own journaling tool.
During the conversation, Tucker McKnight mentioned that he uses a tool called nb
for journaling and note taking. It wasn’t something I’d heard of before, and so I was intrigued.
Many moons ago I would write web applications using technologies like PHP or Python to directly serve web content – often using templating engines to more easily handle data display. These apps would thus ship server-side rendered plain HTML, along with a mix of browser-native forms and AJAX (sometimes with JQuery) for some interactivity and more seamless data submission. I’d use additional lightly-sprinkled JavaScript for other UI niceties.
For the past ten years or so, this all changed and my web apps – including those developed as part of my work and nearly all of my side projects – switched to using front-end frameworks, like React or Vue, to create single page apps (SPAs) that instead solely consume JSON APIs. The ease of adding front-end dependencies using npm
or yarn
, advancements in modular and “importable” JavaScript, and browser-native capabilities like the Fetch API’s json()
function encouraged this further, and the ecosystem continued to grow ever larger. Front-end code would run entirely independently from the back-end, the two communicating (often) only via JSON, and with both “ends” developing their own disparate sets of logic to the extent where progressive web application front-ends could leverage service workers and run entirely offline.
Restic has been my go-to backup tool for many years, and I’ve used it for a wide range of workloads – including for managing filesystem, Postgres, Photoprism and Vaultwarden backups. At the point I was researching suitable backup options for these cases, and considering my personal requirements, I was often stumbling upon discussions comparing Restic to a similar tool – Borg backup – but I eventually settled on using Restic myself, due to its flexibility and the amount of data (measured in terabytes) I intended to backup.
A few months ago, the boy that my husband and I are in the process of adopting came to live with us for good. It was such an incredibly amazing day, full of a mixture of hugely strong emotions, that I’ll certainly never forget. Despite there being (potentially) still months to go before the legal side of things completes, I wanted to reflect on the journey we’ve taken so far in getting to this point. We’re far from unique in embarking on this journey, but I know everyone does have their own unique experiences, and I hope that talking about our particular ones might be some help to others going through the process, or potentially interesting to others.
It’s not too often that a book really makes you think – whether this is about yourself and your own position in the universe, or just about bizarre or interesting concepts. I’ve recently happened across a couple of books written by qntm that definitely do both of these things: Valuable Humans in Transit and Other Stories and There is No Antimemetics Division.
There is No Antimemetics Division is a science fiction dystopian novel(la) introducing (at least, to me) the concept of antimemes – ideas that can’t be shared or passed on. On the face of it, this doesn’t sound particularly distressing, but the book explores the implications of this alongside the setting of malicious (intentionally or otherwise) “entities” known as SCPs (“Secure, Contain, Protect”) that can manipulate and control thoughts, ideas (and even physical or electronic records) in order to prevent the spread of knowledge.
My personal website is generated using Hugo, which allows me to write nearly all of the actual content itself in plain markdown files.
I also maintain a Gemini capsule (hosted at gemini://wilw.capsule.town). For a while I’ve wanted to be able to add more content to this capsule, and to try and keep it updated more consistently over time. However, I don’t really have the capacity to duplicate the time taken to maintain the site (and its blog posts and notes) in order to do so.
Some of my personal projects are beginning to get larger and more complicated, often involving different front-ends and services that all need to be separately built and deployed. Managing all of these is taking away more of my personal time, and I’ve been on the look-out for a good CI/CD automation system for these projects. I primarily use Gitea as a git server, and have been struggling to find a system that suited my needs and works well with Gitea.
I’ve been a member of Goodreads since 2013. I follow a few of my friends and family on there, and whilst it was nice to see the types of things people are reading, I only really ever used the service as a way for logging what I had read. The other social aspects didn’t keep me coming back and I personally didn’t find the home feed interesting.
As I started to get more into the Fediverse back in 2021, I joined BookWyrm (as @wilw@bookwyrm.social). I was able to export my reads as a CSV from Goodreads and import them into BookWyrm, which I continued to use as a method for recording reads and listens. BookWyrm is excellent, easy-to-use, and less clunky than Goodreads. I can certainly recommend it if you’re looking to join or build a community around books.
I self-host a number of services - Nextcloud, FreshRSS, PhotoPrism etc. - at home on a Raspberry Pi. Attached to this Pi is a large SSD to hold the service data and configuration, and all of this is periodically backed-up via Restic to a remote site.
The SSD is simply formatted with ext4
, and the directory containing all of these services and data is currently encrypted using fscrypt. I (mainly) want to encrypt the data in order to protect against the case of break in and theft in my house, however likely or unlikely that is to occur.
Every Mac user seems to have a different way of managing their open applications and windows.
Some people prefer to view each window in “full” mode, in which they take up the entire display and the user can cycle apps or use the dock to change the active window. Other people use full-screen mode and/or swipe between desktop spaces to find their apps, or a mixture of several approaches.
Back in 2021 I blogged about how and why I wanted to switch from Google Photos as a storage solution (and source of truth) for my life’s photo and video library. The post compared several solutions, such as Piwigo, Mega, and Nextcloud.
In that time I’ve tried several further options, starting with pCloud (as described in that post), Nextcloud backed by S3, and plain-old Apple Photos.
Update 2024-09-01:
I have since written a note on Tailscale Sidecars which provides a more elegant solution to this problem. I have left this post here for posterity.
Tailscale’s HTTPS feature is an excellent tool for adding TLS connections to web services exposed over the tailnet.
Although traffic over the tailnet is encrypted anyway due to the nature of Tailscale itself, some web-based services work better when served over HTTPS. Since the browser does not know that you are accessing the service over a secure connection, it may enforce limits on connected web services when accessing them in - what feels like - an insecure context.
The Bear notes app has been my go-to notes app for Mac, iPhone, and iPad for some time now. It’s got a great UX, a customisable UI, and is one of those apps that feels like a (clichéd) “delight” to use.
Bear is written exclusively for Apple devices, and uses CloudKit to sync notes between devices via iCloud. In theory, this isn’t too much of a problem. However, I’ve recently found CloudKit-reliant apps to become a little unreliable.
For several years I’ve been using GatsbyJS to generate the static site content for this website. Gatsby is a great tool and produces blazing-fast websites through the use of an interesting combination of technologies.
In Gatsby, pages are simply React components, and developers can make use of the entire JavaScript and React ecosystems to craft their sites. Config files can be used to create pages that don’t “exist” in your filesystem (e.g. an index page for each tag used in a blog) and GraphQL queries are used to surface content and query data from across the website. Gatsby templates and standard React composition patterns allow for excellent re-use of components.
Making Tax Digital (MTD) is part of the UK Government’s plan for modernising the tax system for both businesses and individuals.
For years, HMRC (the Government’s tax department) has had an online tax system that is infamously complicated and slow to use and update such that even accomplishing simple tasks can be long and painful processes. Part of this is due to the laughably complicated UK tax system itself (rather than the fault of the technology), but some of it can certainly be attributed to the antiquated tooling.
I’ve worked out every day - pretty much - now since March 2020. The only days I’ve missed were because of illness (Covid), or another reason that made it physically (e.g. my dislocated shoulder) or logistically (e.g. during travel) impossible.
These workouts have pretty much always been “at home”, or wherever I happen to be staying at during the time. They started during the first wave of UK Covid lockdowns, in which I realised I was not getting the exercise I was previously used to when walking the 3 kilometres daily to work and back.
Since getting a Magic Keyboard for my iPad Pro, I’ve been using the iPad for many areas of work for which before I would have needed a laptop.
In fact, last week I was able to use the iPad full-time when at work in our new office space, and I didn’t need to reach for my MacBook once. When I can, I prefer working on the iPad due to its flexibility, brilliant display, speed, battery life, and more.
Earlier this week I needed to make some changes and re-deploy an old Vue app. I hadn’t touched the codebase in over a year, and my experience with the rate of change in the front-end web space made me dread what would happen if I tried to re-awaken this thing.
Sure enough, after running a yarn install
and launching the app using the scripts in package.json
, a number of errors were displayed about Node/Webpack/Vue incompatibilities, and I didn’t really know where to start. I don’t use Vue on a daily basis these days, and so I don’t usually need to make an effort to keep fully up-to-date on its developments, but I knew I was several versions behind on vue
, vue-loader
, as well as all the sass
and babel
toolings. This wasn’t going to be a quick fix.
If you’re a current follower of this blog then you may already know that I’m a bit of a fan of using plain text accounting for managing finances.
I mainly use the Ledger text file format and CLI tool for bookkeeping and reporting on finances. This works great, and I can quickly and easily generate different kinds of reports using a range of simple Ledger commands.
For example, to generate a quick income/expenses balance sheet for a particular date range I can run ledger balance income expense -b 2022/03/01 -e 2022/03/31
, which produces something along the lines of the following:
Whilst my days of binge drinking as a student are thankfully far in my past, alcohol is still an ongoing, yet much more minor, part of my life.
Like many millennials (and I’m sure it must be the same for other generations too), we got used to it as a mechanism for socialising. Whether this is meeting friends after work, going out for dinner with family, or spending time in the pub or at home with a significant other.
I often talk about self-hosting on this blog, and I’m certainly a big fan of being able to control my own data and systems wherever possible (and feasible). I’ve recently switched from using Nginx to Traefik as a reverse proxy for my server and for terminating TLS connections.
In this post I’ll talk a little about why and how I made this change.
I self-host a number of services; including Nextcloud for file storage and sync, Gitea for git syncing, FreshRSS for RSS feed aggregation and management, Monica for relationship organisation, and a few other things too.
If you’ve ever run your own Nextcloud before, you may have noticed screens like the following in your instance’s settings pages.
The messages advise a number of maintenance procedures to help ensure the smooth running of your instance. These could be to run database migrations or to update schemas in response to installing new apps.
Often these steps might involve running occ
commands. occ
is Nextcloud’s command-line interface, and is so-called because of its origins in ownCloud.
I recently signed the web0 manifesto, which embodies many of the values I consider to be important when it comes to technology - and the web in particular.
web0 is the decentralised web… web0 is web3 without all the corporate right-libertarian Silicon Valley bullshit.
Essentially web0 is around empowering a decentralised web that:
In practice this could mean owning your own domain name and taking part by hosting a website or through getting involved in other communities, such as in the tildeverse. The key thing is that participants own and can control their own data and that things are accomplished without needing to rely on big-tech.
Twelve months ago - in January 2021 - I started my attempt at the #100DaysToOffload challenge. I had set myself a new year’s resolution to try to write more and, around the same time, I noticed the hashtag for the challenge circulating on Mastodon. It seemed like the perfect opportunity to fulfil my resolution. The challenge is to post 100 times on a personal blog during the space of one year.
🎉 This is post 100 in my attempt at the #100DaysToOffload challenge!
For a couple of years I have been writing mobile apps using the Flutter framework, having previously been a React Native advocate. Flutter is a great tool for writing applications that target multiple platforms and architectures from one code base - and not needing to write any JavaScript is definitely a bonus!
I use and recommend Firebase Cloud Messaging to handle push notifications in these applications. There’s also a great library for Flutter - Flutterfire - to handle the setup and receipt of these messages, along with the requesting of push permissions on iOS. The set-up takes away the pain of managing cross-platform notifications in Android and iOS applications.
For as long as I’ve been using Matrix I’ve hosted my own homeserver on my own VPS and at my own domain.
I previously wrote about how I self-host my homeserver with the help of the Synapse project. Although this set-up is quite straight forward, it’s an extra system to maintain with all of the associated overheads.
One of the reasons I don’t host my own mail server is that I fear missed messages and silent bounces. I trust dedicated mail providers (particularly Fastmail) more than myself in providing a robust enough service to ensure things get through. Equally, if I am telling other people my Matrix handle, then I want to make sure that messages they send (and those that I send) actually get delivered without any problems.
Some people may remember my quest a few months back to find a good alternative to Google Photos for image storage and backup.
At the time, I talked about Piwigo, Mega and pCloud as potential candidates. I also briefly touched upon Nextcloud in that post - a service I use (and self-host) anyway for all of my other storage needs, but I did not consider it further due to the high cost of the associated block storage required to house a large number of images.
Dean Burnett’s The Idiot Brain is an interesting insight into why people think the way they do, personality, emotion, and the biology of the brain.
The author (who happens to live in the same city as me: Cardiff), covers a wide range of examples of human behaviour and relates them to brain function. Often these are based on defence mechanisms developed over the vast time of human evolution, and it’s amazing how our perception of fear and “uncertainty” can have an impact on other feelings and emotions too - such as embarrassment.
Most applications include some sort of outbound transactional email as part of their normal function. These email messages could be to support account-level features (such as password-resets) or to notify the user about activity relevant to them.
In the latter case, such emails might be read, and then archived or deleted by the user, without further direct action. They aren’t typically designed to be something one actions or replies to - they’re mostly there to bring you back into engaging with the platform.
I maintain a small number of projects in my spare time. The amount of time I get to work on and maintain these varies depending on my other workloads.
The projects were never designed to be a means of making additional income, and were usually created simply to solve a need that I (or somebody else I know) had!
By open-sourcing them I hope that others will read through the code, and even check for (and report) problems or potentially contribute. The projects are licensed quite liberally under the BSD license.
A while ago I posted about how I back-up my personal servers to Backblaze B2. That approach involved adding all of the files and directories into a single compressed archive and sending this up to an encrypted bucket on B2.
Whilst this does achieve my back-up goals (and I use Telegram to notify me each time it completes), it felt inelegant. Every time the back-up executed, the entirety of the back-up file - several gigabytes - would be built and transferred.
From the hills of Dusk’s End to the small alleys of Main Street, you feel drawn to the lights of this vibrant metropolis in an uncharted internet territory. The sign reads “Nightfall”.
– Nightfall City
The Nightfall City Gemini capsule (also available via the web) is an internet community in which people can engage with each other and write blog posts and other long-form content.
The community is divided into different districts of Nightfall City: Main Street, Dusk’s End, and Writer’s Lane. These districts allow members (or “citizens”) to post links to their blog posts, zines, or other online articles. From what I can see, people tend to participate in the district or community that best fits them as an individual.
I’ve really enjoyed my recent discovery of a couple of traditional-style webzines. Webzines (sometimes referred to as online magazines, or - in this instance - simply “zines”) are a way of distributing periodic content through the web.
I’m not referring to modern-day online media outlets, but those publications which are typically written by a small number of individuals (often “netizens”) and where the focus is not on advertisements, clickbait, or the mass production of content.
I feel that this book really resonated with my own thoughts around the importance of diversity in groups and teams.
Matthew Syed’s Rebel Ideas: The Power of Diverse Thinking is a book that examines how effectiveness and output can be dramatically altered through building teams that contain diverse thinkers.
The book considers a number of examples - from both the past and present - and compares and contrasts scenarios where teams expressing different proportions of diverse thinkers can change their performance; sometimes with life-threatening consequences.
I’ve recently been reminiscing about the “old” days of the web. They felt much more like expressions of personality and creativity.
These days, most people have social media accounts on mainstream services that act as their sole representation of themselves online. Whilst the content can be different, everyone’s own pages end up looking the same, with avatar images, feeds, and other components having layouts and “look and feel"s controlled by the service - the creativity is lost and things become bland.
I’ve recently posted about our home, in which we’ve completed a few DIY projects, such as renovating the garden and building a small loft conversion (amongst other things!). Today I’m writing about a project we did on the bathroom in the house.
When we first moved into the house, which was an old student dwelling in disrepair, the first floor had a stange configuration containing a separate toilet and shower room. Both of these rooms were tiny, old, and damp. There was no space for a bath and also no window in the shower room and the extractor fan didn’t work.
If you run a service that accepts file uploads from users, and then subsequent re-download by other users (such as images), then your service is potentially at risk of becoming a system for distributing malware. Without safeguards in place, bad actors could potentially use your service to upload harmful files with the intention of them being downloaded by other users.
Services like Google Drive and some email providers will automatically scan files for malicious payloads, but if you - like many people - rely on more basic object storage for storing files for your apps, then there may be less default protection available.
On Sunday I slipped and fell in the pouring rain. I landed hard on my side and ended up dislocating my shoulder.
I didn’t really want to try risking it and putting it back in myself (sensibly, as it turns out!), so I got a taxi to A&E (an emergency ward in UK NHS hospitals). I was lucky enough to be seen within a few minutes; I then had examinations, had X-rays, and was given gas. When the doctor got round to me, he managed to push it back into position after a few minutes of moving it around - accompanied by a very loud click.
It’s been a weird 18 months. Before pandemic-initiated changes, our daily lives might have involved getting up and travelling on some form of commute (either by walking, public transport, car, or something else) to a place of work each morning, before reversing the process every evening.
From my perspective, the change to working remotely from home has a number of benefits. Although I do miss seeing people on a daily basis in the office, the flexibility of remote work certainly outweighs any downsides. This is a feeling also echoed by many across my team - especially those with children or other family life that they need to work around.
I’ve recently noticed (and read) more and more posts discussing *BSD systems. Creations like the new (and excellent) OpenBSD Webzine and blogs (such as Rubenerd’s and Solene’s) do a great job in raising awareness of the family of operating systems.
I’m pretty familiar and comfortable with Linux, having spent many years using it as a daily driver (I am back on macOS full-time right now). Whilst UNIX systems share a lot of similarities, I’ve never properly used a BSD system before.
Following on from my previous post about renovating our garden, I wanted to write an entry about another project we’ve recently completed.
Our home is a Victorian townhouse over three storeys, but the top floor has only one bedroom and a bathroom. The rest of that floor is essentially attic (or loft) space, accessible through a hatch in the ceiling of the middle floor.
As such, the loft is quite long, and far too big for us to use as-is. In fact, I’m a big believer in that if you need to put something in the loft then you probably don’t actually need it at all. I’m pretty sure we only have Christmas decorations stored up there.
I often enjoy books that try to take a different view on known events. I don’t mean consipiracy theory - more around thinking laterally or “out of the box”. Such ways of thinking often inspire ideas that drive innovative change, and it’s important in order to counter “group think” or simply accepting what’s easiest.
One such book that tries to do this is Extraterrestrial by Avi Loeb.
Another podcast I frequently listen to likely needs no introduction of its own. The This Week in Tech (or just “TWiT”) network’s flagship podcast - also called TWiT - must be one of the longest-running tech podcasts.
The podcast series started back in 2005. It runs weekly, with episodes recorded live each Sunday evening and made available via podcast clients shortly afterwards.
It is hosted by Leo Laporte, who is joined by interesting and varied panelists from across the tech sector. Episodes feature light-hearted discussion of recent news and insights from the technology world.
Last week I gave a talk at the Bitcoin Association BSV Meet-up for Wales, hosted by Tramshed Tech in Cardiff.
Before learning about this meetup, I had not heard of BSV - either from a technology or currency perspective. However, as well as promoting an interesting project, the event welcomes showcases from technologists working across the blockchain space.
At Simply Do we have recently completed a project that aimed to leverage blockchain distributed ledger technology to help protect and manage IP assets in complex international and cross-domain supply chains. The project was a success and this was what I - along with my colleague, John - presented about.
A few years ago I was in the position of needing a solution to backup and sync dotfiles (configuration files for various pieces of software) across my machines.
Specifically, I had Mac computers and Linux servers, and needed a way to nicely keep these files up-to-date between them. For example, I may have spent some time crafting and tweaking files - such as my .vimrc
and .tmux.conf
- and needed a way of ensuring all of my devices could access the latest version of these files.
Having recently read The Secret Barrister, which I loved, I was recommended to also check out This is Going to Hurt by Adam Kay.
The book is similar to the Secret Barrister in that it’s a collection of insights and stories from a working professional - this time a hospital doctor. The book is subtitled Secret Diaries of a Junior Doctor.
The author tells the story of his experiences in completing medical school and beginning work in the UK National Health Service (NHS) system. NHS doctors generally follow a pre-defined pathway from “F1” through to consultant (or slightly different if a GP), and this book describes experiences of the author as he works his way through this process.
It’s been a few weeks since my last post about the Pinephone. Since then I have been playing further with a different graphical shell and have been trying out new applications.
In that previous post, I noted a few points that made the phone tricky to use as a daily-driver. However, it should be noted that this was (intentionally) based purely on the phone’s out-of-the-box configuration. I fully meant to continue exploring and to discover ways in which the device could become more of a useful daily use phone for me. This post forms part of that journey.
The Secret Barrister: Stories of the Law and How It’s Broken is an oustanding book. In my opinion it is easily the best book I have read in the past year - certainly the most interesting.
The book is written by an anonymous barrister working in the criminal justice system for England and Wales. They explain the current inadequacies of criminal justice through a mix of interesting real-life and often first-hand stories.
I’ve always been crap at learning languages. From an early age my parents would encourage me to learn French, and I picked up Spanish and German at around the GCSE level too (exams we take around the age of 16 in the UK). But things just didn’t really ever sink in.
Part of this would have definitely been down to a childhood unappreciated privilege of understanding English as a first language. As I moved more into the technology domain to follow my interests (and then later for education and work), I was lucky that everything I needed was also in English (from programming languages, technologies, documentation, and more) - of course driven primarily by the US big-tech sector.
For many developers, the notion of adding accessibility features - such as image alt
text attributes to web page images and integrations with host usability enhancements, such as screen-zoom and text-to-speech - might feel like a chore. Especially for those still in the startup or “do things that don’t scale” phase.
It should no longer be about “adding” accessibility features any more than one these days “adds” a mobile-friendly version of their site (long-surpassed by responsive design and mobile-first pricinples) or even “adds” a button to perform a specific task. Accessibility features are a core part of any product, and should factor into the software development process right through from requirement engineering through to planning, design, implementation, and testing.
I was performing a standard system upgrade on an Arch server this morning and received the following messages (maintainer details redacted):
$ sudo pacman -Syyu
... # Download of packages
(159/159) checking keys in keyring [######################] 100%
(159/159) checking package integrity [######################] 100%
error: fail2ban: signature from "... <...>" is unknown trust
:: File /var/cache/pacman/pkg/fail2ban-0.11.2-2-any.pkg.tar.zst is corrupted (invalid or corrupted package (PGP signature)).
Do you want to delete it? [Y/n] Y
error: failed to commit transaction (invalid or corrupted package)
Errors occurred, no packages were upgraded.
I followed advice in the forums and tried refreshing and repopulating the keys, clearing the Pacman cache, and a combination of these things. I still kept getting the same problem each time I tried to upgrade.
I recently wrote about reviewing my Twitter usage, with the aim of discovering any constructive takeways I get from the platform that warrants me keeping it installed as an app on my phone.
The up-shot is that I have now removed it. I didn’t delete my account as there is still enough value in visiting it less regularly (such as on my computer’s web-browser), but by removing the easy shortcut from my phone I have noticeably helped reduce the amount of time I spend doomscrolling.
Having recently read Project Hail Mary - and rated it highly - Goodreads suggested I try Columbus Day by Craig Alanson.
This is the first book in the Expeditionary Force series - one that I hadn’t yet heard of at the time.
Although it’s sci-fi, the book is set in the modern-day. Earth gets invaded by far more technologically advanced aliens and humanity suddenly finds itself playing along as the lowest echelon in a war involving many different levels of alien capability. The mostly-powerless humans need to work out which side they should be fighting on.
I self-host several services on various servers - for both some professional and personal uses.
I use automated backup scripts to periodically sync data to Backblaze (which I recently posted about). However, once they were setup I would often worry about whether they were working properly. To verify, I’d have to log into Backblaze and check when the latest backups came through.
Although I trusted the process, this became a bit of a pain and more and more of a constant worry. The script might crash, run out of storage space, or anything else, and I wouldn’t know about it unless I actually checked.
TL,DR; I’m starting a Twitter diary to log interesting findings, and to measure its value to me.
Twitter is pretty much the last bastion of mainstream centralised social media that I use (aside from messaging services like Whatsapp and Telegram).
Although I primarily use Mastodon for my every-day social networking, which is more focused on the things I am actually interested in, I always kept Twitter around too as an app on my phone. This is because every time I try to remove it, I quickly feel as though I must be missing out on something. It always felt as though some news or useful announcement would go unnoticed.
Another project I try to maintain (when I can!) is SSO Tools.
This is a simple web service that aims to help developers test their own services’ single sign-on (SSO) functionality. The motivation behind the project was that many commmercial offerings were too expensive for solo developers, or just far too complex for simple testing.
SSO Tools aims to provide a simple interface, with functionality that allows for registering identity providers (IdPs), test IdP users, and service providers (SPs). It is targeted at developers looking to quickly, yet robustly, test and iterate on their SSO setup in their applications.
Another of my favourite podcasts is Darknet Diaries.
Created and presented by the excellent Jack Rhysider, Darknet Diaries releases new episodes fornightly. Each episode contains a true story from the “dark side of the internet” and includes content related to cybercrime, hacking (in the information security sense), dodgy government activity, and much more.
Typically, most episodes involve guests that Jack interviews in order to tell a story. Although it does occasionally stray into some technical detail, the vast majority of episodes are totally accessible to everyone.
Many services - including web and mobile apps - allow for their users to upload imagery. This could be to enable users to upload an avatar image or perhaps create a gallery of image files.
Either way, many photos contain some degree of sensitive metadata information as part of their EXIF data. For example, if you take photos using your phone, it is likely that the camera application will embed metadata into the image file it creates. This could include the geocoordinates of the position from where the photo was taken, the make and model of the camera device, as well as lots of other data (exposure time, focus, balances, etc).
Adding theming and the choice between “light” and “dark” modes to your website can enhance your site’s accessibility and make it feel more consistent with the host operating system’s own theme.
With JavaScript (and React in particular) this is easy to do, as I’ll explain in this post. We’ll use a combination of our own JavaScript code, the Zustand state management library, and the styled components package to achieve the following:
For several years I’ve been a user of Goodreads. It’s a very popular platform, and I primarily use it for keeping track of the books I’ve read, for receiving suggestions about new books, and for keeping up with what some of my friends are reading.
It’s a good service (though sometimes a little slow) - the website and mobile app are nice to use. However, as with any closed system, it’s always a worry of mine to think about what might happen if the service were to disappear or if I were to get locked out for some reason.
About 18 months ago we bought a new home. The house is an 1880s (ish) Victorian building, and many of its original features - such as tile floors, cornice, and fireplaces - had been retained, which is great.
It had previously been a student-style (HMO - house of multiple occupation) house, in which nearly all available space had been converted into bedrooms. As you might imagine, the fixtures and fittings weren’t in great conditions, with old and worn carpets and wallpaper throughout.
I don’t tend to talk much about the projects I’m working on, but thought this would be a good opportunity to write a post about one such project - Treadl.
Treadl is a web app (and more recently and less popularly a mobile app too). It enables weavers to create and store their weaving patterns and projects online. This could be simply for personal use, or for sharing projects with others as a portfolio.
Back in April, I bought a Pinephone. I used the phone quite consistently for the first few weeks and I meant to write an update here a couple of months back, but work (and other things) got in the way a bit.
So, here is my delayed “first few weeks with a Pinephone” update.
As mentioned, I initially aimed to simply use the phone in its out-of-the-box state (i.e. Manjaro Linux with KDE Plasma Mobile) - not as a daily driver, but more as a means of measuring the phone’s base case usability. However, hopefully with an aim to eventually being able to use such a device more full-time.
Some people have complex development processes and flows - making use of tools such as heavy editors and IDEs, Docker for running and building locally in development, or even develop entirely remotely over SSH connections. Other people use simpler combinations of tools.
I thought I’d write briefly about what I use on a daily basis. I have a relatively simple development tech stack:
Terminal.app
application that ships with my Mac, since this works best for me)I also use a small number of Vim plugins - installed via Vundle - to add nice quality-of-life features to my editor:
About nine months ago - at the end of November last year - we adopted a dog. Although I’ve always grown up with and around dogs owned by parents and siblings, I’ve never been a huge “dog person” myself. However, it is very easy to get attached very quickly!
The dog we adopted is an English working cocker spaniel. As you might imagine, he is extremely energetic. He’s a nightmare on (and sometimes off) his lead, and his recall isn’t fantastic yet due to us needing to re-train him after his previous owner.
Providing code snippets on your website or blog can be a great way to convey meaning for technical concepts.
Using the HTML pre
tag can help provide structure to your listings, in terms of spacing and indentation, but highlighting keywords - as most people do in their code text editors - also vastly helps improve readability.
For example, consider the below JavaScript snippet.
class Greeter {
greet(name) {
console.log(`Hello, ${name}!`);
}
}
const greeter = new Greeter();
greeter.greet('Will');
The exact same listing is displayed below with some simple syntax highlighting. The structure and meaning of the code becomes much easier to understand.
The Gemini protocol has gathered even more momentum in the few months since I last posted about it.
Its popularity is largely driven by its privacy-focused and content-oriented design. It doesn’t allow for bloated sites or resource-hungry client-side scripting. It’s a means for simply and easily accessing content that is useful to you - either by hosting a capsule yourself or by joining an existing community.
In this post I am introducing Capsule.Town - a way in which I can try and give back to the FOSS community.
I listen to a number of podcasts each week. One of these is ATP (Accidental Tech Podcast).
This is one of my favourite weekly podcasts. It’s humurous and full of cutting-edge discussion from the tech world, and I always look forward to new episodes.
The episodes are primarily Apple focused, which is fine for me since I’m a big user of Apple products. Some episodes are more technical than others - discussing programming and development approaches - whilst others are focused more on user-facing items.
Many web apps have support for uploading video files. Whether it’s a media-focused platform (such as a video sharing service) or just offering users a chance to add vlogs to their profile - videos are a powerful mechanism for distributing ideas.
For services providing image upload functionality, it is relatively simple to build in processes that extract smaller versions of the files (e.g. thumbnails) to be used as image previews. This allows other users to see roughly what an image is about before opening a larger version. It also enables more interesting, responsive, and attractive interfaces - since the smaller images can be loaded more quickly.
Every now and again it’s nice to dive back into a young adult book. I recently read The Night Circus by Erin Morgenstern.
The book is a sort of dark romantic/fantasy mashup. It’s about a travelling circus, those who perform in it, and those who run it.
Le Cirque des Rêves is not just any circus, however. It’s only open at night, and always closes before dawn. What appears to be clever trickery may actually be much more behind the scenes, and the garish colours found in other circuses have been replaced by a simple black and white colour scheme.
I enjoy reading my RSS feeds across my devices - whether it’s on my phone when out and about, my Mac in between bouts of work, or my iPad when in downtime.
Being able to sync feeds across these devices is important to me, both so I can maintain a single collection of feeds and to ensure that I can keep track of read/unread articles as I switch devices.
There are lots of web-based clients available, but using Reeder - a native app - gives a far nicer reading experience. There are lots of other clients for other types of devices too.
Recently my colleague was talking to me about the concept of the “5AM Club”, as defined in the book by Robin Sharma.
The “Club” is focused around starting your day early, with defined time slots for exercising and thinking.
There’s a great video here that summarises it all in about eight minutes.
The rough idea is to get up strictly at 5AM, and then spend 20 minutes exercising, 20 minutes reflecting, and then a final 20 minutes growing. By 6AM you are then energised and invigorated to start your day more effectively and successfully.
Image processing and resizing is a common task in many types of applications. This is made even more important by modern phones that take photos several megabytes in size.
For example, if you offer an application that allows people to choose an avatar image, you won’t want to render the full multi-MB size image each time it’s shown. This is extra data for your users to download each time (which costs them both time and money, and can give a poor sluggish experience) and also means you need to fork out more cash for the additional bandwidth usage. If you have lots of users, then this time/money saving can be amplified significantly.
Andy Weir has become renowned over the past decade for his science fiction novels. The Martian (and its movie) was hugely enjoyable and successful. I wasn’t so keen on Artemis, but still did enjoy the excitement of the story.
I thought his latest book - Project Hail Mary - was fantastic.
The story opens with a lone astronaut waking up in a spaceship that he has no memory of. He doesn’t know where he is, who he is, or how he got there. Although he works out that he is of pivotal importance to the survival of the human race, the story cleverly keeps you guessing about what might come next right to the end.
In user-facing software, loading indicators are extremely important to let your users know that something is happening. This is the same no matter whether your software is a CLI program, a GUI application, a web app - or anything else.
Without such indicators, users of your software may become frustrated, or assume the program has crashed, and try to close it or leave.
Generally speaking, developers should try to keep long-running tasks to a minimum (or even offload them to a cron-job or an asynchronous server-side worker). However, in some cases this is not possible. For example, in a cloud-based file storage solution, in which uploads and downloads are a core and direct user-facing feature, the user must wait until a bunch of files finish uploading - though any post-processing can of course still occur in the background afterwards.
It’s common knowledge that part of Google’s business model is to use the data it knows about you, your searches, and browsing patterns in order to more effectively serve ads.
Many people feel uncomfortable with this and so there is a strong movement to adopt more privacy-focused options, such as DuckDuckGo. This was my position, too. For a few years I’ve been a solid DuckDuckGo user, and it was my default on Mac and mobile devices.
Wales Tech Week is an annual event held by Technology Connected. The 2021 event is running this week, aiming to bring technologists together from a wide range of businesses and organisations across Wales.
Today, I was a member of a panel discussing blockchain - “Welsh Businesses Bringing Blockchain to Life”. I was speaking alongside experts from other companies working in the blockchain and crypto space, and an academic focused on applying the technology to government functions.
Anxious People is a book about an attempted bank robbery in a Swedish town (not Stockholm!). It is written by Fredrik Backman.
The story involves a would-be bank robber arriving unexpectedly at an open apartment viewing whilst trying to run away, and taking the prospective buyers hostage in the process. It is mostly split between being set at the apartment itself and the police station in which the hostages are separately interviewed after the event. It is told primarily from the perspectives of the bank robber, the hostages, and the police officers.
IDEs and richly-featured text editors - such as VS Code and Sublime Text - support many great features. One of these is the notion of projects or workspaces.
Such workspaces let you save your project’s development configuration to disk - things like the project directory, open files, editor layout, integrated terminal commands, and more. Often, each project can have its own workspace, too.
If you use workspaces then you don’t need to go through the tedious process of setting everything back up again each time you switch project, re-open your editor, or reboot your computer.
Recently I’ve noticed that some of the RSS feeds I subscribe to have become more and more restrictive. A post might contain just a title, or perhaps a short snippet or introductory paragraph, with the expectation that I then proceed to follow the link to visit the website itself in order to view the post in full.
I suppose in many ways that this is similar to distributing podcasts via RSS: the feed contains the podcast title, description, and other metadata, and then a link to download the podcast episode file itself. But this is because podcasts are in audio or video format and cannot be reasonably embedded directly into an XML file.
For his birthday a few years back, I bought my (now-)husband a beehive and a honeybee nucleus.
Some might see this as a strange gift, especially given that we live close to the city centre. It was certainly a surprise for him, but given his love for animals and science I knew he would like it.
We were lucky enough to have a relatively large garden, given our location, of around 20 metres in length. Since we didn’t really use the end of the garden much, it was a good location for hive - though other people have successfully kept bees in much smaller areas and on rooftops.
Just a quick post to say that I recently got married! By coincidence the event was three years to the day after our engagement.
It was a lovely day - great weather and really nice to see those that could attend. Hopefully we’ll get a chance to go away later in the year if/when things start opening up again 😁
In my earlier years I was fairly into gaming. I was definitely only ever a “casual gamer” in the scheme of things today, but I would play at least a small amount of something most days.
When I was young it was mainly those games based on Nintendo platforms - Super Mario, Mariokart, Super Smash Bros, etc. These were great with friends and were the kind of games (along with their various sequels) that we could play over again and for many years to come. Pokemon was also a big hit for me, which would continue on through the consoles.
The Classic Collection of H.G. Wells novels contains five well-known stories: The War of the Worlds, The First Men in the Moon, The Time Machine, The Invisible Man, and The Island of Doctor Moreau.
Despite the fame of these novels, I had never read any of them until I recently listened to them via the audiobook version, which was excellently narrated by the likes of David Tennant, Hugh Bonneville, and others.
Someone non-technical recently asked me the question, “what actually is a server?”. They knew it was just a type of computer that runs somewhere that can be accessible over the internet, but they were interested in how they differ from “normal” computers.
The conversation moved on to how these computers can make several different functions available at the same time over the network, which brought us on to the topic of services and network ports.
For a couple of years now I have been using a self-hosted Nextcloud as a replacement for iCloud and Google Drive. I won’t go into the details as to why (especially given the additional upkeep and other overheads required), as this has been covered before - but mainly it’s about maintaining control over my data.
I use a cloud VPS to host my Nextcloud instance - rented from Linode, whom I can certainly recommend if you’re looking for a good VPS provider - and since starting my Nextcloud journey I have begun hosting a number of additional services on the same server. For example, FreshRSS (which I consume using Reeder), Monica, Gitea, a Matrix server, and more.
The UK went into its first proper COVID-induced lockdown back around March time last year. At this time, our company locked its office doors and we all began working from home. We’re now all still working remotely about 14 months later and will continue to do so for the forseeable future.
Before we closed the office, I used to walk across my city - Cardiff - to get to work. It’s about a 3km walk, which would take me about 30 minutes to walk each way. I enjoyed the walk - I could stop for coffee on the way through, and the distance meant I could take different routes on different days if I wanted a change of scene.
In this post I will talk a little about how I handle my digital notes and to-do lists. In the spirit of my last post on data sovereignty, the focus will be on self-hosted approaches.
It feels odd that the first task many new technical frameworks guide users through, by way of a tutorial, is a simple to-do list; yet finding great production-ready examples of such software can be challenging.
The term ‘data sovereignty’ is something we hear much more about these days. Increasingly I’ve also heard it being mentioned in different contexts.
We’ve seen it more in the world of enterprise SaaS; particularly in the case of UK-based public sector organisations amid post-Brexit data flow policies. More and more organisations are getting stricter in the geographic location of their users’ data. Whereas before most organisations would be happy as long as the data is stored somewhere within the EU, they would now require it to be stored onshore within the UK.
I listen to a number of podcasts each week. One of these is Go Time.
The Go Time podcast releases episodes every Thursday. Its format is mostly comprised of panel discussions and interviews with founders and specialists in the community about the Go programming language. Episodes are usually between 60 and 90 minutes long.
I don’t program in Go a lot myself these days, though do have one or two older projects written in the language. However, I feel that the content is often broadly relevant for non-full-time gophers - like myself - also.
As you may know, I recently purchased the beta edition of the Pinephone. It arrived last week in the Pinephone Beta Edition box shown below.
As mentioned in my previous post on the subject, I bought the phone for purely experimental purposes, to get involved in the community, and to be a part of the freedom and Linux-on-phone movement.
I fully understand that the device is not yet really considered ready for every-day reliable production use (especially when compared to my current iPhone 11 Pro Max). However, the Pinephone is less than 20% the price of my iPhone, and comes with the freedom to do so much more - without the restrictions of Apple’s “walled garden”.
This is a bit of a vanity post, but back in December I was lucky enough to be included in the 2020 WalesOnline “35 Under 35”.
This list aims to present the “best young businessmen in Wales” for the year. It was definitely an honour to be included and it’s great to see the efforts from the whole team at Simply Do reflected. We’re still only at the beginning of our journey and so we have an exciting few years ahead!
I was recently asked whether Steve Jobs was someone that inspired me. It’s a difficult question, I find; he’s definitely an inspiring person in the sense of his work ethic, the products he envisages, and his way of understanding the needs of the target customer better than they know it themselves.
As a person, however, I find his personality and the way he treats others less inspiring. I try to be empathetic to others and take into account the emotional and psychological position of someone else when interacting with them. In a professional workplace this (hopefully) contributes towards creating a space that enables people to grow and develop whilst also emboldening colleagues to put forward their own thoughts and opinions in a more risk-free environment.
The Giver of Stars by Jojo Moyes tells the story of a young English woman - Alice - who marries an American man and moves to a small town in Kentucky in the late 1930s.
Not long after arriving in Kentucky Alice realises she may have made a mistake when it comes to her new husband. However, the real story focuses around a job Alice gets working with the local library.
As is the case with many countries, all businesses in the UK must report the state of their financial accounts to the relevant inland revenue service at their year-end (in the UK, this is HMRC).
This is also the case if you are a freelancer or sole-trader (or if you’ve made other untaxed income - e.g. from investments). In these cases, this is called your Self Assessment. Self Assessments are pretty straight forward, and can usually be completed online by the indiviual themself - as long as they have kept good accounts and know their numbers.
Back in November I started an Invisalign course to help straighten my teeth. Invisalign works like traditional braces, but is instead formed from transparent teeth “trays” that others can only really notice up-close. Given my personal situation, this seemed like a better approach than the traditional metal braces.
In all honesty, my teeth weren’t that bad to begin with but - like many people - as I got older I was beginning to notice a little more “crowding” (where teeth bunch together and start to move out of place). Invisalign was something I had wanted to try for a while, and whilst the UK was in lockdown and I couldn’t see anyone anyway, it felt like a good time to go ahead with it.
I don’t use Facebook often. In fact, I only have an account currently because our company uses the “Login with Facebook” functionality in order to offer an additional single sign-on option for some customers.
I logged-in today as we needed to update some of the app’s configuration on the Facebook Developer portal, and I went via the Facebook homepage feed to get there. A couple of “Suggested for you” posts that showed near the top of my feed were unusual and caught my eye.
Like many people, I own and manage multiple email accounts - for example, some are for work, for home, or for specific projects. I used to be a strong user of solely web-based email clients (such as Gmail or Fastmail’s web apps) for each of my accounts. However the number of tabs I needed to keep open for all of this grew to the point where things became unmanageable - both in terms of needing to check multiple tabs several times per day and also frustrations when the browser would restart, or if I’d lose my tab setup for some other reason.
The HTTP standard is an expressive system for network-based computer-computer interaction. It’s a relatively old standard - it started life as HTTP/1.0 in 1996 and the HTTP/1.1 standard was formally specified in 1999. HTTP/2 (2015) introduced efficiencies around how the data is transmitted between computers, and the still in-draft HTTP/3 builds further on these concepts.
I won’t go into the nuts and bolts of it, but - essentially - for most applications and APIs, the developer-facing concepts haven’t really changed since HTTP/1.1. By this version, we had all the useful methods required to build powerful and flexible APIs.
Earlier this week I ordered a PinePhone, which recently became available as a Beta Edition.
I’ve been excitedly following the progress of the PinePhone for some time now. I’ve joined various Matrix rooms, subscribed to blogs, and started listening to the PineTalk podcast. The phone is a hackable device that runs plain old Linux - not an Android variant - and thus helps users escape from the grasp of the Google and Apple ecosystems.
The Great Alone by Kristin Hannah is a book set out in the Alaskan wild. It tells the story of a young family that move in order to live off-the-grid after the father returns from being a prisoner of war in the Vietnam war.
The book mostly focuses on the viewpoint of the daughter, Leni, who is thirteen years old when she moves with her mother and father. The story tells how Leni adapts and grows into her new Alaskan life over the years, whilst at the same time trying to navigate some of the perils at home in her family cabin. Leni and her family meet and grow close to different members of the local community, in which there are a variety of views regarding the types of people that should be allowed to come to Alaska.
Centralised communication services, such as Telegram, Signal, and Whatsapp, offer convenient means to chat to friends and family using your personal devices. However these services also come with a number of pitfalls that are worth considering. For example;
There are, of course, other factors on both sides that you may want to consider. It can be hard to move away from these services - after all, there’s no point using a system that no-one else you need to talk to uses.
This post contains some of my thoughts on the book Blood, Sweat, and Pixels by Jason Schreier.
This book contains a number of stories about how some of the most well-known (and other less well-known) video games are made. The book’s subtitle, “The Triumphant, Turbulent Stories Behind How Video Games Are Made”, sums it up pretty well.
Working in the software industry myself, I often hear about the notion of “crunch time”, which is a term we’ve borrowed from the game devleopment industry at times when critical updates, fixes, or deadlines are pressing. However, after reflecting on the stories in this book, it makes me realise that the “crunches” we suffer are nothing to the crunch and stresses experienced by game developers in many small teams and large development studios alike.
Although I was still somewhere between being of single-digit age and a young teen back in the ’90s and early ’00s, I still fondly remember discovering and becoming a small part of the flourishing community of personal, themed, and hobby websites that connected the web.
We were even given basic server space in school and the wider internet was thriving with GeoCities and communities grew around services like Neopets. Everyday, after school, we’d go home and continue our playground conversations over MSN Messenger (after waiting for the dial-up modem to complete its connection, of course). The internet felt small and personal (even if you didn’t use your real name or identity) and exciting.
I recently finished reading The Hunt for Red October by Tom Clancy.
This genre of novel (sort of military thriller fiction) is not usual for me and this is the first Clancy book I have read. That being said, the book has been on my “to-read” list for a fair amount of time and so I am glad I got round to reading it.
Like many people I these days try and live a minimal life when it comes to possessions. Having more stuff means there is a greater level of responsibility required to look after it. I love the principles involved in “owning less”.
Although I am in a very different situation to Pieter Levels, I find the ideas behind his 100 Thing Challenge (and other related pieces) to be inspiring.
RSS has had a bit of a resurgence for personal websites and blogs in recent years, especially with the growing adoption of Small Web and IndieWeb ideologies.
Many static site generators - including Hugo, Jekyll, and Eleventy - can easily support the automatic generation of RSS feeds at build time (either directly, or through plugins).
The same is true for Gatsby - the framework currently used to build this static website - and the good news is that setting up one feed, or multiple ones for different categories, only takes a few minutes.
Python’s Flask framework is an easy and excellent tool for writing web applications. Its in-built features and ecosystem of supporting packages let you create extensible web APIs, handle data and form submissions, render HTML, handle websockets, set-up secure account-management, and much more.
It’s no wonder the framework is used by individuals, small teams and all the way through to large enterprise applications. A very simple, yet still viable, Flask app with a couple of endpoints looks as follows.
By now I’m sure everyone has heard the horror stories about people (seemingly-) randomly losing access to their Google accounts. Often the account closures are reported to have been accompanied with vague automated notifications from Google complaining that the account-holder violated their terms in some way, but without any specific details or an offer of appeal or process to resolve the “issues” and reinstate the accounts.
As such, these events usually mark the end of the road for the victims’ presence and data on Google platforms - including Gmail, Drive, Photos, YouTube - without having any option to extract the data out first. This could be years’ worth of documents, family photos, emails, Google Play purchases, and much more (ever used “Sign in with Google” on another service, for example?).
A few months ago I stumbled across this article: Beyond Cyberpunk: Towards a Solarpunk Future. It was posted on the excellent blog Tales from the Dork Web, by Steve Lord, which I can certainly recommend subscribing to.
I had never heard of the term “Solarpunk” before, but I read up more about it and the more I researched the more intrigued I became. Essentially it is defined as - more or less - the opposite to the Cyberpunk subculture, and I think we’re at a bit of a fork in the road from which either future could become a reality.
This month marks a year from when I decided to (mostly - see below) stop answering my phone. This was not because I wanted to be antisocial (quite the opposite), but because it’s become the wrong form of communication for me.
Like many people, I am inundated with sales-y and spammy phonecalls. I have had the same mobile phone number since 2001 (that’s 20 years this year), which I am sort of proud of and would prefer to keep. However, careless (or malicious) entities over the years (and more than likely mistakes also made by my younger self) have meant that my number and name are now in the databases of many different types of agents - from insurance/legal company sales teams through to dodgy Bitcoin spam companies.
Last week I read The Midnight Library by Matt Haig. The book won the 2020 Goodreads Choice Award for Fiction.
“Set” in Bedford, England, the story starts by introducing the main character - Nora Seed - who feels completely down. She is depressed and thinks that she has nothing further to contribute to her own life or to the lives of the few people around her.
For many small or personal services running on a VPS in the cloud, administration is often done by connecting directly to the server via SSH. Such servers should be hardened with firewalls, employ an SSHd config that denies root and password-based login, run fail2ban, and other services and practices.
Linode has some great getting-started guides on the essentials of securing your server.
In more complex production scenarios heightened security can be achieved by isolating application (webapp, API, database, etc.) servers from external internet traffic. This is usually done by placing these “sensitive/protected” servers in a private subnet, without direct internet-facing network interfaces. This means that the server is not reachable from the outside world.
Many people no longer feel comfortable using Facebook. Whether you were never a member to begin with or you’ve had an account but chosen to remove yourself from the service, or you’ve simply tried to start using it less - either way, it’s no surprise given the way that they, across their family of products (including Instagram and WhatsApp), operate in terms of your own data and time.
This is a huge subject on its own and it’s really up for everyone to make their own minds up when it comes to their own stance. It’s been widely discussed pretty much everywhere, and there are loads of resources available on this handy website if you’re interested in understanding more about what goes on behind the scenes on these platforms.
Shapes and patterns can be leveraged in user interfaces to guide your users, draw attention to content, lend weight or emphasis, or just for aesthetics and decoration.
Layout and styling on the web is typically handled using CSS, however mastering CSS to the level where you can confidently take advantage of more advanced features is definitely not easy. I’ve been developing for the web almost full-time for a decade and I’m still pretty crap when it comes to doing complex stuff with CSS.
React state management is what gives the library its reactiveness. It’s what makes it so easy to build performant data-driven applications that dynamically update based on the underlying data. In this example the app would automatically update the calculation result as the user types in the input boxes:
import React, { useState } from 'react';
function MultiplicationCalculator() {
const [number1, setNumber1] = useState(0);
const [number2, setNumber2] = useState(0);
return ( <>
<input value={number1} onChange={e => setNumber1(parseInt(e.target.value))} />
<input value={number2} onChange={e => setNumber2(parseInt(e.target.value))} />
<p>The result is {number1 * number2}.</p>
</> );
}
Many people would consider RSS - Really Simple Syndication - to be a relic of the past. However I think it has been making a comeback.
RSS is a mechanism by which people can automatically receive updates from individual websites, similar to how you might follow another user on a social networking service. Software known as RSS readers can be used to subscribe to RSS feeds in order to receive these updates. As new content (e.g. a blog post) is published to an RSS-enabled website, its feed is updated and your RSS reader will show the new post the next time it refreshes. Many RSS readers have an interface similar to an email client, with read/unread states, folders, favourites, and more.
A few months ago I discovered Blogging for Devs - I think through Product Hunt when it made it to #1 Product of the Day back in August last year.
At the time blogging was something I had been thinking about quite a lot. I actively followed several other blogs - both from people I know and from others in the tech community - and it was clear that, in addition to producing content that was interesting to read by others, writing was something these bloggers actually enjoyed and found valuable too for their own learning and engagement with the community.
If you need a database for your next project, why not first consider if SQLite might be a good option? And I don’t mean just for getting an MVP off the ground or for small personal systems; I mean for “real” production workloads.
Many people will be quick to jump on this with chimes of “it’s not designed for production”, but I think it depends on what is actually meant by “production”? Sure, it’s not the right choice for every scenario - it wouldn’t work well in distributed workloads or for services expected to receive a very high volume of traffic - but it has been used successfully in many real-world cases.
Recently I finished reading Dirty Little Secrets. This is the first book I have read by Jo Spain and the first time I have known of the author.
The book first appears as though it’s a typical murder mystery set in a relatively wealthy gated community in Ireland - however the intricacies of the characters and narrative quickly made it hard to put down. The story begins with the discovery of the long-dead body of the woman who lives at number 4 and continues with the involvement of the detectives as they investigate the strange incident.
If you’ve visited my geminispace (gemini://wilw.capsule.town) you’ll have noticed that I’ve recently been on a mission to decentralise the every-day tools and services I use, and will understand the reasons why. This post will likely become part of a series of posts in which I talk about taking control and responsibility for my own data.
One of the changes I’ve made more recently is to move many of my own personal projects (including the source for this site) over to a self-hosted Gitea service. I chose Gitea personally, but there are many other self-hosted solutions available (see this post for examples and comparisons).
I know that I’ve been a bit crap at updating my blog properly and consistently over the past few years. One of my new year’s resolutions this year is to get into the habit of writing more, and so #100DaysToOffload seems a good opportunity to challenge myself to make sure I do.
The guidelines for and the ideas behind the challenge are on the challenge’s website. There aren’t any rules really, but the essential message is to “Just. Write.”. So, I’ll do my best before the end of 2021, and given that I’ve already published two posts this year I’ll count this number 3.
Over the past few months I have been trying to use centralised “big tech” social media platforms less and instead immerse myself into the more community-driven “fediverse” of decentralised services that are connected (“federated”) using common protocols (e.g. ActivityPub). If you like, you can follow me on Mastodon (@wilw@fosstodon.org, recently migrated over from my old mastodon.social account) and Pixelfed (@wilw@pixelfed.social).
I’ve loved spending my time on these platforms - mainly due to the lack of noise and fuss, and more of a focus on sharing relevant content and interesting interactions with likeminded people (though of course this does depend on the instance you join).
Building apps on serverless architecture has been a game-changer for me and for developers everywhere, enabling small dev teams to cheaply build and scale services from MVP through to enterprise deployment.
Taking advantage of serverless solutions - such as AWS’ Lambda, Google’s Cloud Functions, and Cloudflare’s Workers - means less resource is spent on traditional dev-ops and deployment and, especially when combined with tools like Serverless framework and its rich ecosystem of plugins, you can use the time instead to better develop your products. Let the provider worry about deploying your code, keeping your services highly available, and scaling them to meet the needs of huge audiences.
If you write React web apps that interface with a backend web API then definitely consider trying React Query.
The library makes use of modern React patterns, such as hooks, to keep code concise and readable. It probably means you can keep API calls directly inside your normal component code rather than setting-up your own client-side API interface modules.
React Query will also cache resolved data through unique “query keys”, so you can keep transitions in UIs fast with cached data without needing to rely on redux.
This short post introduces a useful JavaScript operator to help make your one-liners even more concise.
The specification was added formally in the 11th edition of ECMAScript. It is implemented as a logical operator to selectively return the result of one of two expressions (or operands) based on one of the expressions resolving to a “nullish” value. A nullish value in JavaScript is one that is null
or undefined
.
JavaScript has lots of handy tools for creating concise code and one-liners. One such tool is the optional chaining operator.
The optional chaining operator is useful for addressing an attribute of a deeply-nested object in which you cannot be fully certain that the successive levels of the object are valid at run-time.
For example, consider the following object.
const person = {
name: 'Harry',
occupation: 'student',
enrolmentInformation: {
contactDetails: {
email: 'harry@hogwarts.ac.uk',
address: {
firstLine: '4 Privet Drive',
postCode: 'GU3 4GH'
}
}
}
};
In order to safely (i.e. if you cannot guarantee each object level at run-time) read the nested postCode
attribute, you could do so like this, using the logical AND operator:
I recently stumbled across an article on Hacker News discussing the pros of basic personal accounting using GnuCash - a free and open-source desktop accounting program. The article was interesting as the data geek in me resonated with the notion of being able to query the information in useful ways, particularly after having used the system for enough time to accumulate enough financial data.
The comments on the article’s post also mentioned another tool, Ledger. Whilst GnuCash allows users to input transactional and account information as well as reports, Ledger’s focus is only on the reports - a key feature of this CLI tool is that the actual bookkeeping is made directly (or through other tools) into a text file, which Ledger only reads from and never otherwise touches. Both programs work on the principle of double-entry bookkeeping, but some of the key positives of Ledger are its speed (even when working with several decades’ worth of financial data) and its innate ability to be combined with other useful UNIX tools - both for data input and, if necessary (Ledger’s own reporting outputs are very powerful), output.
This note documents the set-up of a k8s cluster from scratch, including ingress and load-balanced TLS support for web applications. It’s mainly for myself to revisit and reference later on. The result of this note is not (quite) production-grade, and additional features (e.g. firewalls/logging/backups) should be enabled to improve its robustness.
Several cloud providers offer managed k8s services (including Amazon EKS, GKE, Digital Ocean, etc.). Whilst these would be recommended for sensitive or production workloads, I wanted to create my own provider-independent cluster in order to understand the ins and outs.
ZEIT’s Now service is great for deploying apps and APIs that are able to make use of serverless execution models, and I use it for many of my projects (including this website, at the time of writing).
I recently needed to deploy a backend written in Go and kept running into problems when trying to read data from the HTTP request body. The client-side app I was developing to communicate with the backend is also written in Go and everything seemed to work fine when running the backend locally (using now dev
), but the exact same requests failed when running it in production. The client’s request body was available when in development, but returned empty strings when running in production.
A previous note about Philips Hue bulbs got me thinking that the API exposed by the bridge might be used to warn if the house lights are left on too late at night, or even if they get turned on at unexpected times - potentially for security.
I put together a simple program that periodically checks the status of known Hue bulbs late at night. If any bulbs are discovered to be powered on during such times then an email notification is sent. It runs as a systemd
service on a Raspberry Pi.
I have recently posted about CENode and how it might be used in IoT systems.
Since CENode is partially designed to communicate directly with humans (particularly those out and about or “in the field”) it makes sense for inputs and queries to be provided via voice in addition to or instead of a text interface. Whilst this has been explored in the browser (including in the previous Philips Hue control demo), it made sense to also try to leverage the Alexa voice service to interact with a CENode instance.
In a previous note I discussed CENode and briefly mentioned its potential for use in interacting with the Internet of Things. I thought I’d add a practical example of how it might be used for this and for ’tasking’ other systems.
I have a few Philips Hue bulbs at home, and the Hue Bridge that enables interaction with the bulbs exposes a nice RESTful API. My aim was to get CENode to use this API to control my lights.
Whilst working on the ITA Project - a collaborative research programme between the UK MoD and the US Army Research Laboratory - over the last few years, one of my primary areas has been to research around controlled natural languages, and working with Cardiff University and IBM UK’s Emerging Technology team to develop CENode.
As part of the project - before I joined - researchers at IBM developed the CEStore, which aims to provide tools for working with ITA Controlled English. Controlled English (CE) is a subset of the English language which is structured in a way that attempts to remove ambiguity from statements, enabling machines to understand ‘English’ inputs.
I haven’t written a post since summer 2015. It’s now March 2017 and I thought I’d write an update very briefly covering the last couple of years.
I finished researching and lecturing full-time in the summer of 2015. It felt like the end of an era; I’d spent around a third of my life at the School of Computer Science and Informatics at Cardiff University, and had experienced time there as an undergraduate through to postgrad and on to full-time staff. However, I felt it was time to move on and to try something new, although I was really pleased to be able to continue working with them on a more casual part-time basis - something that continues to today.
I recently blogged about Nintendo Hotspot data and mentioned it could be more usefully consumable in a native mobile app.
As such, I wrote a small Android app for retrieving this data and displaying it on a Google Map. The app shows nearby hotspots, allows users to also search for other non-local places, and shows information on the venue hosting the zone.
The app is available on the Play Store and its source is published on GitHub.
Since getting a DS, StreetPass has become quite addictive. It’s actually pretty fun checking the device after walking through town or using public transport to see a list of Miis representing the people you’ve been near recently, and the minigames (such as StreetPass Quest) that require you to ‘meet’ people in order to advance also make it more involved. Essentially the more you’re out and about, the further you can progress - this is further accentuated through Play Coins, which can be used to help ‘buy’ your way forward and are earned for every 100 steps taken whilst holding the device.
A couple of years ago I wrote a blog post about wrapping some of Weka’s classification functionality to allow it to be used programmatically in Python programs. A small project I’m currently working on at home is around taking some of the later research from my PhD work to see if it can be expressed and used as a simple web-app.
I began development in Go as I hadn’t yet spent much time working with the language. The research work involves using a Bayesian network classifier to help infer a tweet’s interestingness, and while Go machine-learning toolkits do exist, I wanted to use my existing models that were serialized in Java by Weka.
As is the case with many people, all music I listen to on my PC these days plays from the web through a browser. I’m a heavy user of Google Play Music and SoundCloud, and using Chrome to handle everything means playlists and libraries (and the way I use them through extensions) sync up properly everywhere I need them.
On OS X I use BearededSpice to map the keyboard media controls to browser-based music-players, and the volume keys adjusted the system as they should. Using i3 (and other lightweight window managers) can make you realise what you take for granted when using more fully-fledged arrangements, but it doesn’t take long to achieve the same functionality on such systems.
his week I begin lecturing a module for Cardiff School of Computer Science and Informatics’ postgraduate MSc course in Advanced Computer Science.
The module is called Web and Social Computing, with the main aim being to introduce students to the concepts of social computing and web-based systems. The course will include both theory and practical sessions in order to allow them to enhance their knowledge derived from literature with the practice of key concepts. We’ll also have lots of guest lectures from experts in specific areas to help reinforce the importance of this domain.
Yesterday, I gave a talk about my experiences with developing and using RESTful APIs, with the goal of providing tips for structuring such interfaces so that they work in a useful and sensible way.
I went back to first principles, with overviews of basic HTTP messages as part of the request-response cycle and using sensible status codes in HTTP responses. I discussed the benefits of ‘collection-oriented’ endpoint URLs to identify resources that can be accessed and modified and the use of HTTP methods to describe what to do with these resources.
This weekend I took part in the NHS Hack Day. The idea of the event is to bring healthcare professionals together with technology enthusiasts in order to build stuff that is useful for those within the NHS and for those that use it. It was organised by AnneMarie Cunningham, who did a great job in making the whole thing run smoothly!
This was our team! The image is released under a Creative Commons BY-NC2.0 license by Paul Clarke.
I recently received confirmation of my completed PhD! I submitted my thesis in May 2014, passed my viva in September and returned my final corrections in December.
I was examined internally by Dr Pete Burnap and also by Dr Jeremy Pitt of Imperial College London.
The whole PhD was an amazing experience, even during the more stressful moments. I learnt a huge amount across many domains and I cannot thank my supervisors, Dr Stuart Allen and Prof Roger Whitker, enough for their fantastic support and guidance throughout.
Today I gave an internal talk at the School of Computer Science & Informatics about open-source contribution.
The talk described some of the disadvantages of the ways in which hobbyists and the non-professional sector publicly publish their code. A lot of the time these projects do not receive much visibility or use from others.
Public contribution is important to the open-source community, which is driven largely by volunteers and enthusiasts, so the point of the talk was to try and encourage people to share expert knowledge through contributing documentation (wikis, forums, articles, etc.), maintaining and adopting packages, and getting more widely involved.
I recently wrote a new article for Heroku’s Dev Center on carrying out asynchronous direct-to-S3 uploads using Node.js.
he article is based heavily on the previous Python version, where the only major change is the method for signing the AWS request. This method was outlined in an earlier blog post.
The article is available here and there is also a companion code repository for the example it describes.
Last week, I was invited to give a seminar to the Agents and Intelligent Systems group in the Department of Informatics at King’s College London.
I gave an overview of my PhD research conducted over the past two or three years, from my initial research into retweet behaviours and propagation characteristics through to studies on the properties exhibited by Twitter’s social graph and the effects that the interconnection of users have on message dissemination.
A while ago I wrote an article for Heroku’s Dev Center on carrying out direct uploads to S3 using a Python app for signing the PUT request. Specifically, the article focussed on Flask but the concept is also applicable to most other Python web frameworks.
I’ve recently had to implement something similar, but this time as part of an Node.js application. Since the only difference between the two approaches is literally just the endpoint used to return a signed request URL, I thought I’d post an update on how the endpoint could be constructed in Node.
Last week I visited Karlsruhe, in Germany, to give a presentation accompanying a recently-accepted paper. The paper, “Inferring the Interesting Tweets in Your Network”, was in the proceedings of the Workshop on Analyzing Social Media for the Benefit of Society (Society 2.0), which was part of the Third International Conference on Social Computing and its Applications (SCA).
Although I only attended the first workshop day, there was a variety of interesting talks on social media and crowdsourcing. My own talk went well and there was some useful feedback from the attendees.
In my last post I discussed methods for streaming music to different zones in the house. More specifically I wanted to be able to play music from one location and then listen to it in other rooms at the same time and in sync.
After researching various methods, I decided to go with using a compressed MP3 stream over RTP. Other techniques introduced too much latency, did not provide the flexibility I required, or simply did not fulfill the requirements (e.g. not multiroom, only working with certain applications and non-simultaneous playback).
For a while, now, I have been looking for a reliable way to manage zoned music-playing around the house. The general idea is that I’d like to be able to play music from a central point and have it streamed over the network to a selection of receivers, which could be remotely turned on and off when required, but still allow for multiple receivers to play simulataneously.
Apple’s AirPlay has supported this for a while now, but requires the purchasing of AirPlay compatible hardware, which is expensive. It’s also very iTunes-based - which is something that I do not use.
I recently spent a week in France as part of a holiday with some of my family. Renting houses for a couple of weeks in France or Italy each summer has almost become a bit of a tradition, and it’s good to have a relax and a catch-up for a few days. They have been the first proper few days (other than the decking-building adventure back in March) I have had away from University in 2013, so I felt it was well-deserved!
Last week I released a new version of the tides Android app I’m currently developing.
The idea of the application was initially to simply display the tidal times and patterns for the Gower Peninsula, and that this should be possible without a data connection. Though, as the time has gone by, I keep finding more and more things that can be added!
The latest update saw the introduction of 5-day surf forecasts for four Gower locations - Llangennith, Langland, Caswell Bay, and Hunts Bay. All the surf data comes from Magic Seaweed’s API (which I talked about last time).
Back in March, I emailed Magic Seaweed to ask them if they had a public API for their surf forecast data. They responded that they didn’t at the time, but that it was certainly on their to-do list. I am interested in the marine data for my Gower Tides application.
Yesterday, I visited their website to have a look at the surf reports and some photos, when I noticed the presence of a Developer link in the footer of the site. It linked to pages about their new API, with an overview describing exactly what I wanted.
I today issued a full upgrade of the server at flyingsparx.net, which is hosted by Digital Ocean. By default, on Arch, this will upgrade every currently-installed package (where there is a counterpart in the official repositories), including the Linux kernel and the kernel headers.
Digital Ocean maintain their own kernel versions and do not currently allow kernel switching, which is something I completely forgot. I rebooted the machine and tried re-connecting, but SSH couldn’t find the host. Digital Ocean’s website provides a console for connecting to the instance (or ‘droplet’) through VNC, which I used, through which I discovered that none of the network interfaces (except the loopback) were being brought up. I tried everything I could think of to fix this, but without being able to connect the droplet to the Internet, I was unable to download any other packages.
Over the last few months, I’ve started to use Weka more and more. Weka is a toolkit, written in Java, that I use to create models with which to make classifications on data sets.
It features a wide variety of different machine learning algorithms (although I’ve used the logistic regressions and Bayesian networks most) which can be trained on data in order to make classifications (or ‘predictions’) for sets of instances.
This is just a quick post to mention that I have made the source for the Gower Tides app on Google Play public.
The source repository is available on GitHub. From the repository I have excluded:
The Heroku Dev Center is a repository of guides and articles to provide support for those writing applications to be run on the Heroku platform.
I recently contributed an article for carrying out Direct to S3 File Uploads in Python, as I have previously used a very similar approach to interface with Amazon’s Simple Storage Service in one of my apps running on Heroku.
The approach discussed in the article focuses on avoiding as much server-side processing as possible, with the aim of preventing the app’s web dynos from becoming too tied up and unable to respond to further requests. This is done by using client-side JavaScript to asynchronously carry out the upload directly to S3 from the web browser. The only necessary server-side processing involves the generation of a temporarily-signed (using existing AWS credentials) request, which is returned to the browser in order to allow the JavaScript to successfully make the final PUT
request.
Last weekend I went to CFHack Open Sauce Hackathon. I worked in a team with Chris, Ross and Matt.
We started work on eartub.es, which is a web application for suggesting movies based on their sound tracks. We had several ideas for requirements we wanted to meet but, due to the nature of hackathons, we didn’t do nearly as much as what we thought we would!
My hosting for my website has nearly expired, so I have been looking for renewal options.
These days I tend to need to use servers for more than simple web-hosting, and most do not provide the flexibility that a VPS would. Having (mostly) full control over a properly-maintained virtual cloud server is so much more convenient, and allows you to do tonnes of stuff beyond simple web hosting.
I have some applications deployed on Heroku, which is definitely useful and easy for this purpose, but I decided to complement this for my needs by buying a ‘droplet’ from Digital Ocean.
I’ve been having trouble connecting to Eduroam, at least reliably and persistently, without heavy desktop environments or complicated network managers. Eduroam is the wireless networking service used by many Universities in Europe, and whilst it would probably work fine using the tools provided by heavier DEs, I wanted something that could just run quickly and independently.
Many approaches require the editing of loads of config files (especially true for netcfg
), which would need altering again after things like password changes. The approach I used (for Arch Linux) is actually really simple and involves the use of the user-contributed wicd-eduroam
package available in the Arch User Repository.
Next week I, along with others in a team, am taking part in Cardiff Open Sauce Hackathon.
If you’re in the area and feel like joining in for the weekend then sign up at the link above.
he hackathon is a two-day event in which teams work to ‘hack together’ smallish projects, which will be open-sourced at the end of the weekend. Whilst we have a few ideas already for potential projects, if anyone has any cool ideas for something relatively quick, but useful, to make, then please let me know!
I wanted a way in which users can seamlessly upload images for use in the Heroku application discussed in previous posts.
Ideally, the image would be uploaded through AJAX as part of a data-entry form, but without having to refresh the page or anything else that would disrupt the user’s experience. As far as I know, barebones JQuery does not support AJAX uploads, but this handy plugin does.
styled the file input nicely (in a similar way to this guy) and added the JS so that the upload is sent properly (and to the appropriate URL) when a change is detected to the input (i.e. the user does not need to click the ‘upload’ button to start the upload).
I managed to turn about two tonnes of material into something vaguely resembling ‘decking’ in my back garden this weekend. It makes the area look much nicer, but whether it actually stays up is a completely different matter.
A few posts back, I talked about the development of an Android app for tide predictions for South Wales. This app is now on Google Play.
If you live in South Wales and are vaguely interested in tides/weather, then you should probably download it :)
The main advantage is that the app does not need any data connection to display the tidal data, which is useful in areas with low signal. In future, I hope to add further features, such as a more accurate tide graph (using a proper ‘wave’), surf reports, and just general UI updates.
I’ve taken to writing most of my recent presentations in plain HTML (rather than using third-party software or services). I used JavaScript to handle the appearance and ordering of slides.
I bundled the JS into a single script, js/scriptslide.js
which can be configured
using the js/config.js
script.
There is a GitHub repo for the code, along with example usage and instructions.
Most configuration can be done by using the js/config.js
script, which supports many features including:
Each January the School of Computer Science hosts a poster day in order for the research students to demonstrate their current work to other research students, research staff and undergraduates. The event lets members of the department see what other research is being done outside of their own group and gives researchers an opportunity to defend their research ideas.
This year, I focused on my current research area, which is to do with inferring how interesting a Tweet is based on a comparison between simulated retweet patterns and the propagation behaviour demonstrated by the Tweet in Twitter itself. The poster highlights recent work in the build-up to this, a general overview of how the research works, and finishes with where I want to take this research in the future.
I’ve always been interested in the development of smartphone apps, but have never really had the opportunity to actually hava a go. Whilst I’m generally OK with development on platforms I feel comfortable with, I’ve always considered there to be no point in developing applications for wider use unless you have a good idea about first thinking about the direction for it to go.
My Dad is a keen surfer and has a watch which tells the tide changes as well as the time. It shows the next event (i.e. low- or high-tide) and the time until that event, but he always complains about how inaccurate it is and how it never correctly predicts the tide schedule for the places he likes to surf.
I gave a seminar on my current research phase.
I summarised my work over the past few months; in particular, the work on the network structure of Twitter, the way in which tweets propagate through different network types, and the implications of this. I discussed the importance of precision and recall as metrics for determining a timeline's quality and how this is altered through retweeting in different network types.
I concluded by talking about my next area of research; how I may use the model used for the previous experimentation to determine if a tweet is particularly interesting based on its features. Essentially, this boils down to showing that tweets are siginificantly interesting (or uninteresting) by looking at how they compare to their predicted retweet behaviours as produced by the model.
We recently held our DigiSocial Hackathon. This was a collaboration between the Schools of Computer Science and Social Sciences and was organised by myself and a few others.
The website for the event is hosted here.
The idea of the event was to try and encourage further ties between the different Schools of the University. The University Graduate College (UGC) provide the funding for these events, which must be applied for, in the hope that good projects or results come out of it.