Hello and welcome to thoughtwisps! This is a personal collection of notes and thoughts on software engineering, machine
learning and the technology industry and community. For my professional website, please see
race-conditions.
Thank you for visiting!
21 Mar 2018
Last night, David Winterbottom (software engineer at Octopus Energy) tweeted some thoughts about writing code. Among them was advice to avoid overzealously DRY(don’t repeat yourself)ing code. Coincidentally, just a few days ago, a co-worker of mine and I had a similar debate over some proposed changes in a pull request. I had suggested refactoring some repeated functionality into a reusable method, but received some pushback - which made me start thinking why I so eagerly reach for the DRY.
For those who don’t write software regularly, DRY(don’t repeat yourself) is an acronym that is used to help software engineers avoid code duplication. Code duplication is bad, because it usually decreases the velocity at which one is able to make changes in a codebase. Instead of making a change in the logic in one particular method, you have to make the same change in multiple places and remember/find all of the places where the functionality is duplicated! Even with modern IDEs and grep, duplicated code can sometimes elude the developer and lo and behold, you have a confusing bug on your hands. You’re damn sure you made the change and still the damn program is behaving according to the old logic!
When I was first starting out a few years ago, the company hired a trainer to do a three day course on software design principles. This is where I was first introduced to delightful concepts such as DRY, SOLID and my favourite mouthful YAGNI - which sounds like the name of some exotic animal - but actually stands for You Ain’t Gonna Need It (just based on the acronyms, you can clearly see that software engineering is a highly regulated and standardised profession - that is a post for another day). I feel that those three days permanently burned DRY into my muscle memory.
After graduating from this three day course, I went on to write and refactor and produce lots of bugfixes (and bugs…) and have always been an overzealous devotee of DRYing. More than three lines of duplication - well let’s bring out those refactoring tools we have kudos to the modern IDE and pull this out into a method. Voila! Deleting lines of code and replacing them with a call to a function never felt so good. DRY is the low-hanging fruit of refactoring - the kind that gives you the immediate warm and fuzzies.
What I understood from the discussion is that blindly applying DRY to every single piece of code means accepting the tradeoff of an additional level of indirection (which is what the extra function call is ) without considering if it’s appropriate for that codebase. While abstracting away common functionality usually means making faster changes, it can also make it harder for the next developer to navigate the codebase. Always consider the tradeoffs.
20 Mar 2018
The better part of the day was spent, in what one can only describe, as end to end JSON message tracing hell.
I am currently working on a platform where several individual components communicate via JSON messages. These messages are produced, ingested, reprocessed and placed back onto messaging streams for downstream consumers. Each components runs as an independent service.
Early in the week, we detected several problems in one of the end to end flows of data and started trying to find out what things could be done to unb0rk the process.
The resolution of the bug is not terribly interesting (mostly some fixing in a few places in the Python glue code that keeps the whole thing together), but as an experience, debugging an end to end flow in a microservice-y type of environment is definitely something everyone should do at least once (perhaps even if your stack is a monolith). You quickly learn to appreciate the importance of writing good (informative, not spammy) log messages and having a log aggregator that allows you to filter by regex, keyword and date range. I can’t even imagine what it’s like to trace a single event across 10s or 100s of microservices.
11 Mar 2018
This post was finished on the 28th December, 2019.
There is something about it that, one cannot deny, makes it impossible to look away. After all, 280 characters seems to be almost designed to generate posts that lack the space to deal with subtleties and nuances. We take the grey and make it into black and white. More fire for the outrage machine.
I’m old enough to remember what the web was like before Twitter, Facebook and Tumblr came along. It wasn’t a palace of polished Bootstrapped interfaces, but a bazaar of homemade CSS and HTML. We made websites on Geocities, mostly by studying the source code of good looking websites we found by surfing hyperlinks (do people do this anymore?). It also wasn’t so damn addicting. There was no place I would’ve gladly spent 2+ hours scrolling and clicking. Now, there does not seem to be a shortage of sites that are happy to hijack and monetise the shit out of your attention. It turns out our brains are addicted to novelty, to tiny morcels of low calorie information that we can react to with minimal effort.
I wrote this piece in the aftermath of a massive Twitter pile-on on a famous figure in the US tech scene who
wrote an article filled with somewhat controversial advice for women-in-tech.
My kneejerk reaction was somewhere betweeen “this engineer is one of those fuck-you-got-mine’s (a phrase that was actively
used in the women-in-tech community a few years ago to describe women who seemed to have “made it” in the tech world
and were unwilling to throw the ladder back down)” and “this advice actually makes sense in the imperfect world we
live in”.
In either case, tech Twitter went into a meltdown of sorts after the publication of this person’s post.
All of a sudden, person after person dunked on the writer and started subtweet threads after subtweet threads.
After spending some time looking at the aftermath, I found a disturbing emotion within myself.
A part of me was logging into Twitter and scouring the inflammartory threads just to experience a,
well, a kind of dopamine rush or outrage euphoria, a kind of personification of the popcorn eating -gif
that is sometimes posted when an internet dramageddon is about to go down.
I realised that my lizard brain’s affinity for petty gossip has been amplified by this
website. In fact, this website has molded my lizard brain to think in terms of the medium.
I find myself frequently daydreaming in tweets these days: a thought comes into my head
and suddenly my fingers are itching to tweet it out, my brain frantically shifting around words
to make sure it fits into 280 characters. And when there is a dramageddon going down on the website,
the lizard brain can’t help but scroll on and on in glee trying to dig up more and more
tweets and subtweets piling on the person. Bring out the popcorn, but don’t expect it
to satiate you.
10 Mar 2018
I had a bad interaction with someone I consider a role model, so I guess this is salty Sunday thoughts, but here goes anyway.
Goodbye for now, women-in-tech Twitter community. One blog post and a whole community is getting torn apart by
280 character hot takes. When did we lose the ability to talk about things like adults? Why can we not take criticism and disagreement and use it to become better as a community?
I guess when we figured out that snide subtweets generate more engagement than direct conversation.
It’s been a good run.
03 Mar 2018
Two people, two senior leaders in technology whose work I follow and respect, recently had an inflamed public exchange on Twitter, which made my heart sink. The exchange was about a piece of advice that one of them had published for women in technology - ‘women in tech’ as the movement is usually known. I read the post, not once, but several times. I didn’t agree with all of it. I thought some of it was problematic, the other part practical. It seemed like advice that actually acknowledges that the real working world is not always a nice, fair, just place and I appreciated it for its candor. But advice is just that: advice. It’s not a how-to manual or a guaranteed way to succeed in an industry. It’s often based on the author’s own experiences or the experiences of those she has worked with or mentored. It won’t work for everyone and every life situation.
Although the media is infatuated with the narrative of the genius, Harvard drop-out techie founder, who loves hoodies, raw water and waxing lyrical about the upcoming tech brotopia, there are other stories out there, other ways of being in tech and being successful. There is no one guaranteed way to make it as a ‘woman in tech’- three words which are problematic, because we are people before we are a gender, especially before we are a society’s idea of how our gender should act and behave in the world. This is not to say that gender does not affect our lives - it does, sometimes with extremely devastating consequences.
I suppose what I’m trying to say is that every woman’s tech career will be different and there is no advice that will work for everyone.
I wanted to share some of these half-baked thoughts on the platform and initially retweeted a few posts and commented, but after a while I gave up and deleted the lot. Twitter is ideal for half-baked steaming hot takes, but this is one fire I don’t want to throw more kindling on.
I don’t know what value I can add beyond restating what other people have already said.
It reminds of the time another prominent developer (who happens to be a woman) criticised some women in tech groups. There was lots of unnecessary hand-wringing and hot takes. But also a few well-measured responses. If we as ‘women in tech’ can’t take a critical look at our movement and its aims and can’t tolerate scrutiny from people who don’t agree with our goals, then what are we really doing.
Also, I’m just tired of talking about this. Actions speak louder than tweets or words.
So that’s what I’m going to focus on.