thoughtwisps One commit at a time

Hello and welcome to thoughtwisps! This is a personal collection of notes and thoughts on software engineering, machine learning and the technology industry and community. For my professional website, please see race-conditions. Thank you for visiting!

exposure

Author’s note: This post was finished and published on the 19th of June, 2018.

for free

I have an unconventional friend. Unconventional, because he does not subscribe to the live-to-work culture that seems rampant among the glass-and-steel towers, finely pressed suits and freshly laundered shirts. We talk over the phone (or rather one of those apps - if this makes you think I was born middle aged, you’re probably right about that one), about life and politics and work. I sometimes forget how bizarre people outside of the tech bubble find our daily rituals of seeking exposure: conferencing, CFPs, meetups, showcasing open source contributions, endless hot takes on Twitter and shitting on the latest JavaScript library on Hacker News.

I’m going to give a talk, I tell him. Slides, demoing stuff on my computer, that kind of thing. I explain the technology I’m going to be talking about, how it’s used, what problems it solves. We talk. I tell him how much it’s starting to stress me out, this talk. How these things always seem like good ideas with months to go, but morph into nightmarish monsters when the talk date starts approaching. And how for some reason we keep signing up to do this more and more.

But you get paid to do this, right? he asks. No, most people don’t get paid to speak at conferences, I tell him.

Networking, hallway track, learning about cool new tech, meeting your heroes, or at least seeing them on stage, that’s why, I explain while he digests. I’ve become so accustomed to doing things for exposure: meetups, talks, tutorials, workshops, I never even thought about compensation. Everyone does these things. Twitter is bursting with conference hashtags and livetweeted talks - enough to induce a permanent state of conference FOMO. LinkedIn is full of posts celebrating this or that meetup. CFPs, conference parties, photos of happy people celebrating speaker socks - and here I am, in my pajamas, trying to figure out why the ef can’t I center this div using CSS.

Well actually: Conference talking is voluntary - at least on the surface. No one is filling out that CFP for me. And yet, when everyone is on the dance floor dancing, you feel you have to join in. When conference speaking becomes the norm, the way to become visible to potential hiring managers and future collaborators, it becomes a necessity, not a nice to have. A bit like having a GitHub profile. An active GitHub profile, mind you.

the good and the sad

Even though I actively hate myself on the eve of every talk I have ever given, the conversations and memories from conferences do usually make it worth the trouble! Some of my best memories of the tech industry are from meetups and conferences. I was a Django Girls Coventry mentor at PyCon 2015, met lots of amazing people and got to teach Python to a group of four women. One of the attendees later on became a full time software engineer. Stories like these are what makes interacting with the community very rewarding!

However, there are community experiences that are not great. My list is long and our collective list even longer. I won’t go into them here or likely ever. One incident still stays with me. Not because it was harrassment. No, on the contrary it was a comment that I initially took as a snide remark, but then upon later reflection, I realised the profound sadness of it. A lady came up to me and said ‘Of course you have time to do all this (referring to community organising) - you don’t have a family!’ I was a bit taken aback by it and perhaps angry. True, I did not (and still don’t) have a family, no husband, no children, no dependants relying on me to be there for them after work. But did it really mean I had no other issues or problems to take care of? The other reason why her comment hit a sore spot was that personal life had taken a backseat and eventually nearly completely disappeared from my life in the years post-university and I was starting to feel a nagging feeling of ‘is this it?’. The person who so desperately wanted to be consumed by work three years ago didn’t find endless context switching, battling for meeting rooms in open offices and panic-frenzy-debugging of production fires fulfilling anymore. How dare she think I haven’t sacrificed anything, I thought in anger.

I later realised that the comment had not been implying anything of the sort or even malicious. It had just been an exasperated reaction to the constant demand of free work and out of hours studying just to keep up for exposure, for the chance to secure a role. It favours people like me and imposes a heavy penalty on people who take up the bulk of caretaking duties in the household.

Even though this happened a few years ago, the comment keeps resurfacing. Last time I re-visited this memory was during the PyLadies Gender Paygap panel. A member of the audience noted that women are still more likely (in heterosexual couples) to be the ones taking care of the household chores and thus had less time to develop markers, that we in the tech industry equate with Being Good at The Job (TM): an active GitHub profile, a trail of conference talks, open source contributions to high profile projects, community organising with tech meetups (bizarrely, diversity and inclusion related meetups rarely seem to count in these conversations even though the leadership skills required to maneuver the hate filled waters are on par if not greater than those required for traditional meetups), and a record of experimenting with new tech on side projects. Exposure, is in many ways a privilege. It also takes a toll.

crags, precipices and other perils of exposed heights

Every hillwalk usually has an exposure rating: one for a light bimble, five for a fuck-why-did-i-sign-up-to-do-this. The higher the exposure rating, the more likely you will be required to scramble and try not to look down. The views will take your breath away, though.

An analogous thing exists in the tech industry. The more prominent someone (usually a woman or femme-presenting person) becomes in a given programming language or community, the more likely they are to become a target for harrassment and online and in-person hate. Several prominent contributors to open source receive death and rape threats on the regular and have to deal with racial slurs. Just being different in a space of homogeneity and doing one’s work seems to be enough justification for these vile attacks. The milder forms of this are various kinds of well-actually’s, the most notorious of which is no doubt, as Liz Fong-Jones - SRE at Google - put it: questistatements. It may come as a surprise to people outside of the tech conference circuit, but bizarrely during Q&A (which as the name would suggest is a chance for the audience to ask questions of the speaker) many in the audience who avail themselves of this opportunity use it to prove that they know more about the topic than the speaker. This always makes for rather awkward viewing and a de-moralising experience for the speaker. Exposure, crucial for many as a way to enter and advance in the industry, carries with it plenty of toxic waste.

This is not to say that all exposure is bad. Rather it is the requirement to be ‘out there’ all the time tweeting, blogging, pushing code to GitHub and conferencing in order to get a foot in the door that routinely disadvantanges people who cannot afford to buy expensive conference tickets or need to perform other duties outside of work time. Recently, a fellow community member told me that she had been recruited and offerred a position at a cloud computing firm, in part due to her extensive contributions to the local data science community. This is great and should be celebrated! But we should not expect people to give 90% of their waking hours acquiring exposure for the sake of getting into the hiring pipeline. She is a skilled data scientist and has the education and projects to prove it. Why all these extra hoops just to get interviewed?

re-evaluating success, life etc.

I am at a meetup.

I used to attend meetups religiously, on a schedule. My Sunday and Saturday evenings were spent zealously browsing the pages of Meetup.com, signing up for tech meetup groups and penciling them into my planner for the week. Then, if no production systems were on fire that night, I’d promptly make it by 6.30pm to the hosting company and hide in my chair reading a book until the talk began. You see, I’m not very good at striking up conversations with complete strangers and this fact is usually exarcebated if I am the only woman in the room. Anyway, regardless of how socially awkward the whole situation was, I made a point of attending, because let’s face it, I was and still am largely a newbie. There is always something new you could be learning. I’d decided that I was like a sponge: I had to absorb as much as possible from the technical community around me.

I certainly learnt a lot. About meetups and technical communities and about technologies. I saw some amazing talks, for example a talk on high-performance Java by Martin Thompson. The organizers of meetups deserve a lot of credit: it’s hard, sometimes thankless labour and you have to deal with everything from late pizza delivery to angry people who have been waitlisted.

The meetup ends and instead of braving the saunalike conditions on the Central, I decide to walk thought Holborn to St Pauls and then onwards to Monument. The streets are empty this late. A few Very Important (TM) finance/law people hail a cab. A guy with one of those square Deliveroo containers exits a restaurant and gets on his bike. The glass and steel buildings quietly observe, their mirror-like walls reflecting a distorted image of passers-by. Closer to Monument you can hear the laughter and the clink of half-empty pints being lowered on the pavement while their owners smoke a few cigarettes and laugh at whether or not Paul knows what a load of bollocks these new requirements are and how he’ll never get promoted. I’d be hard pressed to imagine these people - dressed in the City Banker Boy uniform - leaving their desks at 6pm to learn about a new and very complicated way to structure, slice and dice debt.

Why has tech industry developed these complicated rituals to prove one’s worth? The reasons are surely numerous and probably not far off from the reasons we perform whiteboard interviews - a topic for another time.

I’ve been playing the exposure game for a while, but I’m not sure I want to anymore. I want to be good at writing code and sometimes the constant need to stay on top of the tweet cycle takes time away from that. I want to be good at the basics, not learn yet another layer of abstraction that I won’t be able to debug when production hell breaks loose. So, today I’m taking some steps towards that. I’ve deleted all of the content I’ve posted on Twitter. I’ve reconsidered some CFPs that I had started. I’m not sure what will happen, but for the next six months (until December), I want to stay focused on the things that matter most to me.

sunlit silence

I’ve slept neither well nor poorly. Just enough to get up at 7am ( two hours late from my ideal ), not enough to avoid a minor niggling headache that will vanish away with a bit of sugar and caffeine. Although I’ve rarely managed to sustain this habit for longer than a few months (or perhaps a few weeks), I’ve always loved waking up early, ridiculously early perhaps. 6am 5am 4am. More so in the summer than in the depths of winter. In the early hours of the morning, there is more time. It moves slowly, a lazy cat stretching out and yawning on the lawn. Only a few cars, a few sleepy commuters, leaning their tired heads on the double decker’s windows.

In London, there is never enough time and even when you feel like you are keeping up, you’re always swimming in the rapids of millions of other people making the daily pilgrimage to their desks and then back home. By 9am, the whirlpool of suits and skirts and angry sighs and passive aggressive ‘excuse me, please’ (translation: get out of my way, you daft nincompoop, can’t you see I’m very important and very busy ) has already sucked you in.

But before that, the city is yours, mine, no one’s. The litter trucks pick of the remnants of yesterday. The song of the birds is clear and beautiful. In only a few hours, it will drown in the roar of the city.

The late spring light is beautiful too. The sun comes up behind the bend of the Thames, behind the Gherkin and the Cheesegrater and the Walkie Talkie and infuses the air with pale yellow, the colour of silence and sipping tea in the quiet. It cascades down and frolicks on the rippling skin of the Thames.

The peak highlight of the early morning social scene in London ( apart from awkwardly avoiding the eyes of your fellow Tube commuters ) is the coffee queue. This is the precarious moment - the one between being fairly upset about that man or woman whose backpack quarreled violently with your face between London Bridge and Bermondsey and the shot of dopamine that hits the brain once the barista delivers a papercup of sugar and caffeine into your hands.

In the coffee queue today, I’m reading Erling Kagge’s Silence in the Age of Noise. It is a thin volume, but consume it too quickly and you might defeat its purpose. This is what I did when I first bought it - sat down and read it all at once. Now I’m going back and reading it slowly. I’ve been working my way through a few pages Kagge writes about our constant need for novelty and news. It speaks to me, because I, like many, am a notification addict. I can spend hours scrolling, clicking and browsing just because there is a nonzero probability that someone somewhere will hit a Like and deliver a little shot of dopamine into my brain.

Days can pass in this notification induced frenzy and at the end of such days nothing much of importance has been achieved, except for perhaps feeding data into a machine learning model that will eventually tell a business analyst somewhere how likely I am to buy a product and what kinds of advertisements they should target at me. And yet, and yet I find it hard, or nearly impossible not to engage in this. ‘The more we are inundated, the more we wish to be distracted’, Kagge writes. The more time I’ve spent scrolling my timeline, the more I want to keep scrolling it. In the end, I’ve achieved nothing meaningful, perhaps a few laughs and a spike of outrage when someone outside of my echo bubble is retweeted onto my timeline.

This idea of constantly being connected, of constantly being entertained and sad and happy and angry and outraged at little tidbits and memes and hot takes is exhausting. Constantly being connected to a stream of communication masquerades as work - but does it really have value?

“When you’ve invested a lot of time in being accessible and keeping up with what’s happening, it’s easy to conclude that it all has a certain value, even if what you have done might not be that important,” writes Kagge. “In a way, silence is the opposite of all of this,” he continues.

Once I get to my desk, log in, open up the email client, the sunlit silence of the morning is over. I wish it had lasted longer.

event-driven brainware

I started surfing the web (do people say this these days?) in the early 00s, probably closer to 2003-2004. This was back in the day when a few MBs of email storage was considered a luxury, most people had slow af dial-up connections and there was more than one search engine on the market. Various feeds, aggregators and recommender systems did not exist and people actually remembered and wrote down URLs and typed them into the URL bar.

Finding information required planning and usually, we were advised by school teachers, libraries were one’s best bet at receiving good information. This was the time when Wikipedia was still nascent and a frequent target of shitposting by my classmates.

Over the last few years something has happened. Instead of having to hunt for golden nuggets of information, one is usually hit by a flood: YouTube videos, blog posts, articles, tweets, news feeds and article aggregators like HN. The problem is not the scarcity, it’s the flood and so pretty much every modern citizen of the internet has become a filter and a curator instead of a researcher.

The appearance of information has also changed. Instead of being presented as barebones HTML with minimal markdown, most articles and blog posts now come with all of the bells and whistles of late 2010s bootstrapified UIs that we’ve come to expect. This makes it hard to differentiate between good information and information that simply looks good without having much substance.

Something about the way I consume information has changed as well. I noticed that I have a lot less patience to engage with long, slow and steady technical texts. Within 140 characters (or 280 now that @jack has decided to increase our collective attention span), I start to drift. My brain craves novelty.

I rarely plan what information and am going to consume and why I need to consume it. Instead, I plug into tech Twitter and activate the brainware that’s been conditioned by (no doubt) well-intentioned engagement designers to be high on novelty. I consume bits of information, sometimes clicking on the posts, but rarely actually engaging with the material. Within a few sentences, I find myself craving to return back to the timeline of the newsfeed and surf for better waves. My brainware has slowly but surely transformed itself into an event-driven system that reacts to whatever pops up infront of my eyes, but rarely makes an effort to master the material. This is part of the reason I eventually stopped frequenting Hacker News - after a while, I realised that even though I was browsing lots of articles, I rarely if ever retained anything.

everything as code

Sometimes the abstraction of abstraction of abstraction with yet another layer of abstraction is headache inducing and you just wish there would be a bash script somewhere in this repo that would tell you the exact sequence of commands you need to run to get the infrastructure for this app working. Today was one of those days. But let me go back to the beginning.

For the past three years, which is pretty much the bulk of my whole career in software from university graduation to today, I have been a permanent occupier of Layer 7 ( shoutout to the OSI model diagrams they showed us in CISSP training!). As a citizen of this layer, the code I type up deals with user features and relies on abstractions. Files to me are objects with attributes and methods - not bytes and buffers or memory pages. Once my file handling code exits the context manager, case closed, finito, data written to disk (which is of course a convenient abstraction pipe dream and not at all what actually happens). Once the feature is tested and shipped and the CI/CD system signs off with a green light of approval, the other six layers that make up the abstraction layer cake are usually the problem of another team (read Charity Majors’ blog if you want to find out why ‘I just ship it, someone else runs it’ is generally a bad strategy) or another member of my team.

Last October I stumbled on a cool thing called Kubernetes and started poking around the infrastructure tech space: Docker, Terraform and friends, which lead to me accepting a consulting job with a team building out some serious infrastructure on AWS. All of a sudden, this poor Layer 7 lurker is thrust into a world of bash scripts and infrastructure as code. If you haven’t heard of it before IAAC is a ‘movement’ (for lack of a better word) to define infrastructure through configuration files that can stored centrally and versioned instead of the good ol’ way of manually provisioning and setting up servers or running custom bash scripts. For one, this makes it easier to tear down and spin up new infrastructure really quickly. In particular if you are managing large scale infras in a cloud provider, you do not want to spend your day clicking around the console or ssh’ing into every single instance you have to run package manager commands and installation scripts. Second, if anything does go wrong, you can use version control to quickly checkout a previous copy of your infra, run the deployment commands and restore it to a previous state. With manual intervention, it’s harder to rollback changes, especially when production is on fire.

All of these are great benefits in particular if the scale of infra you are dealing with forces the ‘my servers are cattle, not pets’ paradigm. Increased abstraction in the form of tools with specialised configuration languages (ie. HCL for Terraform), can sometimes mean an initial hit in the velocity, especially if someone new to the tool is making changes. I came face to face with this reality while trying to untangle a Jenkins pipeline that was generating Terraform tvars files to be used in downstream jobs. Although, I had previously used Terraform to provision new infrastructure components, working with someone else’s infra code and associated scripts can be tedious. In particular, sometimes untangling what is happening depends on one’s knowledge of all the knobs and levers in the tool. Another problem here is that it is rarely possible to test these changes locally. A third problem is (as Rick Altherr mentioned in a reply to one of my tweets) small syntactic issues can quickly lead to hours of headscratching and painful debugging by trial and error. So good linting and formatting tools are a must (hi, from the developer who has spent hours trying to untangle tabs vs spaces issues in YAML and hunting down an errant comma in a JMESPath expression).

I suppose at the end of the day, we cannot have our abstraction cake and eat it.

overzealous DRYing

Last night, David Winterbottom (software engineer at Octopus Energy) tweeted some thoughts about writing code. Among them was advice to avoid overzealously DRY(don’t repeat yourself)ing code. Coincidentally, just a few days ago, a co-worker of mine and I had a similar debate over some proposed changes in a pull request. I had suggested refactoring some repeated functionality into a reusable method, but received some pushback - which made me start thinking why I so eagerly reach for the DRY.

For those who don’t write software regularly, DRY(don’t repeat yourself) is an acronym that is used to help software engineers avoid code duplication. Code duplication is bad, because it usually decreases the velocity at which one is able to make changes in a codebase. Instead of making a change in the logic in one particular method, you have to make the same change in multiple places and remember/find all of the places where the functionality is duplicated! Even with modern IDEs and grep, duplicated code can sometimes elude the developer and lo and behold, you have a confusing bug on your hands. You’re damn sure you made the change and still the damn program is behaving according to the old logic!

When I was first starting out a few years ago, the company hired a trainer to do a three day course on software design principles. This is where I was first introduced to delightful concepts such as DRY, SOLID and my favourite mouthful YAGNI - which sounds like the name of some exotic animal - but actually stands for You Ain’t Gonna Need It (just based on the acronyms, you can clearly see that software engineering is a highly regulated and standardised profession - that is a post for another day). I feel that those three days permanently burned DRY into my muscle memory.

After graduating from this three day course, I went on to write and refactor and produce lots of bugfixes (and bugs…) and have always been an overzealous devotee of DRYing. More than three lines of duplication - well let’s bring out those refactoring tools we have kudos to the modern IDE and pull this out into a method. Voila! Deleting lines of code and replacing them with a call to a function never felt so good. DRY is the low-hanging fruit of refactoring - the kind that gives you the immediate warm and fuzzies.

What I understood from the discussion is that blindly applying DRY to every single piece of code means accepting the tradeoff of an additional level of indirection (which is what the extra function call is ) without considering if it’s appropriate for that codebase. While abstracting away common functionality usually means making faster changes, it can also make it harder for the next developer to navigate the codebase. Always consider the tradeoffs.