Saturday, 23 April 2016

My 2016 Spring Tour and every kind of office.

Something unusual happened to me recently. I was asked to deliver a talk to a meet-up in Sheffield.

This is odd for a number of reasons; while I occasionally speak at conferences, I'm a the level where that typically happens only because I work hard putting myself forward for the role. Like most speakers, I submit talks to a large number of Calls for Proposals (CFPs) each year, and like most speakers, most of these are rejected. Because of this process of proposal and rejection, I was genuinely flattered that, on the strength of a couple of lightning talks, I was invited to present.

Another unusual thing about this is that Sheffield is over a hundred and eighty miles from my home. Meet-ups are small community events, and they are there to serve the local area (thus, why they're typically named after the area where they take place). For this reason a meet-up usually takes up one evening, travel, event, pub and all. It was clear that to speak at the other end of the country, this was going to have to be a heavier undertaking.

When Eve, organiser of the Sheffield Ruby User Group (ShRUG), approached me to speak, at this years Bath Ruby conference, I was very quick to point out this issue. One thing you quickly learn about Eve is that she is extraordinarily good at organising things. Tekin, organiser of the North West Ruby User Group (NWRUG), was contacted, and the bare-bones of a tour of the North of England was set out.

This was pretty much all one corridor conversation during the Bath conference, I was intrigued by the idea of a speaking tour, and flattered to be asked, but I'll admit that I wasn't convinced, at that stage, that it was going to happen.

Needless to say, the tour happened, and it happened at very short notice. Within two weeks I had been booked for talks as ShRUG and NWRUG, and Eve (Super Organiser of Shrug), had found a sponsor, in the form of Sky Technology in Leeds (who, by the way, are hiring). In order to justify expenses from Sky Technology, I was asked to present the same talk to them, at their offices, for a lunch time slot.

So there it was, Monday in Sheffield, Tuesday in Leeds, and Manchester on Thursday.

Given that it was all put together with 2 weeks notice, I was unable to take the week out of work, so working remotely was a necessity. This means that as I travelled about the country, working space had to be found, unusually, this means that this week I've worked in most of the environments where developers commonly work (except the type I usually work in).

So this post is about my tour, but it's also about these different ways to work, and how I found working in them. So, as a control in this experiment in working environments, lets talk about how I've been working for the last half-decade.

Office 0: My company, our office.


I've been working in the same offices since September 2011, it's a simple office unit in Guildford, Surrey, and each day I meet the same people (this is beneficial, as I do get on with the rest of my company, it's a pretty friendly team). It's made up of rooms which are shared by small groups, as opposed to an open hall or individual offices. It's usually nice and quiet, with up to 4 developers in the room we're in, it's understood (partly through a few years of making this issue known) that a conversation in this space can disrupt the concentration of developers, and sets our work back by some time.

We do a lot of our communication online; work is doled out via a ticketing system (we currently use JIRA), our morning meetings take place over Google Hangouts, and a lot of our work-related chit-chat takes place via an instant messaging client (Currently Hipchat, formerly Skype). This facilitates working in multiple rooms (we're spread across two floors), but also remote workers. Some developers are in the office some days, but not others, we have contractors abroad and in other parts of the country.

(This habit for working in teams with remote workers made the tour possible, I already had all the tools to work remotely, and people in the office who might need to talk to me knew how to get in touch remotely.)

Working in the local team, there are some perks. We make tea and coffee in batches, and announce that it's ready over Hipchat, creating a social bond over hot beverage. From time to time we go out for a drink, or a meal in the evening (covered by the company).

Office 1: Co-working space

So, on arrival to Sheffield I was reminded of two places I've spent a lot of my life; Darwen, in Lancashire, where I grew up and Manchester, where I was living and working when I first learned to code. It's got a slightly grimy "grim Up North" feel to it, the buildings are pretty run down, and the community speaks with a broad accent, all their own, and the whole place has a kind of rough and ready feel. Once you get past the grime, you realise that there's some beautiful buildings, from an earlier age, when the steel industry brought an unprecedented prosperity to this area of the North East.

Of course, having grown up in Darwen, which follows a similar pattern, I know that once you get used to the residents, you'll see they're all hidden gems of their own, with a local wit, community and sense of humour that comes from these post-industrial town.

The Co-working space (also known as hot-desking offices) are a concept that I was already pretty familiar with. While a common pattern for freelancers and remote workers, to get out of the home, is to work in coffee shops (essentially renting their working space by buying coffee), the co-working space is exactly the opposite, people rent their desk, wi-fi etc. and drinks are, typically, provided socially.

This has the added benefit that the workers, and sometimes dedicated managers of the space, can craft a working environment that's suited to the people who use it. It's quieter than a coffee shop, and there can be smaller social spaces and meeting rooms for use as and when they are required. It's cheaper than renting and maintaining your own office (as my company does), which is crucial to small companies (which are often a "one man band", where one worker does everything, the coding, accounting, finding business, filling out tenders, whatever's required), to keep costs down.

I spent Monday working in one of these spaces for the first time (at Union Street in Sheffield), and I was pleasantly surprised by just how easy it was to transpose my day-to-day work into a new place and work remotely. The space was nice and quiet (I had been a little concerned that a space where various people were working on various  projects for various companies would cause distractions throughout the day), and the freelancers working there were friendly and considerate. 

Most of the people I spoke to also worked from home, which is cheaper again than hiring a desk for the day. The fact is it's quite hard to work on your own, in your own living space. It seems that spending a few days a week at the co-working space and a few days at home is a common pattern, to balance costs with the risk of becoming a "digital hermit", disconnected from the world outside your own home. More on that story later.

I was only there for one day, but I can also see that this would result in various combinations of people being there on different days.  This would lead to an interesting and ever-changing community, although perhaps with less of a feeling of companionship or stability.

Talk 1: Sheffield Ruby



(I've got quite a lot of talk about in this post, with my experience of different working environments and of spending some time with the Ruby community in Northern England. So I'm skipping over the contents of my talk.)

So my first talk was delivered at the Union Street co-working space, to the Sheffield Ruby Users Group. It was a friendly bunch, showing some real interest in programming with Ruby and camaraderie within the local community.

Afterwards, after a couple of tries to find a pub that was serving food, we wound up in one which happened to be holding a pub quiz. We competed (with the team name "Sky Leeds is Hiring"), and after a lot of laughs and fun, actually won! To the tune of four English pounds each. If you don't count events and their sponsors covering some of my travel and accommodation costs, this £4 is the most I've ever been paid for public speaking!



I liked Sheffield, and I liked ShRUG, but, as is the nature of a tour, it was soon time to move on.

Office 2: Open-Plan



On Tuesday I woke up in Sheffield, and made a bee-line north through Yorkshire, to the city of Leeds, or more specifically Leeds Dock. Just outside the city itself, it's a modern renewal of the cities largely unused docklands, it's pretty, it's fashionable, it's got shopping and, intriguingly, a museum of weaponry.

It's here that the Leeds offices of Sky Technology have their home, on the first two floors of three large, dock-front buildings, beneath tower blocks of fresh, modern apartments.

Inside the offices themselves, Sky Technology, which must employ at least 400 people in this location, subscribes to the open-plan philosophy of office arrangement. Each office is a large hall, arranged on the ground floor with a high ceiling, with more office space above on a mezzanine level. Beneath the mezzanine are the irregular spaces, meeting rooms, toilets, kitchen and social spaces.

I'll admit that I was a little unsure about working in this kind of environment, I know that conversations in the space I'm working can be a distraction, and any programmer will tell you that a distraction sets you back much further than the time taken to deal with it, as you then need to take up your work where you left it, and remember what you were thinking about and this process takes time.

I was actually pleasantly surprised by my first experience in an open-plan office. The high ceilings kept the actual noise level down, so the many and various conversations going on came across as a kind of hushed murmur. I found it pretty easy to ignore the chatter going on around me and buckle down to my own work. I can also see that this would make it easier to catch up with colleagues, ask questions, and otherwise get in touch as required. Although... when you distract a developer...

This particular open-plan office had one unusual feature. At the head of each bank of desks was one monitor showing some information about the state of the project, presumably a project manager gets to decide what takes place on that one, and another monitor displaying Sky News, on mute. It's almost as if the businesses are connected.

Talk 2: The Sky at Lunchtime




So, this particular talk was essentially a favour to our corporate sponsor, Sky Technology, who had covered some of my costs for transport and accommodation for the week. I was in their offices in order to reprise my talk for some of their staff as an extra bit of lunch-time entertainment (possibly even education) for them.

The event was catered, so this was my first experience of "performing to knives and forks", in a bustling canteen, against the sound of the lunch time rush. It was a little harder to engage this audience, half way through their working day, with so many distractions (not least of which was the free food they were applying themselves to). Nevertheless, the talk was well received, I had some very positive feedback from it, and this event facilitated the rest of the tour. 

By the way, Sky Technology are hiring...


Office 3: Working from home (occupied)



Wednesday was a different working environment again, I'd travelled over to Lancashire to stay at my parents' house near Oldham. Now my working practise was the most common for people outside the office, just staying at home and setting up where there's a free surface. As it happens, this was in their conservatory, looking out on the garden on a bright, sunny day.

On Wednesday, the house was full and there was a general buzz of activity going on around me. My parents, now retired, were outside working in the garden, my brother, currently staying here as well, was in and out of the room, occasionally stopping to chat, and their dog occasionally camping around my ankles, hoping for some attention.

Now, this is fairly common. Sometimes people work from home because they have children or older relatives to look after, and need to be present during the day. Sometimes people take a trip, stay with friends, and, like me on this journey, have to do some work in and amongst the reason they took the trip.

This was pretty pleasant, as a working environment. I didn't feel particularly alone or isolated, the space I was in was quiet enough to concentrate, but I was able to take breaks when I could catch up with my family.

It does take some discipline to work from home, mine or someone else's. There's always plenty of distractions, if you look for them, and of course nobody from work is looking over your shoulder. I've worked from home before (usually when I've been ill, and have trying not to infect my colleagues), but this was a different experience. Overall, due to some fortunate circumstances, it was a very pleasant working day and, it should be noted, I got a lot of work done in this time.

So I can highly recommend working from home, in a conservatory, on a nice sunny day, with family going about their business around you. Of course, in slightly different circumstances, or if I'd been less disciplined about getting the job done, this could be very unpleasant, or disastrous for my job.

Wednesday was a good day, Thursday was a little different.

Office 5: Working from home (alone)




On Thursday I was still at my Parents' home in Lancashire, still in the conservatory, and it was still sunny, with one simple difference. My family were away from home for most of the day.

This is the 'digital hermit' scenario mentioned above. I didn't leave the house for most of the day, as I only had to go through two doors to get to work. I had not just quiet, but silence to work in. Okay, I played music to fill that up, but the feel of this day was still basically solitary. I got work done, of course, and didn't didn't succumb to the temptation to take an extended lunch break in front of the tv, or just to take a nap in the afternoon. 

I can completely see the issues of completely solitary working, tho. I can see that this kind of code-only time could lead to a kind of 'cabin fever' for remote workers. I can see why people choose to spend some of their days in co-working spaces, to get out of the house, and socialise a little as they go about their work. This was okay for a day, but I don't really know if I could handle a life where I don't have much reason to leave the house or go anywhere. 

So my word of warning if you're considering working at home would be this; beware of cabin fever, make sure you see daylight and walk around outside, and take any chance you can to socialise. Or, of course, go and rent a desk at your nearest co-working space.

Talk 3: North West Ruby Users Group



So now we get to the end of my tour of the North of England, and it really was a fitting end to this experience, at the North West Ruby Users Group (NWRUG) in Manchester, England. I actually learned to code in Manchester, or specifically Hulme, in the south of Manchester, some years ago. I had been a regular member of NWRUG, over half a decade ago, and my first talk about technology was actually at an event in the same venue, MadLab. So this was very much the "home town gig" of my tour.

I was not disappointed by the local community where I'd started out. NWRUG was a friendly bunch, relaxed, and very willing to laugh with me. The lighter, comic opening to my talk was very well received, and this friendly, approachable atmosphere extended to the pub afterwards (no quiz this time).

-----------------------

So, this whole thing took place at ridiculously short notice. I was called upon to write a talk with only 2 weeks notice in and amongst a product release, so it wasn't the easiest talk preparation I've done so far. 

However, and I really do need to be clear about this, the week has been a very positive experience for me. I've had the chance to share a bit of my experiences of coding and learning to code, catch up with some of the Ruby community further from where I live and get some first-hand experience of doing my job out of the office.

This has all been made possible with the help of some members of the UK Ruby community. I've already mentioned Eve, but also her co-organiser James. The venue for ShRUG, and a great supporter of that group was Union Street co-working space, that event, and my time at the Sky Technology offices, was sponsored by Sky Technology, who, by the way, are hiring. Thursday's talk at NWRUG, was at MadLab, and arranged by Tekin, who helped make it all happen and sponsored by Createk, who are also hiring. Thanks so much to these people and organisations for helping me along on a truly unique week of experiences.

Any questions? Comments? Feel free to catch up with me on Twitter.

Tuesday, 15 March 2016

Bath Ruby 2016 - a retrospective

So, last week, I headed out to the Bath Ruby Conference in, um, Bath. It's the second year in a row I've attended this event, it's been well worth a visit both times, combining some top quality talks, a beautiful venue, and the all important community aspect.

The more conferences I visit, the more value I find in chatting to people, new and known who share the experience of writing code, who work in the same community. Now, this part of tech events can be highly individual, coloured by who you happen to speak to, the time they're having and all kinds of social oddities (I'm Autistic, remember, people can be baffling to me sometimes), but event organisers can do a lot to facilitate this mingling. So how did Bath measure up in the 'corridor track'?

The venue was a really nice social space, comprising of the main conference hall, seating around 500 people, a tea room, and between the two, the grand octagon, where the sponsors were set out. (Freebies galore! Huzzah!)

It was also a very beautiful reconstructed Georgian building, somewhat grander than the Ruby community are accustomed to, and honestly fantastic surroundings for the day.

The schedule was resplendent with breaks, each talk (and each set of lightning talks) was followed by a break, coffee and snacks were provided through in another hall. Each time the room stood up to stretch their many legs, I met someone different, and had a different conversation. This worked well for the social aspect.



On to the talks, we started off the day with Xavier Riley running us through Sonic Pi. I already know a little bit about Sonic Pi, as some of my previous posts will attest, so I've talked about it rather a lot here. It's always good to learn from actual people, tho, and Xavier took it right from the beginning, the first part of the talk just taking us through the basics of the API and making music with computers.

After this section, it turned into a kind of whirlwind tour of some of the more impressive things that Sonic Pi is capable of, including a few crowd pleasers, such as the Mario theme music. It was difficult to follow the actual code at this stage, tho. My only criticism of this talk was that it followed this pattern a little bit.


It was good to see what it does, tho, and the last part of the talk was some good, solid live coding. This was well pitched to go just over the audience's heads, as it were. One problem with live coding is that to produce something impressive under pressure, people write code which isn't readable, partly defeating the object of showing people code as it comes together. This was built to be readable, but sound good, encouraging people to read.

As the talk finished, there was a break, then a set of lightning talks, open slots where people can have a go at speaking in front of an assembled audience for a short time. For people like me who enjoy public speaking, lighting talks are a great opportunity to try out ideas, and keep in practice. Bath had three sets of lightning talks this year, providing ample opportunities and allowing about 15 people to try their hand. It was a good mix of experienced speakers, first timers and in-betweeners like me, but more about that later.


Next up was Coraline Ada Ehmke talking about the graph-based database, Neo4J. Most databases which web developers work with are what's known as "relational databases", which are built around a firmly structured set of data. This graph database appears to be built around graph theory, which focuses on how entities (known as 'nodes') are associated with one another.

We learned how a graph database treats these associations, edges, as first class citizens. I got a little lost in the details here, but the practical upshot is that it performs far better when working through multiple links. This makes things like semantic analysis and understanding networks much quicker and easier to do.

As I mentioned, most of the work web developers do is with relational databases, but it was good to be introduced to a completely different way of solving some of the same problems. We never know when an idea we've only met in passing might turn out to be the best tool to fix a new problem.


Next up, Courtney Ervin was talking about Open Source Software, and particularly how we can approach contributing to the OS ecosystem. She broached on a number of issues it throws up. Mostly based around the fact that most programmers learn in and around Open Source projects, and feel obliged to contribute back into that community once they're able to.

Of course, the end-point is that not everyone can work for free, as well as working for money, and we need to manage our time. Courtney told us that there are many ways to contribute to the community as besides writing actual code, and that there are many possible benefits to contributing besides altruism. The main take-away is that we need to balance these benefits against the negative effects of overwork, and under appreciation.

She put the issues out there, and made it very clear that we're not actually obliged to contribute when it's having a negative effect on us, it has to work both ways.

Another talk, another break, another set of lightning talks. There was one speaker in this set who had specific significance to me, chatting about code review and repair in the guise of a wonky limerick.



but let's not dwell on him too long...

Then we broke up for lunch, the conference wasn't able to provide lunch so instead there was what must have been a slightly terrifying tide of people heading into Bath, where there's a fantastic array of places to eat and historic buildings to see.



Next up, Janet Crawford, a Neuro-scientist of Stanford University was talking about unconscious gender bias. This is becoming a very familiar message at technology events, but the sad truth of the matter is that the technology industry (as well as engineering and science) is still male-dominated, and it doesn't need to be that way. It's a subject that needs exploring at these events, something that we should all be aware of.

Janets talk focused on the ways that the human brain forms shortcuts, and how this helps us to understand the world around us. It's essentially a good bit of brain design, but it does mean we compartmentalize people in the same way, so when we see that technology is mostly male-dominated, we make associations between men and coders. The trouble is that it follows that we think only men can code.

I have met a number of talented female coders who prove that this simply isn't true.

However, with an industry which is mostly male, and which sadly assumes this is because of innate ability, it's hard for women to get into the industry. It's a vicious circle.

Janet stated this case, along with some history of this kind of bias and the neurological basis for it in a concise, thorough and, importantly, compassionate way. Familiar, but well delivered, a valuable perspective on one of the biggest problems in the technology industry.


Next up, Zach Holman, he's spoken at a number of events I've seen so far, and he's always been billed as 'the guy from Github'. Only last year he was fired from that organisation, and this talk was about being fired from Github.

This is a topic which is rarely spoken about at tech conferences, which is, in retrospect, surprising. There are often jobs about finding work, hiring coders, career progression and starting a business, but the termination of a job is rarely mentioned. It's hard to tell if it's a taboo subject or not, but it's understandable that people don't like to talk about having been fired.

Zach, in my experience, has always been an entertaining speaker, and this talk was no exception, with humour injected into each part of the narrative. Even then, it was clear that the firing experience was still pretty fresh and raw, it showed a fair amount of vulnerability and introspection.

It was also very well rounded, all things considered, looking at the perspective of someone being fired, but also colleagues and employers.

The only real issue I found here, and Zach did warn us, is that it was rooted, unsurprisingly, in American employment law. It's difficult to tell just how much of this would apply to a similar situation in the UK.



The keynote for the day was Ruby core team member and former TechJAM panelist, Aaron Patterson, talking about how method calls are constructed in Ruby.

Aaron typically uses a fair portion of his talks being funny, and this was no exception. Falling back on a favoured topic of the (somewhat exaggerated) Trans-Atlantic culture gap. We heard about Freedom, waffles and a fairly famous cat.

Down to technical details, and this years most technical talk went well down the rabbit-hole of code. We looked deep inside the internals of the Ruby language, and how it works at a low level.

I got very lost in the details, here, being at a level where we rarely have to work, most coders don't have much framework to understand it. However, again, it's good to be introduced to things related to the work you do, even if you don't use that knowledge.


The whole thing was followed by an after party which was good fun. The sponsors laid on some nice food, and a free bar and most of the conference hung around to eat, drink, chat and enjoy ourselves. A great standard conference after party with an added bit of Video Jockeying from Ruth John, another TechJam panelist, who was in her element using audio-reactive video with clips from 80's cartoons.


So, overall, a fun, social, interesting day of talks and chatter. A great event for techies to go and directly engage with the wider Ruby community. Great work from Simon and the rest of the team for making it happen on a second year running.

--------------

Addendum: Apart from my little selfie snap at the start, the photography for this post was taken by Adam Butler who also presented a lightning talk. You can see his Bath album, with many more photos from the day, here.

Wednesday, 17 February 2016

Sail with me into the Pi (part 3)

We're talking about Sonic Pi, we're talking about Sail by AWOLNation. In part 2 we put together a second part, a pattern for playing separate parts together using the `in_thread` method, and we looked at reducing repetition by defining methods for code that's used more than once.

Part three is going to be a lot like part two, we'll take another step in building the music up and do a bit more abstraction.

First things first, we're going to add one more to our collection of synthesisers. It looks like this:


Actually, there's a bit of repetition here. A collection of variables which all end in '_synth', you could even call it a collection of synths. There's a data structure in Ruby which is a collection of values which are labelled, it's called a hash. For our synths, it looks a bit like this:


We can get one of these values out of our hash with squared brackets, it looks like this:


This is another example of a refactor, which, as you may remember from part 1, is a change to our code which doesn't change what it does, but changes something about how it works. But this is only the start of this refactor, we've changed how we structure our data, we need to change the uses of that data. So wherever we see `@vamp_synth` we need to change it to `@synths[:vamp]`. Fortunately, we've reduced enough repetition in our code that there's only two places we need to do that:


How do we know that our refactor was successful? Simple, we run the code and check that it does the same thing. Fortunately Sonic Pi makes it quite enjoyable to do this, we just tap alt + r and listen. It sounds the same, so now we can move on.

The new part we're going to introduce is a riff, if you listen to the track it's played on a plucked cello. It's played immediately after those bass hits are finished, so there'll be a sleep of 2 beats to coincide with that. You'll also notice that it's often repeating the same note three times in a row, three times in one beat.

We can decide from this that we need two methods before we start worrying about what the actual notes are. We need one method to play one of these plucking notes, and one to pluck the same note three times in a single beat. Here they are:


Let's see; I'm defining the plucking method on line 35. I'm grabbing the :riff synth from our synths hash and changing to that synth (line 36), then I'm playing the note it's been given on line 37. You'll notice that I've set attack, decay, and sustain to 0. Skipping the first 3 parts of the ADSR curve, and going right to the release. The note will be short, one triplet. Then we'll wait for the length that's been passed in before proceeding (line 38).

Line 41 defines our triplet method. We know what it should be doing is playing a note 3 times, and you can see that the code says `3.times`, that's pretty self-explanatory, but what about the next part.

Next on line 42 we see some curly brackets. This is another way to define a block in Ruby. Where we saw do and end earlier on, this is doing the same thing. So the code between { and } will be run 3 times. Then we're calling riff_pluck, which we've just defined, passing in the note that's been given to riff_triplet and saying to wait 1 triplet for the next pluck.

So now we can start to write it. The riffs to go along with the two parts we've already written, in full, looks like this.



There's a lot of that, I can't even fit it all on one screen (thus, I can't fit it all in one screen shot). You can probably see that I've already used all of the techniques we've discussed so far to shorten the code. Each use of the pluck_triplet method saves us two more lines, and I've used the `times` method again to repeat those two bars that don't change.

Oh, and it sounds like this:


I can't cut this down any more, we need each line we see, but I can tidy it up a bit, and make it easier to navigate. Well, the obvious place to split this sequence up is when we see that sleep for 2 beats. We're going to make each of these parts a method of its own, named after the first note in it. Each part of this is what's known as a riff, a pattern based on the notes of a chord. So, splitting that sequence up at each sleep, it looks like this:


And now that we've put each riff away in it's own method, the sequence looks like this (I've put it away in a _loop method like the other two parts):


This looks a lot tidier. We haven't got rid of that code, but we have put it away in methods which describe what they are, and the sound is, of course, exactly the same.

Now, we learned in part 2 what we need to get more than one part to play together and at the right time. We call the methods on successive lines, the parts sleep through the part where they're not playing, and each part is wrapped in an in_thread loop. We've met all of these requirements, so we can add it to our existing code like so:


And it sounds like this:


We're starting to get the feel of the song now. You can grab the full code at this stage here.

Next time we're going to look into a bit more of Sonic Pi, how to play samples, sound files, play them differently, and play some drums.

Thanks for reading, as ever, feel free to say hi on Twitter at @MarmiteJunction.

Friday, 12 February 2016

Sail with me into the Pi (part 2)

We're still talking about Sail by AWOLNation, we're still talking about Sonic Pi. If you haven't read it yet, you might want to take a look at part 1.

In part 1 we wrote a method to collapse the code to play the opening vamp from Sail. So what's next? Well, if you listen to the song, you'll hear a second sound coming in at about the 17 second mark. It's big, low-pitched and declarative. This sound cuts in, shouting about it's presence and, here's the clever part, it spends more than half of it's time silent. It's over powering for the 2 beats it plays for at a time, but then leaves space for the other sounds.

So how do we make it? Well, we start with some analysis. The notes each time are what's known to musicians as an open fifth, or to rock guitarists as a power chord. We know it plays for two beats at a time, and we know it's a rough sound. So first, lets define a synth, at the top where we defined the vamp synth.


I've chosen a saw wave because it sounds rough, powerful and a tiny bit like a distorted guitar. Next, we need to learn the lesson from part 1. Write a method to do what the bass part will do. Namely, use the @bass_synth variable, construct a power chord, and play it for 2 beats. Let's have a look at the code:


Remember part 1, def means we are defining a method, bass_hit is the name of the method and it has one argument meaning root.

On line 25, we are telling the 'play' method to use the variable we named @bass_synth to choose what kind of sound to make.

Line 26 is using the method called 'note' to convert from a named note to the MIDI note number for that pitch, and setting that to a variable named root_note. I've discussed MIDI notes in earlier posts, but the important thing to know here is that it's a number, and each whole number represents one note on the a keyboard.

On line 27 we are building an array, or list of notes. The first is the root_note we set on line 26. The second is 7 notes higher, otherwise known as a fifth. Remember that a power chord is also called an open fifth? This is why. The third one is 12 notes higher, what's known as an octave. Basically, the point where it's the same note, but higher up. We've built the notes for a power chord, based on whichever note we pass in as the 'root' argument.

Line 28 plays the array of notes we just put together. Just like in part 1 we're going to set some of the ADSR variables. We've told it to sustain the note for 2 beats, then fade out pretty quickly, in 2 triplets.

Line 29 will wait for those 2 beats for the note to end before proceeding.

Down on line 33, we're using it. I'm passing in a low E flat as the root, and we hear a big, low pitched power chord. It sounds like this...


We got there quicker this time, all we need to do is write the rest of the chord sequence for the bass hit.


It works, we're leaving all that lovely empty space using the sleep method. And if we have a listen, we hear the chord sequence, marked out in these big power chords.



Now, the next step is obvious, surely? We know these bass hits, and the vamp from part 1 are two parts of the same song, and they will play together. So the first thought might be to just put both bits of code down on the same sheet and listen, right?


(You might have noticed, if you're sharp-eyed, that I've taken out the first vamp method call. More on why I did that later.)


Now, this doesn't work the way you might think. We'll hear the whole sequence of bass_hits, then the entire sequence of vamps. Fortunately, Sonic Pi has a simple way to handle that, something called a thread.

Now, a thread is like another program, started by the program we're writing. The difference is that our program won't wait for the code in the thread to finish before proceeding to the next line. We can put each of these two parts into a thread of it's own with in_thread.


So now when we press ctrl + r to run our code. We hear the two together. They sound a bit like this.


So we've got to the end for this section, well, almost. I'm going to wrap them both in a method. It won't take this code away, or make it shorter, but it will hide it away a little, so we can concentrate on the sequence of the song a bit more.



Okay, so lets look back at the song again for a moment. It starts with the vamp, which goes round its sequence once, then it's joined by the bass.

So all we need to do is set off the vamp_loop method, wait 8 bars for it to finish, then set off both the bass loop and the vamp loop. Oh, and we're going to put that first 'vamp' call back, which just sits before the repeating part. It looks like this:


And sounds like this:


If you'd like to try it yourself, you can see the code to-date here.

Well, that's it for today. We're starting to hear a bit of that slow build-up, but there's more parts to add, more song to play, and we're going to look at more ways of making this code easier to read, understand and change.

As ever, feel free to say hi, I'm known as @MarmiteJunction on twitter.

Thursday, 11 February 2016

Sail with me into the Pi (part 1)

Prerequisite: This is a tutorial about Sonic Pi, a live-coding music platform using the Ruby programming language. If you want to code along, first go to the Sonic Pi Website and follow their instructions to download it.

Okay, a bit of background. From time to time I come across a song which becomes an 'earworm'. Getting stuck in my head, struggling to get out. One such song is Sail by AWOLNATION. I've sat and learned it for the guitar, beat-boxed that slow, immersive groove, I've sequenced it on my Network Ensemble project, and last Friday, it was the turn of Sonic Pi.

Now Sonic Pi, in short, is awesome. At a lower level, its pretty familiar to me, it works in the same way as Network Ensemble and Text to Music. When running code in Sonic Pi, it pushes messages to a sound engine (theirs is written in Supercollider), via a network port. The exciting part is that where I've written messages as and when I need them. They've made a very nice Ruby API for sound control, including a number of synthesisers, effects and samples, all ready for people to dive right in to using.

The really cool part is that, Sonic Pi is being used in schools, using music as instant feedback for young people to learn about code. Personally, I know that if I'd had this kind of introduction to code earlier on, it would have been a strong introduction to programming, and brought me much earlier into the trade that has become my life.

Okay, enough background, I used Sonic Pi to create a version of Sail, and I'd like to show you how I did it. The first thing I did was add something to the language, to add some methods which I'd like to see.

Yes, I know I'm starting with something pretty complicated, but trust me, this will make the rest much easier to understand. Here's the code:

module NumberMethods
  def beat; self; end
  def beats; self; end
  def triplet; self.to_f / 3; end
  def triplets; self.to_f / 3; end
  def bar; self * 4; end
  def bars; self * 4; end
end
::Numeric.send(:include, NumberMethods)

So what's happening here is that I'm creating a Ruby module containing some methods, then I'm including that module in the Numeric class. This means that I now have methods named beat, triplet, bar etc. when I use a number in my code. It's fairly simple code, beat and bats return the number itself, unchanged, triplet and triplets return a third of that number, and bar returns 4 times the number.

This is because all of the time operations in Sonic Pi are based on beats. So while usually `sleep 1` in Ruby will wait for 1 second before continuing, in Sonic Pi it will wait for one beat, an amount of time defined by the Beats Per Minute (or BPM). You can use the method `use_bpm` to change the speed, and so `sleep 1` will wait for 1 second if you first call `use_bpm 60` or for half a second if you first call `use_bpm 120`. The song Sail, has 4 beats in the bar, and divides its beats into 3, known in music as a triplet, so that's why I've given myself these methods.

Just to prove that this works, I can write some code using my new methods and the Ruby `puts` method, then run it (alt + r) to see that the expected changes have been made to those numbers.


On the log (right) we can see that 1.beat is, as expected, 1. 1.bar is 4 beats (the number 4) and 2.triplets is two thirds (0.66666666). This means that instead of worrying about our note values, or repeatedly writing the code to divide 1 by 3, we can use these names, known to most musicans, in our code. By making this small change to the way numbers work in Ruby, we have made our code more Domain Specific, meaning we can write our code in a way that's more like the way we would talk to people who have some knowledge in the domain of music.

Okay, we've named our note values, let's have a little look at the notes. At the start of the song, we hear what's sometimes called a vamp, the current chord is played in a way which sets out the rhthmic feel of the piece.

If we analyse this a little further, we hear that it's playing a chord (a collection of notes, played together) on each beat, for the first 2 thirds of each beat, laying out the 'triplet' feel to the song. So what I really want to make the music do is play a combination of notes a given number of times, with this triplet feel.

Once you've got used to writing code, you can usually write what might be described as Speculative Code or, as I like to think of it, Fantasy Code. We write the code we would like to work, then see if we can make it work. We can imagine a line of code which says "vamp Eb and Ab for 2 beats". It would look a little bit like this:

vamp([:Eb4, Ab4], 2)

And there we have the first two beats of the song, right? Well, not quite. There's no method called 'vamp' in Sonic Pi, so we'll have to write one. It looks like this:


Let's break this down a little. Line 12 uses the def keyword, so we are defining a method. This means that when we use the word 'vamp' in future, Ruby will look to this definition to decide what to do. Inside the brackets we are naming two arguments, notes and length, which will be expected when we call the `vamp` method, to tell it how, exactly, to vamp on that occasion.

Line 13 tells Sonic Pi to use the synthesizer named :tri, so when we call `play` on line 15, it will play those notes on the :tri synthesiser.

Line 14 uses the do keyword to start a block, so whatever number is passed in for the 'length' argument, it will run the code between do and the end on line 17 that many times.

Line 15 shows us the first bit of Sonic Pi sound making. The `play` method will play the notes we give it on a synthesiser. We're just passing in the notes that have been set as the 'notes' argument, then there's some more named arguments for the `play` method.

These all belong to the volume envelope, commonly described as an ADSR envelope. We don't want it to take any noticeable time to increase the volume of this note from silence, so I've set the attack argument to 0.01, a negligable amount. Now, I want this vamp to play for the first 2 triplets of each beat, so I'm setting sustain to 1 triplet, so that will play for the first triplet at full volume, then release is the next triplet, the third triplet is silent. This is a strong way to introduce this triplet 'feel' to the song. Notice I've used the `triplet` method we introduced earlier, instead of dividing 1 by 3 repeatedly.

An important thing to remember about the `play` method is that it will not stop your code working until it's done. If we were to call `play` again immediately afterwards, it would play at the same time as the first. For this reason, we need to tell it to wait for 1 beat before playing again. This is what the code on line 16 says.

Line 18 uses the end keyword to finish defining the `vamp` method. We're done.

Now to use it, on line 20 I set the beats per minute to 120, simply deciding how fast I want this to be played.

Then, on line 21, we've got the fantasy code I wrote earlier, hoping we could make it work, and now it does. A quick alt + r and success! It sounds a little like the first two notes of the song.


Okay, so that seems pretty hard work for one chord, played twice. But that's the beauty of it, we've written a method that we can use to play this vamp on any chord for any amount of time. So we can drop in the rest of the opening sequence with a few more vamp methods.


So the vamp method has paid off, we've said "play this for this amount of time" 8 times instead of re-writing the specific instructions on lines 15 and 16 over and over again. Without moving the repeated code into the `vamp` method the instructions in these 8 lines would take 34 lines of code, all of which would be unchanged. With all that near-identical code it is very difficult to read the code and work out what, exactly, it is going to do. So we've made it shorter and easier to read.

It sounds like this:

We've go the introduction, the first 8 bars sounding ok. There's just two little changes I'm going to make. Firstly, the synthesiser the vamp is played on is inside the `vamp` method, so when this grows bigger, we'll have to find the method in order to change this information. So I'm going to define this in a variable, earlier on.


This is a change to the code which won't change what it actually does, known as a refactor. We've done exactly the same thing, slightly differently. This does open up a few possibilities for future changes, however. For one, when there's a lot more code, we can define all of the synths up at the top, so we could change what kind of sound each part will make, without having to first dig up the method they are using. The second is that we could change the synth dynamically, in another part of the code, but more on that later.

The second change I'd like to make is one to the way it sounds. We've added our vamp, and it sounds ok, but we can make it sound better. We're going to use an effect. We do this by putting our play method inside a `with_fx` block, like so:


Now, Sonic Pi provides some very clear and full help for all of it's effects. You just click help on the top-right of the Sonic Pi window.

Then select the Fx tab, at the bottom, then select the effect we're using, in this case, reverb.


So, armed with this information, we can add a few arguments to change the amount of reverb that'll be applied to our vamp:


That's it for today, it sounds good. Try it yourself. Here's all the code in this tutorial, you can just copy and paste it into Sonic Pi (gotcha: Sonic Pi uses alt + v to paste, not ctrl + v). 


module NumberMethods
  def beat; self; end
  def beats; self; end
  def triplet; self.to_f / 3; end
  def triplets; self.to_f / 3; end
  def bar; self * 4; end
  def bars; self * 4; end
end
::Numeric.send(:include, NumberMethods)

@vamp_synth = :tri

def vamp(notes, length)
  use_synth @vamp_synth
  length.times do
    with_fx :reverb, damp: 0.8, room: 0.4 do
      play notes, attack: 0.01, sustain: 1.triplets, release: 1.triplet
      sleep 1.beat
    end
  end
end

use_bpm 120
vamp([:Eb4, :Ab4], 2)
vamp([:eb4, :gb4, :eb3], 6)
vamp([:eb4, :ab4, :eb3], 2)
vamp([:eb4, :gb4, :eb3], 6)
vamp([:eb4, :ab4, :eb3], 2)
vamp([:db4, :f4,  :db3], 8)
vamp([:gb4, :db5, :gb3], 4)
vamp([:db4, :f4,  :db3], 4)


I'm done for now, but keep an eye on this site, there'll be more of this tutorial. Next time, we're going to look at some of those heavy bass sounds and get the two parts playing together.

Thanks for reading, feel free to get in touch on twitter, where I'm called @MarmiteJunction, or drop a comment here. I'm always up for chatting code.

Sunday, 23 August 2015

Reading USB controllers in Ruby (or What to do when you don't know what to do)

Disclaimer: I'm writing a blog on this subject because I couldn't find a more useful tutorial online. The truth is that I have twice worked out how to read a USB device in ruby, but I do not have a good understanding of the USB standard. I am sharing both how I reached a working code-base and the code I wrote, but there will be people out there who understand this better. If you're one of them, please write a simpler tutorial so this one won't be needed any more.

Our story starts at the Brighton Ruby conference in 2015, where I presented a technology-themed version of the popular panel game, Just A Minute. I had put together a simple system in Pure Data to keep track of the scores, topics and the timer. Long story short, this system let me down, it crashed half way through the session and I had to re-construct the scores on the fly.

In retrospect, I found that Pure Data was not the right tool for the job, so I set about rebuilding the same system in Ruby, with the Gosu library.

One of the reasons I had chosen Pd in the first place was the simplicity of using USB HID devices (you can learn about that in my earlier tutorial). So half way through this process, I ran up against the fact that is is not quite so simple in Ruby.

Firstly, I started with libusb, a standard library for reading USB.

I'm using the controllers from the trivia game Buzz, which look like they should be a particularly simple USB device, no output, no continuous controllers, just 20 buttons, should be simple, right?

First order of business was to find out what the USB device was. There's none of the PD index-based HID identifiers, instead I had to use the linux command 'lsusb' to find them. And the output from this includes my laptops keyboard and mouse, as well as what appears to be some internal USB hubs. It looks a little like this:

ajfaraday@ajf-samsung:~$ lsusb
Bus 002 Device 005: ID 0cf3:3004 Atheros Communications, Inc. 
Bus 002 Device 003: ID 054c:0002 Sony Corp. Standard HUB
Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 003: ID 2232:1029  
Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

To this day, I have no idea what bus 1, device 3 is.

Okay, so we can't see the word Buzz, or playstation, or HID. The only clue is that it's made by Sony. The only way I can see to confirm this is to un-plug the USB and re-run lsusb. Sure enough, the Sony Corp. Standard HUB does not appear now. So I know this is the device I'm looking for.

Bus 002 Device 006: ID 054c:0002 Sony Corp. Standard HUB
It looks like this isn't enough information to actually use it, tho. I'm going to need more. The help page for lsusb tells us we can use the -s flag to choose a specific device, and the -v device to get verbose information.

ajfaraday@ajf-samsung:~$ lsusb -h
Usage: lsusb [options]...
List USB devices
  -v, --verbose
      Increase verbosity (show descriptors)
  -s [[bus]:][devnum]
      Show only devices with specified device and/or
      bus numbers (in decimal)
So we need to get the bus, and devnum (device) from our previous lsusb command. Here's the full information from my Buzz controllers.
ajfaraday@ajf-samsung:~$ lsusb -s 002:006 -v

Bus 002 Device 006: ID 054c:0002 Sony Corp. Standard HUB
Couldn't open device, some information will be missing
Device Descriptor:
  bLength                18
  bDescriptorType         1
  bcdUSB               2.00
  bDeviceClass            0 (Defined at Interface level)
  bDeviceSubClass         0 
  bDeviceProtocol         0 
  bMaxPacketSize0         8
  idVendor           0x054c Sony Corp.
  idProduct          0x0002 Standard HUB
  bcdDevice           11.01
  iManufacturer           3 
  iProduct                1 
  iSerial                 0 
  bNumConfigurations      1
  Configuration Descriptor:
    bLength                 9
    bDescriptorType         2
    wTotalLength           34
    bNumInterfaces          1
    bConfigurationValue     1
    iConfiguration          0 
    bmAttributes         0x80
      (Bus Powered)
    MaxPower              100mA
    Interface Descriptor:
      bLength                 9
      bDescriptorType         4
      bInterfaceNumber        0
      bAlternateSetting       0
      bNumEndpoints           1
      bInterfaceClass         3 Human Interface Device
      bInterfaceSubClass      0 No Subclass
      bInterfaceProtocol      0 None
      iInterface              0 
        HID Device Descriptor:
          bLength                 9
          bDescriptorType        33
          bcdHID               1.11
          bCountryCode           33 US
          bNumDescriptors         1
          bDescriptorType        34 Report
          wDescriptorLength      78
         Report Descriptors: 
           ** UNAVAILABLE **
      Endpoint Descriptor:
        bLength                 7
        bDescriptorType         5
        bEndpointAddress     0x81  EP 1 IN
        bmAttributes            3
          Transfer Type            Interrupt
          Synch Type               None
          Usage Type               Data
        wMaxPacketSize     0x0008  1x 8 bytes
        bInterval              10

Okay, there's a load of extra information here that we don't need. Just to cut a bit of this out, we're going to need idVendor, idProduct and the Endpoint Descriptor.

Back to libusb, the README for the library quickly gives me a bit of example code, which I've changed to include the information for my Buzz controllers.
require "libusb"

usb = LIBUSB::Context.new
device = usb.devices(:idVendor => 0x054c, :idProduct => 0x0002).first
device.open_interface(0) do |handle|
  handle.control_transfer(:bmRequestType => 0x40, :bRequest => 0xa0, :wValue => 0xe600, :wIndex => 0x0000, :dataOut => 1.chr)
end
But as soon as I try running this code I see this error.
/var/lib/gems/1.9.1/gems/libusb-0.5.0/lib/libusb/constants.rb:62:in `raise_error': 
LIBUSB::ERROR_BUSY in libusb_claim_interface (LIBUSB::ERROR_BUSY)
The stack trace points us at line 5, 'device.open_interface'. I did a lot of googling on this subject, and got a lot of confusing answers. I'm skipping a lot of trial and error here, but the answer here seems to be that something else, possibly the operating system, is reading the USB port already. So to use it with my Ruby app, I need to use detach_kernel_driver. Which looks a little like this:
  def reset_device_access
    usb_context = LIBUSB::Context.new
    device = usb_context.devices(
      idVendor: 0x054c, idProduct: 0x0002
    ).first
    handle = device.open
    handle.detach_kernel_driver(0)
    handle.close
  rescue => er
    puts er.message
    # nothing needs doing here
  end

There's a couple of things here, firstly, I'm finding the usb context, then finding the device by as in the example. I then need to open a device handle, run detach_kernel_driver, then close the handle. Only, if there is no kernel driver to detach, this throws an error. The rescue block here is something of a hack, I can't find out how to detect if I need to run detach_kernel_driver or not, so I simply catch the error, but don't halt the code. If it throws an error, then it didn't need to be done. This shows my ignorance of the deeper, darker parts of libusb, but it works.

I don't like this, but some times this is something we need to do, pending a better understanding of what we're working with.

So, I can actually get to my device, that's a start. Although, I want to use my device in and amongst some other code, I didn't want it to be limited to a single code block. I had to go digging around in the libusb API documentation and found that instead of using 'open_interface', which opens the interface, uses it within a block, and then closes it at the end of the block, I could instead use the 'open' method. My new code looks a bit like this:

class Controller
  def initialize
    @usb_context = LIBUSB::Context.new
    @device = @usb_context.devices(
      idVendor: 0x054c, idProduct: 0x0002
    ).first
    reset_device_access
    @handle = @device.open
  end
end

So, I've wrapped it in a class, run my reset method to ensure it's free, and saved the device handle as an instance variable, @handle. The next thing we need is a method to read that handle. For this we need the endpoint data from back in our lsusb data, specifically the attributes named bEndpointAddress, bLength and bInterval. Again, after a lot of trial and error, I found it easier to use interrupt_transfer, which is specific to input endpoints (the information coming back from a USB device). So this is also in the Controller class

  def raw_data
    data = @handle.interrupt_transfer(
      :endpoint => 0x81,
      :dataIn => 0x0005,
      :timeout => 10
    )
    data.bytes
  end
Again, I had a great deal of difficulty reading the result of interrupt_transfer, and the code shows where I wound up after it. The data variable is a string, but it's not human readable, the result looks a little bit like this.


 

 

 
0
 

 


Each line of this is a single output from the raw_data method, while I'm pushing some buttons on the Buzz controllers. There's no understanding this. A lot of trial and error later, I discovered that this is actually an array of numbers, each one is a byte (a number made up of 8 binary bits, which translates to a number between 0 and 255. So, with the method defined above, I wrote a script to watch what happens when I push the buttons...
require 'libusb'
require './controller.rb'

c = Controller.new

loop do
  begin 
    puts c.raw_data.inspect
  rescue
  end
  sleep 0.01 
end  

Oops, there's another one of those unhandled rescue blocks. This is bad practise! Usually you would either stop the program completely on an error, and guard earlier on in the code against situations which will cause an error to be thrown. Only I can't find out if it'll work or not without just trying it. So this will have to do for now.

What I've found by probing in this way, is that when a button is pushed between one call of the raw_data method and the next, there is no error, but when no buttons are pushed, it throws the error 'error TRANSFER_TIMED_OUT'. So I just ignore the timeouts, and use the data from when there is a change. So I can now see the output from the raw data method, only when buttons are pushed. Here's what happens when I push a button at random:
[127, 127, 0, 0, 240]
[127, 127, 0, 4, 240]
[127, 127, 0, 0, 240]
[127, 127, 0, 4, 240]
[127, 127, 0, 0, 240]
Okay, so the first thing I've noticed is that the first two numbers (bytes 0 and 1) don't seem to change at all. I have no idea why, but I can easily isolate the input I was looking for. The fourth item in the list (byte 2) goes up by 4 when it is pushed.

Now, 4 is a binary number, and a quick check of the other buttons proved a definite pattern. Pushing any button on the controllers will increment byte 2 or byte 3 by a binary number (1, 2, 4, 8, 16, 32, 64 or 128). The long and the short of it is this, as I mentioned before, these numbers are made up of 8 binary bits, each representing one of the numbers listed above. The number each active bit represents is added up to make that number.

If I select the index of a number in Ruby, it gives me that indexed bit of the number. For instance:
# bit 0 represents 1, so for number 1, this is active.
1[0] 
# => 1
# but if the number will not contain a 1 in it's binary makeup, this is 0.
2[0]
# => 0
# This means that if we add up some binary numbers, the bits for these numbers is a 1:
n = 4 + 32
n[2] # the bit for 4
# => 1
n[5] the bit for 32
# => 1

So, applying this to the data to
c = Controller.new

loop do
  begin 
    puts c.raw_data[3][0] == 1
  rescue
  end
  sleep 0.01 
end  
So, raw_data[3] is byte 3 of our raw data and raw_data[3][0] is bit 0 of byte 3. If this bit is a 1 instead of a 0, that button is pushed. By pushing each of the buzzer buttons (the ones I'm interested in), I found this information out.

# buzzer | byte  | value | bit
# 1      |  2    |  1    |  0
# 2      |  2    |  32   |  5
# 3      |  3    |  4    |  2
# 4      |  3    |  128  |  7
That is, for buzzer 1 we see byte 2 increment by 1, which is bit 0 (the one on the far right) etc. So to check all 4 buttons, I narrowed this down to an array with which byte and bit to look for for each button. Which I've stored as a constant on my Controller class.

class Controller

  BUZZ_BITS = [
    [2, 0],
    [2, 5],
    [3, 2],
    [3, 7]
  ]

  def check
    data = raw_data
    BUZZ_BITS.each_with_index do |lookup, i|
      if data[lookup[0]][lookup[1]] == 1
        puts "buzzer #{i + 1} pushed"
      end
    end
  rescue
    # no input, just ignore the error
  end
  ...

This is really the end-point for this tutorial. What I've done in the end is store which bit I'm looking for for each button and at each call of the check method, I grab the raw data. Then I iterate through the bits I'm looking for, and print the index of that bit (with a + 1 so people aren't confused by the zero index).

Okay, so this is pretty confusing, but it works, honest. You can see the full example code at www.github.com/ajfaraday/usb-controller-demo and see it in the wild at www.github.com/ajfaraday/ruby-jam.

Thanks for bearing with me on this one, I really hope you found it helpful. Any comments, questions? Feel free to get me on twitter at @MarmiteJunction

Monday, 16 March 2015

An open letter to recruitment consultants, on their relationship with developers.

To: The incumbents of the IT recruitment industry

Recruitment consultants are a reality, it can be difficult for companies to find the right candidate, and for developers, particularly those who decide to work a series of short term contracts to find their next position. In this climate recruiters are regularly in contact with tech companies and programmers both searching for jobs and employees and attempting to find matches for them.

But it's not always a match made in heaven, often developers are not looking for work, and many happily hold a single job for a number of years. However my skills as a Ruby on Rails developer are currently in a lot of demand, so sometimes as many as five different agents, often more than one from the same agent, will get in contact and take up some of my valuable time attempting to tempt me into a new position.

Many people in the computer industry share the experience of being frustrated by over-zealous recruitment agents, often getting in touch during working hours, advertising unrelated jobs and using hyperbole to exaggerate the desirability of positions. In more extreme cases, recruiters can be patronising, unduly persistent or completely dishonest to both potential employers and employees.

I appreciate that recruitment is a goal-driven sector, and there is pressure to perform, but making more contact is not, necessarily, the best way to fill jobs, or find them.

-----------------

Allow me to give one example of the mistakes which recruiters make:

My boss works hard, he is a director of the company, and has worked hard to make it the success it is. He's a rails developer, but also handles database and system administration, as well as liaising with customers, regularly working to capacity.

One of my colleagues walks through the door with the phone, someone, giving their name, but not the reason for their call, has asked to speak to the head of Rails development. The boss answers, politely, and quickly becomes annoyed "No, I'm a director of the company, I can't just leave to do another job."

The recruiter has learned the piece of information they phoned to ask about, taking around 30 seconds of the time they are uniquely placed to know the value of. They've also told a director of the company that they want to poach developers from what is a small, tight-knit team, of course he's annoyed.

The correct response would be to give it up as a bad job, politely withdraw and cross his name off the list, permanently.

The next thing I hear is a raised voice "No, we are not currently hiring. Good bye!". It's hard to slam a cordless phone, but my boss had a jolly good try at pushing the red button with attitude.

Having phoned the office phone number in the hope of getting to a senior developer, told the boss that they're attempting to poach developers and persisting in a conversation which is clearly over. This recruiter decided to switch modes from "we need developers" to "we've got developers for you" without missing a beat. Besides a devious method of getting in touch, they're must be an untruth in there somewhere, or at least speculation. They're either lying to the developer in him, or to the director.

This having taken place, is it any surprise the boss didn't want to enter in to a business relationship with the individual on the phone?

----------------

Here's another example, in which a lengthy and complicated bug fix was interrupted with a 30 second phone call which put my own thought processes back by at least half an hour.

The phone rings, undisclosed number, I step out of the room and answer, already expecting recruitment, PPI claims or some form of "get rich quick" scheme.

The voice comes through, "Hello I'm (name) from (company name, it was an acronym, unhelpful)"

I reply, "Sorry, I don't know that company, what do you do?"

"We're a company of IT specialists."

"But what do you do?"

"I'm looking for a developer for a job in..."

Enough of my time has been wasted, I raise the tone of my voice a little "I'm not looking for work at the moment, thanks."

Again, the conversation has clearly ended, but the killer instinct which recruitment agents all seem to develop kicked in, but uncertainly...

"But what if we could..." he pauses, "Offer you more money". There was a noticeable rising, uncertain tone to this last word. As if he'd only just realised that the word money might not be a magic bullet in his fight.

"No thanks", and I make another spirited attempt at slamming my mobile phone.

I couldn't stop thinking about the fact that this total stranger, who wants to start an actual business relationship with me, seriously thought he was going to tempt me away from my current position with a pay cut.

This fact, on it's own, was enough to distract me for some time, but in truth any interruption is disruptive.

----------------------------

I don't just want to shout in the direction of recruiters, but to offer some genuine constructive advice on how to avoid alienating developers who aren't interested.

There are three principles I'd like to suggest you bear in mind when contacting developers.

Concision

Recruiters will often provide a lot of extraneous information about a position, for instance, that it has investment, inflated perks (office furniture and powerful work computers are not perks, they're necessities), or who their clients are. Developers are busy, we don't have a lot of time to read emails in detail, especially when we haven't decided to change jobs at all.

We are not likely to be tempted away by the fact you've taken more time to fill out some of the unimportant details of a job, and to exaggerate the benefits, and quite possibly hide some negaitve points.

Mostly, we'd like to find out who the job is for, so we can do our own research, and bypass all this text.

Okay, so there's a business case for keeping that piece of information secret. Although it does try to start a real business relationship by openly showing a lack of trust in potential clients.

Here's what developers actually want to know:
  • What's the job, don't just say "may be relevant to you", what does the successful applicant have to do? 
  • Where is it?
  • Is it a contract? (this isn't always a good thing), how long will it last?
  • How much is it? We're not completely money obsessed, but it's a good guide for the level a job is at, if we can go for it, and if it's worth the considerable disruption of changing a job. 
How about a bullet point list of this information, instead of paragraphs and paragraphs of nebulous, imprecise information?

Relevance

A javascript developer will rarely be interested in a java position.

A new developer will not be interested in a highly responsible job which doesn't provide some amount of training and employee development. (Senior developers don't happen without being juniors first).

If a CV has not been updated for years, this is probably because it's owner is not looking for work.

Recruiters tend towards a scatter-shot approach to recruitment, the theory being that recruitment emails have a hit rate, so a higher volume of emails lead to the same proportion of a higher number of people. Keywords don't always result in candidates who fit the bill.

I have personally blocked a number of recruiters from contacting my email address when I get more than one email from them a week. I have been known to reply tersely when I get multiple emails concerning the same position from multiple recruiters belonging to the same agency.

Try to gain a little more understanding of the industry you are working in, be a little quicker to take names off the list, or at the very least wait a year before getting back in touch.

Remove hyperbole and patronisation

Developers are intelligent, hard-working people who understand our work affects the public image, productivity and stress levels of our clients. We are professionals, we've worked hard to develop our skills and persist in improving skills to the benefit of our employers.

We do not need to have our egos stroked in order to begin a serious business relationship with an agent. We do not need to be called ninjas, rockstars or jedi to pique our interest. In around half a decade, there will be many programmers who are actually twelve years old, presently, however very few are.

Please don't tell me the job you're representing is an exciting opportunity. Near enough every email from a recruiter starts this way. It's a job, I can decide for myself whether or not it's an exciting prospect for me.

Any other hyperbole says so much more about the author than the subject, facts really are more important.

--------------------

So how would I like to be contacted?

Dear Mr Faraday

I'm currently looking for a Ruby developer for my client, a digital advertising agency based in Rotherham. It's a 6 month, contract, which may become permanent, and a £150 day rate.

Let me know if you're interested and I'll send over a job spec.

Kind Regards

Ms Joan Q Recruiter - Doohickey IT

As a prospect, that's really all I want to know. It's not hard to ask for more, and initiate a dialogue, but it's very off-putting to be fed lots of information outright.

Just let me know who you are and what you're representing to me.

Oh, and you'll rarely get a good response by phoning my mobile during work hours. Interrupting my job to try and take me away from it is not a considerate thing to do. It's like an estate agent saying "It looks like you have a home, would you like a home".

--------------------

In conclusion, think about who you're talking to, and how this unsolicited contact will be received.

Thanks!

Andrew Faraday