Jimmy Breck-McKye

Developing opinions

A case for self-teaching software engineering

If you are a non-CS grad seeking a career in software engineering, I recommend you try and self-teach as much as possible. Bootcamps have their place and books provide a great head start, but only by learning independently can you really get to grips with what the job entails.

The reason is simple: this is a job about independent problem-solving and self-directed research. It requires the ability to learn obscure new technologies on the fly and spend a lot of time working without guidance. The sooner you embrace working this way, the better you will get at it, and the clearer an idea you’ll have of what the work is really like.

Of course, everyone needs guidance at the beginning. You can’t self teach until you have some kind of map of the territory, some kind of idea where to go and even what questions to ask. I’m not advocating some masochist experience of teaching yourself programming with nothing but a text editor and a hostile compiler. Everyone needs a start, and you should never eschew teaching resources completely. But not many resources will teach you the art of asking good questions, and even fewer will show you how to effectively ask them of yourself.

Because there are a lot of times you’ll be on your own. Your job as a developer will be to invent new products using existing technologies that solve all the well-known problems, such that what remains is either highly unusual and unique to your project. Most of these problems won’t have direct analogues to anything you’ve encountered in your textbook or courses, and they won’t have been seen before by anyone else, either. Whilst you will get support as a junior developer, there will come a time when figuring it out becomes your job alone. You can use resources and ask for advice, but you are essentially working unguided.

At this point you, will need to dig into your reserves of patience and tolerate the frustration of several dead ends and countless near-successes. And I think only teaching yourself programming really comes close to that as preparation. Both require similar kinds of deduction and the ability to disassemble a complex problem into simpler parts. Both rely very heavily on debugging and reverse-engineering. Both require you to feel intensely comfortable working without any indication how close you are to a solution. They’re directly transferable skills. More importantly - in some ways - they both require you to stay creative and tenacious in the face of uncertain odds, and get used to bouncing back from failure.

That’s not to say things like computer science degrees are poor preparation for software engineers. Far from it. If you have the opportunity to learn computing at university, seize it. It will teach you new ways to think, foundations for building some very difficult programs, and can open opportunities to work on cutting edge projects that won’t be available to non-CS grads. It’ll also give you some historical context on the technology itself, which can be quite illuminating.

It won’t really teach you how to build software, though. Only writing software unguided, with trial and error, can do that.

Nor will a coding bootcamp. These schools are great for crowning hobbyist experience with more formal certification, or teaching veteran programmers how to retool themselves in more modern technologies. But they are typically very shallow and won’t serve you well once you need to transition into new technologies.

I’ve worked with several bootcamp grads in my career. They were energetic, intelligent and eager to learn. I enjoyed working with each of them. But - they knew absolutely nothing about computers, and it stymied their effectiveness. They had to invest a great deal of their own time afterwards getting to grips with those technical fundamentals. So even successful bootcamp grads cannot escape some degree of self-teaching.

Neither can successful developers generally. Programming is highly sensitive to fads and trends, with the tools of your trade reinvented on an almost yearly basis. Very rare is the employer who will pay you to learn these new technologies on their clock (why would they? It makes you more employable), so you must - you guessed it - teach yourself. Usually with spotty documentation and few online resources - because by the time those resources really exist, in a way you’ve already missed the boat.

You see, there’s the rub. To a certain degree, what makes you valuable as a developer - and why you’re paid so much - is your ability to help your employer stay ahead in the market, by leveraging new technologies before their competitors can. The best way to do that is by knowing the real state of the art. But if your personal state of your art, the edge of your abilities, the ceiling of your skill is bound to whatever has been packaged into online courses and video series, to some extent your skills are going to be always behind the curve, always already out of date. You’ll know a lot of ‘core technologies’ that are the bedrock of most projects, but few of the niche technologies that really make the projects new and different, the ones that really let you add value.

Be wary of that. At the point a technology becomes a standard, someone out there is already looking at a way to package what you know into a library or a product they can commoditise. Any solved problem can be automated or abstracted away somehow, so that what’s left is a continuous frontier of the new and the unsolved. Either you embrace the novelty or you put yourself at risk of becoming unemployable. As generous as developer salaries can be - when you have skills the market desires - that same market can turn on you, quicker and with sharper teeth than you think.

Don’t throw away your textbooks or unsubscribe from your video course. But try to spend the majority of your time working on personal projects and learning programming independently. I guarantee that the sooner you start, the more dividends it pays out.

Comments