In a perfect world, full-stack developers are developers that can—in theory—create a usable end product with minimal input or support.
It seems simple enough, not everyone agrees on what makes a “full-stack developer.” Some don’t think the term should be used. Some adamantly defend the need for it. Some don’t even believe full-stack developers exist!
The strife over what constitutes a “full-stack developer” doesn’t just lead to online arguments (though it does incite a…lot… of… them). It also spurs misalignment in how to assess, attract, and hire full-stack candidates.
And while it’d be easier to put this dispute on the back burner, the term “full-stack developer” isn’t going anywhere. Not only has the job seen 198% growth in demand in the last year, but it’s also the most popular occupation for developers across the world (per our 2018 Developer Skills Report). In other words, it’s not a disconnect teams can afford to ignore.
In this piece, we’ll help clarify why there’s confusion around the oddly political job title, explain each side of the debate, and help you align with your hiring team to ensure you identify the skills you really need for your team.
To better understand the current disagreement over the term “full-stack developer,” we’ll have to unpack how the argument started.
The term “full-stack developer” hasn’t actually been en vogue for long at all. One of the earliest known mentions cropped up in 2008, and the first Google queries for a “full-stack developer” didn’t happen until 2010—and it’s only increased in popularity since:
But if this role has existed in some form since pre-internet times, why don’t we see this search term gain popularity until the early 2010’s? It turns out, answering that question requires a little bit of a tech history.
Per the history of the term, “full-stack developer” originally gained popularity in the mid-2000’s, where simpler, more streamlined technologies meant that many developers could feasibly execute a project end-to-end. The approach was a total 180 from the siloed responsibilities that defined the previous era of the late 1990’s and early 2000’s.
But as time wore on, a shift back to more complicated technologies and more layered stacks in the early 2010’s once again encouraged stratification of roles. Server-side and client-side work became increasingly polarized, contributing to the popularization of the terms “front-end developer” and “back-end developer.”
In response, in the same era, “full-stack developer” regained popularity in an attempt to distinguish developers that didn’t identify with the front-end/back-end specialist binary. Instead, they defined themselves as a third type of developer—one that could tackle both front and back-end responsibilities.
But of course, not everyone in the tech community agreed with that interpretation. And while it’s hard to pinpoint exactly when this debate surfaced, most credit two happenings for catalyzing it: first, a 2010 piece from former Facebook Engineer Carlos Bueno on the definition of full stack, and second, a claim that Facebook “only hire[d] ‘Full Stack’ developers” in 2012, as heard by Laurence Gellert at OSCON.
The result? A broiling terminology debate that’s still alive and well almost a decade later.
The anti-full-stack developer camp is perhaps one of the most dominating voices in the argument over what does (or doesn’t) constitute full-stack. In short, their argument is hinged around the idea that a full-stack developer is someone with “the ability to easily navigate the back-end and front-end with a senior level of expertise.”
While there’s some variety within this view, this camp believes that full-stack developers should be able to:
And while this group concedes that many developers can do some work that spans both disciplines, they feel very few can do both well. In short: they allege that a truly full-stack developer is a unicorn, and that too many people bill themselves as a full-stack developer without having full-stack qualifications.
This camp’s gripe with the “full-stack developer” term can be summed up in the following points:
This argument alleges that true full-stack developers are few and far between. Instead, they’re more apt to believe that self-identified full-stack developers are actually front-end developers with some back-end capability, or vice versa.
On the other hand, the pro full-stack developer camp argues for a broader interpretation of the term. They refute the idea that a full-stack developer has to be an authority in every layer of the stack; instead, they need working knowledge of the entire stack, with true expertise in only a few layers.
Their definition of “full-stack” opts for a less restrictive set of requirements, describing someone who can:
In other words: from this view, a full-stack developer doesn’t have to be an all-around expert in every layer of the stack. Instead, here, a “full-stack developer” refers to being an effective and seasoned generalist: someone with a wide knowledge base, a solid specialty, and the willingness to admit when they’re out of their depth.
Their defense of the term “full-stack developer” is rooted in a few key thoughts:
This argument alleges that full-stack developers don’t replace, but build on and complement the work of front and back-end specialists. Their core value lies in their ability to understand and work on the full breadth of a project, and to bring a more global technical knowledge to everything they touch.
In a nutshell, the philosophy can be summed up in this quote from The Pragmatic Programmer:
“The more things you know, the more valuable you are…the more technologies you are comfortable with, the better you will adjust to change.”
Amidst the debate, there’s some good news for those of us trying to decipher a definition for the term: both sides do align one thing. That is, they expect at least a basic understanding in all layers of a given stack. Their primary difference lies in how much expertise they expect from a full-stack developer in each layer.
Inching closer to alignment, a paper published by the Association for Information Systems (AIS) recently analyzed the top 5 most referenced and visited definitions of “full-stack developer” in an attempt to distill a common interpretation.
Here’s the universal definition they proposed based on that research:
Full stack development is a methodology which addresses all stack layers and in doing so creates a complete, implementable solution to business requirements. Full stack developers have broad experience among all stack layers and expertise in a few layers. They should be able to render a minimum viable product in a given stack.
– “Towards a Consensus Definition of Full-Stack Development”, Shropshire et al, 2018
While this definition is still bound to stir opinions from extreme viewpoints on either end, it’s certainly a step in the right direction to aligning on what, exactly, a full-stack developer is. From here on out, it’s up to the technical community to adopt a working definition—tech hiring teams, especially. After all, they do write the job descriptions.
With that in mind, here’s what recruiters and hiring managers can do to navigate the discussion:
Which camp do you and your team fall in? Let us know what you think in the comments.
You definition of full-stack won’t always align with your candidate’s. Standardize your process with coding challenges.