David King got his start as a professional programmer working odd jobs. He took on small software projects, set up networks, that sort of thing. For fun in his spare time he’d contribute to the open-source operating system FreeBSD—a pastime many developers consider the most thankless job ever. People started to notice. Eventually, King landed a gig with Reddit, the biggest social news site on the web. Now he’s one of six engineers at Hipmunk, a travel site with good buzz and $5 million in funding. He works with his friends, makes a good living, has equity. By all accounts, Dave King is the midst of an impressive career. He’s a successful developer. And, like many of his peers nowadays, he did it all without a college degree.
While there are a few high-level computer-science concepts that require a college education to master, King says, 90 percent of developers won’t use that knowledge in their day jobs. And yet a diploma is still the first thing recruiters at most large companies look for when hiring a programmer. “It can be very difficult to prove yourself to the people you want to work for without a degree,” King says. “You aren’t even given a chance.”
That process is fine for most industries—a Harvard-educated accountant is a lot more likely to be a good hire then a self-taught one. But programming isn’t accounting. It requires creative thinkers and problem solvers, people unlikely to thrive in the confines of a college classroom. So why do hiring managers apply traditional methods to a nontraditional job?
As programmers become the backbone of the business world and the tech industry embarks on a bubble-driven hiring blitz, that thinking is going to have to change. In many places, it already has.
It’s a good time to be a developer. Businesses from Fortune 500s to neighborhood bars need a digital presence to compete. Add that to a ravenous market for mobile app development and a booming startup scene and you have the makings of a new tech bubble. There just aren’t enough programmers to go around. The country’s jobless rate may be hovering around 10 percent, but a recent study by Dice, a technology and engineering career website, put the number of available tech jobs at more than 84,000.
The demand starts at the top. For years, the big Silicon Valley companies have been locked in an escalating battle for the world’s top talent. Salary, of course, is the big lure. Google, The New York Times reported this year, has increased starting salaries for recent college grads by nearly $20,000, hoping to fend off startups. In case paychecks aren’t enough, companies use perks to differentiate themselves. Facebook will do your laundry, Google offers free haircuts, and game developer Zynga covers pet insurance.
While the industry is growing exponentially, it’s also becoming highly segmented. Developers are building careers on platforms and technology that didn’t exist a few years ago. Having a 9-to-5 job is no longer a requirement. While it’s not easy, savvy developers are increasingly enticed by solo paths: building their own iPhone apps, starting freelance businesses, or founding startups that cater to new segments of the web. In the same Times story, a recruiter for the venture capital firm Andreessen Horowitz said one-third of the engineers she attempts to recruit ask for funding to start their own businesses instead.
Developers have an attractive menu of options. They can enjoy the fat paychecks and quirky perks at a big company, they can get their hands dirty building something new at a startup, or they can strike out on their own. But what’s good news for developers poses a challenge for established companies that fall somewhere in the middle. They’re having such a hard time hiring talent that some have stopped trying. Last year, when the analytics firm Webtrends couldn’t find enough developers to build Facebook apps, it acquired the app-development firm Transpond. It was easier to buy a company than hire coders.
Even with many companies facing such an intense hiring squeeze, they haven’t yet taken the professional-sports approach and started drafting coding superstars straight out of high school. Most haven’t even changed their college-degree requirements. Companies that pride themselves on being innovative remain hung up on the diploma. Which is doubly surprising given that bypassing college isn’t a new idea in the tech world—it’s just newly relevant. The last time the diploma debate was this heated? At the height of the previous bubble.
In 1998, Forbes published a story called “The Tyranny of the Diploma,” in which author Brigid McMenamin highlighted the fact that Bill Gates had completed only three years at Harvard before dropping out to start Microsoft. The article sparked more then 200 comments on Slashdot, one of the original user-curated tech news sites. They couldn’t agree on the best course—to school or not to school. Many commenters argued that, while a formal education doesn’t teach practical skills, recruiters don’t know how to filter applicants without looking first for a degree. The site’s founder, Rob Malda (aka CmdrTaco), called his college education a four-and-a-half-year “time suck.” He wrote, “Education can’t really keep pace with ‘modern’ technology. Sure, learning theory and getting some practice never hurts, but if you’re already a geek, is it a waste of time?”
The conversation hasn’t changed much over the years, at least not when it comes to getting a job at the entrenched Silicon Valley giants. Google’s hiring policy remains famously strict: Applicants must have a degree and a high GPA, regardless of how many years it’s been since they graduated. At Netflix, however, the line between graduates and nongraduates is getting hazier.
Jeremy Edberg, a hiring manager and lead site engineer, says that while his company doesn’t require that job applicants have a degree, managing the data-heavy video delivery Netflix does is better learned in school. Still, he adds, those jobs are the exceptions. And only the top programs—MIT, Berkeley, Stanford—are teaching the skills they demand.
“There are two kinds of degree. One where you learn about theory and one where you just learn how to write a programming language,” Edberg says. “That [second] one’s not that useful. You’re just learning syntax, and any coder can pick that up quickly. But a good education is based in theory and principle.”
Edberg didn’t need either kind to get his first tech job. He dropped out of Berkeley in 1999 at the height of the boom to work at a tech company where a friend was working. (Laid off after the bubble burst, Edberg went back to school and majored in cognitive science.) His next stop was at eBay, where he worked until 2007, at which point he decided to attend Paul Graham’s Startup School—a daylong seminar for engineers and entrepreneurs focused on building businesses, and one of the few areas of “education” that programmers can agree on. He soon became Reddit’s first employee.
Edberg’s story isn’t that unusual in the startup world. Staffs are small, so factors other than a degree take precedence: Will the applicant get along with the rest of the staff? Can she play several roles at once? Self-taught coders are often quick learners with a broad knowledge base and the ability to adapt—exactly what founders are looking for.
“Résumés are important,” says Christopher Slowe, who handles much of the hiring at Hipmunk, where King works. “Projects and past work are more important to the hiring process. We ask for a tic-tac-toe program that uses a web server as the ‘first glance’ at a potential hire.” Slowe says that when it comes to web development, as long as you’re not working in a specific role like Edberg does at Netflix, no skill is more important than problem solving. Every issue is new and they all need to be resolved quickly.
Nathan Manousos, a freelance developer who dropped out of college, says that he has rarely encountered a startup looking to hire people with computer-science degrees. “There’s a sense that if somebody didn’t go to school they must be really passionate,” says Manousos, who’s also hired developers. “If I see somebody who’s really good and didn’t study it in school, I think they must really care about what they’re doing.”
In college, Manousos spent his time teaching himself programming skills he knew would be important to a career. “Meanwhile I was failing a lot of math classes. There’s a lot of knowledge that you gain in a computer-science program about theory and low-level things and data structures and algorithms, which are nice to know but aren’t really that applicable day to day.”
While startups have long valued showing your work over proving your pedigree, the idea seems to be trickling up. Midsize companies, feeling the squeeze between the behemoths and the startups, are beginning to ask applicants to submit completed projects along with their résumés.
ClickFox, a consumer analytics firm, employs about 40 developers on its research-and-development team.
The hiring manager who oversees the group, Tom Wheeler, says he doesn’t worry about degrees. He’s much more concerned that the applicant is aggressive and has the right type of personality. “I care about what they have built, how they built it, and how they work together,” he says.
Wheeler was one of the first developers to leave college before graduating. “When I went to school there was no pure computer-science degree. I just wanted to write software and I dove right into the business world.” At the time his decision was unusual. Now, he says, the industry has become so broad and diverse that it’s getting easier for folks like him to find a company where they fit.
“If you’re a very bright individual and you’re good at self-starting, you don’t need to go to college,” Wheeler says. Nor do many programmers want to.
“Developers are a different breed of people. They don’t understand the way the rest of the world thinks. And that’s why they’re successful without getting a degree.”
Ultimately, the developer job market is a disjointed place, with different employers requiring different experiences for the exact same work. That’s a recipe for a lot of very confused undergrads. Students know that their education won’t be particularly applicable in the real world, unless they want a job that relies heavily on theory. But leaving school breaks with tradition and societal expectations. It’ll upset the parents.
Katie Zhu, a computer-science major at Northwestern, decided to stay in school. “It’s easy to think about the shortcomings of it,” Zhu says. “But there’s still a big overarching security. I think that’s invaluable.” Still, she makes sure to work on projects in her free time so she can have something to show employers once she graduates. “They spend a lot of time in college teaching you how to write a résumé but not how to make a GitHub file,” she says. GitHub is a social coding site that lets programmers upload their projects and collaborate on solving problems. Many developers point to the advent of open-source software as a turning point in self-education; it allowed programmers to pick through each other’s code and discover new ways to attack problems.
“There’s more of a void in terms of the university creating the opportunity for you to do projects,” Zhu says. “You have to do that in your own time. The school could make that more part of the dialog.”
If Jim O’Neill had his way, Katie Zhu wouldn’t have gone to college in the first place. O’Neill runs the Thiel Foundation, which gives grants to individuals to start companies to spur scientific research and technological innovation. In September 2010, the organization announced its newest fund: 20 Under 20. It will give 20 entrepreneurs under the age of 20 the money to start their own businesses. The only catch: recipients have to drop out and concentrate on their endeavor full time.
Funded by Peter Thiel, a cofounder of PayPal and an early investor in Facebook, the foundation believes that college ultimately prevents many innovators from reaching their full potential. (Like Gates, Facebook’s Mark Zuckerberg also dropped out of Harvard.) “Not only is college a potential barrier, it also does not necessarily have the benefits that lots of people assume,” says O’Neill. “For one thing, you don’t need college to be a good coder. But most people take the time and expense to go to college, and it’s not a rational decision. It’s more of a default. People don’t think about the cost benefits and choices. They do what their friends are doing and what their parents expect.”
The Thiel Foundation hopes that by giving young people a chance to bypass college and actually try out their ideas, it can prove college isn’t the only route to starting a career. “We think employers should stop requiring a college degree. It’s vague and means so many different things,” O’Neill says. No two computer-science programs are the same, he argues. “[A diploma] doesn’t provide information.”
Expect more companies to start judging applicants by how they spent their post-high school years, not where. After all, hiring managers and developers agree that, ultimately, programming comes down to critical thinking. The way to truly know what developers are capable of is to look at what they have produced. How do they solve problems? What projects have they completed on their own?
Perhaps most importantly, were they willing to take a risk and dedicate themselves to writing code? If someone has the drive and passion to tackle difficult tasks without the support of an institution, hiring managers and recruiters should take notice. Just ask Hipmunk’s David King. “Being self-taught isn’t easy,” he says.