Showing posts with label life-optimization. Show all posts
Showing posts with label life-optimization. Show all posts

Thursday, October 19, 2023

This one weird trick^H^H^H^H^H deep technique for writing an actually good resume

Resume-writing is a game.

There are two players.

There's you, trying to condense your whole life into one page in a way that presents you as the most impressive candidate possible.

And then there's the reviewer, trying to decode that page into a real person they can assess.

There’s been a lot written about writing resumes. And also on getting awards and grants, another variation of this game. My favorite piece is Steve Yegge’s Ten Tips for a (Slightly) Less Awful Resume. 15 years later, it’s still relevant. Turns out the ways people communicate about competence don’t change very fast.

But you don’t want a slightly less awful resume.

You want an actually good resume.

A few months ago, I finally shared my #1 piece of interviewing advice. I think it’s time I did likewise for this earlier part of job searching.

But there’s something you need to know about first.

Posers

Back at school at Carnegie Mellon, people were who they said they were. Classes were hard, and the number of CS majors was kept small. If someone told you they were a good coder, it meant they had stayed alive through labs that slaughtered their peers. Boast falsely, and there’s a fair chance someone in the room saw you spend 90 minutes on a programming final that took them 15.

But something that shocked me when I first moved to Silicon Valley was how often the “good programmers” aren’t. For the first time in my life, I met posers.

There’s the guy volunteering at a Stanford lab while claiming their project was actually his company. The person who was employee #9 at a $100M startup but told his housemates he’s the founder. The intern from the East coast who drew an audience with his tales of being a machine-learning researcher while brushing aside questions about his actual role in the research. The woman who introduced herself for years as the CEO of some startup that never seemed to do anything. And then there’s the countless Google and Facebook employees who thought getting a job at a billion-dollar company somehow made them the world’s best.

I think Silicon Valley tends to attract the extreme in this regard, and they tend to show up at parties frequented by ambitious young people. The rest of the world lies somewhere in-between that and my famously unpretentious alma mater.

But still, they’re out there. And some of them send in resumes.

Like there’s the guy applying for an enterprise sales job coming whose resume proudly listed “Business Sales — Apple, Cupertino, CA.” His resume spoke about all the companies he worked with, but soon we realized that he wasn’t putting together complex deals from Apple HQ, but just handing out Macbooks at a local Apple Store. (And not in Cupertino — around that time I learned the hard way that you can get Macbook t-shirts but not repairs at Apple’s only store in Cupertino.)

Then there’s the guy finishing up a master’s degree who talked his way into a first round software-engineering interview. He regaled me with the story of his past internship where he acted as both a software engineer and a project manager. Then we got to the coding part, where he happily churned out code full of unmatched braces and variables that don’t exist. We later learned from his former employer that they put him on project-management only after giving up on him producing useful code. And I learned not to take “graduate courses in machine learning and data science from _____ State University” as a meaningful signal, no matter how cool the project sounded in one sentence.

I saw it too when reviewing applications for the Thiel Fellowship. I distinctly remember an applicant who boasted about being “one of the only people able to program futuristic technology like Google Glass” and speaking in front of huge crowds at tech conferences despite not seeming to have accomplished anything of note. But he also claimed to have over 100,000 Twitter followers, and…that was true! That’s when I learned that Twitter followers can be bought. Now I see boasts about follower counts as a red flag without a clear reason for the follows. (I Googled this guy recently. He’s now claiming that Steve Jobs came to him for advice when he was 15.)

Real Evidence of Competence

All of this motivates my #1 piece of resume-writing advice.

So, here it is:

You have an incompetent evil twin who is trying to pass themselves off as you. You must say things they can’t.

That is, you must say things where “this person is competent” is a very likely explanation for you saying that, and “this person is overinflating themselves” is a very unlikely explanation.

There’s a certain law where, if you hear an ad for a game boast “Explore over 10 levels and fight with dozens of weapons,” then there are exactly 11 levels and 24 weapons. Likewise, jaded reviewers will interpret your resume as the weakest thing consistent with the text. So if you write “Helped launch new features with millions of users,” then the default assumption is that you took notes in the meetings and maybe built a few unit tests. But if you write “Sole developer of the Foobar feature, which is used by 500,000 people weekly,” then your note-taking non-coding doppelganger can’t compete, and all that’s left is to evaluate how impressive the Foobar feature actually is.

There’s a mathematical way to state this advice that I find illuminating.

  • Let P(t|c) represent “the probability this text was written given the person is competent.”
  • Let P(t|~c) represent “the probability this text was written given the person is not competent.”
  • You want to maximize P(t|c)/P(t|~c).

There’s a technical term for this: “Bayesian evidence of competence.”

So many of the mistakes people make in resume writing come from focusing on writing something that sounds like what an impressive and competent person would say — optimizing P(t|c) — without a corresponding focus on writing something that a poser couldn’t say without blatantly lying.

There are parts of the tech and business worlds where expertise is hard to acquire, where the choices are endless and subtle, and their consequences are years off. Software architecture can be like that.

But whether to pass a resume to the next stage is a binary decision, and even a beginning interviewer can quickly review hundreds of resumes and get rapid feedback on how well they predict interview performance, if not actual job performance.

That means that the potential interviewer reading your resume will almost certainly be very good at playing their side of the game.

But they want you to win. So just a few tips can make you much better at playing yours.

Just remember: the real game is not to write an impressive resume.

It’s to be an impressive person.

Friday, February 23, 2018

The Practice is not the Performance: Why project-based learning fails

Last night, I encountered an old post by Zach Holman where he pushes the idea that traditional school-based CS is useless; project-based learning is the way to go. I’ve heard this idea repeatedly over the last 10 years, and know at least one person who’s started an education company with that premise.

I don’t want to debate the current way universities do things (I found my undergrad quite useful, thank you), but I do want to dispel the idea that everything would be better if only we switched to project-based learning. The opposite is closer to true. Project-based learning is not a superior new idea. Project-based learning is the most inefficient form of learning that still works.

To see why, we actually have to sit down and think about the learning process. I’ve learned a number of models of learning over the course of my teaching training, but the one I’ve found most useful is taught by the Center for Applied Rationality. It goes like this:

  1. Break down a skill into small components.
  2. Drill each component rapidly, with immediate feedback
  3. Integrate the pieces together into larger components; drill these similarly

In that light, project-based learning definitely has something going for it: if you do a project, you will practice all the skills needed to do projects.

There’s something missing, though: it completely gives up on trying to think about what components it’s trying to teach.

Here’s how project-based learning might play out: After finishing his first two Android apps, Bob decides he wants to learn network programming, and generally get better at working on a software team. He and Alice pair up and decide to build a chat app for sending cat gifs, with a distributed replicated backend. They decide to make a spreadsheet of tasks and claim them. Alice also wants to learn network programming, and she swoops in and takes most of the server components; Bob gets left with most of the client work. Over the next three weeks, Bob spends a lot of time building the GUI for the app, which he already knows how to do, and only a couple hours implementing the client protocol. They start writing tests because the teacher said they had to. Bob has trouble writing the tests, and realizes a couple ways he could have made his code differently to make testing easier. He also discovers that two of his tests were too fragile, and needed to be changed when he updated the code.

It’s now one month later. What has he gotten from the experience? He's learned he needed to be more proactive about taking on tasks that challenge him and grow his skills. He’s learned a smidgeon about network programming, and a couple ideas about how to write better tests. These are good lessons, sure, but expensive. Isn’t there a way for Bob to learn more in that month?

False Temptations

What are the common arguments in favor of project-based learning? Here are two of the main ones.

Real skills

The first big reason for project-based learning is that it teaches real skills used in industry. Why do many schools teach much of their curriculum in Haskell, not in the Tiobe Top 10, or even SML or OCaml, not even in the top 50? Wouldn’t they serve their graduates better teaching Node and React?

The first counterargument is that industrial technologies come and go. Proponents acknowledge this, sure, but still call CS departments “out of date” for not following trends. What really drove home the futility of this argument for me was this essay by software-engineering pioneer Mary Shaw. Had she followed that advice in the 60s, she points out, her students would have spent their time studying JCL, the language used to schedule jobs on IBM mainframes.

The second and bigger counterargument: learning concepts is much more important than learning applications, and the best environment to learn a concept is rarely the one in industrial demand.

People often ask me what’s the best language to learn to study software design. I ask them what’s the best instrument to learn to study music theory. Everyone answers piano. In piano, you can see the chords in a way that you can’t in, say, trombone. We see something similar in other domains. In The Art of Learning, Josh Waitzkin recounts how, unlike others, he started studying chess at the fundamentals, in situations with few pieces on the board. He ultimately beat competitors who studied many times harder. In The Art of Game Design: A Book of Lenses, Jesse Schell advocates looking past modern video games and instead studying the concepts in board games, dice games, playground games.

So, for programming, we need to (1) figure out the core concepts to teach, and (2) pick languages that make the concepts readily available. C and Java --- indeed, all top languages until Swift arrived --- lack elegant ways of expressing the idea of a sum, a value that can be one of many alternatives. Thus, when explaining why adding a new alternative sometimes breaks existing code and sometimes doesn’t, I find myself having to explain using the clumsy manifestations as unions and subclasses. The first time I read Bob Harper’s explanation of why they chose SML for the CMU undergraduate curriculum, I thought he was just rationalizing snobbery. Now, I quite agree.

More like real jobs

The second big argument for project-based learning is that it more closely resembles what students will actually do on the job. This, in turn, is based on the idea that the best way to practice an activity is to do it.

This is false. “The only way to improve at X is to do it” is the advice you give when you actually have no idea how to improve. When you do know, you isolate subskills and drill them. Martial artists punch bags and do kata, and fight with extra constraints like no dodging. Musicians play scales, and practice a single measure over and over. Mathematicians rederive theorems from the book. And to condition a shot-putter, the best way is not to put weights in their hands and have them mimic the throwing motion, but rather to train the body’s ability to produce power, using squats and lots of heavy exercises that don’t even resemble shot-putting.1

Drilling programming

So what am I advocating? I’m advocating that you actually think about what you’re trying to teach, and design drills for it. The first drills should focus on one thing at a time.

So, for Bob trying to learn network programming and the general software engineering skills of being on a team project, here’s my 5-minute attempt to come up with an alternative way to teach these skills:

  1. Writing just the networking component of a larger system.
  2. Being asked to write the test cases for a small program given to you. The program is deliberately designed to tempt you into all the classic mistakes of test-writing.
  3. A simulation where you “coordinate” with the TAs to build something, committing pseudocode to version control. They troll you by deliberately misunderstanding things you say and seeing if their misunderstanding can go undetected.
  4. You and several teammates are assigned a small project. You’re asked to divide it amongst yourselves into unrealistically small units of work, and write test cases for each other’s components. (Integrates the skills of 2 and 3.)

These total much less time than the chat app, and teach much more. If students find they’re not optimal for learning, I as the instructor have much more room for experimenting. And if the students do choose to do a full project afterwards, they’ll be much more prepared.

I don’t teach network programming and haven’t tested these specific ideas. But, in my specialty of software design and code quality, I use exercises built on similar principles all the time. So I may teach a client some of the pitfalls of a naive understanding of information hiding, and then show them code from a real project that has that problem and ask them how they’d solve it. Or I’ll ask them to give just the type definitions and method signatures for a Minesweeper game; if they violate any design principles, then I can give them a feature request they can’t handle, or show why the implementation will be prone to bugs.

Is it better than just assigning projects? That’s the wrong question to ask because project-based learning is incredibly easy to beat. My clients are mostly working professional software engineers; they’re already doing “project-based learning” every day. On my website, I claim.

Programmers learn by making bad design decisions, working on the codebase for a year, and then find themselves wishing they could go back and do things differently. I can give you that experience in an hour. 

Does this sound like a bold statement about my teaching prowess? It’s not. In fact, piano teachers put that claim to shame. You can spend hundreds of hours practicing a piece using too many muscles on every key press. If your body awareness isn’t great, you might not find out until your hand cramps up right before your performance. A couple seconds to catch that mistake, a couple minutes to tell you a story of that happening to others, and the piano teacher’s just saved you months.

(As an aside, this is why I believe in finding a competent private coach for every skill you really care about improving in.)

Replacing a traditional CS education with a “software engineering BFA,” like Spoelsky and Atwood suggest, is no longer a hypothetical exercise. We’ve tried it. And now dev bootcamps are going bankrupt. Instead of substituting for a traditional degree, recruiters are calling bootcamps jokes. Olin College of Engineering is famous for its project-based curriculum, but one student reports that she learned much more from traditional classes.

It’s time to stop looking for panaceas and shortcuts and realize that deliberate learning and deliberate practice --- as a separate activity from the everyday doing --- is the only way to mastery. As famed gymnastics coach Chris Sommer puts it, the fastest way to learn is to do things the slow way. Studying the fundamentals may seem like a distraction keeping you from getting your hands dirty making a Rails app using the Google Maps and Twilio APIs, but when you do get there, you’ll find there is less to learn if you’ve already compressed the knowledge into concepts.

Shameless Plug

My Advanced Software Design Web Course starts next week. It’s based on a lot of the learning principles I mentioned in this post, starting each concept with isolated drills and progressing to case studies from real software, and comes with personalized feedback from me on every assignment.

Disclaimer:

No, the sum total of knowledge about CS education is not to be found within this post. Yes, I do have some formal training in education; yes, other people have a lot more. Yes, there are a lot of things I didn’t bring up. Yes, the situation with bootcamps is more complicated than a simple referendum on project-based learning. The simple “turbocharging training” model of learning I gave is not a theory of everything. Yes, you need to run into problems in context, find motivation, try things out of order, and even eventually do a full project on a team without guidance. I believe realistic projects do have a place in education, but they still must be coupled with the principles of rapid feedback, and they are a poor substitute for learning the concepts piece-by-piece.

Acknowledgments

Thanks to Elliott Jin for comments on earlier drafts of this post.


1 When I was searching for a personal trainer, I asked about this to help screen candidates.

Saturday, June 4, 2016

The Partial Control Fallacy

Around the time I started grad school, I applied for a few prestigious fellowships. Winning is determined by several factors. Some are just an application, while some have a follow-up interview, but the applications all get scored on a rubric that looks roughly like this:

  • 50%: Past research
  • 30%: Letters of recommendation
  • 10%: Transcript
  • 10%: Personal Essays

Naturally, I proceeded to pour massive amounts of time into the essays, letting it consume much of my free time for the month of October.

Getting that Fellowship will really help me have a successful graduate career. Writing better essays will help me get the Fellowship. Therefore, to the extent that I care about having a successful graduate career, I should be willing to work hard on those essays.

But if the real goal is a successful graduate career, then at some point shouldn’t I put those essays down and do something else, like reading a few papers or practicing public speaking?

This, I dub the Partial Control Fallacy. It’s where, if there’s some outcome you want, and you only control a couple factors that affect that outcome, you decide how much to try to improve those factors as if you were actually improving the entire outcome. It’s closely connected to the 80/20 principle: it’s when you only have control over that last 20%, but you pretend it’s the whole thing and work on it accordingly. It’s when the 80/20 principle would suggest doing nothing at all.

Here are some more examples:

  • Trying to get any competitive award that’s judged mostly by your past. The best college application is stellar grades and some good awards, the best resume is a great network and lots of success stories, and the best pitch to VCs is a rock-solid business.
  • Thinking really hard about what to say to that cute guy or girl across the room. Most of what happens is determined before you open your mouth by what they’re looking for and whether they’re attracted to you.
  • Worrying about small optimizations when writing code, like avoiding copying small objects. Most of good performance comes from the high-level design of the system.

I think I’ve been guilty of all three of these at one point or another. I don’t want to think about how much time I spent on my Thiel Fellowship application and preparing for my YCombinator interview. Meanwhile, most people who get into either don’t spend much time at all.

In parallel computing, there’s a concept called Amdahl’s law. If your program takes t steps to run, and you can make s steps faster by a factor of f (say, by splitting them across multiple processors), then the new speed is t-s+s/f, for a speedup of t/(t-s+s/f). Therefore, if you optimize those s steps extremely hard and split them across infinite cores, the best speedup you’ll get is t/(t-s).

Applying that to the above, and you can see that, if I worked infinitely hard on my essays, I could only make my apps 11% better versus not doing anything at all. (At least to the extent that it really does follow that rubric, even if I submit a blank essay.)

Sometimes, that outcome is all you care about it, in which case you’re perfectly justified in trying to eke out every advantage you can get. If you’re in a massively competitive field, like sports or finance, where there’s a really big difference between being #1 and being #2 at some narrow thing, then, by all means, go get that last 1%. Wake up early, get that 7th computer monitor, rinse your cottage cheese. But if you’re putting this kind of effort into something because it’s your terminal goal — well, you’re not doing this for anything else, are you?

I think the solution to this fallacy is always to think past the immediate goal. Instead of asking “How can I get this Fellowship,” ask “How can I improve my research career.” When you see the road ahead of you as just a path to your larger mission, something that once seemed like your only hope now becomes one option among many.


Thanks to Nancy Hua, Melody Guan, and Ryan Alweiss for comments on earlier versions of this post.

Tuesday, October 13, 2015

The Prototype Stereotype

It’s a sunny day in Santa Cruz, and Alice is showing Bob her new app:

Alice: This is Frosttly, my new on-demand delivery service for cupcakes. 

Bob: Cool! How does it work?

Alice: It’s based on the Uber and Google Maps APIs. Whenever, you press this button, it texts one of our Cupcake Delivery Specialists your current location and summons an Uber so they can bring you cupcakes. 

Bob: Wait; there’s got to be more to it than that. 

Alice: Naturally, this is just a prototype. The final version will integrate directly into the order system for a specialized cupcake kitchen, and will feature sophisticated order tracking and highly optimized routing of deliveries. It’s fully functioning though. Go ahead; try it. 

Bob presses the button, and hears Alice’s phone vibrate in the other room 

Bob: Wow, it works! I can’t wait to get my cupcakes.

If you’ve ever been to a hackathon, you might have seen plenty of conversations like this. Elsewhere, Billy is showing Alyssa his prototype of a new dessert: he poured confectioners sugar over an antique dinnerware set to create something light and fluffy with a rustic feel (the final version will also have chocolate and goji berries). Billy’s concoction is much closer to a final product than Alice’s cupcake service without cupcakes, and may have even taken more work than her handful of lines of code. Yet still I’d be surprised if I saw anyone calling a pile of sugar a prototype dessert. What makes one a prototype but not the other? The resolution is simple: there’s no such thing as a prototype app.

A few years ago, my working definition of a prototype was simple; you build enough of your plan to get a sense of what the final version looks like and show you could build the whole thing. The prototype of my app for playing arbitrary card games was a screen where you could drag-and-drop rectangles. The prototype of my game mod used hand-crafted assembly to make changes. This breaks down when you stop to think: what aspect exactly are you trying to figure out?

Some people walk into Jesse Schell’s Game Design class expecting an easy time, and are shocked to find themselves pulling multiple all-nighters for a class where getting a 100% on everything is only enough for a B. But those that persevere find themselves with new worldviews on everything from sleep to applied probability theory, and learn why there’s no such thing as a prototype app. As Schell states in his book “The Art of Game Design,” a prototype is defined not by the product it steps towards, but by the question it’s intended to answer. And depending on the question, the form of the prototype can be very surprising.

So, what’s a prototype for Tetris? You mean: something to figure out how the blocks mechanic works in practice? Get a friend to cut some shapes out of paper and start sliding them down a grid. It might not make for the best Tetris experience, but it takes about 15 minutes to get going, and it’s enough to start getting a feel for how the shapes work together.

The team for the game “Prince of Persia: Sands of Time” wanted to figure out how the acrobatics in the game would work. Their prototype was just a few animations of different moves, plus a bit of imagination.

Designing the gameplay for a new first-person shooter? Try flashlight tag. Designing the atmosphere? The prototype is also called “concept art.”

And for hardware? Nintendo showed us the difference between design and technological prototypes in 2005 when they unveiled the slick look of the Wii while having modified Gamecubes running behind their demo booths

It gets even more interesting once you leave the realm of games. Jeff Hawkins of Palm prototyped the user experience of the Palm Pilot by carrying around a block of wood in his shirt pocket. He would frequently take it out to “check his schedule” or “look up a contact.” Whenever someone suggested a new feature in a meeting, he would take it out and ask them where it would fit. Meanwhile, I told a friend at an assistive robotics startup, that, while their current project does serve the purpose of a technological prototype, he could build a prototype to test the value proposition much faster by simply going to his grandparents’ house and pretending to be a robot.

Prototypes and MVPs

Like everything else in Lean Startup, the idea of a “minimum viable product” has been passed around the Valley in a game of telephone until its meaning is perhaps less than that of the three words stuck together.

A minimum viable product is a very special kind of prototype, one that tests the two key factors behind a startup’s success, what Eric Ries calls the value hypothesis and the growth hypothesis. Typically, this constrains the MVP to more resemble the actual product, but not necessarily. In 2008, Dropbox hit a key milestone on the path to their MVP when they released a video demonstrating their product. Tens of thousands joined the waiting list. So actually, the minimum viable product was the video itself: they had already proven that lots of people (growth hypothesis) want what they’re building (value hypothesis).

Similarly, the MVP doesn’t need to work internally at all like the final product. Lean Startup contains a couple examples of this: the ingredient-delivery and meal-planning service “Food on the Table” started with the CEO making deliveries to one woman, and didn’t even try to add automation until forced to. Alice’s app may not be a technical prototype of anything other than the ability to send texts, but if she can do enough behind-the-curtain work to get real users and see how they respond, it’s enough for a perfectly fine MVP.

Unpacking the Confusion

Prototypes for design questions, engineering challenges, usability, market. Demos for users, investors, the press. Why do people seem to want to combine them all into one mythical “prototype?” To show they can build it? Surely they realize that, for a typical web app, the answer to “Can I make this?” is usually “Yes” if not “Yes, but why?” And why is it weird to prototype an RPG game’s combat system with pen and paper, when many of them are just digitized versions of physical predecessors?

I think it’s simply a case of a more general phenomenon. The best way to learn to play a song may first involve drills with nary a bar from the final piece. The charity that makes you feel great and does a lot of good may actually be two charities. When you learn to identify what things you want, it’s often best to get them separately. So let’s lay rest to the idea of a prototype app. Forget about about prototyping your product as a whole. Find the underlying questions, and answer them.


Thanks to Jonathan Paulson, Amy Quispe, Nancy Hua, Michael Poon, and Melody Guan for comments on earlier drafts of this post.

Monday, August 17, 2015

Sources of Power

You’re in high school, trying to get into a good college. You know what you must do: do well in classes, score highly on the SAT, and be active in extracurriculars — and do it better than everyone else.

Actually, I have a different suggestion: train with a friend for the USA Biology Olympiad, score highly in the first two rounds of exams, and qualify for the national training camp and then the national team.

Only a handful of people can follow that strategy. But anyone who could play the standard high-school achievement game and have a good shot at getting into MIT or Stanford could instead play the Olympiad game and have a great one. The USABO is disproportionately high-utility compared to how competitive it is. It comes with a free trip to a national training camp where you receive intense training in biology and bond with a few dozen other top high-schoolers. There are vast swaths of America, including lots of high schoolers studying hard for the SAT, who have never heard of USABO. And yet there are communities where training for Olympiads is such a well-known option that it barely counts as a strategy.

My friend who did this and got into all her colleges didn’t do so by playing the standard high school game better than her competitors, but by stumbling into a different game entirely. In doing so, she could do things the others couldn’t. I think this a fairly common pattern: a lot of what’s involved in making it to the top of anything is not being better at things than other people, but outright being able to do things they can’t. In business, they call it a “competitive advantage.” Peter Thiel calls it a “secret”. For personal life, I like to call it a “source of power.”

The “other people can’t” is the big part. As a source of power percolates into society, it loses its power as an advantage, although whether you should stop doing it depends on whether its value is external or innate. As an example of the former, 200,000 people compete in the American math Olympiad qualifier rounds each year, so training for the math Olympiad is not such a good move for most people. It’s prestigious, but only in proportion to how competitive it is. In economic terms, the free lunch has been eaten. Meanwhile, when Benjamin Franklin was working in London, saving up to open his own print shop, he found it easier than most to be frugal due to his insight that strong beer does not grant physical strength. They nicknamed him the “water American.” Nowadays, his insight is common knowledge, but that doesn’t make it less effective. Instead, it becomes the new bar.

As a warning, I found when writing this that a lot of examples of sources of power I used or wanted to use would strike a lot of people as weird, but it would take a lot of space to justify them. This is inevitable in retrospect: if it’s considered normal, it’s no longer a source of power. I also noticed while writing this that a lot of my examples focus around high school or college. I think that’s largely because life tends to diverge afterwards, and the examples become much more niche.

Discovering Sources of Power

How can you learn about new opportunities before other people suck them dry? How can you find ways of being better before they become background knowledge? While CEOs often spend much of their time looking for a leg up on the competition, I think there’s enough sources of power and few enough people looking for them that simply trying is enough. In fact, sources of power are sufficiently exploitable that there are many algorithms for finding them with high success probability.

Often, they’re hidden in plain sight, waiting for anyone to read. For a basic example, right now a degree in computer science is a ticket to a decent life. Right now, the meta-skill of “study things that are valuable and will be in demand” is sufficiently uncommon that you can raise your expected earnings (or, dare I say, life outcome) significantly just by following it. Look at the distribution of college majors if you’re not sure. I think the same further applies to specializing in hot-but-difficult subfields like natural language processing or security. Right now, CMU’s Plaid Parliament of Pwning is winning tens of thousands of dollars from application security competitions every year, while only a handful of other American universities have a team at all. More broadly, this idea also applies to entering STEM in general.

This is basically staying ahead of the demand curve. When it comes to personal skills, demand is slow to propagate, and you can gain a lot simply by being faster. Since knowing that you can raise your earnings by becoming a programmer or moving to North Dakota doesn’t cause everyone to instantly become a programmer or move to North Dakota, it will remain exploitable for anyone who wants for quite some time.

It’s interesting to think about trying to be even more ahead of the demand curve by making a big bet and training in what will be hot. This involves predicting the future. But, in life as in stocks, the winner is often not someone who knows what will happen, but someone who figured out slightly more than everyone else. And, for predicting the future, not many are trying. Along these lines, I was surprised when I learned that several prominent companies in the mobile space — in particular AdMob, acquired by Google for $750 million, and Flurry, whose software runs in over 100,000 apps — were actually founded in 2005 or 2006, before the iPhone’s announcement ushered in the modern mobile era. I think a lot of people knew the mobile revolution was coming, although perhaps not that it would be so fast. The people bold enough to actually act on that prediction were in a very good spot when it happened. I’m hoping to do something similar for program analysis.

From Who You Know to What You Know

A lot of the above could perhaps be summarized to find ways to be effective. The interesting part is how people find sources of power. One way is to invent it yourself, whether by finding a loophole, noticing a trend, or doing science. While there are a few places where it’s clear that investigating it may result in disproportionately better capabilities, this can run out of steam pretty fast. It may be possible to gain vitality by eating better, but the reward curve of doing nutrition research probably more resembles climbing the corporate ladder. Most often, the way to find a source of power is to hear of it from other people.

I think it’s a pretty simple effect. People with similar interests like to cluster, but people who also really care about improving will cluster further. They might be able to invent one or two sources of power on their own, but then they share it with the people around them — who also have a secret or two. The effect compounds, with the benefits from sharing ideas dwarfing the loss of exclusivity. Just as IBM found their above-average testers becamedozens of times better when grouped together, what you get are communities that collectively have and share the best ways of doing things. So the way to get good at something is to simply find the right community and join it.

So, for example, you’ve probably heard weight-loss advice from everyone from talk show hosts to your neighbor. This suggests that you can do better by talking to bodybuilders, who can control exactly which day they’ll hit their goal.

But this strategy can be easier said than done. The problem is not the joining: these communities are rarely exclusionary. The problem is the finding: every community wants to seem like them. They get drowned out in the noise.

In 9th grade, I attended a local programming contest. I spent a morning running floppy disks to the judges, and left with a full belly and $500 — I had won by a sizable margin. I immediately went home and Googled for more high school programming contests. I found a pen-and-paper competition in which you answer multiple-choice questions about the BASIC programming language.

I often wonder how my life would have been different if I had instead discovered the USA Computing Olympiad.

For another example, it’s well-known that to get stronger you need to push your muscles to their limits — and specifically their strength limits, rather than their endurance limits. There are vast swaths of the Internet where everyone understands what implications this has for training, with plenty shouting it at the top of their lungs. But if you look around for advice on “how to get fit,” you’re perhaps more likely to find advice to do lots of crunches, or warnings that weights might make you look like a steroid junkie. One journalist described crossing this gap as “I somehow bumbled my way into a parallel universe of American fitness, one in which men know exactly how to get strong.”

Passing It On

As we’ve seen, while sources of power with intrinsic value may merely descend from insight to platitude over time, the externally-valued have a shelf life. Perhaps the big warning from this is for those wishing to help others be successful, especially parents. As Paul Graham wrote, parents are like generals always fighting the last war. I remember seeing a teenager on CollegeConfidential complaining that their parents wanted them to stay home all summer and study for the SAT. Perhaps that would be a rational choice in their home countries, where college admissions were and still are based on grueling exams. Yet here the most advanced standardized math test for college admissions is the Math SAT Subject Test, where it’s possible to miss 7/50 questions and still get a perfect score. Meanwhile, my own mother had occasional aspirations of being a “white tiger,” and would often cajole me during my hacking sessions to “stop playing Java” and go study for the SAT.

This realization — that all these secrets and sources of power I’ve spent so much effort finding might backfire when I try to pass them on — is what scares me. I imagine becoming a parent telling my children to train for Olympiads, not knowing that that’s become advice about as good as spending a summer studying for the SAT.

I think the defense is to recognize the phenomenon but go a meta-level up. Why are people at the top of one field often very good at another? Is it merely grit and intelligence? Just as there’s a meta-skill of finding sources of power, I think there’s a skill of finding and recognizing the people with the genuine secrets, versus the posers and people out to get your money. I believe there’s a way to recognize genuine competence that transcends fields (related concept). That’s a topic in and of its own.

So, find your sources of power, but pass on the meta-skill of finding them. To get your children into college, help them find the new secrets.

And, of course, that’s assuming college admissions are still worth obsessing over.


Thanks to Jonathan Paulson, Amy Quispe, Jessica Su, and Nancy Hua for comments on earlier drafts of this post.