>For all the company knows, I (and countless others) could be the best developers they'd ever meet. And yet we're automatically excluded from even applying.
The game is about probability, not possibility. The assumption is:
P(qualified | degree) > P(qualified | no degree)
That's probably reasonable, since schools do filter out at least some idiots. To find a candidate, one must interview roughly 1/P candidates. If interview costs are high (e.g., face time is precious), you want as many easy filters as you can get.
This is especially true if you don't trust HR to properly find the "diamonds in the rough" (I certainly wouldn't).
OK I can understand it from that perspective, it's not that there's so much weight being put on a degree, but rather its used for lack of other filters, when there's lots of applicants.
Doesn't sound very accurate, but I can see the reason why.
For example, the P(A | B) notation is something you learn at a University Probability class. A good feeling for elementary probability thoery is something that makes you much better at modeling, understanding and predicting the world, and not just as a programmer. And there's lots of other classes like that in a Uni. curriculum (although the overall percentage of really useful classes is in my estimate only 25-50%). Before you say that you could learn that at home, well, normal people who want to learn such things choose to go to school, and corporate HR is usually not interested in deviants. My personal opinion: I also have friends who are very smart and great programmers and dropped out of school, and I think that that particular choice was not very smart on their part, they could have just finished it, and would be in a much better position. Eg. they wouldn't be reliant on personal recommendations when applying for jobs; also, how do you ask for a raise w/o a degree --- where are you going to go instead?. etc.
"normal people who want to learn such things choose to go to school"
That's a load of crap. Normal people who want to learn such things pick up a book and read about it. Only very young people consider "going to school" a viable option. Normal people have jobs, kids, and responsibilities and don't have time to go to school every time they pick up a new interest.
Normal people who want to learn such things pick up a book and read about it.
I would be very surprised to find very many "normal people" who have the drive to learn, say, probability theory or statistics to any reasonable depth by self-study. Of course such people exist, but they are neither typical nor common.
I said normal people who want. Now it's true that normal people don't generally want to learn probability theory, but if they do, they'll turn to books and self teaching long before school unless they're extremely young and still think of school as life.
Once you've been in the real world for a while, the idea of going back to school is generally the last option, not the first. The only thing school has to offer over self directed learning is very smart people to learn from, but if you look around, you can find those people in the real world as well. Mentors tent to show up just when you're sincerely looking for them.
they'll turn to books and self teaching long before school unless they're extremely young and still think of school as life.
On the contrary, many people take classes and attend conferences, workshops and seminars at many different ages. You could learn cooking by self study, but cooking classes continue to be popular, despite being more expensive.
The only thing school has to offer over self directed learning is very smart people to learn from
Hardly the only thing: a lot of things are easier to learn when you're in a group of intelligent people all interested in approximately the same thing, and all trying to learn the same subject matter (I'm thinking mostly of graduate seminar classes here, not 300-student undergrad lectures).
Well, you said that "normal people" would read a book "long before" going to school, which I think is not true in many domains.
My second point was not just that you can learn from your peers, but that there is a social environment at a school that is important, and not readily replicated in self-study. You don't learn much from your peers at a typical cooking class, but they still contribute to why many people choose to take classes rather than self-study.
The stats on college attendance would seem to be against you since about 70% of the population doesn't do it or doesn't finish a 4 year degree. Those are normal people. The 30% that do attend college and finish with at least a Bachelor's are the minority.
Yes, school is a social environment for learning, that has value, but it's hardly the only way to obtain that. Outside of school, there are local clubs and hobbie groups, forums and blogs such as this on the Internet, alternate paths such as the military, etc. There are all kinds of ways to be around people who enjoy the same things you do, or are trying to learn the same things you are without going to school.
By saying thats a benefit of school you're implying that it's not obtainable elsewhere. School just isn't that important if you really want to learn, it's more valuable for those that lack motivation and need someone to make them learn.
My apologies for the U.S. cultural assumption. See http://www.census.gov/prod/2004pubs/p20-550.pdf for what it's like here. I also presumed we're discussing those who finish college, not those who just enter it. What percentage actually finish college in the UK?
The OECD stats look quite interesting. They break things down into Type A[1] and Type B[2] programs. Superficially, Type A are 3+ year courses, Type B are 2+ year courses. Results for the UK and US are here[3][xls]. Executive summary: for the 3-4 year Type A graduates, in the UK, 97% graduate. The OECD average is 67%. This report also has data for the US.
For actual numbers of people in the UK with degrees, there's this page[4] from the National Statistics. For people of Working Age, 18 to 60-ish, in the UK, 16% have degrees, and another 8.5% have "Higher education qualifications". 15% have no qualifications. It would be handy to have data for the 25-35 age range, since not many 18 year olds have degrees.
Normal is not easily defined. Average is. you are confusing the average person with the average hacker. which most likely the average type of person in your circle of acquaintences.
I have suggested self-teaching to many non-hacker types and they are completely opposed to it. Even those going into programming.
Not related to the original question (I agree with the above answer) but in my experience HR departments/managers are more likely to read a CV not sent by the usual channels. The easiest way to bypass it is simply sending an email. The most efficient - leave it in person. Don't know why, but from (limited) anecdotal evidence seems to work.
That assumption will only allow you to pick based on a relative standard not an absolute one. Also, a degree being an eligibility criteria makes no sense. If you want to hire well then you should be willing to meet any relevant candidate. So in effect, with a degree being a REQUIREMENT you are actually lowering the pool of people who will apply and hence reducing your probability not increasing it.
The most desirable companies get so many resumes that they need to filter out most of them. I used to work for a popular search engine before Google came into the picture. As an engineer I had to interview people every week. HR handed me lots of resumes from people with CS degrees from top universities for phone screening. They didn't bother with the rest unless someone was heavily recommended by a trusted source.
Agreed completely. But how do you get more information cheaply?
Remember, all you know is P(qualified| resume says autodidactic)= P(qualified | autodidactic) * P(autodidactic | resume says autodidactic) . The second term can easily kill you. Do you trust everything people put on their resume?
On the other hand, P(degree | resume says degree) is probably close to 1, since a degree is so easy for an HR person to check.
Look, I've got nothing against hiring good people who have no degree, and I certainly don't think a degree proves much (my current students prove that conclusively). If you know you have such a good person, hire them, ignore the degree.
I'm just pointing out that the game is to find the good people. And that's a bit tricky to do.
[edit: by this way, in the interest of clarifying, what you want to optimize is not candidate quality, but (candidate quality - search costs). Don't ignore the search costs term.]
In the programming industry, it's not really that hard to filter out bad candidates. You shouldn't waste time talking to any programmer not willing to submit to some simplistic programming test in lieu of a resume. Resumes are crap and have no place in hiring programmers. Code is all that matters, either they can do it or they can't and if they can, they'll accept your challenge.
Nothing makes for a better interview than doing a code review and critique of the candidates own code. Something simple but telling like write a little slot machine that lets you bet money, spin the wheel, and win or lose with the game ending when you run out of money.
Anyone who refuses such a test isn't a programmer you want anyway, and you can tell an enormous amount about a potential candidate by simply looking at the quality of the code submitted. Don't bother interviewing anyone who's code you can't stand. You can see everything you need to know about their skills in that code.
You wouldn't hire a graphic designer who refused to submit samples of his designs, nor should you hire a programmer who refuses to submit samples of his code. Merely by having such a test, you'll weed out all the fakers because they won't bother with it, they'll submit their fake resumes elsewhere.
Many good candidates could be put off by such test. More importantly, even if they agree to your test and send you some code, you need an engineer's time to evaluate the code sample, and if you're going to invest engineer's time you might as well do a full phone screen. The recruiters are by and large non-technical people. They would love to be able to ask some technical questions which could help them predict whether a candidate would pass the interviews or not. I've been actually asked by a recruiter friend of mine how to do it. But, if you think about it, this is not so easy if you're not a programmer yourself.
You are absolutely right that it would be madness to hire a programmer without having seen him code. The companies are well aware of it and they will surely ask you to write a lot of code during the interviews. But that comes at a much later stage, as it costs them much more money than checking on your resume whether you have a degree or not.
I'm not saying that I like the way things are, but that's just life. If that's of any consolation, in most other industries the requirement of having a degree is much more strict than in case of Software Engineers.
I disagree, any candidate worth having would enjoy such a test. Any programmer who is put off by being asked to program needs to find another profession, period. You should not ask someone to program during an interview, you should review the code they've submitted and make them explain it, all if it, their design choices, their idioms, naming conventions, etc. If they can't discuss code they just wrote prior to the interview, you don't want them, but many good people don't perform well under the pressure of an interview so you won't get an accurate feel for them if they program at the interview.
No one but a programmer is qualified to evaluate another programmer, HR and recruiters have no place here and they'll just waste time and effort pretending to be useful, they aren't.
By doing the test you've already filtered out the wanna be's so you don't need phone interviews, the submitted samples should go strait to a qualified programmer, it'll take him only a few minutes to toss out any bad submissions and he'll quickly know who's worth actually interviewing and who's not.
One issue to be aware of is that it's not just the corporation interviewing multiple candidates. It's also the candidate interviewing multiple companies. Putting a large burden on each candidate may not scale if every company does it. I agree that it's important to assess actual coding ability but I don't know if a programming test is the best way to go about it.
Any programmer who considers such a trivial test a large burden, you don't want. There's no other way to assess programming ability than programming, it's really that simple. If you hire an unknown programmer without making him submit a code sample, then you deserve what you get because you're gambling and likely to lose. The whole point is to find good programmers, and good programmers enjoy programming, they'll happily write a trivial program to prove it, happily. Anyone who balks at such a test is not someone you want to hire, period.
Good code is good code, I don't agree that it takes a vastly different skill set to work on different sized programs. During the interview you can simply ask questions such as "How would you provide the ability to configure various strategies for different users interfaces for this slot machine? Say a web UI, a command line UI, and a native app UI?". "How might you make the randomizer for spinning the slots configurable?". The discussions these open up will tell you a lot about their knowledge of making flexible software.
Again, the test is just to weed out the bad candidates and allow you to only interview worthy people, everything else you'll find out in the actual interview. Interviews will be rare, few people will submit code you'll find acceptable at all. The vast majority of people applying for programming jobs, can't program, it's sad, but true.
There is a set of skills and body of experience that applies to working on large bodies of code. In fact, there are different skill sets for working on bodies of bad code and good code.
The language syntax may be the same, but programming approaches might be very different. I've seen parts of a Smalltalk program where it was coded like spaghetti copy-paste Fortran, and it's actually hopeless to try and fully understand the semantics and still meet your deadline. But I was able to use a few tricks to prove that certain modifications wouldn't alter other functionality so I could get my work done. In the same program, there was well factored code with a nice object model, where it would behoove you to understand the part you're working on in a more conventional way.
On reflection, I think you may be right that good code is good code. Size of the system matters most when the code is bad.
There are lots of other cheap filters that companies do not use. For example, "Have you read Ayn Rand?" This question may well have better predictive power than a degree. And it's not hard to verify if the person is lying about whether they've read some at all.
Or, "how many blog posts do you have about programming?" Again this is easy to verify.
Who is Ayn Rand? An utterly dreadful writer and philosopher who believed that the axiom of reflexivity of identity had substantive ontological, social, moral and political consequences. It's no accident that Objectivists sought to justify their belief in the "virtue of selfishness" in the law of reflexivity; they have no use for altruism. Rand's philosophy glorifies the sociopath homo economicus, whose sole objective in life is to maximize his expected utility.
However, results in evolutionary game theory show that a society of self-seeking, self-regarding agents will generally face conditions that ultimately lead to its collapse.
Gary Cooper's goofy speech in The Fountainhead ( see http://www.youtube.com/watch?v=Zc7oZ9yWqO4 ) typifies Rand's attitudes. Among other preposterous propositions, Cooper is made to utter the nonsense that great inventions are uniformly the work of sole inventors, selfishly and reflexively seeking their own interests. This is ahistorical; see Against Intellectual Monopoly by Michele Boldrin and David K. Levine ( http://www.dklevine.com/general/intellectual/againstnew.htm ) for the history of inventions such as the steam engine, radio, telephone, and so on. In each case, ideas were in the air, and there were a number of people who came up with similar inventions more or less at the same time.
Cooper argues the basic notions of intellectual monopoly, which are that intellectual property is essentially indistinguishable from tangible property, and that all copies of ideas "belong" to their creator. These arguments come straight from the RIAA legal playbook. I'm surprised that any culture of hackers would want to subscribe to notions more commonly associated with corporate monopolists.
I suspect the point is that having heard of and bothered to read one of her books, whether you liked or agreed with it or not, probably implies various things about you.
A) You socialize(d) with people who read things that aren't sold in the grocery store.
B) You not only know how to read, but most likely voluntarily read a 700+ page book in your spare time in order to learn/see what it was about/etc...
C) If you can speak about what was in the book, and what you thought about it, you can follow the plot of a 700+ page book, you can understand the points the author was making, perhaps you can intuit the not-very-subtle philosophical and societal messages she was delivering, and you can discuss how you agree or disagree with those messages.
It's no IQ test, but frankly it's probably a much better question than "Do you have a degree?".
At least I'd rather work with people who have read a book like that, and have an opinion on it's content and the author's points (even if they hated the book/points/etc...), than the average CS degree graduate.
I'd rather work with a smart, well read person who likes to think than someone who hasn't read much.
Sure asking about Ayn Rand isn't really an intelligence test, but it's not a bad start. Smart people can learn about data structures and algorithms. Slow people who've managed to get a CS degree from some random college may have learned enough to pass, but there's no demonstration of smarts there.
I'm hiring based on people being smart, self-motivated, willing to learn, willing to think, and people I can hold a conversation with. Teaching someone like that how data structures work is a lot easier than teaching a degree holder how to be someone I want to work with and someone I can trust to be on the ball as new technologies come out.
Ayn Rand seems to be (significantly) more popular here than in the general population, judging from a dozen or so times I've seen her come up in comment threads. Isn't that more relevant than your personal, negative opinion of her philosophy?
So, out of curiosity, do these "results in evolutionary game theory" have a source?
"Isn't that more relevant than your personal, negative opinion of her philosophy?"
No, because it's not only a personal opinion, but a statement that Rand's philosophy of rational self interest is logically invalid ('rational' does not imply 'self interest'; the basis on the reflexivity of identity could fairly be called desperate) and scientifically incorrect. Rand's philosophy is incompatible with findings of reciprocal altruism in evolutionary biology and experimental game theory.
"So, out of curiosity, do these "results in evolutionary game theory" have a source?"
As you must be aware, many smart people disagree with you about Rand, and can back it up with something better than a youtube video. And anyway, you said her philosophy contains certain flaws. That doesn't imply it's not valuable and useful overall, so even if I concede your points, it's not very important. The reasons I like Rand have nothing to do with "axiom of reflexivity of identity" or that other stuff you said.
edit: no online sources? I don't normally pay $35+ because a hostile, anonymous internet commenter said something would refute someone I respect but didn't want to explain the ideas himself.
As for $35, it pains me to mention libraries...it was a source, with a link so that you could see something about the book.
I can't say I know of a single public intellectual or professional philosopher who takes Rand seriously. I do know of a well-regarded mathematical logician who does, but this is an aberration.
The degree question is also silly, but has non-zero predictive capability. So do these. Certainly they rule out plenty of good programmers, but the issue is: do they leave plenty of good programmers, and a higher quality pool of remaining applicants?
If good programmers read Rand or write blogs at a higher rate than bad programmers, then it works, even if it's frequently wrong about individual people.
The difference is, if companies started using the filters you suggest, then the candidates would soon catch up and everyone would have a programming blog and have read Cliff's notes for "Atlas Shrugged." So it's not, as they say, an evolutionarily stable strategy. That's why no one does it on a larger scale I guess.
For a demonstration of this principle, compare the quality of a person with a degree in Computer Science. Has the average gotten better in the last 30 years, stayed about the same, or gotten worse?
My feeling is it has gotten much worse as people have sought degrees solely for the purpose of using them to get jobs in the field, precisely as you suggest would happen to blogs and reading cliff's notes.
However, may I point out that blogging is very different from reading the cliff's notes of a book? A hiring manager can read your blog. If they simply check that you have a blog, well, whoop-de-doo. I have a blog, so that clearly proves nothing.
However, if the hiring manager reads your blog, they can deduce a great deal about what you pretend to think and how you communicate it. So it is a small example of your work, much as posting source code is a small example of your work.
they are generic and poor filters. for example do you read person x. If person X isn't a household name (e.g. Bill Gates) Then it says nothing about them other than none ever introduced them to that person. I'm willing to bet there are a significant number of good C++ programmers that don't/can't care or know who Bjarne Stroustrup is. I had to look up the spelling of his name (although I don't consider myself a good c++ programmer).
The ability to write blogs is irrelevant here too as what we most likely care about is the ability to program. Hence, the programming test.
Take the group of programmers who have written more than 30 blog posts about programming. X% of them are good hires, and 100-X% are bad hires.
Now take the group of programmers who have written less than 30 blog posts about programming. Y% of them are good hires, and 100-Y% are bad hires.
Is X > Y, or X == Y, or X < Y?
That is the issue.
And that is the way degrees are used (when used rationally). The claim with degrees is that X is more than Y, not that most good programmers have degrees or anything else. That may be so. Then someone defended using degrees by saying there is a lack of alternative tests that are sufficiently cheap. That's not true. There are lots of cheap tests, and I have suggested 2 for which I believe X>Y is likely, and which, if studied, might turn out to have a higher X than the degree test.
to a degree I did. Maybe not to the degree you desired.
My point was is that even though it filters out many of the worst programmers it would also filter out all of the really great programmers. I'd rather have my search take longer and cost more to have those great programmers on my team than end up with just good programmers.
Provided your filters (1) increase the probability that the remaining applicants are qualified and (2) do not reduce the applicant pool too much, then it is probably a good idea (even if unconventional).
Note however that criteria (2) is important. If you use the filter "Is named Linus Torvalds", you would certainly increase the probability that any given applicant is good. Then again, your applicant pool will drop to either 0.
You're certainly right about abstraction being useful. My fist thought was that the simple-English for autodidactic here would be "self-taught", being aware of unnecessary abstractions, though the subtler meaning is closer to "someone who will teach themselves".
Simply saying "self-taught" wouldn't have had the right connotation: instead of "clever and driven", it would have been, "but what have they missed?" and we would have missed out on this interesting off-topic sub-thread!
I thought there was a difference where self taught would be "I did teach myself this thing" and autodidactic would be "I have the personality trait of continually teaching myself new things." Only one of the dictionaries I checked made this fine grained (and relevant) distinction explicit though.
That actually happens a fair amount with non-native speakers, since our uncommon words are sometimes very similar to their common words. For example, an italian recently called me ascetic, and I had to look it up.
I love how English is quite happy to absorb new words, even if the English speakers only know a subset!
Back in school, we were warned about "faux amis", words which sound like they'd be right to an English speaker, but very much out of place to a native French speaker. I think in English we'd just accept any and all alternative meanings and let context sort it out, who's up for some creative reading?
Using that word may not be pretentious, but -- worse -- it is bad form. An intellectual might object to the word's use in this context simply because it is bad writing.
The goal of writing to communicate. Unnecessarily using obscure words clouds communication. In this case, I don't think the using "autodidact" was necessary. It did not add to brevity or cadence or coherence. It was a mistake.
The only thing worse than an anti-intellectual is a bad intellectual.
Ironically (or pseudo-ironically?), the very difference in connotation between self-taught and autodidactic is pretentiousness. Perhaps not quite pretentiousness, but more of a difference in social class -- a self-described autodidact is indicating that even though they are self-taught, they are not "blue collar" self-taught.
And it's not that self-taught always means lower class, it's just that autodidact always means higher class. There must be a better word in English for the type of distinction in words I am describing here, maybe someone can enlighten me...
Um, fuck you. Really, I mean it. My mom, a single mother, worked full time doing hard physical labor in a bakery. Her back is still sore from all the hard work she did to raise me and my brother. We lived in cheap housing, paycheck to paycheck for years, and went to pretty crappy public schools. I don't see how you get anything about economic class from the word I used.
We were poor growing up. I liked to read. I learned big words when I read, and I use them when I communicate. End of story.
I feel like I'm in junior high getting bullied for being a nerd. I'm so sorry for using a big word. In the future I'll try and limit myself to a fifth grade reading level (maybe reaching rhetorical, which is a fancy word for a skillful kind of speech, heights of seventh or eighth grade level) so that the idiotic (for you mouth-breathers, idiotic is a big cloudy multisyllabalic way of saying dumb that fails to add brevity, or cadence or coherence) Techcrunch half of this forum can follow along without having to use a dictionary to find a word they don't know.
"Autodidact" is obscure? WTF? Where is this coming from? Did everyone come here right after watching the American Gladiator contest? Did we get an influx of Time Magazine readers?
For me, the goal of writing here is to practice at playing with ideas. If, in playing with ideas I use words that are (maybe, a little) obscure, and I fail to communicate to people who seem to me like idiots, well, I'm fine with that. Great, actually. If only the failure to communicate ran both ways.
I'm done being autodidactic. I'm going to put down my books and go watch some soap operas and sitcoms until I figure out how the idiots here think.
It's not at all pretentious to use a well-defined word whose meaning is precisely appropriate. To anyone who needs convincing of this I recommend a large stack of Christopher Hitchens.
What's pretentious is using big words to impress others. Most people who do this actually end up not using such words correctly, since nuances of meaning are hardly their primary concern.
Edit: what I really object to is your implied criterion that anything beyond what "most people" would say must be pretentious. Lord help us if we're supposed to go by "most people".
I think Richard Feynman would disagree with you, and his plain spoken manner is a symbol of how to be smart without being pretentious.
Using uncommon words appropriately doesn't make them any less pretentious unless you really don't know they're uncommon, and Christopher Hitchens, whom I very much like, is hardly someone associated with being unpretentious. Hitchens certainly doesn't suffer fools well.
So anything beyond what's common is pretentious? As I read the definition of the word, that's just wrong. On the other hand, looking things up in a dictionary to find out what they mean is so far from common that, by your definition, I'm being pretentious just for doing so.
Edit: your invocation of the name of Feynman strikes me as gratuitous (oops, I'm being pretentious there) and a textbook example of appeal-to-authority (or rather, it would be one, if you actually were quoting him or had the right to speak for him).
Edit 2: what on earth does "suffering fools well" have to do with being pretentious or not?
I wish there was a voice recognition program for smartphones that kept a running log of uncommon words within earshot. You could then click on the log to go to a definition. (Which would be pre-fetched and cached.)
Well, it's not really worth arguing about, but my appeal to authority was in response to an appeal to authority (Hitchens). If you don't agree the word is pretentious, fine, but to me and many others I presume (since I didn't start this thread, someone else called it pretentious first) do find that people who use uncommon words are being pretentious and trying to sound impressive with their vocabulary. Given a choice to say something plainly or with fancy words, the unpretentious choice is plain talk.
As for the reference to suffering fools, I was merely pointing out why I think Hitchens can be pretentious, but it wasn't really relevant, I agree.
Maybe ostentatious is a more accurate word, but it is a synonym of pretentious. In any case, I really don't care that much, so I won't bother arguing further, if you disagree with me, then let's just agree to disagree and move on to something more productive than arguing about words.
Ok, but I can't resist adding one more! First, it seems we do agree on what's "pretentious": not using uncommon words as such, but doing so to sound impressive with one's vocabulary. Also, I brought up Hitchens not to invoke any kind of authority but rather as rich source material for the inventive and pitch-perfect and endlessly entertaining (and, yes, unpretentious) usage of all kinds of words. It seems we agree on most of that :)
Lastly, I'd like to add something to this conversation that isn't merely being critical. The thing about less common words is that, often, they are not exactly synonymous with more common alternatives. There are often nuances that, consciously or otherwise, add to the meaning of what's being said. While thinking of this I remembered a brief post from Language Log a while back that I thought was brilliant. It takes several examples of forms that have been claimed to be interchangeable (and thus superfluous), and susses out real distinctions between them:
There must be a few people here who find language as interesting as I do, and would enjoy parsing out the examples they give. I'd post it as an item to HN but can't think of a title that could possibly convey the point. Oh well, maybe I'll just post it anyway...
It's pretentious to say that it's pretentious; in fact, any use of the word pretentious is extremely pretentious, including this one. The use/mention distinction also breaks down for the word 'pretentious': it's not even possible to mention the word without being fatuously pretentious. This is especially true if the criterion for pretentiousness is whether "most people" would know what it is.
True, but it's getting "more information" that is costly. If someone else can do it for you, so much the better. Personal recommendations are the best source (if you know and trust the recommender).
The game is about probability, not possibility. The assumption is:
P(qualified | degree) > P(qualified | no degree)
That's probably reasonable, since schools do filter out at least some idiots. To find a candidate, one must interview roughly 1/P candidates. If interview costs are high (e.g., face time is precious), you want as many easy filters as you can get.
This is especially true if you don't trust HR to properly find the "diamonds in the rough" (I certainly wouldn't).