‘Awful’ new kind of job interview
HOW many vacuum cleaners are made in a year? Can you swim faster in water or in syrup? How would you weigh your head?
Google's "brainteaser" job interview questions, so famous they inspired a book titled Are You Smart Enough To Work At Google?, were ditched in 2013 after the search giant admitted they were a "complete waste of time".
Nevertheless, brainteasers and the broader category of "unstructured" job interviews are becoming increasingly common among employers - despite a complete lack of evidence supporting their use.
Writing in the UK Telegraph, clinical psychologist Linda Blair said candidates would once upon a time have had a "fairly clear idea about the questions you'd be asked - your education, previous experience, reasons for wanting that particular role".
"Not so now," she said.
"The 'unstructured' job interview is becoming increasingly popular. In this format, there are no standardised, predetermined questions; rather it's more like an open-ended, free-flowing conversation led by the interviewer, who decides what to ask as the interview progresses.
"Employers say they prefer this approach to the structured interview because they believe they get to know candidates better. Although that may be true, there's no evidence that unstructured interviews predict job potential."
She points to research by Yale University professor Jason Dana, who wrote in The New York Times last year that interviewers "typically form strong but unwarranted impressions about interviewees, often revealing more about themselves than the candidates".
Prof Dana said at best candidates hired in this way were no better or worse than those hired using test scores or other standardised methodologies. At worst, unstructured interviews can be "harmful, undercutting the impact of other, more valuable information about interviewees".
In 2013, he and his colleagues devised a series of experiments to figure out why, despite decades of evidence showing they have "little validity", unstructured interviews continue to grow in popularity.
"In one experiment, we had student subjects interview other students and then predict their grade point averages for the following semester," he said.
Another group of students were asked to predict the performance of a student they did not meet, based only on course schedule and past GPA.
"In the end, our subjects' GPA predictions were significantly more accurate for the students they did not meet," he said. "The interviews had been counter-productive."
Even more worrying, some of the interview subjects had even been told to give "random" answers. Prof Dana said "strikingly, not one interviewer reported noticing that he or she was conducting a random interview".
"More striking still, the students who conducted random interviews rated the degree to which they 'got to know' the interviewee slightly higher on average than those who conducted honest interviews," he said.
The final twist to the experiment? The researchers explained what they had done and their findings to another group of students, then asked them to rank the information they would like in making a GPA prediction - honest interviews, random interviews, or no interviews.
"They most often ranked no interview last," he said.
"So great is people's confidence in their ability to glean valuable information from a face-to-face conversation that they feel they can do so even if they know they are not being dealt with squarely. But they are wrong."
In the 2013 paper, the researchers warned of the twin perils of "sensemaking" and "dilution" - making sense of "virtually anything the interviewee says", and poor information weakening the impact of quality information.
"Because of both of these powerful cognitive biases, interviewers probably overvalue unstructured interviews," they wrote. "Our simple recommendation for those who make screening decisions is not to use them."