Surely, the most unanticipated result of my 2006 article “Homo aspergerus: Evolution Stumbles Forward” (here) was commenters asserting that despite my claim, I didn’t really have Asperger’s Syndrome. I am sure those individuals meant well; on the face of it, it is daft and needlessly self-damaging to announce to the world that one suffers from a form of mental illness, so sympathetic observers might naturally be inspired to protest that it couldn’t possibly be true. However, if you think about it, doing something daft and needlessly self-damaging could itself be regarded as a sign of Asperger’s Syndrome, as the perpetrator obviously lacks the social skills to recognize the potentially harmful effects of his statements, and I also happen to believe, not unreasonably, that I know myself better than other people. Still, to buttress my credentials to address this subject, I did obtain, from a friend of my wife, a copy of the survey once used in California public high schools to diagnose students with Asperger’s Syndrome, and my results indicated a high probability that I do indeed have Asperger’s Syndrome. And that’s as much evidence as I’ll ever be able to present – because, as another aspect of my condition, I absolutely loathe the idea of meeting with a stranger and babbling on about my personal problems solely to enable a psychologist to provide an authoritative diagnosis.Another complication, of course, is that the psychological community no longer recognizes Asperger’s Syndrome as a distinct disorder; instead, officially, I must now identify myself as having an “autism spectrum disorder,” with a Ph.D. and numerous publications indicating that I am on the high end of that spectrum. In some respects, this was an intelligent decision, since Asperger’s Syndrome was never defined precisely, and I certainly never displayed any of its most extreme symptoms: as a child, I never acted out in public, and if you mention a subject, I will not drone on for hours and hours on that subject. On the other hand, replacing “Asperger’s Syndrome” with a term including “autism,” with its extremely negative connotations, could be regarded as yet another way to stigmatize individuals who do not exhibit some of the debilitating traits usually associated with that disorder.

Indeed, the reclassification might seem to undermine the argument of my original article, that Asperger’s Syndrome is not a malady but rather a blessing – the distinguishing characteristic of a new, and better, type of human who is inclined to avoid social situations and prefers the sort of isolation that often leads to innovative new ideas and major advances. Any rational society would naturally seek to identify and reward the people who are most likely to improve whatever group or organization they are part of, I argued, so Homo aspergerus should eventually come to dominate the world, and its population as a whole. Yet, in revisiting this topic a decade later, I must address the obvious fact that, though evolution moves slowly, one might have expected to observe in that interval at least some small signs that such revolutionary developments were actually occurring, as awkward, antisocial individuals began slowly and steadily advancing to higher and higher positions due to their natural superiority. Instead, members of my purported new superrace have done little if anything to improve their positions, as comfortably sociable people seemingly remain firmly in control of our major institutions and the less sociable people that they employ.

From the perspective of evolutionary biology, though, this is hardly surprising. Perhaps Homo neanderthalensis was destined to be supplanted by the superior Homo sapiens, but one can be sure that they did not go quickly, and they did not go quietly; rather, they struggled to survive in every way they could for as long as they could. And today’s Homo sapiens, though lacking the beneficial traits of Homo aspergerus, are similarly struggling to maintain their positions in a changing world that is making them obsolete. To that end, it seems that they have developed and refined two powerful weapons, which I have decried at length as part of my website’s series of essays on “Unknown Menaces to Civilization”: the job interview, and the meeting.

To be sure, the basic idea behind the job interview is reasonable enough; yes, before businesses hire people, they should first have a conversation with them. But when I first began seeking jobs, interviews took the form of a single session that was no more than one hour long; and for that period of time, even people without great social skills might manage, by means of careful preparation and great effort, to present themselves reasonably well and hence, if they also had strong credentials, they might obtain the desired position. But somehow, the idea emerged that a single brief interview was not enough; rather, one needed to extend the interview process to consist of a long series of separate interviews with all of the groups or individuals who might, by some contorted logic, be deemed “stakeholders” in the hiring decision. On some university campuses, the resulting day-long interview has now been supplanted by the two-day interview, during which a woman seeking an administrative position might have to endure over a dozen meetings with administrators, the president, staff members, faculty members, lecturers, undergraduate students, graduate students, alumni, janitors, gardeners ….. once any group gets added to the list, they will always be included. And the only purported respite comes when the prospective administrator is offered a free lunch or dinner with campus employees – which, of course, constitutes yet another interview in disguise.

No matter how qualified they are, no individual with Asperger’s Syndrome could possibly survive such an unending series of social encounters; invariably, they might express irritation in response to a silly question, offer an inexpediently honest answer, confess to their ignorance about a subject of supposed importance, or, completely exhausted by the process, fall back on curt replies while praying for this ordeal to end. And any lapses of this kind will doom their chances of getting the job. In fact, only one sort of person could possibly pass such a series of interviews with flying colors: in the past, they might have been called “smooth talkers”; today, other, less euphonious expressions might come to mind. In sum, the extended interview process has ceased to function as a way to identify the person best suited for a position; instead, it only identifies the person with the best gift of gab – who may, or may not, also be the most qualified candidate.

And this is completely understandable, if seen from the perspective of current administrators who also acquired their posts primarily because of their gift of gab: quite naturally, they would like to work with people like themselves, and hence they would accordingly plan an interminable series of interviews to bolster the chances of people like themselves. Granted, one cannot ascertain the true motives behind these extended interviews, but knowingly or not, their creators have implemented a system to exclude Homo aspergerus from any organization’s higher echelons.

This is not a problem if the smooth talkers they hire are actually competent, but as indicated, that is not necessarily the case. And if unqualified individuals do obtain jobs that they can’t really handle, how can they hold on to their positions? In most cases, they can rely on the work of their less sociable subordinates, who typically know what they are doing; then, when something is done well, their bosses can take full credit for it, and when something is done poorly, their bosses can assign all the blame to the lousy employees that they have been provided with. Yet it is also necessary for these talentless bosses to maintain the illusion that they are actually involved in doing the work that they are responsible for, and the best way to achieve that goal is to employ a second mechanism of dominance, the meeting.

Like job interviews, meetings were originally based on a reasonable premise: yes, it might be good if everybody got together every so often and talked about what they are doing. But as long as people working on the same project are regularly communicating by means of memos or emails – the methods that Homo aspergerus prefers – there is no real need for periodic personal meetings, as well demonstrated by the productive work of numerous academic researchers who collaborate while living on different continents. But smooth talkers are in their element at meetings, since they can always talk a good game, and they can pretend to be contributing to the work of their subordinates by issuing redundant commands or offering unhelpful advice while their listeners remain silent, failing to impress anyone and laying the groundwork for another mediocre performance evaluation. It is little wonder that the typical daily schedules of many contemporary executives now consist primarily of one meeting after another, unnecessary interruptions to the work of capable underlings that may be arranged primarily to create the impression their conveners are actually competent, hard-working administrators.

These survival strategies, though, can only work for so long; for if an organization is being managed by an empty suit, serious problems will eventually emerge, and blaming subordinates only works for so long. We can now account for one typical characteristic of the modern business: the low-level or mid-level employees who perform most of its essential tasks may hold their jobs for decades, while the executives running the organizations may be replaced every few years or so. The usual explanation is that the people at the top are constantly under tremendous pressure, given their awesome responsibilities, so that a single mistake might lead to their downfall, while underlings escape such relentless scrutiny. My alternate theory is that their socially awkward subordinates are typically competent, and hence keep their jobs, while their smooth-talking superiors are occasionally incompetent, and hence tend to lose their jobs. The problem is that whenever inept administrators are forced out, companies invariably set up another twelve-hour interview process that may result in hiring another empty suit.

All of this can explain why Homo aspergerus has not been rising up the corporate ladder – but what about the largest corporations of them all, our local, state, and national governments? As a parallel trend, we notice that political campaigns have been getting longer and longer, and such campaigns might be largely defined as extended “interviews” of perspective officeholders by reporters and voters – or, perhaps, as an endless series of “meetings” with various constituents. Today, presidential campaigns routinely last for two full years, and as of July, 2017, scores of people have already launched their 2018 campaigns to become congresspersons, senators, or governors. And, as is the case with job interviews and meetings, it is easy to argue that longer and longer campaigns with more and more events are leading to more and more inferior victors, as only smooth talkers, who may or may not be the most qualified candidates, can endure and thrive during such a grueling process of social interaction. For many, the best evidence for the negative effects of extended campaigns would be the current occupant of the Oval Office, but people of a different political persuasion might point to other recent officeholders as well. It is routinely speculated that our two greatest presidents, George Washington and Abraham Lincoln, suffered from Asperger’s Syndrome, but their successful campaigns required only limited personal contact with journalists and voters; it’s hard to see how they might be elected today, unable to make a good impression on television or to avoid committing a fatal “gaffe,” or why they would even voluntarily submit themselves to the ordeal of a two-year campaign. (In my own case, I permanently abandoned my childhood dreams of becoming president when I realized that it would involve incessant social activity.) In sum, like the business world, the political world has crafted mechanisms that give an advantage to smooth-talking empty suits.

Little can be done, perhaps, to alter political campaigns so as to improve the chances of well-qualified but gauche candidates, but socially challenged job-seekers, frustrated by their inability to obtain the positions they deserve, might – like one commentator on my original article – angrily rail against the “charming idiots” that so often rule the roost and launch campaigns to combat discrimination based on the presence or absence of charm. Further, if a position that involves sitting in a cubicle all day and working on a computer incongruously demands “strong interpersonal skills” as a job requirement, a rejected candidate lacking those skills could file a lawsuit claiming that they are being illegally discriminated against because of their disability; and the terminological switch to “autism spectrum disorder” could prove advantageous to Homo aspergerus, since no one would deny that autism is a disability. But as I indicated before, I do not believe that individuals who are fortunate enough to have Asperger’s Syndrome should bewail their failures in a system that is stacked against them; instead, they should do what they do best – think outside the box – and figure out their own ways to triumph over the world. And in ways that do not always attract attention, I believe that many of my compatriots are now doing exactly that, constructing and thriving within their own niches in the societal machine.

In my own case, after a lifetime of employment in the academic world, some might describe me as a failure, since I never became a full-time faculty member or senior administrator. (On one occasion, I was even told by a superior that I could not be promoted because the person’s colleagues thought I was “strange.”) But it would be silly for me to complain that I was never able to become some university’s Jubilation T. Cornpone Distinguished Professor of Literature or Assistant Vice Chancellor for Student Affairs – since, by any rational measure, I have been far more successful than most of the people who have garnered such titles. My books are shelved in university libraries all over the world, and I get to annoy millions of people with my film reviews. Other oddballs with disparate talents – inventors, bloggers, poets, artists, musicians – are also contriving to have an impact on the world without following any of the traditional pathways to success. While often receiving little attention, they are arguably working, slowly but surely, to undermine and eventually supplant the institutions that reject and suppress them, playing the role of small, pesky mammals in a world of doomed corporate dinosaurs.

The most conspicuous vanguards of this nascent new order are those individuals who start their own businesses to compete with the existing businesses that do not value their input and ideas. After all, you don’t have to do well in an interview to hire yourself, and if you’ve devised a way to do something better – which a socially isolated person is most likely to do – you have an excellent chance of prospering beyond anyone’s expectations. Silicon Valley is filled with highly profitable companies founded by a few geeks who made millions from their technological innovations, while companies like Sears are floundering because their glib executives can’t figure out how to attract customers. And if these upstart companies can themselves avoid becoming stultified bureaucracies, and keep seeking out and promoting capable but unsociable employees, a new business model might emerge that minimizes the advantage smooth talkers now enjoy. Furthermore, extremely successful nerds might even have the money and name recognition to effectively market themselves as political candidates – a theory that might be tested in 2020 if, as rumored, Mark Zuckerberg decides to run for president.

It is also possible that in the future, both businesses and governments will come under the control of superior artificial intelligences – but this in itself might be interpreted as a different sort of triumph for Homo aspergerus, for two reasons. First, artificial intelligences have not yet demonstrated a flair for quirky originality, but they certainly display the Asperger’s trait of awkwardness in social situations (recall HAL 9000’s inept efforts in 2001: A Space Odyssey [1968] to persuade Dave Bowman to stop turning him off, and real-world instances of inapt responses from Siri and Cortana), and some striking examples of computer-generated poetry and artwork indicate that they someday may be able to think outside the box just as well as an isolated human being. Our new mechanical masters, in other words, may exhibit their own signs of Asperger’s Syndrome. Second, if any individuals are going to prosper in a world dominated by machines, it will surely be those with Asperger’s Syndrome, since – as I noted before – they generally prefer to associate with calm, predictable machines instead of bizarre, incomprehensible human beings.

However, I must now move away from these general observations and speculations, since this essay is being posted at a science fiction website, to again explore the extent to which Asperger’s Syndrome is related to science fiction. To modify my previous argument, I must now report that, today, the honest answer is “not very much” – though that was not always the case.

*****

When science fiction emerged as a genre in the 1920s and 1930s, it naturally appealed to a certain sort of adolescent man: he was good at math and science, but he wasn’t attractive, and he wasn’t athletic; at his high school, he was perceived as a loser, in contrast to the class presidents and star quarterbacks who ruled the social scene, and he seemed destined to forever be a loser, never able to win friends and influence people. One solution to these outsiders’ plight – to effectively become one of the cool people – was offered by the classic Charles Atlas advertisements that I actually recall seeing in childhood comic books, wherein a muscular man kicks sand in the face of a scrawny nerd: pay your money, the ads promised a presumed readership of humiliated scrawny nerds, and Charles Atlas will transform you into a muscular man who can kick sand in other guys’ faces. But there was a reason why these boys weren’t already muscular – they enjoyed their slide rules and science labs, and they didn’t want to waste their time lifting weights or running laps.

The science fiction stories of their day offered a more appealing solution: an envisioned future in which smart scientists would rule the world. This is almost literally the case in Hugo Gernsback’s classic novel Ralph 124C 41+: A Romance of the Year 2660 (1911-1912, 1925), wherein Gernsback’s hero became, solely because of his scientific acumen, one of the most influential and respected individuals in the entire world. But even the more adventurous protagonists of space operas were always characterized, first and foremost, by their intelligence. The heroes of E. E. “Doc” Smith’s and John W. Campbell, Jr.’s space epics were brilliant scientists who built their own spaceships and invented their own weapons, and even the younger protagonists of Robert A. Heinlein’s later juveniles were always brainy kids, capable of helping their uncle launch a rocket or captaining a starship. The message couldn’t be clearer: in the future, you won’t have to be charming or athletic in order to succeed; you will only have to be smart. So, while sitting at home without a date to the prom, young nerds could console themselves with the thought that someday, those class presidents and star quarterbacks dancing the night away would be under their thumbs, in a new, more enlightened society that properly valued their intellectual abilities.

However, while this message resonated with young people who identified themselves as budding geniuses, it wasn’t a good strategy for attracting a larger audience of people who needed heroes of a less intellectual nature to identify with. Hence, science fiction began to change in order to make itself more popular. We observe a transitional stage in the original Star Trek (1966-1969). Now, Captain James T. Kirk was not a dummy, and he could display a modicum of intelligence in episodes like “Arena” (1966), wherein he employs some basic scientific knowledge to devise an effective weapon on a barren planet. But he was for the most part a concession to popular taste, the sort of hero who usually prevailed only because he could beat up the bad guys and seduce the pretty women. Still, in many cases, he was absolutely dependent on his more capable subordinates: scientist Mr. Spock, engineer Montgomery Scott, and physician Dr. Leonard McCoy. At times, when the Enterprise seemed doomed, all Kirk could do was to urge Spock, Scott, or McCoy to work faster to come up with a solution; and thus it was these men, in the science fiction tradition, who emerged as the true heroes.

The situation is entirely different in Star Wars (1977) and its numerous sequels. True, scientists are occasionally observed in the Star Wars universe, but they are always figures in the background. The major heroes may have courage and spunk, they may be attuned to the “Force,” they may be wizards at wielding light-sabers or firing guns, but they are never intellectuals; if Luke Skywalker’s light-saber broke, he wouldn’t know how to fix it, and Han Solo may have the practical know-how to keep his decrepit Millennium Falcon operational, but he couldn’t build a better starship. These are heroes for the masses, and hence they are massively popular; but there isn’t much in these stories to appeal to the socially inept geeks who were drawn to science fiction in the 1930s. Even more decisively than Star Trek, then, works in the Star Wars franchise primarily celebrate all of the traditional heroic virtues of strength, conviction, and bravery, and their influence on the written literature has been immeasurable; indeed, if asked for a brief description of the sorts of science fiction that now fill the shelves of Barnes & Noble, “adaptations and imitations of Star Trek and Star Wars” would work well enough.

Bluntly, science fiction has become a literature for muscular guys who kick sand in scrawny kids’ faces.

That is an observation, I hasten to add, not a complaint. After all, since Campbell famously proclaimed science fiction to be the one form of literature that accepted the inevitability of change, it hardly behooves anyone familiar with the genre to complain that science fiction has itself changed. In the 1990s, there were innumerable laments about the ways that science fiction was becoming dominated by repetitive sequels and series, and calls for campaigns to drive such ephemera away and nurture science fiction that was true to the genre’s innovative (and implicitly intellectual) roots; today, we recognize that these efforts to resist change were naïve and futile. The reality is simply stated: up until the 1960s or so, science fiction was very appealing to Homo aspergerus; now, for the most part, it is not, and today’s representatives of the species must seek out other forms of palatable entertainment.

As to what those forms of entertainment might be: there are clues to be garnered by attending a science fiction convention, largely dominated by aging fans who were attracted to science fiction as youths and retain some affection for classic authors and the increasingly beleaguered contemporary authors who are still endeavoring to follow in their predecessors’ footsteps instead of churning out space operas about plucky space pilots battling sinister aliens. Looking around at all the pot bellies, walkers, and wheelchairs at these conventions, one might reasonably ask: but where are the young people? And in my experience, there are two places where they can be found: the anime room, and the gaming room.

I cannot claim any great familiarity with anime, but it is my impression that its young heroes tend to be more brainy than brawny, and that it features stories that involve and require scientific knowledge. For example, in one film I have watched several times, Mamoru Oshii’s Patlabor 1 (1989), an embittered scientist devises an amazingly convoluted scheme to transform Japan’s human-controlled robotic laborers into killing machines, requiring some brilliant youths to deduce what he has done and find an equally ingenious way to prevent a catastrophe. In this situation, a Luke Skywalker or Han Solo would have been useless, but Smith’s Kimball Kinnison could have saved the day, aligning Oshii’s story with the space operas of the 1930s.

As for video games, they do require a certain amount of physical skill in expertly manipulating the controls, but their primary requirement is a great deal of thought: a player must figure out how to get over an obstacle, or how to defeat an enemy, and in competitive games, the best players must constantly devise new strategies to gain an advantage over similarly inventive adversaries. Some players even strive to improve games by hacking into the software and changing their rules. Casual gaming may be commonplace, but the men (and occasional women) who become dedicated to perfecting their play are clearly attracted by the intellectual challenges that these game pose, and success in that milieu represents another validation of the idea that the future will belong to the smart, not the strong.

In sum, the anime room and the gaming room are offering the young attendees at science fiction conventions (probably dragged there by their parents) the sorts of messages that science fiction was offering me, and other marginalized youths, from the 1930s to the 1960s; and while I cannot relive my youth now, I strongly suspect that, if I were a young person today, I would have little interest in the exploits of Luke Skywalker and Han Solo, or Honor Harrington and Miles Vorkosigan for that matter, but instead would be obsessed with watching anime and playing video games, activities that are now much better suited for Homo aspergerus.

*****

Overall, then, my thoughts about the condition that has defined my life have changed in two respects. First, I am less optimistic about the rapid emergence of Homo aspergerus as the world’s dominant race, recognizing that their antecedents will fiercely battle against their ascendance, though I discern signs of their incipient victory outside of traditional venues. Second, I am less inclined to describe science fiction as the species’ characteristic form of expression, accepting that the genre has irrevocably changed in ways that will force Homo aspergerus to seek out other sorts of works to provide them with sustenance and support.

But one important thing has not changed: I continue to believe – perhaps egotistically, perhaps vaingloriously – that my compatriots and I represent the true future of the human race. People today are generally spending far too much time connected to the world by their laptops, tablets, and smartphones, and hence they are being inexorably drawn into a mentality of groupthink; and the fact that they sometimes have two or more different forms of groupthink to choose from (hating Trump, loving Trump) does not alter the fact that their thoughts are increasingly identical to the thoughts of vast numbers of people, and hence have no real value. Today, we face vast and significant problems that demand new and innovative solutions, and I’m sorry, but you’re not going to find those solutions by visiting MoveOn.org or Breitbart.com. Instead, we need people who don’t spend most of their time connected to the world, and we need to celebrate – and listen to – those people, even if they don’t maintain eye contact or seem strange. No, I myself don’t have any wonderful ideas about how to save the world – having devoted my life to understanding science fiction, not the real world – but if anybody does, I’m sure it is some people like me. But will anybody ever hire them?


Gary Westfahl has published 25 books about science fiction and fantasy, including Science Fiction Quotations: From the Inner Mind to the Outer Limits (2005), The Spacesuit Film: A History, 1918-1969 (2012), A Sense-of-Wonderful Century: Explorations of Science Fiction and Fantasy Films (2012); excerpts from these and his other books are available at his World of Westfahl website. He has also published hundreds of articles, reviews, and contributions to reference books. His most recent books are the three-volume A Day in a Working Life: 300 Trades and Professions through History (2015) and An Alien Abroad: Science Fiction Columns from Interzone (2016), now available from Wildside Press; ; his forthcoming books include Arthur C. Clarke and a collection of essays from the Eaton Conferences on Science Fiction and Fantasy Literature.