Zeroing in on Zero Dark Thirty

One of the major controversies surrounding Kathryn Bigelow’s multi-Oscar nominated film Zero Dark Thirty is that is appears to condone torture. The defense is that the film-makers were simply depicting the torture without (necessarily) actually condoning it. Its Wikipedia entry gives a detailed summary of various positions, so I am just going with a personal reaction. Just as Quentin Tarantino defended his ‘N-word’ -effusive film Django Unchained as simply depicting rather than celebrating the offensiveness of the N-word, Bigelow defends the torture as simply depicting what actually happened without necessarily rationalizing it. However, once you put away the whole media controversy and watch the film, it becomes clear that Bigelow makes her protagonist Maya simply the good guy in a bad (but necessary) place. So, when the already overly-long movie finally comes to an end, Maya gets to have a nice good cry for a nice long time, and then everything becomes just fine and dandy as my late grandmother would archly say. And the good side wins. What a relief!

But is ‘relief’ the right word?

 

First there was the very disturbing Abu-Ghraib-style torture scene at the beginning of the film which was very Reservoir Dogs Tarantino-style in its grit, grime and blood, set up against the backdrop of the industrial warehouse atmosphere. Incidentally, note here that one of the biggest criticisms of torture to extract information is that if you apply enough pain, or subject the person to enough trauma (sleep-deprivation, loud blasting music for 96 hours, hands half-way hung up for maximum fatigue), then the person is liable to say anything simply to make the pain stop. That is what happens to this victim as he is first tortured, and then shoved into a pint-sized ‘box’ for extended periods of time—he simply mutters Monday Tuesday… hoping that any named day of the week will somehow that will save him. What if he doesn’t know the required information? What if he is innocent? Even, giving that benefit of the doubt to the torturer that if the captured suspect does know something, is that an acceptable way to get information out of a human being? 

 History is filled with stories of this kind of torture—the Spanish Inquisition, (or closer to home in place and time) Witch-burning in Salem, Massachusetts, for example. More than anything else, torture seems to satisfy a sadistic need of the torturer to inflict pain (or perhaps a punitive desire to exact revenge), under the guise of extracting information. The opening sequence in the film of the Sept 11, 2001 scene—shot entirely in darkness with only voices and sounds—represents the horror of what the bad guys did. And yes, it was heart-breakingly, powerfully, vivid in the absolute darkness of the attack within the collapsing towers. The rest of the movie (as Bigelow has made it), is the story of what happens when you mess with Uncle Sam. Let that be a lesson to you proto-terrorists! Ka-boom! Yee haa! Yippie Ka-Yay! And all them American war cries of victory to cosmeticize the upcoming massacre.

I had been following the Abu-Ghraib and Guantanamo Bay tortures in the news since that sickening day that we now call 9/11, that I was interested to see the perspective in this film–the stance is very simple, as Maya tells the victim: “YOU are doing this to yourself; if YOU tell us what we want then YOU can stop this torture.” This is obviously blaming the victim and completely absolving the torturer. It is so simplistic, and so chillingly rationalized, (think of the similar rationalizing of ‘simply following orders’ defence that featured prominently in the Nazi war criminal trials at Nuremburg, the mutilation and genocide in Rwanda, the ‘ethnic cleansing’ of the Armenians and every other brutally evil abuse of power). And in every one of these cases, the torturer has some sort of self-justifying moral high ground. In this movie, the justification is that the CIA tortures its victims to get information that will prevent future terrorist attacks. Oh? Really?

 

The most troubling assumption, of course is that everyone caught and tortured is actually guilty, so it is only a matter of heaping more torture and trauma until the victim babbles away something that the intelligence officer can file a report about to HQ. Brilliantly played, Jessica Chastain’s Maya has her best moment in the passionate reprimand she belts out to her careerist boss in Pakistan (her neck-veins bulging in self-righteousness), which could apply equally well to Maya, except that in her case, she has only one item in her career portfolio—UBL. Usama bin Ladin. So, come hell or high water, she is going to get her UBL! At least that was the sentiment Seth MacFarlane picked on in his Oscar-hosting comment about the movie being a “celebration of every woman’s innate ability to never ever let anything go”[1] which earned him a media and public roasting.

Then there was the blatant assassination of all the people who lived in UBL’s supposed ‘house’—the husbands and wives were ruthlessly gunned down in their houses in front of their children. That chilled my heart more than anything else, for it made me wonder how many such brutal murders of people are done by commando squads on their raids of suspect houses. How many silent victims exist who are killed with not even the slightest consideration of if they are indeed guilty. If you lived in a dangerous land with your family, and one night some masked strangers came firing at your door, will you open the door and invite them in for a cup of tea, or will you pull out your gun and fire away at the doorway? If you do that here in this film, you will get killed in front of your kids. Even the coyly whispered “Usama” which was designed to elicit a laugh at that tense moment, points to the mocking stance of the whole attack—and the attitude of the murderous soldiers, equipped with their mind-bogglingly superior firepower as they riddled bullets into the bodies of the men and women sleepily rising from their beds. This is euphemistically called “Collateral Damage” which means the massacre of innocent civilians in the process of acquiring military targets. 

Then there is the laborious in-between part which shows Maya’s quest for UBL amidst paperwork and politicking. For her, watching thousands of DVDs of ‘interrogation’ has numbed her to the sheer atrocity of all those tortures, as she simply collects questionable data points, and hazards guesses about reliability of the supposed links. In some way, it reminded me of some of the half-baked theorizing and data mining that we did in our team projects during our MBA studies. We fashioned ourselves as tough business people, keeping up awake late at night, snacking on energy bars and bravely muscling our way into those deeply deep business strategies. What utter codswallop we manufactured in so many of those meetings, Powerpoints, Excel sheets! The only difference was that we picked our data-points and figures from Bloomberg terminals, HBR case studies, google, etc to impress our professors into giving us good grades. In contrast, the Mayas get their suspect data points from the savagely tortured suspects. But hey, it is all in a day’s work, eh? A dead giveaway of the self-seriousness of the director is in the title Bigelow gave to the movie—originally it was called For God and Country (which is a definitely more appropriate one, with all the double-edged insinuations of the expression), but she needed some tough militarydoublespeak jargon of the thirty minutes after midnight when the attack supposedly occurred, Zero Dark Thirty… Zero Dark Twenty-Eight-and-Twenty-Three Seconds would simply not do.

 

Maya is presented as this determinedly dogged hero even if she looks like a rather delicate woman. She doesn’t flinch in the face of the tortures both ‘live’ and on DVD, she pushes through the bureaucracy, and in good old-fashioned cowboy style, she is gonna git that summabeatch who killed mah freends, and she is gonna gun down that dirty dog what did it. By the way, the bureaucratic red tape here is that annoying problem of not being able to torture the prisoners more frequently, and faster, to get the “information” so that it takes 121 days to hunt down UBL. If only they could have freely tortured every Muslim and/or Arab in the world, they might have gotten the information sooner, and UBL dead sooner. At least that is the underlying assumption for the 2nd half of the movie.

The director Katheryn Bigelow (who is older—but interestingly similar in appearance to Jessica Chastain’s Maya) made a name for herself as a tough woman in a man’s world of machismo brawn by making gritty, tough-guy movies (Point Break, K-19, and Oscar-winning Hurt Locker). What better way to out-tough-guy the tough-guys than by being an even meaner “m*th*rf*ck*r” as Maya self-defines herself? So, with the military firepower of the US army, she creates the ultimate tough-guy movie in which the willowy blonde is the toughest m*th*rf*ck*r who brings in UBL—which even the greatest nations with the toughest tough guys could not do. And as ‘toughness’ is measured in the meanest, most savage and bestial treatment of the so-called enemy, she is going to have the balls to do it. Is this the new American hero? (At least Tarantino’s heroes were definitely anti-heroes). Ask that of the thousands of children orphaned or dead, women machine-gunned in the back, and tortured men who had the misfortune not to be born American.

But they are collateral damage. In other words, Dead.

 

 

Postscript: The subsequent J. J. Abrams film of Star Trek: Into Darkness (prominently dedicated to the 9/11 veterans), again underscores the obligatory homilies about the self-destructiveness of revenge, the cautionary note about becoming evil in the process of hunting down evil people, and all that colloquial yara yara. Interestingly, the film features not just one, but two, “beating up the bad guy” scenes: One, where the galactic hero of good, Captain Kirk, and another where self-sacrificing superhero Mr. Spock, systematically proceed to give the baddie Khan what could only be described as an ‘epic beating.’ In both cases, Khan is held at gunpoint by the good guys, and beaten up by Kirk and Spock who give vent to their anger and desire for revenge on their unresisting prisoner (until the soft-hearted ‘female, nurturing character’ Uhura cries out to them to stop). And this is supposed to be ok. Much interstellar hay is made in the movie of violating the ‘Prime Directive,’ with nary a whiff of the Geneva Convention principles echoing onward into the 23rd century—a century in which provides drunken Kirk with his bar-brawls, complete with the primitive and yet more equitable conventions that govern the brawl: both guys have an equal opportunity to punch the other. But on an even playing field, it is probable that Kirk would have been creamed in the fight.

Advertisements

Certified Copy: A Reaction

Abbas Kiarostami’s film, Certified Copy (“Copie Conforme” in original title, 2011), presents a fascinating encounter as if it were the snapshot of a marriage, especially as it is played out by two strangers who meet for the first time, and end up spending an afternoon together– a period of the many hours of a morning and an afternoon, which are captured in just over 1.5 hours that span the film. Without dwelling on details of plot, the issue I want to discuss is of the distinction between what is actually said, and the sub-text that inevitably arises as the conversation proceeds. On the one level it is a conversation between an author (the man) whose book of art criticism, he feels obliged to defend to a critical reader (the woman) who came to hear him speak on his book tour, and afterwards invites him for a coffee.

But very quickly we realize that the discussion is not purely intellectual as examples get drawn from the woman’s family—her sister’s stammering but adoring husband, her rebellious teenage son, her absent husband whose work seems more important than family, etc. The author seems oblivious to the offence he is causing to the woman as he blandly advances his counter-arguments (in typical intellectual fashion) to her obviously personal issues. She valiantly tries to match him on that same abstract intellectual level, but her emotions intrude, and from the first of many acidic comments we see the familiar contours of a more familiar kind of argument—that between a married couple.
The author (the man) wants to make one last point before changing the subject, to which the reader (the woman) responds that he simply “wants to have the last word.” She takes him to see a painting that she thinks will be of interest to him, but when they get there he doesn’t seem interested. She complains to him about issues she is having with her teenage son, and he responds by defending the son, much to her irritation and dismay. They discuss marriage as an institution but again the discussion quickly becomes a husband-wife argument as she projects her experience of dealing with an absentee husband, and he projects his experience of dealing with the unreasonable demands of a wife. It is a familiar argument, and many of us may have experienced it in different shades in our relationships.
Interestingly, while the woman initially comes across as being stubborn and petulant while the man is balanced and urbane, they both go through transformations as the film progresses. The man becomes sulky and irate, and the woman sympathetic and kind. The ending is ambivalent, but a realization seems to come to the woman at least, that the two of them have different (and incompatible) conceptions of the world. The whole conversation (and the most of the film), peppered with arguments, feints and thrusts, reconciliations and breaks is revealed to be a futile attempt by each to convince the other of the rightness of his/her worldview. Even as the film ends, this conundrum remains explicitly unresolved—as it does between married couples in the real world. And the familiar arguments continue (Who is right? Me, of course!) raising questions of the futility of marital arguments, or the acceptance of them—depending on your disposition.

A Minor Surgery

Just the other day I finally got to see the surgeon for a minor surgery that I needed done. The procedure here in Canada (national health insurance covers all citizens and permanent residents)  is that one needs to first see a General Practitioner, GP (generally a family physician) who will then refer the patient to a Specialist if required. In most cases, however, even where the need for a specialist is obvious (e.g. major fractures almost invariably require surgery), the specialist cannot treat a patient without a letter of referral from a GP. Thus arise the inevitable delays in getting prompt treatment: a GP visit is first required, and that takes time to book, whose function is absolutely not to treat the patient (beyond a suggestion for painkillers or something equally ineffectual by the sympathetic GP), but simply to acquire that letter of referral. Then a specialist visit is required, which generally takes even more time as specialists’ calendars are booked months in advance. The purpose of this first specialist visit is then to verify for himself or herself that the GPs diagnosis is indeed accurate and that this particular specialist’s treatment is required (Crafty rascals, those GP chaps!). Still no treatment is provided beyond perhaps the psychosomatic relief that the patient may feel in finally being able to articulate for the first time to someone who knows and cares—so far it has been like talking to the hand, so to speak. Then comes the second visit to go over the special procedures (to continue with our surgical example, say, the surgical operation). Finally a date is set for the surgery into the surgeon’s calendar, which as noted above, is heavily booked, permitting the said appointment to occur weeks or months later. In sum, a fractured bone could take anywhere from a few weeks to months before any actual therapy is even begun to be effected.

One has to bear in mind the time taken off work for the various doctor’s appointments (one is lucky to be seen within the hour after the actual appointment time, which is generally due to heavy overbooking by the receptionists to ensure that—even factoring in cancellations—that every minute, nay, every second, of the doctor’s time is fully booked in the billable hours of the day). There is the possible need for assistance if the fractured bone in question impedes general mobility—say a broken leg—will inevitably requires one to secure the assistance of a relative or friend to assist along during the doctor’s visits. Even if the fractured bone—say a broken finger—does not condemn its suffering owner to such abject dependence, the business of travelling about back and forth between GP, Specialist and other hospital visits is at the very least a cruel inconvenience. I cannot imagine anyone with a broken bone (of whatever functionality), joyously looking forward to the difficulty of movement—the travelling about, and the endless waiting among other ailing souls, tattered magazines, and the sharp disinfectant with all its unpleasant associations—which would tax even a perfectly healthy and fit person, and far more a person who is in considerable pain! But these costs, not to mention the fact that the poor bone is still as broken as ever, and possibly creaking even more ominously with each tweak of the wear and tear of endless days until the great day: the day of surgery and repair. And then the whole healing process begins, with possibly further follow-up visits, time off work, abstinence from even an occasional consoling scotch, but at least one is generally thankful that at least some form of therapy has been effected, and one fishes out one’s stiff upper lip to wear for occasions such as these, and hopes for the best.

This brings me back to the minor surgery I needed. Apparently a cyst had formed above my wrist, and had recently become especially inflamed causing pain and declining mobility in the fingers of my left hand. Now, this is no major medical condition for which discoverer’s names are assigned and immortalized; it is so common that even as late as the seventies, it was called the ‘Bible cyst’ so named from the recommended homegrown treatment: one generally hammered down the cyst back into the wrist, and the Family Bible (a staple in every God-fearing household) was generally a handy hammer, and perhaps more effectual than another comparable volume for its divine interventions as well. However, recent medical wisdom exposed what must have been obvious from the start: that such a violent mashing of a liquid-filled cyst would forcibly disperse the liquid through the tissues and tendons causing recurrence of the cyst, and possibly infection or some such undesirable consequence. Therefore, it is recommended that a quick incision be made, the offending cyst permanently & safely excised, and the incision closed—in other words, a minor surgery.

At the hospital, I am misdirected three times (to three different wings of the hospital), and finally end up where I started at the counter which I had initially taken was the correct place, the Day Surgery Unit. Paperwork done, duly ticketed, taped and stamped, I was put in a consulting room adorned with diplomas and degrees extolling the professional excellence of a certain Karen Armstrong M.D., Ph.D, and a Fellow (doubtless a jolly good one) of the Royal College of Surgeons, and a few other impressive sounding memberships to similarly august organizations. A pleasant young man in his twenties, possibly stepping out of a GAP commercial, walked in, introduced himself as Dr. Fernando Spencer in an endearing Spanish accent, which recalls, perhaps, an audio-book reader of Harlequinn romances. After a couple of preliminary questions, scanning my requisite GP letter of referral, X-Rays and Ultrasound scan (done on different days, in different parts of the city), he confessed to me that some mistake had occurred, and I really needed to see someone else. At least he had the courtesy to sound apologetic: in an I-am-sorry-for-your-lousy-luck-but-hey-its-not-my-fault sort of way.

My previous visits to the GP at a drop-in clinic, to the Radiologist for X-Rays, and the Ultrasound Clinic for the scans, had been characterized by an immense sense of haste in terms of “face time” with them. These people had absolutely super busy schedules from which they were squeezing in a few minutes to do you the super big favour of a consultation. They generally zipped in to the consulting room, scribbled away comments to a few rapid-fire questions, interrupted you if you exceeded the word limit of 10 words-per-comment, and had you packing out of the consulting room within the next few minutes. Everything about the demeanour conveyed impatience, other (more important) preoccupations, and an overall sense of deep urgency like the world was coming to an end in the next few hours, and you have the temerity to prey on the doctor’s precious time with unnecessary questions about your puny affliction. As the specialist was somehow perceived to be upper in the pecking order of medical professionals—at least that did seem implied by the long waiting lists and the royal letters of referral required to be even granted an audience with them—I was conditioned to expect an even more hasty visit with even more perfunctory examination. It was therefore with well-conditioned resignation that I accepted my Syssiphian fate that all this time and effort, all these months of pain and weakened use of my left hand had been for nothing. I would have to go back to square one and re-start the whole process, hoping for an eventual therapy.

Dr. Fernando Spencer, quickly disappeared with my file, I thought permanently; after all he had performed his function in this consultation, which happened to be that a mistake was made in getting this patient to him when obviously another special kind of specialist was intended. Surprisingly, he returned, explained my next steps to me if I wanted to continue in this quest of getting my cyst removed. Pausing significantly, he added, that was, if indeed a cyst removal was necessary in the first place. He reasonably observed that a measure of mobility had returned to my fingers, the pain had dulled (and perhaps I had become more accustomed to pain as well as gingerly using the troublesome wrist to avoid further agitation), and that it might subside in time with the nightly application of a warm, soaked hand-towel. This time, he took my hand, applied some pressure to the bump, seemed to examine it with his clinical fingers, and simulated something of a doctor-patient conversation with me. After he had outlined some of the risks associated with surgery, not to mention the fact that I would have to wait a few more months to see the correct specialist, I considered the hot-towel therapy as perhaps the best option. In fact, my mother, a veritable font of home-grown remedies (it was from her that I got the ‘Bible cyst’ story), had suggested that to me when I first incurred the inflammation.

I should end with an apology for possible typos, and other mechanical errors in this piece. You see, I am at a slight disadvantage these days: the fingers on my left hand are not as agile as they used to be. And my wrist is wrapped in a thick hot hand-towel, which sometimes encroaches on to my keyboard. The surgery was not merely minor, it was irrelevant.

Scholarship ain’t bagged by American Lingo

In the world of finance academia, there is a belief that an American accent is indispensable for the aspiring researcher or professional; thus many of our professors give well-intentioned advice to non-American PhD students (typically from China and India) that they improve their accent as a part of their professional tool-kit for their future careers. I expect that the rationale here is that an American accent is equated with clarity of speech, idiomatic American usage, and a demonstration of adaptability to meet the needs of one’s (presumably American) audience. Let us examine this assumption piece by piece, starting with the latter and moving to the former:

A presumably American audience: the audience of a researcher is most likely fellow researchers, professors, professionals in industry (e.g. CFOs, Consultants, Investment Bankers etc). While there is an undeniable proliferation of speakers of American English in these areas, there is an arguably increasing number of non-American, Non-native English speakers in these occupations, whether as researchers, professors, or finance professionals. Therefore, it would be more appropriate that all speakers (including American English speakers) pay attention to their accents, and adopt a neutral, non-culture-specific idiom to enhance the ease of communication. The BBC World Service famously moved from a quintessentially British accent to a more neutral one to accommodate the needs of their vastly non-British listeners. Perhaps their strategic move is instructive here as the world of finance is becoming rapidly international, rather than specifically American.

Idiomatic American usage: In the USA, business protocols developed in the past (20th) century were culturally specific to America’s self-perception of itself as not formal, not pompous, but instead jocular and friendly. Thus, opening an formal dinner speech with a self-deprecating remark or humorous anecdote, starting off a client meeting with a comment about weather, illustrating the value of a strategic move to a Board of Directors using a baseball metaphor (play hardball, home run etc.) or Wild West metaphor (guns blazing, circle the wagons etc.) have all entered into American business communication. While they have been beneficial in enhancing communication among Americans who share these metaphors, they have been pointedly detrimental in communicating with non-Americans (or even Americans who do not share these metaphors). George Bernard Shaw famously described England and the USA as two countries divided by a common language, by which he meant precisely these localized metaphors. If a British researcher (say, Ms. A), were to talk about “going into extra innings” or “bowling a maiden over,” she would be as unintelligible to an American audience, as she would be perfectly intelligible to, say Mr. B from an Indian audience. In contrast, both the British and Indian researchers would be equally at sea listening to an American grippingly narrating an unexpected breakthrough “at the bottom of the ninth, with the bases loaded.”

Clarity of speech: The three elements which influence clarity of speech are fluency (smooth, enunciated delivery), comprehensibility (can be easily understood by an average listener) and pronunciation (which is self-explanatory), and all are equally lacking in American speakers as they are in non-American speakers: the oratorical style of successive Presidents George W. Bush and Barack Obama are a case in point. To the extent that researchers develop their fluency, an argument can be made for its necessity in communicating with an audience. Fluency, however, is an attribute of performance which is not automatically bestowed on speakers of American English, but of effective communicators in any language. Comprehensibility to a generally educated audience is the one of most intellectually taxing of tasks as it requires such a comprehensive understanding of the subject matter so as to structure, organize and simplify one’s research to its essential elements. Richard Feynman, the Nobel prize-winning physicist, is arguably more well-known for his ability to communicate the most complex of concepts to his audience, than for the discoveries themselves, which underlined his extremely sophisticated understanding of his subject matter[1]. In contrast, as George Orwell satirized in “Politics and the English Language,” the more typical befuddled academic is often prone to defensively hide behind a cloak of jargon, the undigested mishmash of his or her research in whatever language. Finally, the issue of pronunciation is in itself a thorny one as some pronunciation is more easily comprehensible to an international audience than others. The confusion of “R” with “L” has long been a source of amusement poked at Japanese and Chinese speakers of English, but nevertheless does not fundamentally block the comprehension of listeners (perhaps due to its ubiquitous presence). Still, misplaced stresses on syllables could lead to some confusion (“Indifference curve” comes across differently if the stress is placed on the “diff” rather than on the “in”). Perhaps a solution could be to encourage researchers to adopt a “neutral” pronunciation (similar to the BBC version) understandable by all users of English, whether native speakers or not.

However, when all is said and done, one needs to visit the primary goals of a researcher: to do top-quality research, come up with key knowledge-enhancing insights, and (for the more pragmatic of us) to convert these insights into practical and useful application. The need to communicate in a common language is undeniable, and to effectively communicate is definitely an advantage to a researcher (as it is to most other professionals in all fields), but this assertion that everyone work on developing an American accent seems to smack of an egocentric belief that all non-American researchers out there should adapt to the convenience of their American colleagues. For better or for worse the English language has become the dominant language of scholarship, as Latin was in medieval Europe, and Arabic and Chinese were in the Middle and Far East respectively. However, it is salutary to observe that while the Science Citation Index reports more than 95% of scholarly articles published in English, that more than 50% originate from Non-English speaking countries; linguist David Crystal has estimated that non-native speakers of English outnumber native English speakers by 3 to 1. A post 2008-America may still want to assert its version of English as the norm, but as research in Finance becomes increasingly international (as do other forms of scholarship), the current and future generations of scholars may come to view American English as a quaint affectation of scholarly interest only to linguists and historians of the English Language, rather than as the definitively modern Latin of 21st century academia.


[1] The Feynman lectures are still the textbook for undergraduate Physics courses at U of T. My first encounter with Feynman was through this textbook, which prompted my subsequent search into his other work, including the popular bestsellers, Surely You Are Joking, Mr. Feynman (1985), and The Meaning of It All (1998), both of which I wholeheartedly recommend to the general interested reader as an introduction to which is sometimes literally rocket science.

MY TWO LITTLE COPPER COINS

The Microsoft hostile bid for Yahoo in 2008 provides the backdrop to a HBR case that MBA students at the Ted Rogers School of Management were doing in a Corporate Finance course. There were 11 teams of five or six members each in a case competition, where they took on the role of the investment bankers to either Yahoo’s Board of Directors or to Microsoft’s Board. I was invited as one of the judges to that case competition which took place on Nov 7, 2011. What follows is a note I wrote to the students in my personal take on the strategy considerations behind that proposed deal, as well as absolute necessity of disciplined reasoning, and considered attention to human detail in understanding the way business operates in empirical reality.

Thank you for letting me be one of your judges in your recent case competition. It was enjoyable, educative and illuminating for me, as I hope it was for you with you listening to each other and learning in greater detail about the case. I wanted to make a few brief comments on the case, and perhaps as good a starting point as any would be the one word that was common to all the presentations: synergy. This word has become one of the great clichés of businesspeak, where most people know what the word means in a theoretical context, but most people who blandly use it have no clue about what it actually translates to in actual execution. All the teams spoke of possible synergies between Microsoft and Yahoo, but if you recall I was most keen in pressing you to explain what that actually meant in real life terms. I was not asking for technical details but for some common sense, logical ideas on how these synergies were to be achieved, or indeed if they could be achieved at all.

In your role as investment bankers, whether for Microsoft or Yahoo, your role is very similar to that of matchmakers arranging a marriage. Perhaps many of you, especially those of Indian or Asian origins, may have availed yourself of their expertise in helping you find your life partner for a long and happy future. Therefore at a very basic level, you need to see if the partnership will make sense; you cannot just thrust two people into marrying one another, and simply say that they will find “synergies” to make their marriage successful. That would be tantamount to arranging a marriage between a mouse and an elephant, and citing the possibility of their viable offspring to “mouse-and-elephant synergies.” My point is that things have to logically, and logistically, make sense in practical terms before you rush off to your Excel sheets to make all sorts of models. Incidentally how many of your teams had right at their first team meeting for this project with everyone impressively opening up laptops, seriously keying in numbers, or designing Powerpoint slides, or writing factoids on a whiteboard? Any good case meeting starts with a conversation about the companies involved, and whether the proposed recommendation is actionable. Then we can start working on the numbers to test the accuracy of our strategic analysis. To return to the marriage metaphor, we need to see if the couple in question are actually compatible before we start negotiating dowries, pre-nuptial agreements, or wedding gown financing.

So, let’s start with the basics. We can all agree that any Microsoft/Yahoo! Union would be an enterprise that takes on the market leader, Google, and significantly (but not necessarily exclusively) within the search engine business. Unlike in more traditional businesses with long business cycles and entrenched competitors from huge corporate organizations, any major internet-related business has radically short business cycles. The only way a business remains the market leader is because it is significantly better than the competition. Like in the Olympics in Ancient Greece, there were no records kept of the runners-up—just the gold medalists. So, being second-best (or third or fourth-best) is not really an option[1]. Further, just because your business leads in one industry does not mean it can lead in another industry. Take the sad and lamentable story of Blackberry: RIM came up with a secure email communication and internet access innovation that opened up the age of smartphones, and swept up the corporate world[2]. However, when Apple came up with the iPhone, Blackberry suffered a near fatal blow. So, Blackberry tried to respond with its pathetically pathetic Playbook (which was basically a copycat model of Apple’s iPad) and what a disaster that was! My point is that in this business, simply copying or trailing behind the market leader’s innovative product is completely ineffective in so many ways. Steve Jobs famously sneered at Zune and similar MP3 players which emerged to compete with the iPod (and its famous tracking wheel and single button at its center)[3]. All these copycat products ended up simply making their imitation tracking wheels, without understanding or being capable of replicating the design innovation that made the original tracking wheel such a radical and efficient innovation on the human-machine interface—even my daughter at the age of 3 learned how to use my iPod to play her favorite songs. And it is interesting to note that to date, iPhone has the best touch-screen interface in the mobile phone industry. This is innovation!

So, to return to the Microsoft and Yahoo Union (MYU), if there is any intention of competing with Google, then MYU has to create a radical innovation. It is true for any IT-related business, but especially for Google which was no one-hit wonder, but a massively game-changing company, product and service. Indeed, if I may indulge myself, I would divide the history of the internet into two periods, the way we divide western history into B.C and A.D. In the history of the internet, there was the dark ages B.G (Before Google), and the era of light A.G. (After Google). So, that is the level of innovation that is needed if MYU wants to unseat Google. Some of your presentations actually hit on this point—not to simply get some copycat, or unoriginal search engines hopefully (all too hopefully) cooperating in some hokey “synergies,” but an actual game-changing innovation. Good job!

For some of you, that innovation would be the Panama Project advertising system at Yahoo, which may or may not be the next greatest thing since sliced bread, but the key point is that it was still in beta testing. Just think about it. Is there any possible way that a product still in beta will be able to compete with the existing tried-and-tested Google ad-incorporated search engine? Even Google itself rolls out its excellent products in beta, never to compete with market leaders—Google chrome in beta was never competing with Internet Explorer. It was only later that Google chrome became ubiquitous, though Internet Explorer still firmly controls the business world (my work laptop uses only IE, and for security reasons I cannot even download and install Firefox or Google chrome which are vastly better browsers). So, I am not buying the Panama Project “synergies” just yet. And if I am to dip into some of those organizational behavior courses I took during my MBA, I would hazard a reasonable guess that the inevitable cost-cutting due to replication of functions in the new MYU company will definitely drive away a significant portion of Yahoo’s and Microsoft’s top talent right over to Google which is famously known for the loving care with which it treats its employees (if you don’t know about it, simply search for “Top 10 reasons to work for Google,” or should I say more appropriately, google it!). Many of you chose the WACC of Yahoo as a proxy in your valuation to get your price range for the share price, but how realistically accurate of a value is that? Sure, you could do the math, make a lovely financial model, impress Professor Arup that you have mastered DCP and Comps valuations, but at the end of the day do you not think that you may have grossly overpriced Yahoo? If you still remain defensive on this point, I will draw your attention to the sad and lamentable tragedy of AOL Time Warner. This time, you may want to wiki it instead of googling it.

But, let us say, for the sake of argument that MYU aspires to produce a totally insanely great product that will change the face of the internet as we know it. Then the question arises of if this is indeed possible. Let us go visit the hopefully happily married couple of the mouse and the elephant. My guess is that no matter how much they love each other, they may have some challenges making little elephant-mouse babies, and the problems start right from conception, if you pardon the intended pun. Microsoft has a long and established tradition of creating, and foisting on the unsuspecting public, sub-standard products (MS-Windows, Internet Explorer, MS-Office etc). Its business processes and its corporate culture are ruthlessly competitive, with a heavy emphasis on production targets and deadlines rather than on innovative products or elegant design thinking[4]. Yahoo is innovative, and its business processes and products do tap into the cultures of innovation, so much so that it has become the homepage of so many users (who willingly or not get their hourly newsfeed of the sexual peccadilloes of their favorite celebrities). Then, there is of course, Yahoo Finance and Yahoo Auctions which are almost household words etc (at least in business breadwinner households). If Microsoft acquires Yahoo, presumably the Microsoft corporate culture will prevail over the Yahoo culture. Microsoft performance targets, milestones, logistics and ultimately people will prevail over those of Yahoo. Therefore any projections of future growth over the next 3 – 5 years are merely hockeystick projections (Why did you do valuations for 3 – 5 years, and then terminal value after 5 years? Are there any precedent transactions which suggest 3 – 5 years as a suitable growth period?) A valuation of Yahoo, with a recommended purchase price will have to incorporate consideration of not just what Yahoo’s independent current market value is, but more significantly of what its’ value will be to Microsoft once the purchase is completed, and the new MYU starts generating revenues. Of course as Investment bankers, you are duty-bound to present the rosiest, floweriest of financial pictures about projected revenues, but any smart board of directors worth their salt will definitely question the logic behind the numbers. And, in my experience, they don’t get convinced because someone mumbled “synergies” or “cost savings” or any other half-baked justification. Even the famous dean of the Rotman School of Management (and one of the top 50 Leadership Gurus), Roger Martin, earned a comment to one of his Harvard Business Review Blog articles earlier this year, to “put it back in and bake the other half” who signed himself/herself as ‘Incredulous[5].’ The proposal needs to be both actionable and be able to make intuitive sense.

So, to summarize, we first discussed our role in the merger, then we looked at the main competition (i.e. Google). We had to assess how we strategically take on Google, and decided that we need a massively innovative product/service. Then we looked at the logistics of Microsoft and Yahoo working together to actually produce this great innovation, and the likelihood was not encouraging. We also recognized that we have no good precedent transactions on which to base projected future cash flows. Further we recognized that Yahoo’s calculated market value (based on DCF) may not be an accurate proxy measure of how the new MYU company will perform. Your comparative valuations used ratios of comparable deals at the time of purchase, but did not consider how the post-merger actual performance compared with the predicted performance of the proposed pre-merger financial models. So, the key determinant of your valuations, the WACC, was more of what financiers colloquially call a WAG (Wild-Assed Guess). Does that mean that we should simply throw up our hands and say this acquisition won’t work, and why bother anyway? If you wanted to be ethical about it, perhaps you should not even be making this pitch to Microsoft and Yahoo—MYU has got all the signs of a failure about it. However, as a conscientious student at the Ted Rogers School of Management, and as a successful future investment banker, you are going to have to do some valuation, and come up with some price. By having this discussion of the actual strategic realities of the proposed acquisition, you are now in a better position to decide which valuation model to use, address its weaknesses and strengths, you know which numbers to pull off Yahoo’s balance sheets and how much validity to give these numbers, you can make realistic assumptions in your financial models, and in your sensitivity analyses. You will be able to convert the WAG into a SWAG (a Scientifically Wild Assed Guess). And finally, you have done the strategic thinking behind the whole process so that when you go to make your pitch to either Yahoo or Microsoft, you can explain the rationale behind the numbers (which hugely maximizes your chances of selling the board of directors on the deal). In a word, when you say “synergy” you will know exactly what you are talking about, and then it ceases to be a cliché, but becomes a sound business plan upon which you can stake your reputation, and your future success. Thomas Watson, CEO of IBM famously coined the injunction to his employees, Think! (with the red dot on the letter i reminiscent of the red mouse-ball on all IBM personal computers). Even more famously came Steve Jobs with the legendary Apple Commercial and its inspiring message “Think Different”. Good luck!


[1] Jack Welch of General Electric had his famous injunction to all of the GE Business Units to be “first, second, or get out of the business” which made GE the global powerhouse that it was during his period at the helm.

[2] Ted Kernaghan, billionaire philanthropist (and board member of RIM), once bragged to me that “Even the President of the United States carries a Blackberry!” So great were the heights scaled by a small company from Waterloo, Ontario, and which became a worldwide phenomenon.

[3] I highly recommend his posthumously published biography, Steve Jobs: A Biography (2011) by Walter Isaacson, especially now that Indigo is selling it at a discounted price of $25, which gives you some interesting insights into this titanic genius. For a brief preview of the man see my blogpost: https://tonymayadunne.wordpress.com/2011/10/10/lucifer-rising/

[4] I am using the word “elegant” in a very specific design sense of the word. Matthew E. May’s book, In Pursuit of Elegance: Why the Best Ideas Have Something Missing (2009) is a wonderfully readable description and discussion of elegant solutions.


Lucifer Rising

There was a certain poetic justice to my being informed through an iPhone of the passing of Steve Jobs on Wednesday, October 5, 2011, underscoring the words from his US President Barack Obama: “there may be no greater tribute to Steve’s success than the fact that much of the world learned of his passing on a device he invented.”[1] Despite that bright day being of an otherwise small personal celebration for myself in a long and dark year of degradation, I invited my friends—two of whom joined me in a toast to the life of this departed man on that bittersweet evening. The third raised her glass—with reluctance, and after much reservation. Surprising? Perhaps not. Jobs was many things, but he was certainly no poster-boy of amiability. As David Warren succinctly summarizes in The Ottawa Citizen, “[Steve Jobs] made it company policy to give not one penny to any philanthropic cause. Who pitched entirely to the mass market, with cleverly purposeful branding. Who imparted intangible fashion qualities to those products, through fanatic attention to industrial design. Who rode often brutally over opponents; who had anger management issues; and was the boss from hell to anyone who didn’t perform according to his exacting specifications.”[2] I raise these issues at the outset to forestall any accusations of hagiographic disposition in this blogpost.

Yes, he was all these things. He was also all the things that the many have said in tribute, no better epitaph for him than in the script he wrote for the famous Apple “Think Different” commercial  celebrating “The misfits. The rebels. The troublemakers. The round pegs in the square holes. The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them. About the only thing you can’t do is ignore them. Because they change things. They push the human race forward. And while some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.”[3] I have watched and shared that commercial with friends and colleagues over the years, and, even in my possibly thousandth viewing of that commercial today, the tightness rose in my throat, and the tear of inspiration in my eye. Those carefully chosen words. And his Stanford Commencement Address, which in the intervening years—between when he delivered the address, and when he died—showed to all the world that he meant every word of what he said. As much is being said about him, I will restrict myself to a few unobtrusive observations.

Calligraphy Class: Having dropped out of his ‘registered’ courses at Reed College, Steve Jobs impulsively slipped into a course on calligraphy and studied the history, the art, and the science of the written word, for no particular reason than curiosity itself. He was not to know it then, but, as he points out in his Stanford Address, that learning about calligraphy was what inspired his design of the beautiful fonts so characteristic of Apple computers, and indeed of what millions today enjoy on all personal computers—it changed the face of typeface as we know it. One can only imagine academic counselors, parents and other well-meaning individuals scratching their heads in dismay at this dissolute and irresponsible Steve “wasting his time” on such calligra-rubbish instead of dutifully studying his requisite courses, getting his requisite grades, securing a requisite job, and living out the requisite middle-class college-educated California life.

Knick Knack: was a 3-minute short animated movie directed by John Lasseter, made at a then unknown computer graphics division which was purchased by Steve Jobs in 1986 after his infamous dismissal from Apple—and following the flagging fortunes of his NeXT Computers whose “hardware was gorgeous but far too expensive for the education market it was intended for.” Even at NeXT, Jobs was in quest of perfection, and nothing less would do. Anyway. The film Knick Knack follows the ill-fated attempts of a snowman to get from Alaska to sunny Miami—which must have touched Jobs’ heart in some way and to which he gave his enthusiastic support (see the handsome acknowledgement made to him in the closing credits of the film). I remember watching it for the first time as part of the movie previews to Disney’s The Little Mermaid (1989), and being entranced by the exquisite shading, color, and sound effects, which made the main feature pale in animated comparison. This little animated short film was the precursor to Toy Story, and the award winning Finding Nemo, The Incredibles, and Up. The computer graphics company became Pixar. John Lasseter, who then went on to become the company’s Chief Creative Officer, eulogizes, “[Jobs] saw the potential of what Pixar could be before the rest of us. Steve took a chance on us and believed in our crazy dream of making computer animated films; the one thing he always said was to simply ‘make it great’.”[4] To date, the films made by Pixar all number themselves among the fifty highest-grossing animated films of all time. All this was after Jobs great—almost biblical—fall from Apple, when he rose Phoenix-like from the ashes of defeat to even greater heights. Apple became one of the greatest companies of the 20th century, and Pixar is poised to become the one of the 21st century.

Philanthropy: Jobs’ controversial position on philanthropy has not gone unremarked. In fact, he made it company policy not to contribute to philanthropic causes, which, as is often the case, has led to a questioning of his motives—that he was a selfish capitalist, only concerned with his personal successes and his personal gains, with no thought for the fate of his fellow men and women. But should the question of motives be applied to the benevolent actions of other famous philanthropists (John D. Rockefeller, Andrew Carnegie, Michael Jackson, and here in Canada, Peter Munk, Hugh Jackman and Michael Lee-Chin among numerous others), there is the rather awkward issue of self-propagation: their generosity is engraved with their names and their purchase of posterity. Then there is the issue of organizational ineffectiveness that has stigmatized so many Not-for-Profit organizations that are generally the beneficiaries of philanthropic generosity. Perhaps here the Bill & Melinda Gates Foundation (with generous support from Warren Buffet) has taken a significant step in their rigorous efficiency to ensure the maximum extraction of value for the maximum number of people in need. An excellent Canadian organization, Charity Intelligence, have set as their mission to inform potential donors of the actual translation of donor capital to worthy causes, where they set themselves “not intended to usurp or replace these [emotional] stories, but rather to complement them with the numbers and measured results.”[5] But this blogpost is not a polemic against the fame-seeking donors, nor is it the castigation of organizational inefficiencies in Not-for-Profit organizations—they do indeed provide much needed support to the less fortunate people of our world, and indeed for which the dispossessed are grateful (I, myself, have been the beneficiary of charity at needful points in my life). Still, it is pertinent to note that the name of Steve Jobs graces no great Foundation, Museum, Library, University, Hospital or similar enterprise. As a matter of fact, when I mentioned the passing of Jobs to my mother (who is aware of such social initiatives), her response was: Who is that? Steve Jobs, perhaps danced to a different drummer. In an age of Corporate Social Responsibility, Jobs is off the radar. When Warren Buffet and Bill Gates launched the Giving Pledge[6] initiative inviting to some of the richest people to donate half their fortune to philanthropic causes (or effectively be subjected to a ‘Wall of Shame,’ vigorously reported in print, internet and other twittering media), Jobs declined[7]. Now, we may never know what his true motives were, but, if I may indulge in a controversial partisan view, Steve Jobs was not about to join the latest bandwagon of CSR self-promotion. Looking back at his many speeches, his actions, his designs and his products, we have to accede (even if grudgingly) to the common expression, that he didn’t just talk the talk; he walked the walk. And his creations are a testament to the uncompromising excellence he demanded of everyone, most of all, himself.

So, how do we assess the man? I take inspiration from one of the memorable movies of my childhood, The Empire Strikes Back, in which Master Yoda castigates his young protégé, Luke Skywalker hesitantly about to “try” a challenging task. Yoda firmly says “Do. Or Do not. There is no Try.” For Steve Jobs there was no half-hearted Try; it was always an imperative Do! It was the same uncompromising stance he demanded of all who worked with, and for him. It was the signature of excellence on all his inventions, his innovations, and his life. In him, I recall the magnificent hero of John Milton’s great 17th century epic poem, Paradise Lost—Satan, who defiantly proclaimed upon his banishment from heaven that it was “Better to reign in Hell, than to serve in Heaven.” His name was Lucifer, the Bringer of Light, the Prince of the Morning.

Naked Emperors

Here is the Polish poet and Nobel Laureate Wislawa Szymborska, commenting with her signature comic touch, on the occupation of poets when asked to reveal their occupation—

Bureaucrats and bus passengers respond with a touch of incredulity and alarm when they discover they are dealing with a poet…. But there are no professors of poetry. That would mean after all that poetry is an occupation requiring specialized study, regular examinations, theoretical articles with bibliographies and footnotes attached, and, finally, ceremoniously conferred diplomas. And this would mean, in turn, that it’s not enough to cover pages with even the most exquisite poems in order to become a poet. The crucial element is some slip of paper bearing an official stamp. (xi-xii)

—that nevertheless finds echoes in other occupations that have not been sanctified, authorized and deodorized for commonplace use.

In my case, it happens when I tell people that I am a Trainer (Dog trainer? Circus trainer? Fitness trainer?), or should I say a Coach (Football coach? Little League coach?) or Leadership Speaker (inevitable comparisons—flattering though they are—to my namesake Anthony Robbins?). Outside the corporate world, there are the bestseller writers on self-help books that most readily come to mind, some flavors-of-the-month like James Redfield (of The Celestine Prophecy fame), and others of the more perennial variety in the Dale Carnegies and their more recent avatars, Deepak Chopra and Dr. Phil. These figures shape the public perception of my occupation, and derive their credibility from the enormous marketing engines that drive readers to their books, and participants to their seminars. But what exactly is being sold here? Szymborska (again on poets), provides a pertinent observation and stringent criterion:

And yet it wasn’t so long ago… that poets strove to shock us with their extravagant dress and eccentric behavior. But all this was merely for the sake of public display. The moment always came when poets had to close the doors behind them, strip off their mantles, fripperies, and other poetic paraphernalia, and confront—silently, patiently awaiting their own selves—the still-white sheet of paper. For finally, this is what really counts. (xii)

What really counts. For a poet, it is a great poem; for a trainer, it is in creating an enduring change in thinking and stance in the mentee. The big secrets of success are not really secrets. Most people know that discipline, hard work, leadership qualities, team-work, out-of-the-box thinking, initiative…and we can run the gamut of desirable attributes, skills and knowledge onwards, but the bottom line is that training is not in just imparting content, but in inspiring performance. Unlikely a simile though it may be, great trainers can be in their own humble way, like great poets: they inspire us.

And lest we dismiss these comparisons as frivolous for the hard world of business, let us recall that before our MBAs, our certifications, and professional designations, there was business—that formative cornerstone of civilization—which existed for millennia. We may have evolved to our particular structures and logic of business, but the business of training, of educating, of coaching, and indeed of mentorship (Mentor, whose name comes from Greek mythology, was tutor and guide to the son of Odysseus) have ancient and enduring origins. Even among the most hardened of businesspeople, there are memories of great teachers, guides, advisors,… and yes, trainers, who shaped their thinking and actions. We look at the great achievements of great men and women, of great companies, of great inventions, great innovations, and all too often we see the lone warrior in the spotlight; we do not acknowledge the supporting apparatus that lies in the darkness below the stage, the whole foundation upon which the soaring pinnacle was built.

So, what really counts? In business parlance we call that the return on investment, the ROI, whose financial measures may be in dollars and cents, but in training is expected to be a productive change in behavior that improves business performance. I say “expected to be” because the criteria used to measure performance are amorphous, difficult to quantify, and regrettably subjective. Therefore when training is called for (and especially in its most august manifestation, Performance Coaching, or more recently, Leadership Development), how do we choose who is the best person for the job? All too often we choose the person with the most slips of paper bearing an official stamp, the people with the most mantles, fripperies and other paraphernalia, so to speak.

This is not to deny the importance of certification, credentials and experience, which are all important to ensure a certain level of quality, certain benchmarks for performance, and even a common foundation of principles and accepted practice through which practitioners in the field communicate with one another. But (and here is the big BUT), when credentials become the most significant criteria through which we evaluate a practitioner’s performance, we are losing sight of what really counts. And naturally, when that becomes the evaluative criteria, it shapes the output.

Classic behavioral theory (once shunned in business and industry as too airy-fairy-psychological-nonsense), is making an interesting appearance in the new Behavioral Economics and Behavioral Finance. This should not be a surprise to us common sense thinkers in other aspects of our life. And one of those little precepts we quickly learn (as children, and later adults) is that the criteria used to evaluate performance will shape the output of that performance. If better grades will get one the promised bike, many an enterprising kid will work for the grade (irrespective of whether learning actually took place), which provides an all too typical paradigm that the kid will follow in later life. So, kids focus on grades, poets on fripperies, and trainers on slips of paper bearing an official stamp, when they should be focusing on learning, better poetry and inspiring greater performance respectively.

In the fable of the Emperor desirous of new clothes, perhaps he needed to ask himself what he really wanted: flattery? Or beautiful garments? So long as flattery seemed to be his criteria in choosing tailors, we know what he got. But for the rest of us, royalty or not, what expectations do we have of our tailors, our poets, and our trainers? And is it what really counts? If not, we can surely expect some kid in the street to hoot at a naked emperor.

Work Cited:

Szymborska, Wislawa. “The Poet and the World” (Nobel Lecture). Poems New and Collected 1957 – 1997. Trans. Stanislaw Baranczack & Clare Kavanagh. New York: Harcourt Brace & Company. 1998.