Category: Corporate Culture


The trouble with being SMART

January 30th, 2016 — 1:41am

2016 is here. Worldwide, managers are setting SMART – Specific, Measureable, Achievable, Relevant, and Timely – performance goals. Even though the underlying theory – Management by Objectives (MBO)– has lost credence, this catchy mnemonic, which George Doran coined in 1981, has not. The time is upon us to retire SMART for all managers and executives from whom we need discretionary effort.

MBO lost credence because “The boss knows best” paternalism no longer works well. Global companies are transforming structurally to free emerging businesses from top-down strictures. Open source communities, coordinated by respect for expertise, not central authority, are creating technologies and products. Innovative workplaces are giving employees time off the clock and free resources, and benefitting from their unmeasured, untracked tinkering. Such environments thrive on distributed leadership and decentralized, uncounted action, and SMART goals can’t add to, and inevitably subtract from, them.

The problems with SMART run deeper and can damage even organizations that don’t need to unbundle business units or use open source approaches. The business environment has fundamentally changed. Companies no longer compete individually, but as members of networks: Apple couldn’t create the iPhone, or Airbus the A350 aircraft, without collaborating with others. Network members may be located half a world away, and inevitably have their own strategies, processes and cultures. So, complexity, uncertainty, and ambiguity abound, which allow problems and opportunities flash across these networks with blinding speed, meaningfully affecting performance. SMART goals implicitly assume staid environments that are far removed from these realities and can keep executives from responding appropriately.

Problems with SMART arise from virtually all elements of the acronym. ‘Specific’ goals, clear-cut and definite, are easy to articulate and act on. They enable quick assessments of individuals’ successes. However, when used extensively, they reduce discretionary activity and limit broader action. I once facilitated a meeting between two groups of senior executives, each from a well-known global company, whose businesses had merged. One group described how its corporate values drove performance evaluation and gave it freedom to act. The other retorted that its values were the five tasks set for each by his/her boss; each manager could, and did, decline to work on any initiative unless specifically tasked to do so. Guess which company had acquired the other? Guess which one’s stock price has usually outperformed the other’s?

‘Measurable’ goals have become an unshakeable article of faith, commonly justified by physicist Lord Kelvin’s dictum, “If you can not measure it, you can not improve it.” Such goals make it is easy to decide not just whether someone has performed, but how well. In so doing, they implicitly emphasize efficiency (doing something optimally, even if it is the wrong thing) over effectiveness (doing the right thing, which may be hard to discern). To make this point, I often ask senior executives to identify a single factor whose absence would destroy their businesses. They inevitably – and quickly – converge on ‘Trust.’ They are right – how much would you get done if you had to personally check every single word you were told? I then ask, “How do you measure trust?” They don’t – and can’t: this critical driver of business success is immeasurable. Instead of spouting Lord Kelvin out of context, executives should internalize the words attributed, perhaps aphoristically, to Albert Einstein: “Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.”

Since ‘Achievable’ may produce inadequate outcomes, many companies set ‘Ambitious but achievable’ goals. Regardless, this criterion disregards our knowledge about human motivation. As Daniel Pink has brilliantly summarized in a YouTube video popular in business schools, carrot-and-stick approaches improve performance only when work is physical. Intellectual work benefits from allowing people to develop mastery in a field and giving them the autonomy to act therein. So, the quintessential carrot-and-stick nature of Achievable goals limits their relevance to managers. To drive supernormal performance, we should instead give people responsibility for accomplishment, and allow them to set their own targets consistent with organizational goals.

Goals are supposed to be ‘Relevant’ – not just to the organization, given its environment – but also to specific individuals and groups for whom they are set. The first criterion is undoubtedly reasonable, and on a prima facie basis, so is the second: why set a goal that isn’t applicable to, or deliverable by, the people in question? In reality, ‘relevance’ inevitably results in an enduring, widespread problem: organizational silos that hinder collaboration. For example, a sales team accountable for customer satisfaction is likely to have conflicts with a supply network team accountable for minimizing inventory. However, these silos would collaborate in their own interest if each was assessed (in part) on the other’s accountability – in effect measuring them for something that wasn’t relevant to their daily work.

How could ‘Timely’ not be legitimate? Very simply because it has become code for “as soon as possible.” We have made a virtue of speed to the exclusion of every other meaningful, and important organizational goal. Business textbooks assign critical importance to “first mover advantage” even though irrefutable examples of its falseness are readily available. When was the last time Apple launched a truly first-in-the world product? Was Google the first search company? Was Facebook the first social media offering in its niche? How are Chinese and Indian multinationals, Johnny-come-latelies to international markets, giving established Western firms a run for the money? When ‘timely’ equates to solely to speed, creativity, effectiveness and yes, even efficiency, suffer, sometimes irreparably.

What should executives do? They should reserve SMART goals solely for people who have limited discretion. For everyone else, they should begin goal setting with non-specific, qualitative, “can’t be done” diffuse and “time is one of several criterions” goals that give people autonomy and mastery. Indeed, they should urge them to propose goals for themselves. They should add SMART goals, only where they are truly unavoidable, and there too, with enough fudge-factors to ensure they don’t become limiting or constraining.

In effect, throughout the goal setting process, they should ask themselves: Am I paying attention to issues that truly matter? Am I truly leading an organization of people, or am I merely checking boxes to show that my job matters? Being SMART is easy, but that doesn’t make it right.

———

A shorter version was published by Forbes on January 12: http://www.forbes.com/sites/forbesleadershipforum/2016/01/12/it-may-be-time-to-get-rid-of-smart-management/#2715e4857a0b6e160d5e3bbc

Comment » | Business Tools, Company Performance, Corporate Culture, Leadership

Grokking Jobs on Campus

September 1st, 2011 — 2:01pm

I’ve been Executive in Residence at Babson College since January. As Fall creeps up on New England (You’re beautiful, but can you please stay away for a little longer?) and students return, my thoughts are a continent away, at two other campuses: The California Institute of Technology and the Apple campus in Cupertino.

This summer, I learnt of a Caltech lore: When Apple visits Caltech to recruit undergraduates in computer science, it brings an open checkbook. Even unreasonable salary expectations don’t preclude the hiring of those whom it likes. Initially, the story seemed inconsequential.

Then, a few days ago, Steve Jobs resigned his position as Apple’s CEO. Apple’s iconic co-founder has reportedly lived a decidedly iconoclastic life, at least in comparison with those of the CEOs of most global companies. He dropped out of college, but living on friends’ sofas, continued to attend classes he liked. So exposed to calligraphy, he incorporated a range of fonts, not just Pica and Elite, on the original Macs. He then dropped out altogether went to an ashram in India, from where he returned a Buddhist. He embraced counter-culture and reportedly regards his doing so a critical formative experience. In short, as a young man, he was the complete antithesis of the people that Apple is seemingly hiring at Caltech.

I am not begrudging the Caltech seniors, particularly those who have worked diligently, their high-paying jobs! Nevertheless, the juxtaposition of these events raised in my mind a critical question for Apple and a more general one for businesses and academia. The roots of these questions lie in an amazing interview Jobs gave to Wired magazine in 1996, before he returned to Apple. In part, he said:

“Some people think design means how it looks. But of course, if you dig deeper, it’s really how it works. The design of the Mac wasn’t what it looked like, although that was part of it. Primarily, it was how it worked. To design something really well, you have to get it. You have to really grok what it’s all about. It takes a passionate commitment to really thoroughly understand something, chew it up, not just quickly swallow it. Most people don’t take the time to do that.

(Jobs probably used the word grok very deliberately; if you don’t grok it, read Robert Heinlein’s Stranger in a Strange Land.)

“Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they were able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.

“Unfortunately, that’s too rare a commodity. A lot of people in our industry haven’t had very diverse experiences. So they don’t have enough dots to connect, and they end up with very linear solutions without a broad perspective on the problem. The broader one’s understanding of the human experience, the better design we will have.”

I wonder if those responsible for on-campus hiring at Apple have grokked this interview. People have long debated Apple’s ability to create lifestyle-altering experiences in a post-Jobs era. A die-hard Apple fan, I had no doubts it could – if it institutionalized Jobs’ perspective on design. (I call this making of the “private knowledge of an individual the public knowledge of many” organizational learning.) However, if Caltech’s lore is true (and broadly representative), Jobs’ insights haven’t become organizational. This won’t be a problem tomorrow, but will be when the individuals so hired rise to managerial positions. Will they prize staff who lack deep knowledge but who, by virtue of their life experiences and broad knowledge, can connect seemingly unconnectable dots?

More broadly, in a world that prizes “deep, micro-knowledge” more than “broad, macro-knowledge,” how do we produce great designers, managers, and indeed, leaders? How do we ensure people are, in Jobs’ words, “able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.”

I am not arrogant enough to believe I have the answer, but will leave you with a proposal. For college students, I’d make a “semester abroad” a requirement, not an option. And an American going to Western Europe (or vice versa) wouldn’t count.

What do you think?

Comment » | Corporate Culture, Design, Education, Leadership

The Joys and Perils of Dancing on a Knife’s Edge

February 25th, 2011 — 11:09am

The tumultuous crowds that brought down a dictator in Egypt had an unintended impact far from their homeland: they drowned out – rightfully! – the announcement of a strategic partnership between Nokia and Microsoft. I admire the Nokia I researched; yet I acknowledge it is currently in deep trouble. I have long disdained Microsoft for its product quality and its reliance on monopolistic power instead of innovation (sole exceptions: Xbox and Kinect; and yes, I admit that Office 2011 for the Mac is far better than iWorks!). So, what do I think of this alliance?

A key “prerequisite” question is: Do I still believe the ideas in The Spider’s Strategy? Absolutely! Toyota’s “unexpected acceleration” fiasco and its resultant recalls of millions of cars didn’t discredit Lean Enterprise. Why then, should Nokia’s recent challenges discredit Networked Organizations? Indeed, Nokia got into trouble because in the key area of product innovation, it stopped applying the ideas that powered its 17% compounded annual organic growth rate (in revenues and operating profits) from 1995 to 2006.

Nokia violated a subtle rule embedded in my third Design Principle, “Value and nurture organizational learning.” It used to learn rapidly by setting seemingly impossible targets that demanded the periodic reinvention its business model. Simultanneouly, to keep control, it insisted its managers follow a “no surprises” policy. This brilliant rule is the proverbial knife’s edge. Balance well, and you can pull off miracles. Tilt toward “big risk” and you can lose your shirt. Tilt toward “no suprises” and you will bring innovation to a screeching halt. As it grew, Nokia made the mistake many other large companies have: it tilted toward “no surprises.” So, unlike Apple, it didn’t build a network of complementary product makers to buttress its proprietary Symbian software. Unlike Google, it didn’t attract a different type of sustaining network by making Symbian open source – until it was too late.

The alliance with Microsoft was in the cards from the day Nokia’s Board appointed Stephen Elop CEO. Nokia’s press release spoke of a strategy to “build a new global mobile ecosystem” with Windows Phone software at its core; “capture volume and value growth to connect ‘the next billion’ to the Internet in developing growth markets;” and make “focused investments in next-generation disruptive technologies.”

The second element – a continued focus on markets like India and China – is an key, though the notoriously developed-world-focused financial analysts may not care. Apple has ignored these markets and Windows still has a true monopoly among operating systems. These facts, plus Nokia’s still dominant marketshare there, give the alliance a strong base on which it can build; Nokia can instantly create volume for the Windows Phone and a seemless integration with Wintel computers may give it an edge over low cost Chinese phone makers. At the very least, this element will buy the alliance time; at best, the “next billion” is a huge market. That’s where the first element is also critical.

To bring the alliance value, the goal of building a mobile ecosystem must truly assimilate the lesson of a recent The New York Times story about a start-up company that hoped to build a business around enabling group dates. The founders noted that the site’s users were mostly South or East Asians, but filed that fact away as “Interesting, but Unimportant.” Success came only when they reluctantly acknowledged that group dating wouldn’t fly in the US and shifted their focus to India. The world, as Thomas Freidman said, is flat. But that doesn’t mean people’s needs are the same everywhere. That’s why the word “global” in the language of this strategic element is troubling. Its use may seduce financial analysts, but unless an ecosystem to specific markets, it won’t amount to a hill of beans. At one time Nokia knew this lesson; it had had anthropologists in Indian villages whose work strengthened its market position there. Does it still remember that lesson and can it convince a monopoly to learn it too?

The third element is critical for the long term and most troubling: Will two companies who haven’t created any disruptive technology recently be able to do so in the near future? Nokia’s Chairman Jorma Ollila had championed the Networked Organization philosophy and as CEO, had managed its phenomenal growth. I could make a cogent case that he and the Board had no choice but to create the alliance with Microsoft. (Which would explain why they pursued Mr. Elop in the first place.) Now, he must ensure that Mr. Elop realizes that his most critical tasks are (1) putting into leadership positions those within Nokia who are still capable of dancing gracefully on a knife’s edge and (2) using his deep knowledge of Microsoft to convince Mr. Ballmer to do the same. Then, and only then, will the alliance succeed. If so, I may one day become once again an enthusiastic customer of both companies.

Comment » | Business Environment, Company Performance, Corporate Culture, Leadership

Better on a Camel?

June 9th, 2010 — 3:54pm

It has been exactly 99 days since I last posted. Hadn’t meant all this time to pass, but life intervened. So, I’m going to welcome myself back by first looking back 40 years.

If you are old enough – or have an deep interest in commercial flying – you might know that long ago, British Airways used to be British Overseas Airways Corporation. During those days of genteel competition, airlines’ acronyms often became amusing nicknames. BOAC was “Better on a Camel;” industry insiders used this moniker affectionately. Decades later, however, one would truly be better off on a camel than on British Airways.

Why? Three words: People, people, people. BA and its employees are constantly at war. Their mutual acrimony routinely spills over into public and affects passengers. Both sides seem to loath the customers who keep them employed.

In the late 1990s, BA put up signs at Heathrow, threatening to prosecute passengers who were discourteous to its employees. It neglected to tell its employees that they too needed to be polite. And with that omission, they unleashed trouble. At a check-in counter once, I expressed mild irritation that I was not given the seat I had reserved. Red Queen style, the agent literally turned crimson with fury. How dare I complain, he asked? If I didn’t like the seat he was giving me, I didn’t have to fly.

Fast forward a few years to an ever lengthening Business Class check-in line. One of the two agents designated to attend to it was enjoying a long, uproariously funny phone conversation. A passenger left our line and requested him politely to terminate what was clearly a non-urgent call. The agent followed the man back to the line and as the rest of us stood around stunned, began screaming, “Who are you to tell me what work I must do?” His rant lasted a couple of minutes and then, he went back to his call.

Fifteen minutes later, he was still on his call and the line was becoming ever longer. Another passenger screwed up his courage and asked a passing agent to summon a manager. This one also became Red Queen incarnate, “You’re telling me to do something? Who are you to tell me what I should do?” He hadn’t heard the “please” the rest of us did and felt it was completely appropriate to abuse a premier passenger.

I’m not making these up! More recently, a business class counter at Brussels was open, but the BA agent was missing. I chatted with a couple of other waiting passengers. Each of us had multiple such horror stories. One called me “lucky” since I only had to fly to London, while he was stuck with BA till Sydney.

This is one sad, sad airline whose service is worse than even the deficient service (by Asian standards) available in the US. As I am writing this, BA cabin crew are finally on the strike that judges had forbid twice before. Once was last December, but by the the time the judicial edict came down, they had hurt thousands of vacationers during the Christmas holiday period. Another time was last April. I was on a round the world business trip that began in Europe and I actively avoided all BA long-haul flights even though they were theoretically the most convenient. Unable to avoid a short Madrid-to-London flight, I waited with bated breath for signs of trouble. Fortunately, I wasn’t affected.

Some readers might blame such behavior on the presence of unions. Maybe so, but they are, at worst, only partly at fault. To me, the clear onus for such disregard for customers must be placed on management. BA management, it seems, has long believed that “service” means more amenities. BA has generally been among the leaders in introducing new technology – like flat bed seats in business class. But in the far more difficult area of creating a more positive corporate culture, in well over a decade, its management has failed – miserably, in my opinion. Nor has their approach to management created much value for their shareholders. Which raises the question: Why do they still have their jobs? (I know, Richard Branson’s been asking this for a long time.)

Economists point to the virtues of free markets; if enough people felt like me, they say, we could take our business elsewhere and punish BA. In a world of networks, however, that is not true; BA is a key member of the One World alliance and as long as I choose to fly One World, I will have to put up with BA, at least occasionally.

Sometimes, good things have very bad consequences.

Comment » | Company Performance, Corporate Culture

“Don’t be evil” meets “Do no harm”

March 1st, 2010 — 3:33pm

Last week, an Italian court gave three Google executives six months’ suspended sentences. Their case dealt with a video uploaded on YouTube in Italy, giving the court (and the prosecutors) jurisdiction. The video, which showed a group of teenage boys bullying another with autism, quickly became an Internet sensation. A couple of weeks later, Google received a complaint and removed the video within three hours. By then over half a million people had viewed it. The Google executives were deemed guilty of violating privacy laws.

In the real world, most issues worth reflecting on – like this one – have no simple answers. This one asks us to weigh the relative benefits of privacy and free speech. I don’t know all the possible arguments people made about this case, but I’ll address a few that I heard repeatedly.

The first ignores the specific details of the case and suggests it was a cynical ploy by the billionaire Italian Prime Minister Silvio Berlusconi to clip the power of the Internet since it was threatening his vast “old media” empire. I don’t know much about the Italian judicial system, but if one has a reasonable understanding of realpolitik and of Mr. Berlusconi’s repeated cavalier disregard of a variety of laws, this view is hard to dismiss as a ridiculous conspiracy theory. If true, the court’s decision could have a very negative impact around the world.

It isn’t unheard of for ruthless executives to take unethical, albeit legal, positions to further their ends. The big deal here is that this decision was handed down in a Western democracy on an issue with very high stakes. Undoubtedly, many ruthless people are currently assessing how they could win similar rulings in their bailliwicks. The Ahmadinejads and Mugabes of the world are preparing arguments along the lines of “But this is acceptable in the West.” So, the decision has made the world much more fraught with risk for decent people.

The second viewpoint has attracted most commentators. In essence, it compares the Internet to traditional communications – like telephones and the post office. Telephone companies aren’t subject to criminal charges when their equipment and services are used to plan crimes, no matter how nefarious. So, why should companies like Google?

I am not a lawyer, but for me, this argument doesn’t have legs. Progress in laws generally always lags progress in technology. In The Spider’s Strategy, I argued that our legal systems haven’t caught up with the fact that sense-and-respond capabilities are erasing the traditional boundaries of companies and taking us into uncharted territories. So, inadequate laws shouldn’t be a defense here.

Besides, Google’s defense was that it took down the video within 3 hours of being informed about it. The real question is: should it have acted proactively? After all, when I go to my local post office, I am routinely, proactively asked to confirm that the letter or package I am shipping has nothing dangerous in it. Legally, the post office doesn’t have to ask me (at least not that I know of!), but it is commonsensical for them to do so, if for no other reason than to protect its own people. I am sure that if I give them cause for concern, someone will take some proactive action and at least screen my package. Indeed, increasingly, the post office is trying to screen all packages.

But Google responds that every second, twenty hours of video are loaded on its systems around the world. They just don’t have the ability to screen everything. This argument also seems specious. Google doesn’t have to screen everything. However, can’t – doesn’t – it have filters to screen on an exceptional basis? If a tag or a comment says “school yard bully” couldn’t that particular video be checked out? Let’s assume that this filter would itself get swamped by volumes. How about using an additional decision point? “If a video hits 100,000 views or if a video is shooting up the popularity index very rapidly, check its appropriateness.” Saying “We want the right to search every book in the world and make money out of giving people access to these” seems incompatible with “We can’t possibly be expected to scan every video – or even a fraction of the videos – currently on our system.”

The third viewpoint focuses on biases rooted in the divergent histories of people around the world. Americans favor the freedom of information over all else because its national birth was in part driven by the oppression of a government using information inappropriately. That’s why it is the First Amendement to the US Constitution (part of the Bill of Rights which enshrines the first ten amendments); the Consitution was adopted on September 17, 1787 and the Bill of Rights was adopted on December 15, 1791. In contrast, there is no explicit “right to privacy” in the US Constitution or its amendments; this right was imputed to exist (on the basis of several of the other Bill of Rights amendments) as late as 1965 by a much disputed ruling of the US Supreme Court.

In contrast, Europe has suffered severely as a result of a lack of a fundamental right to privacy. Throughout history, dictators and totalitarian regimes have terrorized their people by collecting huge amounts of secret information and using these to justify punishments, torture and killings. And so, it is no surprise that Article 8 of the European Convention on Human Rights says, “Everyone has the right to respect for his private and family life, his home and his correspondence.”

Supporters of the “information first” logic point out that today’s totalitarian states block access to information, particularly that acquired through the Internet. So, Google rightfully stood by its corporate motto and “did no evil:” It shouldn’t have – and didn’t – act preemptively to block the video, but took action when it was appropriate. Supporters of the “privacy first” logic, (which, incidentally, the Italian court adopted) argue that Google had a fiduciary responsibility to protect the autistic child’s right to privacy. Above all, Google should have “done no harm.”

In the years to come, we will face the two facets of this third viewpoint over and again. Sense-and-respond capabilities will not only benefit businesses and society, but will also raise this issue in ways that we can’t even imagine. (For example, listen to “Different Strokes.” This “On the Media” program from National Public Radio discusses technology that tracks where someone goes on the Internet on the basis of his/her typing pattern.)

My own bias is towards privacy; I think it will increasingly become hard to live as an individual unless privacy safeguards are strengthened. And the day when this becomes a real issue for everyone is not far off; it will happen, as I’d indicated to a pharmaceuticals industry audience in May 2002, because of genetic-profile based medicine. Even “open information” stalwarts in the US will have to think about whether they want companies and governments to have unfettered access to their own specific genetic structures. That is why I did not howl in protest when I read about the Italian court’s decision – but as I indicated in my discussion of the first viewpoint, I am not one hundred percent convinced that it was the right decision.

1 comment » | Business Environment, Corporate Culture, Online Business Models, Politics

The Michael Crichton Strain

January 29th, 2010 — 11:01am

Michael Crichton was the author who ensured that English speaking children know – and can perfectly pronounce – the names of at least ten dinosaurs. I read the first of his 26 novels, The Andromeda Strain, in 1976 and several others – including the ones about dinosaurs, Jurassic Park and The Lost World – in subsequent years. He also created the extremely popular TV show ER; I didn’t see even a single episode of the show. He passed away in November 2008.

I liked reading his books because many, if not all, of them dealt with the complexities of a world I knew well: the intersection of advanced technology and business. However, I am definitely not a “Crichton groupie;” I stopped reading him in the early 1990s, because I felt that his 1992 book, Rising Sun, had racist undertones. This decision means that Mr. Crichton may well have held positions about which I know absolutely nothing.

Mr. Crichton’s writings introduced me to an extraordinarily powerful idea: humans are creating ever more complex technological systems without truly understanding their implications. They think they can completely control these, but the reality is they can’t. For example, consider the following extract from a speech on environmentalism, as it is reported on “Michael Crichton, The Official Site”: “Most people assume linearity in environmental processes, but the world is largely non-linear: it’s a complex system. An important feature of complex systems is that we don’t know how they work. We don’t understand them except in a general way; we simply interact with them. Whenever we think we understand them, we learn we don’t. Sometimes spectacularly.”

I couldn’t help but be reminded of this idea when the news about Toyota’s ever-expanding recall came into the public spotlight. How could a company so admired and emulated falter so badly? One explanation is that the Company’s relentless pursuit of growth over the last decade caused it to take its eye off quality. Toyota’s new CEO, Akio Toyoda, shares this view; when he got the job in October 2009, he apologized profusely in public for the quality problems that Toyota had experienced. As time would tell, those were nothing compared to what’s happening right now. (I will return to this explanation in a future post.)

A second possible explanation drove me to introduce Michael Crichton here: we are building cars so complex that we really don’t understand how they function and why they do what they do. So far, no one knows what ails the Toyotas. Is it a mechanical problem with the accelerator pedal made by the US company CTS? These pedals are being replaced not just on Toyota but also on other cars. But even Toyota doesn’t think this is the key explanation. Mechanical problems are generally easy to diagnose because we can actually see what’s wrong. The “improper floormats” explanation is also, at best, a secondary one. Right now, the focus seems to be on the electronics that control acceleration – and possibly, even the embedded software. Yet no one has yet figured out what this problem is. So, unless the real story has not been made publicly available (which is always possible), this explanation is still speculation; perhaps informed speculation, but speculation nevertheless.

Many years ago, I had started writing – and then abandoned – a book on manufacturing. In that effort I had assailed the belief that some software companies popularized in the 1990s: “Get it 80% right and ship.” Customers will tell you what is wrong – and you can fix it then. An incredibly simplistic belief in the power of being first to market drove this view. I hope it gets buried soon, for Apple is only the latest company to show that first mover advantages are highly overrated.) Couple this view with Mr. Crichton’s lesson and the dangers of following it become immediately obvious.

In 2006/2007, I was writing The Spider’s Strategy. I pointed out that the holy grail of modern product development – “make it modular” – had major limits. Companies like modularity because it gives (1) the flexibility to use the same parts in different places and (2) the ability to outsource design and manufacturing work in discrete chunks. I cited examples of product failures that had afflicted some of the best known brands in the world, including Toyota and argued that the weakness of this thinking lay in the electronics and software. This limitation made it essential for companies to collaborate closely with their design and manufacturing partners.

Toyota understood this fact better than most other companies. This is why it focused on building strong partnerships with its suppliers. Those partnerships had helped it make the jump from a Lean company to a networked company. It is truly sad that along the way somehow its management unlearnt this critically important lesson.

Comment » | Company Performance, Corporate Culture, Leadership

What Took You So Long, Mr. Whitacre?

December 9th, 2009 — 2:40pm

Edward Whitacre, the Chairman of the Board of GM, has been very active during the last few days. On December 1, he – formally GM’s Board – fired CEO Fritz Henderson. An Associated Press article in The New York Times reported on opinions expressed by two – unnamed – people who were close to Mr. Henderson. It noted that “…the board upset that the automaker’s turnaround wasn’t moving more swiftly and Henderson frustrated with second-guessing …” The same people also suggested that “[Henderson has] was frustrated from the beginning by the board and government push for faster change and other questions about his decisions.” Mr. Whitacre has taken on the task of interim CEO while the Board searches for a replacement. In all likelihood, he/she will be from outside the industry.

Three days after making this decision, Mr. Whitacre appointed a new management team. He reached down into the senior middle management cadre and appointed Mark Reuss, a recent GM for Australia and a newly appointed VP for Engineering, GM for North America. He expanded the responsibilities of three women executives and sidelined Robert Lutz, the Vice Chairman who ran product development and who had hinted publicly that he would be replacing Mr. Henderson. In a public statement about these decisions, Mr. Whitacre noted that GM’s top heavy management was stifling good mid-tier managers, and he wanted to “… give people more responsibility and authority deeper in the organization, and hold them accountable.”

Of course I cheered! In this blog, last December (“What’s Good for General Motors is Good for America”) I wrote, “… this company cannot be trusted to reform itself …” and listed five conditions that the US government should ask for in return for bailing out GM. These included: “Mr. Wagoner and his top lieutenants must resign in an orderly fashion …;” “ … over the next five years GM … must be reduced in size, so they are no longer ‘too big to fail.’ This will require mandatory spin-offs of relatively independent businesses …;” and “…no one in the top spots in any of the restructured companies should come from the senior-most ranks of these companies …”

Then, in January (“Marie Antoinette’s Soulmate”), I tore into Mr. Lutz: “If anyone has any doubts about why GM is really flirting with bankruptcy, Mr. Lutz comments (during an NPR interview) should have clarified the issue. The Vice Chairman of a company which went with a begging bowl to Congress acted as if he was Marie ‘Let them eat cake’ Antoinette’s soul mate. CEO Rick Wagoner and GM’s Board should have repudiated his statements by publicly firing him …” I also opined that a pre-arranged bankruptcy would not solve GM’s core problem. During the negotiations, I said, “No one will be focusing on changing the culture that allow people like Mr. Lutz to be top dogs. And without changing culture – encouraging collaboration, being open to others’ ideas, being willing to take considered risk, managing learning every day, etc. – these companies will stumble from one disaster to another. Changing culture takes great effort, committed leadership and time. All three will be in short supply during the negotiations …” I added, “I would like to see … an orderly departure of people like Rick Wagoner and Bob Lutz, and a shifting of power to less jaded executives running smaller companies created by splitting up the behemoths.”

Then in April (“The King is Dead! Long Live the King”), I challenged the criticism made of the firing of Rick Wagoner: “Imagine, for a moment, that a President of the US (… ‘POTUS’) was at the end of an eight year tenure and he … had not been able to turn around the economy. Would you call him a failure? Sure you would! Mr. Wagoner has been CEO for 8 years; prior to that he was GM’s CFO, President of North American Operations, and COO. A comparable track record in US national politics would have been Secretary of Treasury, (a hands on) Vice President and then POTUS. In effect, Mr. Wagoner had many more then 8 years to fix GM. Under the circumstances, the fact that he might have ‘made progress,’ is simply not good enough!” I ended that post with, “The King is dead. I hope the new King – or kings, as I have argued earlier – come from middle ranks or better yet, from outside the industry.”

So, Mr. Whitacre has made many of the executive changes I wanted. Hopefully, the new blood will transform GM’s ossified culture and structure and take the strategic steps I suggested. As long as Mr. Henderson was the CEO, there was no hope of this happening. His concern that the Board was pushing too hard indicates that he, like Mr. Wagoner, would have found eight years too short for reforming GM.

Now there is hope that at least some of the money the US government used to bail out GM will be returned.

Comment » | Company Performance, Corporate Culture, Leadership, Organizational structure, Politics

“Stargazer, you with your head in the heavens …”

November 28th, 2009 — 2:05pm

Around this time every year, American manufacturers and retailers fill the airwaves with countless advertisements. A couple of days ago, I saw many from Toyota, touting the legendary quality of its cars. But for the first time in a long while these sounded hollow. Toyota has just announced yet another recall – affecting four million cars – for possible uncontrollable acceleration. The problem has resulted in a few deaths. I immediately told my wife, “This is Toyota’s “Audi moment.”

In the 1980s, Audi’s slogan was “The art of engineering” and its cars were doing very well in the premium/luxury segment. One year, some customers complained about accidents at start up; the cars moved before the drivers wanted to, often causing accidents. Audi denied the problem and blamed driver error. The media picked up the story and ultimately, the US government mandated a new safety feature for all cars: one cannot shift the gear to ‘Drive’ or ‘Reverse’ without having a foot on the brake. But by that time, Audi’s sales had plummeted – if I recall correctly – from about 50,000 a year to under 10,000. Audi’s reputation didn’t recover for many years.

I have long admired Toyota’s management prowess, and praised some of its policies and experiences in The Spider’s Strategy and in this blog. However, Toyota has clearly not learnt from Audi’s experiences. It first denied the problem and then blamed its customers. When it – very belatedly – acknowledged the issue, it said that the accelerator pedal was getting stuck on the mat and said that it would retrofit the existing pedals.

Toyota’s response is far from appropriate. It won’t be able execute the retrofitting till April 2010. What are the legions of Toyota customers supposed to do till then? Help slow global warming by not driving? Moreover, not everyone is convinced that the pedals are at fault; many blame a software malfunction. (This isn’t far fetched; in The Spider’s Strategy, I described earlier software problems in Toyota – and other high end cars – as one of the motivations for networked companies.) Toyota disagrees sharply – but nevertheless, is changing the software in some cars.

Toyota’s advertisements, reminded me of a Neil Diamond song: “Stargazer, you with your head in the heavens / You’ll never get by walkin’ that high off the ground / Moon dreamer, I’ve been around and I’ve seen it / The higher you get – the harder they let you down / You pay your dues, it seems forever /And if you’re clever you may be in for a while / Then you’re out of style/” I wondered why its executives didn’t realize that since quality is their claim to fame, a plausible challenge of that capability can be devastating. I also mused about the appropriateness of focusing advertisements on quality while a major recall is the lead news item on the evening news. Finally, I pondered why good executives don’t understand that blaming large numbers of customers is always a losing strategy in a crisis. Perhaps it is because they forget – with their “head in the heavens” – that they can’t afford to be “walkin’ that high off the ground.”

Only time will tell if Toyota is “out of style” now, having been “in for a while” because of its earlier “cleverness.” Recently, its new CEO, Akio Toyoda, apologized abjectly to shareholders and customers for Toyota’s many recent failings and vowed to return to policies that had made it one of the most admired companies in the world.

Great (Adaptive) companies do make mistakes, just like lesser ones. What distinguishes them is what they do next. They acknowledge their mistakes, quickly correct them and determine how to obviate the entire class of such mistakes in the future. Mr. Toyoda, the ball’s in your court.

1 comment » | Company Performance, Corporate Culture, Leadership

A Tale of Two Indias

October 21st, 2009 — 2:03pm

Earlier this month, I finished a fifteen day trip to India. I formally met executives at two of the country’s largest business groups, and several others in social settings. A journalist posed a question that set me thinking about how business had changed since I lived and worked there.

In 1984, I returned to India after finishing my MBA and spending 13 months in the US as a management trainee at American Express Bank. In other posts, I’ve mentioned my experiences as a trainee creating policy papers for the AEB Board. My return quickly cut me down to size.

My 13 months in New York hadn’t taught me the basics of banking. For example, the lesson I was taught about “Letters of Credit,” the grease of the wheels of international trade, was: reject any “Bill of Lading” that differed from its LC, no matter how small the discrepancy. This categorical rule was meant to protect the Bank if contract disputes emerged between the buyer and seller of the goods covered by the LC and BL. But in India, 100% of the BLs had multiple differences from their LCs. Rejecting them all would shut down the Bank. My India-trained colleagues handled these decisions effortlessly, whereas I, supposedly a “high potential” junior manager, couldn’t without help. (Incidentally, the (rare) managers from the Sub-Continent who got posted abroad, often received multiple, quick promotions to the levels they would have normally achieved had their careers unfolded in the West.)

I also learnt there were many things we could not do, even if they made good business sense. Often, my colleagues (and even my boss) signed off on some transactions with indecipherable squiggles. Moreover, when they had to approve certain types of transactions (e.g., giving a valued client a better deal on its foreign exchange transaction than was allowed by India’s central bank), they almost always had to make client calls; in those cases I signed for them. Ultimately, I realized that these transactions violated arcane Indian laws; when these were minor, the officers squiggled; when they were non-trivial, I became a convenient fall guy.

Today, the situation is very different. Many arcane rules have been repealed; one does not need to break the law every day in order to do one’s job. A palpable degree of confidence radiates from business executives about the prospects of their own firms in particular, and the economy in general. In the 1980s, virtually nothing I had studied in my MBA was applicable in India. Today, many companies are brilliantly run – and could teach a lot to the rest of the world.

The country’s liberalization and rapid growth has, however, produced one downside: A top executive I met bemoaned the fact that the country was teeming with MBAs who wanted to be Managing Directors immediately, without putting in sufficient time to learn their profession. (And I could sympathize with this viewpoint. I watched an interview of a newly appointed, very young, CEO on a TV business channel. The interviewer fawned over him in a manner typically reserved for film stars and cricket players. Which young person wouldn’t want such treatment?) The fact that virtually none of the many business schools that have sprung up all over the country require work experience for admission exacerbates this problem; selections are done mostly based on the Indian equivalent of the GMAT and/or the candidate’s academic record. The desire of the young MBAs clashes with the realities of corporate life and is producing a serious problem: Indian executives estimate that 20% – 40% of their professional staff change employers each year.

Corporate leaders must address this challenge, even if these numbers are a wild exaggeration. I suspect that they will have to take a good hard look at their human resource policies to craft a uniquely “modern Indian” solution. A key part aspect must be the strengthening of company-specific management training; such an investment will convince seasoned managers that the company is truly interested in their professional wellbeing. Until Enron went down in flames, Tom Peters and others were preaching that every manager should adopt a “Me Incorporated” mindset; this doctrine is still prevalent on today’s Wall Street. In India, the most visible example of this mindset is the bitter fued between the two Ambani brothers, each a billionaire many times over. In a networked world, the rest of Indian industry simply cannot afford to fall into the same trap.

A model of what is needed already exists in India and I was privileged to visit it: The Tata Management Training Center in Pune is one of the oldest corporate universities in the world. It is using everything from in-class programs for senior executives to eLearning tools to meet the needs of managers at all levels. Some courses last for a few days, while others are delivered “Executive MBA” style over several months. Chetan Tolia, its Managing Director, told me that some 5,000 managers walk through TMTC’s gates each year. If other major business houses emulated the Tatas in this regard, India could develop a truly formidable competitive advantage.

The million dollar question is: Will they? The ten million dollar question is: What impact will this have on developed economies?

1 comment » | Business Environment, Corporate Culture, executive education, MBA education

Time to Re-read “What is Strategy?”

July 22nd, 2009 — 4:45pm

For the uninitiated, “What is Strategy?” is the name of a best-selling Harvard Business Review article that Michael Porter, a “University Professor” (i.e., the highest of the high) at the Harvard Business School and the Grand Poobah of Strategy, wrote in 1996. I will address only one of its many ideas in this post. I thought of it because of a recent visit to an upscale mall – and an announcement by a major company.

The visit was to an Apple retail store. I needed to connect my Mac to our Sony plasma TV, but could not remember the exact pin-configuration of the TV’s socket. The Apple employee helping me suggested that I ask at the Sony Style retail store located nearby and so, there I went.

You may recall that Sony began opening these stores when Apple started eating its lunch. The stores would make the vast array of great Sony products accessible to consumers. The moment I told a salesperson – who looked like a supervisor – that I was there for information, not to buy, he visibly lost interest in me. Not that the store was busy; you might have been able to hear a pin drop if you cupped your ear. Undeterred, I asked my question. The salesperson responded, “Do you have internet access at home?” “Yes,” I said, “But how does that help me now?” “Well,” he replied, “When you go home, look up the answer on our website.” “You can’t do that here?” I asked. “No,” he said, walking away. The ludicrousness of the idea that I would search their website instead of looking at the back of my TV did not even occur to him. And he is supposed to convince affluent consumers how to spend their money? In the time he spent losing a once and future customer – perhaps for ever – my teenager used my iPhone to get the information.

At the Apple store, the same salesperson greeted me again. He apologized for not thinking of going online and gave me the cable I needed. My wife asked for his help in selecting a graduation gift for my niece, who was finishing her high school. He showed us several fun software, but my wife picked up an expensive productivity program. “Oh gee,” he said sarcastically, “I just finished school and in the Fall, will start college. And my aunt gets me a productivity software! How nice!” We laughed, saw his point and decided to defer the purchase. He lost an immediate sale, but he reinforced the link between Apple and me.

Porter’s article says that strategy is about “fit.” Multiple small, individually inconsequential items must work together seamlessly for a strategy to be successful. The reason why Apple’s retail stores work – one in two purchaser of a Mac in an Apple store is new Apple customer – is that they are a seamless part of Apple’s corporate strategy. From the Genius Bar to the highly knowledgeable, non-pushy employees, everything fits together perfectly, just like the components of any Apple product. (Even the employees’ clothes match those of the Steve Job-like pitchman on its highly effective advertisements, “Hello, I’m a Mac” “And I’m a PC.”)

Sony once knew this lesson, but has forgotten it. Retailers speak of “location, location, location.” Sony’s location did not help it seal a relationship with me.

It is in this context I have been waiting to see how Microsoft’s newly announced retail stores will turn out. So far, this venture has been defined by location: the stores will be near Apple stores to give consumers non-Apple options. This is strategy?

For the sake of Microsoft’s shareholders (of which, regretfully, I’m one), I hope that the people in Redmond have thought this out a bit more. And if they haven’t, they should take this opportunity to first read Chan Kim and Reneé Mauborgne’s book, “Blue Ocean Strategy.” The essential thesis of this book is that too often, companies compete head to head with each other, leaving blood in the waters (“Red Ocean”) instead of seeking “Blue Oceans” where there are no established competitors. The Redmond strategists should also remember Porter’s message about fit: business history is full of examples of companies which tried to copy an effective strategy of a competitor, but failed miserably. The copying was typically superficial and small, seemingly inconsequential elements did not fit together. The Sony Style stores are a great example. Oh wait, Wintel machines and Windows Vista are even better ones.

1 comment » | Business Environment, Business Tools, Company Performance, Corporate Culture

Back to top