What to do when people are put in harm’s way because solid, technical advice on designing, deploying and operating complex systems is ignored by decision makers not competent in the target technology? Does the ethical responsibility for the consequences pass to the decision maker somehow absolving the advisor? Or should engineers forswear company loyalty for their higher duty to preserve human safety? At what point should deference to the boss or the client be replaced by mutiny – action outside the chain of command to prevent an accident?
Patterns in Bad Behaviour
Valuable lessons are seldom learned in bliss. As catharsis for a painful experience I once had, I offer you one of mine:
The manager of a large project, a small component of which was a safety critical system, chose to go live with the system prior to completion of system test. In my professional view this placed the public at risk. To put it mildly I was extremely angry. Nothing in my university education or thirty years of systems engineering had prepared me for this. Nothing I could do or say could get this decision reversed. I had no idea how to respond to such a blatant disregard for systems engineering best practice. The system failed in use. Luckily no one was hurt.
What follows is the sum total of my reflection and research on this subject. It is an attempt to channel what anger is left in me to some useful purpose. My research revealed that the management behaviours I experienced matched an ancient pattern: Deus Ex Machina, a plot device used by bad playwrights. Further, the best response to Deus Ex Machina is equally ancient. It’s called “speaking truth to power “.
At some point in your career you will encounter Deus Ex Machina. In my case it was frustrating but had nowhere near the impact of the Deepwater Horizon oil rig disaster which provides my case study. When you experience it I hope, after reading my story, that it will find you better prepared than I.
The Tragedy of Deepwater Horizon
Some time prior to his high dive Mike witnessed a tipping point in management decision-making. He called it “a chest bumping incident” where a conservative process for capping the well, proposed by the drilling contractor Transocean, was overridden by a BP executive who mandated a faster but riskier procedure. The motivation was time saving and hence cost reduction and the result: a high pressure methane gas release on Deepwater Horizon’s deck, ignition, eleven deaths, destruction of the rig and a 87-day oil spill that continues to foul the Atlantic Ocean.
Speaking Truth to Power
Could this disaster have been averted if the Transocean contractor had pushed harder for his conservative well capping strategy? Possibly not. As with most disasters the cause cannot be traced to a single event, there were many contributing factors. However it is acknowledged by the subsequent commission of enquiry that multiple instances of pushback on bad practice would have saved the day.
The act of speaking against the strongly held beliefs of those more powerful than yourself was given a snappy title by the Quakers in the 1950s. They called it, “speaking truth to power “. “Power” can refer to a testosterone fueled boss of ecclesiastical bearing secure in his exclusive relationship with a higher power and convinced of his invincibility. Power can also be vested in a set of commonly held beliefs (sometimes the result of corporate group think). Mathematician, astronomer and philosopher Galileo spoke out and suffered a triple whammy of consequences with his outrageous assertion that the Earth and planets revolve around a stationary Sun. His pronouncement had him dragged before the Inquisition in Rome in 1633. For speaking truth to 1) the established dogma of the Catholic church, 2) his holiness the Pope and 3) God, he was forced to recant, his books were declared blasphemous and banned and he was confined to internal exile in Florence until his death ten years later. As they dragged him away this giant of 17th century science and technology whimpered, “Eppur si muove” (“And yet it does move”).
No question, speaking truth to power is dangerous. In researching this subject the first thing I discovered was that human reflections on speaking truth are older than time. Further, the associated patterns of behaviour and predictable outcomes have been documented and played out in myth, legend, theatre and real life literally millions of times. The ancient Greeks were the first to document the risks of exercising virtue in a virtueless world. Speaking truth to power is a central theme of Sophocles’ fourth century BC play Antigone. Early in the play King Creon’s guards draw straws for the life threatening task of informing the King that his niece Antigone has defied a recent edict and has the popular support of his subjects. Creon refuses to back down believing that submitting to public opinion would be a sign of weakness and a threat to his power. In a courageous act Creon’s son Haemon tells his father, “Your presence frightens any common man from saying things you would not care to hear.” Fast forward to modern times. Would you like the job of telling Jack Welch, Andy Grove, Rupert Murdoch or Larry Ellison that, “you are wrong, wrong, wrong!”
Today speaking truth to power can still get you killed (I cite: Muammar Gaddafi’s Libia, Bashar al-Assad’s Syria, the Mafia). In the systems engineering context it can get you fired and ruin your career. On the other hand not speaking truth to power can cause the destruction of companies and the death of innocent people. King Creon’s failure to listen brought death and ruin to his family and destroyed his country. In real life, on Deepwater Horizon, failure to communicate and act on good engineering advice killed eleven people, caused the world’s worst oil spill and cost billions.
So how do we recognise a situation where moral courage is required? How does a professional engineer decide to either suppress his or her moral objections or summon up courage and make a stand? The answer lies in recognising patterns in situations and human behaviours, understanding your options and formulating considered responses. This is how I came upon the Deus Ex Machina pattern which is best illustrated by the events at Deepwater Horizon.
In May 2010 President Barack Obama established the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling (“The Oil Spill Commission – OSC”). The OSC’s report is available on line . Among the conclusions are:
The explosive loss of the Macondo well could have been prevented.
The immediate causes of the Macondo well blowout can be traced to a series of identifiable mistakes made by BP, Halliburton, and Transocean that reveal such systematic failures in risk management that they place in doubt the safety culture of the entire industry.
Among the list of management failures were:
(1) ineffective leadership at critical times; (2) ineffective communication and siloing of information; (3) failure to provide timely procedures; (4) poor training and supervision of employees; (5) ineffective management and oversight of contractors; (6) inadequate use of technology; and (7) failure to appropriately analyze and appreciate risk.
In describing BP’s interactions with its subcontractors the report lends credibility to Mike William’s story as follows:
BP‘s contractors were unduly deferential toward BP‘s design decisions. A Weatherford centralizer technician described the prevailing view as ―Third party, we do what the company man requests. In several instances, BP‘s contractors expressed private reservations about the plans and procedures at Macondo but did not more forcefully communicate to BP that there were better ways to do things.
Not speaking truth to power (and not listening – Creon-like) is proving expensive. BP and its subcontractors Transocean Drilling, Halliburton and Cameron International are currently defendants in a civil suit. BP and its partner Anadarko Petroleum alone are facing $US17 billion in civil fines.
Another interesting aspect of the report was its depiction of the inexeroable drive to save costs. From the report:
Bias in Favor of Time and Cost Savings
On any drilling rig—no matter who is the operator—―time is money. BP leased the Deepwater Horizon at a rate of about $533,000 per day. The high daily cost made the rig the single greatest expense for drilling the Macondo well. It also gave BP a strong incentive to improve drilling efficiency.
The Chief Counsel‘s team observed that the Macondo team understandably made individual decisions consistent with an orientation toward efficiency but did not step back to consider what the safety implications of those decisions were when taken together.
Here’s the rub. Every day BP managers could summon up 533,000 reasons to ignore any engineering advice that might delay the project. This paragraph leapt off the page and resonated in my guts. This is the money/time steamroller that I faced when suggesting a delay of one to two weeks to complete system testing. In addition I was up against prior commitments to politicians and the public which transformed the steamroller into a runaway train. Would you stand in front of a runaway train?
My point is: if you are going to speak truth to power you had better: 1) have courage and 2) be prepared.
The OSC report recommends increased regulation of offshore drilling especially in deep water and enhancing the independence, technical capability and integrity of regularity agencies. It also recommends that oil companies develop specialised Safety Cases for each rig addressing both construction and operation. It notes that this is standard operating procedure in North Sea drilling. It also recommends improvement of well containment technology. In a tone of barely concealed frustration it rages against
… the simple inability-of BP, of the federal government, or of any other potential intervener – to contain the flow of oil from the damaged Macondo well.
Overall it endighted BP for allowing rapidly evolving technology to outstrip its ability to manage risk.
Finally the report makes the surprisingly candid admission that no amount of government regulation will stop another Deepwater Horizon. Future safe operating must spring from the safety culture of the organisations doing the work. The word “culture” is used 17 times.
So there we have it. Culture. The shared values of an organisation. The ideas and behaviours that you would hang on to even though, when tested by circumstance, they might cause you pain. This is where a safety culture operates, at the most primal level of the human psyche, conscience on one hand and greed on the other.
Wielding Technology With Wisdom
On a proud day in 1975 I stood on a stage and accepted and electrical engineering degree from a University Vice- Chancellor. I did not bow as was the custom (I recall I winked at him). This was my day and my degree, I had knocked myself out for four years to get this far and I was not going to bow to anyone for it. The implications of accepting this qualification were clear and exciting . It gave me a licence to operate, to build systems that improve the lot of humanity and, in the process, have fun with many fast moving technologies. It took me more than 30 years to fully appreciate another implication: it gave me the power to save people’s lives or, by inaction or incompetence, be responsible for human suffering and death.
Engineers are taught technology, they receive no instruction on how to wield it with wisdom. My engineering course did not cover ethics or philosophy. I have to say that for more than 30 years this was not a problem at least until that fateful day when I sat in a meeting, faced a cost/time runaway train and realised I had no response.
Picture this. A project manager surrounded by the aura of power, confidence, hubris and the divine right to rule that is implicitly bestowed on the leader of a half billion-dollar project looks down from his throne and, referring to the functional safety program says, “I thought we weren’t going to do this.” When shown system test results that reveal many serious defects he works through them downgrading the severity level of many test failures. In previous meetings he had crashed the system development schedule. That is, given us the end date and told us to rearrange all the tasks accordingly. This the scheduler dutifully did creating a schedule that everyone knew was impossible thus rendering impotent any serious attempt at professional project monitoring and control.
There was a sense amongst the team that somehow we should be fighting back . This was a blatant violation of everything we knew to be professional, honest and right. But how? How can you buck the momentum of a massive project proceeding inexorably to delivery. How can you do what you know is right in the face of massive cost/time pressure and political will? How can one be a righteous person in a unjust world? Upon reflection there was no magical solution. But I do know this: 1) at this point my engineering education clocked off! 2) at this point my classical education should have kicked in – the wisdom of the ages providing persuasive arguments for full support of the safety program and 3) at this point the lack of a classical education could have got someone killed.
Deus Ex Machina
Deus Ex Machina is a Latin phrase in common use in the classical theatres of Greece and Rome (400 BC to 400 AD). Its meaning: “god from machine” refers to a plot device commonly used by bad playwrights to extract themselves from story problems – usually endings. Bereft of ideas, they would simply manufacture climaxes by having actors portraying gods lowered onto the stage on a platform with ropes and pulleys (hence the term “god from machine”). Apollo or Athena would then settle everything determining who lives, who dies, who marries and who is damned for eternity.
Even at the time, this was recognised as contrived, aesthetically unsatisfactory bad practice, indicative of a playwright lacking in creativity. Greek philosopher and polymath Aristotle, student of Plato and teacher of Alexander the Great criticized the device in his “Poetics”, where he argued that the resolution of a plot must arise internally, following from the previous logic of the play. He said:
It is obvious that the solutions of plots too should come about as a result of the plot itself, and not from a contrivance, as in the “Medea” and in the passage about sailing home in the “Iliad”.
If a safety critical project where a play and the plot a functional safety program plan, its passage home should be the full implementation and testing of the recommended safety features in the systems product. The abrupt termination of the story by the contrived, illogical and unsafe intervention of the uninformed was recognised as bad practice in 300 BC and remains so today. The analogy of “safety interrupted” to a Greek tragedy is prophetic. Greek tragedies typically end with half the cast dead and the rest damned for eternity, the living dead, cursed to live with what they’ve done.
Recognising the Deus Ex Machina Archetype
The key to dealing with bad behaviour is to first recognise it for what it is. For example, psychologists use archetypes as a diagnostic tool for treating mental illness. An archetype is a model of human behaviour. An understanding of the archetype informs the treatment process. I was fascinated to note that just the act of informing a manager that he was acting out documented pathological archetypal behaviour caused him to stop short and reflect on what he was doing. I have called the archetype that contributed to the Deepwater Horizon disaster and appeared in my personal situation: Deus Ex Machina.
The Deus Ex Machina pattern:
- Technical specialists in possession of the facts make informed evaluations of safety risks and formulate risk management strategies.
- Recommendations are made to management who have the power to act (or do nothing) and are ignored. The reasons: a) cost/time pressure, b) ambition, c) political expediency, d) hubris, e) ignorance of the technical issues in play, f) ego.
- Management cuts corners. Design, construction and operational risks are taken. For example, risk management programs are cut short or eliminated to save money or, as in my case, schedules are crashed, test programs are not completed and safety critical systems are deployed prematurely.
Left unopposed the Deus Ex Machina pathology is likely to cause dangerous failures which result in loss of life, destruction of property and environmental damage.
Post-incident behaviours often include telling lies and attempts to shift the blame from managers to technical people. Protestations that, “I didn’t know it was going on” are common.
Opposing Deus Ex Machina
Ok. So a so-called management god has parachuted into your project and is telling you to do things you know are wrong and potentially dangerous. You’ve recognised the Deus Ex Machina archetype . Now what do you do?
MIT social scientist Albert O. Hirschman offers three options to employees who disagree with company policy:
- Loyalty. Remain a loyal “team player” (shut up and do what you’re told)
- Voice. Try to change the policy (speak truth to power)
- Exit. Tender a principled resignation (quit).
In normal circumstances company loyalty is a good thing. Loyalty and teamwork builds cohesive organisations that can do great things. Loyalty secures promotions and improves lifestyles. This alone is enough to drive most people down that path regardless of what a company does. Everyone has a mortgage and kids to educate. Add to this our human need to be accepted by our peers. To be ostracised and alienated from our friends is to be less than human.
To move your position from loyalty to voice can be a difficult decision. Disasters involving loss of life are high impact low frequency events. Probability is always involved. Specific incidents are almost impossible to predict. They are often the result of a slow and tortuous degradation of a company’s attention to safety. Witness Union Carbide at Bhopal and BP at Deepwater Horizon.
In making the decision to speak out ask yourself :
- By remaining loyal am I colluding with a bad course of action that may harm innocents?
- By saying nothing am I lending my name, and any credibility it may invoke, to an immoral act?
- By doing nothing am I supporting a boss who has forsworn company values, professional values or both?
- By not voicing am I providing a negative role model for my organisation?
- Is loyalty to this company more important than my professional reputation?
Point five may be more important than you think. Jesse Gagliano was Halliburton’s cementing engineer on Deepwater Horizon. The OSC report goes on:
Documents show that before the blowout, BP engineers thought Gagliano was not providing “quality work” and was not “cutting it”. They highlighted that Gagliano had a habit of waiting too long to conduct crucial cement slurry tests. Three days before the blowout, Morel complained that he had “asked for these lab tests to be completed multiple times early last week and Jesse still waited until the last minute as he has done throughout this well.” Morel found “no excuse” for the tardiness.
Guilty or innocent, Gagliano’s name is mentioned 17 times in a disaster post-mortem report commissioned by the President of the United States on the world’s worst oil spill. Would you like that on your resume?
Let’s be clear. By giving voice to your objections and speaking truth to power I am not talking about voicing disagreement on minor detail or suggesting a small change in a course of action. I am talking about a major step to the left, spending millions where no millions were budgeted, delaying by weeks when no delay can be tolerated. I am talking about declaring that the Earth revolves around the sun in conflict with established dogma that the sun revolves around the Earth. And by doing so directly challenging the decisions, authority, integrity and wisdom of powerful people.
Before you voice consider the following:
- Are your arguments based on demonstrable fact rather than emotion?
- Is there any whiff of self interest in your position?
- Are you personally clear on the moral issues involved – what is moral wrong being committed here?
- Are you doing this out of spite, anger or intense dislike of an individual?
- Do you understand the consequences of your actions and do you accept them?
Be strategic about delivering your message. To gain traction your fact based argument must be delivered in a clinical professional manner to the right people at the right time. Delivery may not be a single event. It may be more effective to put your case progressively over a series of progress or planning meetings.
Demonstrate empathy. Show real concern for the management position. “Yes, I understand we need to deliver on time but sticking to that date is likely to have these negative consequences …” To demonstrate empathy is to experience the emotions of those who you are attempting to influence. Remember Bill Clinton’s famous response, “Ah feel your pain.” Bear in mind that regardless of how a manager may present himself he is probably not evil. He is more likely ill informed, not technically literate and focussed elsewhere (on costs and schedules).
Declare your motivation. Make it clear that you are acting on your professional understanding of right and wrong.
Present facts. In the special case of a functional safety program you need two things: 1) A Functional Safety Plan and 2) Records such as hazard logs, safety requirements specifications and test results that lend credibility to the safety effort. The Functional Safety Plan signed off by the boss is your license to operate. It gives you the right to raise objections if the project is deviating from plan. I had a plan but my mistake was not pushing it high enough in the project hierarchy. I learnt from bitter experience that the plan must be signed off by the person with the organisational power to delay the deployment of a system and allocate money and other resources if they are required. Had I done that the sentence, “I thought we weren’t going to do this, ” could never have been uttered. A plan commits its approver to a course of action. If there is no sign-off there is no commitment.
Drawing the boss’ attention to the detail of a Functional Safety Plan on a large project can be problematic, on multibillion-dollar projects it’s nigh on impossible. Don’t be daunted, insist. It seems, by necessity, the managers of very large projects are more politicians and technologists. It is therefore unlikely that just e-mailing a copy to the boss will procure understanding. You must eyeball him. Sit down with him and go through it.
If despite your best efforts you are still ignored and as a last resort, consider appraising senior people of the legal term “wilful blindness”. It means looking the other way, putting yourself in a position where you will be unaware of facts that may render you liable. This is the core of Rupert Murdoch’s problem with the News of the World phone hacking scandal. Most legal jurisdictions hold that if you intentionally fail to be informed about matters that make you liable, you are still responsible in law. A typical response of senior people after a disaster is “I didn’t know xyz behaviour was going on.” Refer BP, News International, Royal Bank of Scotland, Enron and Lehman Bros. In the engineering context senior people must be made aware that the project has a functional safety program, that it is a life and death matter and that should their actions either knowingly or unknowingly reduce the effectiveness of the program they will be liable for the consequences.
Control your anger. Anger is a delicious emotion but it has no remedial effect. The need to voice often arises in the emotionally charged situations you find in deadline death marches. People are overworked and irritable and the boss is totally focused on a delivery date. Anyone who threatens that date often becomes the enemy. The motivation to voice often arises out of anger. Anger at a situation. Anger at an individual. It made me angry to see the discipline of Functional Safety Engineering denigrated, devalued and literally plowed under by cost/time pressure. Substantial bonuses were being paid for early delivery and I was in the way.
To be an effective advocate you must control your anger. In his reflections on anger, Aristotle says there are times when anger is called for and appropriate. In fact, if one does not become angry over a grave injustice, he says one cannot be considered virtuous. The secret is in knowing when to be angry and then how to direct it usefully. The virtuous person, he says, becomes angry at the right time, over the right issue, and to the right degree. This is good advice. In a modern business environment an angry person can easily be characterised as an out-of-control malcontent whose pronouncements are not worthy of consideration.
Recognise you are not alone. If you speak out you will be, at least in a physical sense, alone. The less courageous will draw away from you. Your peers may even tell you that your passion is unseemly and unprofessional. You’ll be accused of being disloyal, not a “team player”, an angry malcontent, someone who doesn’t know what they’re talking about, someone who doesn’t see the big picture … and so on.
But you are not alone. Be aware that you are something more than an engineer presenting a technical point of view. You are an advocate. You speak for thousands of dead; past victims of avoidable dangerous failures (reflect on the eleven dead of Deepwater Horizon). You also speak for human beings who unknowingly may not have long to live if your recommendations are ignored. I like to calmly announce to rooms full of angry managers that not only do I speak for past dead and those whose lives are threatened but also the millions of engineers past and present who have developed the best practices that are being violated on this project (“I got you surrounded,” I say). In short, you represent the quantifiable truth as we currently know it. It’s also helpful to point out that these best practices are embodied in the international standards usually called out as conditions of contract. Our contract!!
Make them feel small. The larger a project the more multidisciplinary it becomes. The people in charge may not be experienced Systems Engineers and therefore not value the systems engineering discipline or even accept that there is one. I struck one manager who felt he was qualified to give us technical direction because he new how to operate a personal computer. In managing this perception it is useful to think of a cathedral. Go to one and experience the high vaulted ceilings, the massive space and by comparison your own insignificance – a speck of dust in the scheme of things. You can have the same experience by just looking up at the stars. It has a calming effect on everyone regardless of religious persuasion. The the basic science, mathematics, technologies, processes and practices emanating from millions of man years of thought and practice are our cathedral. Make it clear to detractors that we are operating from a massive body of knowledge and to act against accepted best practice may expose them to unwanted risk.
Classify the bad behaviour. Make the players aware that they are engaging in a behaviour pattern that has been documented in the literature for more than two millennia, a pattern that has a high probability of ending in tears (reference the Deus Ex Machina Archetype).
Hold them to their commitments. You did have a Functional Safety Plan … didn’t you? They understand its content … don’t they … because you reviewed it with them … didn’t you? And most important of all, their signatures are on it; they made a commitment … didn’t they?
Make the consequences real. Many people view safety as a worthwhile idea but a somewhat abstract thing, easily pushed into the background when profits are threatened. Causing a human being to act requires wrapping that idea around an emotional charge. Make the consequence of death real, emotional, palpable and possible to those who would make bad decisions. I tell the story of a chemical engineer I once worked with. He undersized a pressure relief system on a pilot reactor killing a lab technician. I met him 15 years after the incident, the eternal sadness of remorse was still in his eyes.
The executives who took risks with Deepwater Horizon have had their “religious experience”. The concept of people dying is no longer abstract to them. By now they are card carrying risk managers. If you can contribute to someone having a simulated religious experience before the fact you may have done them a great service.
Keep a diary. After-the-fact memories quickly fade. I guarantee you at some point you will have to remind someone of what they said or agreed to. You will also have to recall your own actions with perfect accuracy. This is not possible without a diary. Log who said what to whom and what was done on what date. Record decisions and their associated rationale and make sure you have copies of critical documents such as plans, safety requirements specifications, hazard logs and test results. Assume that one day a commission of enquiry will knock on your door and ask what went wrong and what you did to prevent it. If you are a safety authority your door will be the first they approach.
The decision to remove yourself from the situation (quit your job , fire the client) is usually driven by concern for your professional reputation and your ability to live with yourself. Attempting to change a bad situation with a principled resignation is usually ineffective. High profile people who have taken this step more than once report that the outcome may not be negative. It can enhance your professional reputation and if you’ve behaved professionally absolve you of legal liability (remember the diary). The people who have threatened you with not-eating-lunch-in-this-town-ever tend to move on or end up in jail and new horizons soon present themselves.
Unfortunately for engineers a resignation will not make an unsafe system safe . If you feel strongly enough about it you may elect to blow the whistle (engage with regulatory agencies or the media). This step should not be taken lightly, gather up your evidence and take legal advice.
The Ethical Duty of a Systems Engineer
The Deepwater Horizon incident serves as a reminder of what man-made systems can do if poorly managed. And with each passing year we roll the dice on bigger technological risks. With each passing year, unknowingly, the general public’s dependence on us increases. Travellers routinely board larger and more complex aircraft taking safety for granted, people travel in driverless trains, uninhabited air vehicles circle overhead, we implant computing devices in our bodies and citizens make their homes closer to the blast range of chemical processing plants unaware of their dependence on safety systems.
Those who commit Deus Ex Machina are not evil, they are unknowing. In this environment we systems engineers are responsible for keeping people safe. Only we truly understand the technology, its potential for good and its dark side. Everyone else is helpless. Knowingly or unknowingly they trust us.
We have an ethical duty to come out of our mathematical sandboxes and take more social responsibility for the systems we build – even if this means career threatening conflict with a powerful boss. Knowledge is the traditional currency of engineering, but we must also deal in belief. The techniques of persuasion must become part of the engineering toolbox. If the safety integrity of a system is compromised by a bad management decision it is our duty to speak truth to power and change belief systems.
The alternative is to risk enduring regret for the shortened lives of the people who put their faith in our skills. Think it over and I’ll leave you with this:
Cowardice asks: Is it safe?
Expediency asks: Is it politic?
But Conscience asks: Is it right?
- CBS News (2010), Video Extras: Escape From Deepwater Horizon, Survivor Mike Williams Recalls His Harrowing Escape, [Online], Available: http://www.youtube.com/watch?v=qnrIE8TrgSA [4 Mar 2012]
- Oil Spill Commission (2011), Deep Water: Gulf Oil Disaster and the Future of Offshore Drilling, [Online], Available: http://www.oilspillcommission.gov/ [4 Mar 2012]
- O’Toole J., Speaking Truth to Power: Old Tales and New of Leadership, Organizational Culture, and Ethics, [Online], Available: http://www.scu.edu/ethics/practicing/focusareas/business/truth-to-power.html [4 Mar 2012]