It looks like nothing was found at this location. Maybe try a search or browse one of our posts below.

Dear Dad,

Thanks for the e-mail. I enjoyed the joke, although I don’t really get the part about the nun and the rabbi. What you described them doing seemed out of character somehow.

Glad to hear your European trip was such a success. A German version of our new book, Selling on the ‘Net would be great, but since neither of us speaks German I’m not sure how we’d collect and analyze a bunch of German Web pages. Any ideas?

Speaking of the book, I know I promised to shamelessly promote it in my column. It just seems so crass, weaving in the title and publisher and pretending my readers really need to know it’s called Selling on the ‘Net by Herschell Gordon Lewis and Robert D. Lewis, published by National Textbook Company, for $39.95.

It would be one thing if there was any possibility of subtlety. Anyway, I’ll try.

You asked about my description of information theory a few weeks ago. I know it was a long shot. To answer your question, though … I may be, but since that’s hereditary you may not want to push the subject too far.

And besides, if it hadn’t been for the typesetting problem it would have been far clearer. Information entropy really is a great measure of complexity. Let me show you how you’d use it. You use four programs for most of your work: WordPerfect, AOL, the American Heritage Dictionary, and that CD-ROM with the world’s famous literature. WordPerfect is critical – give it a 5. AOL is important, so it gets a 3. You use the other two, but you could easily do without them, so they get 1 each. Add ’em together and you get a total of 10 points.

The formula should have been printed as:

-SUM[p(i)log(p(i))]

(You need the base 2 logarithm, so divide the regular, base 10 log by log(2) to convert it. Last time I used algebraic notation, but typesetters have a hard time with that.)

To get p(WordPerfect), divide WordPerfect’s score of 5 by the score total of 10 to get .5. Plugging all of your scores in:

-[.5*log(.5)+.3*log(.3)+.1*log(.1)+.1*log(.1)]/log(2)

which comes out to around 1.7. That’s a pretty low score – a pretty reasonable work situation. When it gets much over 2 things are getting complicated.

I know most business people avoid any statistic more complicated than a percentage, Dad, but that doesn’t have to be an absolute rule. If the best measure takes a bit more algebra, that shouldn’t be a hard barrier, should it?

Look at the formula for Return On Investment (ROI). That’s a polynomial series. A lot of people don’t understand why you put all of your costs into an artificial “Year 0” (you pretend all of your costs happen up front before you start figuring annual cash flows) but if you don’t, the polynomial could have more than one right answer. That gets hard to explain (“Well, boss, the return on investment is either 15.3% or -5.8% and we have no way to choose which one is more valid.”)

Or compound interest. You take the interest rate (plus 1) and raise it to an exponent equal to the number of years you’re compounding. People who don’t understand compound interest don’t even know if they’re losing ground to inflation, so it’s pretty important.

I know Americans take pride in being mathematically challenged, Dad. That doesn’t mean I should encourage the “Gumping of America” (GoA, to use the technical term) in my column. Ignorance may make good box office in Hollywood, but it’s a bad way to manage a business.

Okay, I’ll stop. Talk to you soon.

Love, Bob

P.S. I promise, I’ll find some way to flog Selling on the ‘Net in my column. Sigh.

Culture is the new IT governance.

No. It isn’t. Not yet. Culture should be the new IT governance.

The IT Governance Institute’s definition of IT governance is as good as any: “… leadership, organizational structures and processes to ensure that the organization’s information technology sustains and extends the organization’s strategies and objectives.”

Nothing wrong with the concept. IT’s priorities should be driven by business considerations. Setting them through the consensus of its stakeholders seems sensible.

It is sensible. Where IT governance goes sideways is where oversight usually goes sideways: A failure to understand that Homo sapiens has two subspecies: Steven Spielberg and Jeffrey Lyons. Either you helped make the movie or you’re a critic.

In most companies, most of the time, the IT Steering Committee is Jeffrey Lyons. It doesn’t really exist to set IT’s priorities. It exists to review, critique, and for the most part reject suggestions as to what IT’s priorities might be.

The IT Steering Committee, that is, isn’t a strategy-setting team that collaborates to decide how the company can best take advantage of what IT has to offer. Instead it’s become a group of critics, who see their job as ensuring IT doesn’t go off and waste precious company resources on pointless technological extravagances.

In case the problem still isn’t clear, too often, the IT Steering Committee’s mission isn’t to help put good ideas into practice. It’s to prevent bad ideas from becoming bad practice. The result: It makes sure the business never tries anything except the safest ideas.

Which is one reason, and a very important one, that shadow IT is on the rise: Departments commissioning their own information technology don’t have to jump through any of the IT Steering Committee’s flaming hoops.

There’s another, related reason: The company has to be careful how it allocates its “scarce IT resources” so they provide the maximum return.

This sounds convincing, until you ask why IT resources are so scarce. Usually, they’re scarce for one of two reasons, or both.

The first has been pointed out in this space several times before: Companies try to cut costs by trimming the IT budget, not realizing this is like trying to cool a room by blowing cold air at the thermostat. The more cold air you blow, the more everyone swelters.

The second reason is a terrible trend: IT’s resources are scarce because of the fondness boards of directors and top-level business executives have for financial engineering.

Here’s the math: In its most recent year the Fortune 500 will have earned an aggregate $945 billion in earnings. But as reported by Bloomberg last fall, they’ll “invest” 95% of it in stock buy-backs, leaving only $47 billion for all forms of reinvesting to achieve competitive advantage. All new IT spending has to come out of that residue.

If IT resources are scarce, it’s an artificial and deliberate scarcity. Rather than fight for these artificially scarce resources, business managers at all levels are increasingly walking away from the struggle, instead rolling their own IT through a combination of SaaS solutions and cloud-hosted custom systems written by outside developers.

As pointed out last week, this avoidance of formal IT oversight results in three very real risks: Re-keying of data that should automatically flow through IT’s integration mechanisms; exposure of sensitive corporate data to outsiders who have no business seeing it; and failure to adhere to the company’s painstakingly arrived at set of official data definitions, which will, in turn, make both re-keying and automated integration problematic.

Which in turn leaves only three possible solutions. The first is to live with the problems — probably not a good idea, as they are preventable without all that much additional effort.

The second is to apply existing enforcement mechanisms to shadow IT. They’ll work, but they’ll slow down something whose principle virtue is that it speeds things up.

That leaves the best alternative: Culture. Members of cultures enforcement them through social coercion, greatly reducing the need for official sanctions. It’s efficient, because everyone in the company internalizes its culture without any formal training. Employees know the rules.

The downside: Establishing and maintaining the desired culture is hard work — not hard the way nuclear physics is hard, but hard the way laying cinder block is hard.

But it’s worth it. The right culture delivers the right results without the heavy hand of enforcement, letting leaders apply a much lighter touch.

Thomas Kuhn published The Structure of Scientific Revolutions in 1962, redefining the dialog about how science progresses, changing it from a purely philosophical prescription to incorporate the sociology of how real scientists actually behave.

In it, to his everlasting damnation, he introduced the phrase “paradigm shift,” which has since been tortured to insanity by a generation of management consultants who never bothered to actually read Kuhn’s seminal work.

It’s a shame, because the underlying idea — that major advances entail a complete change of perspective and worldview — is quite valuable. A paradigm shift doesn’t disprove old ways of looking at things. It makes them irrelevant.

For example: IT is in the midst of a paradigm shift that makes the old idea of requirements irrelevant.

“Requirements” is a holdover from a time when functional managers and end-users asked IT for a system. IT, responsible for delivery of working software that did something useful, asked the logical question, “What are your requirements?” What we got in response were a series of attributes. Software that possessed those attributes “fulfilled the requirements.” Whether it did something useful for the business wasn’t IT’s problem.

That isn’t how things work anymore, for which we should be eternally grateful. IT and functional managers now share responsibility for making sure business change happens. With this new scope, IT and representatives of every affected part of the enterprise must collaboratively redesign the target business function.

Defining requirements is irrelevant to this process. Instead, we must create a high-level “functional design” that describes new business processes, employee roles, and the technology that will be used by employees in their new roles to implement the new processes.

“High level” is a vague term. In practice, it means allowing no more than seven boxes in any diagram and no more than seven bullets of text in any narrative. When necessary for clarity, you can elaborate each box or bullet with one more level.

Once everyone agrees that this new design will work, you can drill down to as many more levels as necessary to create a detailed specification from which developers can code, testers can test, and trainers can train employees in their new roles.

It might seem that the difference between requirements and design is simply word play. It isn’t: Requirements are attributes; designs describe how things will work. This is far more than a semantic distinction.

It’s a different paradigm.