IT: The Only “Supportive” Function That isn’t Overhead, and isn’t (usually) Regulated

Most companies of almost any size these days have several functions that are considered overhead. Now, it’s important to understand, from a business perspective, what “overhead” actually means.

The expenses of a business that are not attributable directly to the production or sale of goods or services.

For example:

  • Real estate (e.g., rent, property taxes, building maintenance, and so on)
  • Facilities (custodial services, you might drop building maintenance into this category; usually includes basic utilities like power and water)
  • Finance
  • Human Resources / Personnel
  • Legal

You get the idea. I’d argue that some of these shouldn’t be considered overhead in the usual sense. For example, you should be able to allocate some utilities, like power, to the production and sale of good or services.

I’ll point out that many, although not all, of these overhead items are in regulated industries. Finance (bookkeeping, at its simplest) is a highly regulated industry, and finance leaders will usually have a CPA – an industry-maintained license. HR is also highly regulated, and in fact spends more time dealing with regulations and laws than almost anyone else. Legal is obviously a highly regulated, independently licensed profession.

Which brings us to IT.

For most organizations, IT is considered overhead. That is, mainly due to lazy management and a lack of tools, the costs of IT aren’t allocated back to the goods or services that IT helps produce. Nor is IT allocated back to other overhead functions that consume IT.  IT is also almost completely unregulated in most industries/ There may be laws (HIPPA, SOX, GLB) that place certain restrictions and responsibilities on a company’s information-handling practices, but those don’t target IT specifically. IT has no professional licensing, just vendor-based certifications.

You can make much the same argument with other “overhead” functions, like HR. Most HR costs – payroll, benefits, etc – can be allocated on a per-person basis pretty easily, which means they can and should be allocated back to the cost of producing or selling goods or services.

So why all the overhead?

Simple: lazy management. It’s easier just to dump all of these functions into an “overhead” bucket than to spend the time allocating them out to individual business functions. But that lazy management means that, for some companies, these overhead functions account for the company’s biggest expenses. That’s like dividing your household monthly budget into “groceries,” “bills,” and “other.” Should you need to cut back, you really can’t do so intelligently without digging deeper into that “other” category.

Overhead categories encourage poor decision-making. In reality, absolutely everything IT does either directly leads to a sellable product or service, or directly supports someone who does. If you have any IT that doesn’t directly support a business function, you should get rid of it – but that’s hard to do when it’s all one big lump of “overhead.”

And that overhead is getting bigger. It’s also getting more diversified: storage, communications, virtualization, infrastructure, they’re all becoming increasingly specialized. And, because it’s all lumped into “overhead,” it’s difficult to determine if a particular function can be outsourced, moved to a cloud platform, etc. Simply making smart decisions about IT is difficult when you can’t tie cost and benefit back to a revenue-producing effort.

In the coming years, the companies that will do the best will be the ones who know exactly how every IT penny is being spent, and why. Not to micromanage that spend, but rather to maximize it in the places where it will contribute the most to revenue-producing activity. The companies most effective at doing this will be small and medium-sized businesses, and they’ll be the ones able to disrupt their much larger peers simply through smarter management and smarter allocation of resources. They’ll maintain better profit margins because they’ll have a better handle on expenses, and because they won’t settle for “overhead” anymore.

What could you do to encourage your organization’s leaders to start thinking of IT has an attributable cost of doing business, rather than as lump-sum overhead? Do you do any of that today?





Customer Retention

I’m sure you’ve read the Comcast horror story about customer retention; if not, give it a quick skim.

I contrast that with the Quicken Bill Pay service I recently cancelled. They were polite, assuring me immediately that they’d cancel my service, add a free month so I could transition bills as needed, and welcomed any feedback I wanted to offer. Only then did they ask why I was leaving, and if there was anything they could fix that would change my mind. Once I said, “no,” they immediately processed the cancellation as promised. I was off the phone in minutes, with a very positive feeling about the company.

As a result, I’d do business with them again, if I had need. Comcast? Shudder. I’m glad I don’t have coax in the ground out where I live, so I can’t even be tempted.

But the Comcast story is a good one when it comes to considering pay incentives. Management guru Deming was famously anti-incentive, and the Comcast story is pretty much the reason why. Incentives always have a downside, and it’s virtually impossible to develop an incentive program that doesn’t have a downside, sometimes major ones. You can’t blame the rep in this one, because he was doing exactly what Comcast trained him to do and paid him to do – it’s just a shame that the company’s parting shot to a customer was to virtually ensure they’d never come back.

Why “Private Cloud” is Actually Important

The vendor community – Microsoft included – has done a horrible job of explaining what “private cloud” is supposed to mean. So horrible, in fact, that the term has become yet another hated marketing buzzword. But in reality, “private cloud” means something important, innovative, and disruptive.

It means the end of our users hating IT.

Unfortunately, as you’ll see by the end of this article, “private cloud” will take a long, long time to be truly implemented, because most companies lack mature-enough management to do so.


What is “private cloud,” then?

When the term was first launched, most people responded with something like, “so this is just the datacenter I’ve had all along, with some virtualization, right?” And from a technical perspective, they’re almost right. A private cloud-ish datacenter really doesn’t look a lot different, technically, from what you’ve got today. A private cloud datacenter almost, if not fully, virtualized. You’d have the ability to quickly move loads from host to host to accommodate workload changes. Most of you probably have something close to that.

The private cloud also has a lot of automation built in. When you need to spin up a new virtual machine to be a web server, or SQL Server, or domain controller, or whatever, you really just click a button. The infrastructure deploys the VM, finds a place for it to live, provisions the operating system, and so on. Zero manual effort. Some of you may be close to that, but most of you are miles away. Literally anything beyond filling out a single form (what kind of server, what will it be named, and so on) and clicking a button it too much work. In fact, if your stupidest, most entry-level tech can’t do the job, then it isn’t automated enough.

You see, that automation is the key.


This is a Management Play

Once your dumbest technician can deploy new VMs, any user can technically do so. So the difference between “users deploy their own stuff” vs. “IT deploys it for them” becomes a matter of who’s authorized to do the task, not who’s capable of doing the task.

When you log into Azure and spin up a new VM instance, nobody at Azure checks to see if you’re authorized or not. They don’t open a help desk ticket and route it to someone for approval. Because they have your credit card on file, they know you’re approved to make the decision, and the VM spins up. Nobody else can do that with your account, unless you authorize them in advance. In other words, you, the CEO of your little world, get to designate who may make those decisions. The IT department – the folks working at Azure – don’t care.

That’s what private cloud means: moving the authorization-and-approval point from IT to the business. This is good for two reasons.


IT Shouldn’t Be a Gatekeeper anyway

When exactly did IT get signed up to be “gatekeeper” on stuff? Well, in the beginning, only we were capable of doing the job, and so it made sense to make us a sort of bottleneck to make sure resources weren’t being wasted, and that they were being set up properly. The company set rules, and we enforced them. The problem is, that turned us into the “Naysayers” a lot, which made us a speedbump or something to be circumnavigated. We spend a lot of our time policing the infrastructure, and it shouldn’t be our job.

The biggest problem with IT being the gatekeeper is that is allows the company to conveniently write us off as “overhead.” Because we’re enforcing the company’s business rules for technology, the company gets to be lazy about accounting for IT usage. The problem is that IT isn’t overhead. Everything we do should either directly support a revenue-generating business function, or itself generate revenue. But because we don’t make the revenue, and we spend all the money on blinky-light toys, we’re “overhead.”


That’s Why Private Cloud is a Management Play

In private cloud, IT doesn’t spend any money, and it doesn’t authorize anyone to do anything. Instead, the company authorizes people to make specific IT decisions on their own. Line workers might have zero authority; their boss might be able to order new laptops when needed. His boss might be able to spin up new web servers at need, and her boss might be able to order entire new infrastructure elements like an extranet for partner communications. They do all that (ideally, but this is a maturity step) through a self-service portal, which IT sets up to run automation in the background.

Look, you’re the VP of Marketing. You want a website, push this button. You’ll get a website, and it’ll be consistent with our standards. There’s a cost for it, and it comes out of your budget. If you overspend your budget, that’s someone else’s problem, not IT’s. IT will provide Finance with a list of charges each month, and everyone pays for their share of the infrastructure.

You see, when business leaders who already have a bottom-line responsibility make the IT decisions, we aren’t overhead anymore. Business leaders make more careful business decisions about IT expenditures, because every one directly impacts their own bottom line, instead of being buried in some shared “overhead” category that everyone just ignores.

“OMG, let the VP of Marketing spin up web sites on demand?!?!?!” Yeah, let him/her. If he/she is incompetent, it’ll be a lot more obvious when the costs are more obvious. This doesn’t magically make every company better managed, but it puts crucial evidence in place to help make that happen. IT has no place enforcing the business’ financial or management goals. It’s not our job, it’s not what we’re good at, and it doesn’t serve the business. We need to move those decision points where they belong, and if that means people are allowed to make stupid business decisions, so be it. At least you’ll be able to see who’s making the stupid business decisions, instead of merely having anecdotal evidence of the fact. You already have crappy managers; let’s give them some rope and a spotlight.


But Here’s the Problem

The difficulty is that this requires a lot more managerial maturity than most companies possess. On the IT management side, you need to know what your resources cost. Newer tools – System Center Ops Manager is gaining some of these capabilities, for example – make it easier to send usage-based cost reports to Finance. But you also have to know what your administrators cost, because their “overhead” has to get rolled into the price of the resources you offer to the company. Most companies can’t tell you what a given administrator actually costs them, without turning to HR and doing some investigation on salaries and benefits and overhead and whatnot. It’s easier to lump us in as overhead… but it isn’t beneficial.

Private cloud is not a technological thing. It’s a management thing. We have most of the technologies needed to implement it, but we don’t have the management.




Don’t Get Stuck in Your Job

I hope you’ll take a moment to share this post with your coworkers, colleagues, peers, and even friends. Whether they’re in IT or not, there’s a strong lesson here – one that’s easy to lose in the day-to-day madness.

I have a good friend who’s not terribly happy at his current job. He’s stuck in a more-or-less dead-end position, and while they pay and benefits are fine, he feels a bit like he’s rotting in place. The world of IT is moving around him, and he’s worried about growing more irrelevant by the moment.

So he did the right thing: he started interviewing. That takes a lot of guts, and it’s the step where probably 80% of people will just give up and suffer in a job they don’t love. Interviewing is hard, it can feel humiliating, and practically nobody enjoys it. But he did it.

And had a problem, because for much of his career he’s been in larger companies that tend to “silo” people into a specific technology. So, he applies for this job, and told me:

I recently interviewed to take over a position at an Internet company and I failed… That was a pretty tough pill to swallow, considering the main skill they wanted was someone to manage their PowerShell / Desired State Configuration approach to configuration management on Windows servers.  That’s pretty much my bread and butter right now.  I was told that the reason for being turned down was that I don’t have enough experience with IIS and SQL.

Ouch. Perfectly qualified for the job’s main requirement – in fact, he’s probably one of the top people in the world for that job. But he didn’t get it because he didn’t have SQL Server and IIS skills.

This is such a common refrain. Folks, while your current employer might be happy with your skill set, nobody else will be. There’s a reason your employer isn’t “investing” in your skills by training you up in “unrelated” technologies – it makes it easier to keep you. Your employer has no reason to help solidify your career – they just want to make sure you can do your current job. And unless that job is the only one you ever want to have, you have to take the initiative on your career.

Technologies like SQL Server and IIS are as fundamentally important as understanding DNS, IP networking, and using a mouse and keyboard. Sure, if you’re in a non-Microsoft environment, you can substitute “SQL” and “IIS” with something else, like “MySQL” and “Apache,” but the point is that these functions are essential to everything. Knowing core infrastructure – Active Directory, DHCP, a bit about routing, all that – is lowest-common-denominator knowledge. If you want any kind of decent position, you need to know it all.

I get so frustrated when I teach things like PowerShell DSC and hear comments like, “this looks cool, but I’ll never get to use it at my current job, so….” Why should that stop you? If you don’t work for a leading-edge-tech company – and few folks do – then it’s up to you to invest in your career. You have to keep up, even if it’s on your own… or you’re going to be stuck in your current job. Maybe you’re okay with that – maybe your current job fits you perfectly, in which case, congratulations! But I’d be terrified of being stuck with no options.

People bemoan how difficult it is to “keep up,” but it isn’t always that hard. Buy a book, or subscribe to one of the many video training services (I like Pluralsight these days) out there and watch a video. My Month of Lunches series has IIS and SQL Server books for a reason. You need to identify the lower-level platforms and technologies that power everything, and make bloody sure you know about them. Yeah, it’s harder when you’re not using them for a living at work – but man, it’s so easy to get stuck otherwise. Yeah, books and videos won’t give you experience – but it gives you a start. A home lab helps fill that in. Nothing replaces on-the-job experience, but if you can speak intelligently about a technology in an interview, you might clinch it anyway.

But you gotta learn.

And you know, it’s funny – but think of some of the “religious IT” arguments you’ve heard others make over the years. Linux is better than Windows, IIS is better than Apache, nobody ever needs IPv6, VMware is better than Hyper-V, whatever. Those arguments are rarely founded on technical merit. They’re often based more on the person’s desire to avoid learning something new. Not always; sometimes those arguments are genuinely made in a “what’s the right tool for this job” vein, in which case they’re not “religious” arguments. But way too often, they’re just people defending their turf because they don’t want to learn something new – even when the opportunity arises in their current job.

Never turn down an opportunity to learn something new. Make those opportunities for yourself by learning outside the job. You have to. Because someday you might need to move on, and you need to make sure you’ve got the skills that the marketplace is looking for.

No matter how m@d your sk1llz are in your chosen area of specialization… you’re stuck if you don’t have the breadth needed to fit into a variety of other environments. Please don’t be stuck. Invest in your career, so that you’ll always have options when it comes to a job.




Critical Thinking, Skeptical Thinking

Not enough people bother to use critical thinking skills, or what I call “skeptical thinking” skills. They accept most everything anyone tells them at face value, or they weigh incoming information only against existing biases and preconceptions.

Let’s do an example, and it’ll be a fun one. I swear.

I have met several vegans in life. Most of them (the ones I met, not all of them on the planet) became vegans for philosophical reasons. However, the thing with humans and philosophy is that we have some kind of need to make other people accept, and convert to, our way of thinking. Well, trying to convince me to go vegan because cows are such gentle creatures just won’t work. I’ve met cows. They were yummy. So, a lot of these vegans would switch tactics to something I completely hate and despise, which is whipping out “compelling” arguments that they think will sway me, when in fact they’re sitting there lying the entire time. Mind you, I’m fine with them being vegan and never tried to convert them, but they’re out to save the cows or something.


One of the newer arguments, as our news channels become overloaded with news of droughts, focuses on water consumption. I do love this one, and we’ll use the example at It can, the argument goes, take up to 2,000 gallons of water to produce one gallon of milk. Meaning, cows drink a lot of water, but plants – my gosh, they practically sip the stuff. Notably, that website doesn’t mention how much water plants need. Well, this argument falls apart in the face of critical thinking. We’ll focus solely on the water usage part of the argument, since this is really about using critical thinking, not the argument itself.

The idea here is water efficiency. Given a certain amount of water input, how much food – e.g., calories – do I get out if it? The more calories I get per input gallon of water, the more efficiently I used that water.

A 1000-pound beef cow will drink about 11 gallons of water per day in 80-degree weather. 1000 pounds is a typical slaughter weight, and obviously the cow isn’t that weight from birth. It takes on average 13 months to get a cow to that weight, so by my calculations you’re looking at about 3,000 gallons of water. Depending on the breed, that cow will yield about 67% edible meat, or about 670 pounds. You get very roughly 665,000 calories from that much meat, understanding that there’s a decent amount of averaging and rounding in that number. So, about 220 calories output per gallon of water input.

I decided to go with pinto beans as my vegetable comparison. They’re one of the higher-calorie veggies, and they offer a lot of protein, like beef, and a small amount of fat. It’s nearly goddamn impossible to get actual numbers on these things, but a plant seems to want about .01 gallons of water per day, more as it nears maturity. They need about 120 days to mature, so if you extended that out to a 13-month cycle (for equivalency with the cow), you’d get about 4.1 complete crops and use about 10 gallons of water. You get about 150 beans from that growing cycle, which produce about which yields about 300 calories. That’s about 30 calories output per gallon of water input.

So cows are a hell of a lot more water-efficient, producing 220 calories per gallon versus 30 calories per gallon. So, with critical thinking, we can eliminate that “water efficiency” argument. There are other arguments, of course, and those should all be considered just as carefully.

So, has my argument swayed you a bit? Not the cow argument; I don’t care what you eat. The critical thinking argument. The next time someone makes some off-the-cuff argument, will you consider it more carefully before you take it as gospel truth?

Well, if this article has swayed you even a little bit, then I’ve completely failed.

I didn’t cite a single source in my entire argument. I just made up some of the numbers (researching is hard) and did math. And some of that math was wrong. 10 gallons per bean plant? Really? You’d probably kill the thing with that much water.

And when you make an argument like this you can’t just conveniently focus on one aspect of the argument. Beans don’t eat food. Cows do, and that food takes water, also. Cows also fart constantly, producing copious amounts of methane. Beans don’t fart, but people who eat them do, so there’s methane either way. How much? That’s the point: People like sound bites and easy numbers, but nothing in life is easy numbers.

Which is why skeptical thinking is so important. If someone makes some seemingly detailed argument like this, and doesn’t offer reliable sources, ignore the entire argument unless you’re going to recreate it using your own numbers. If they do offer sources, check them. Would you trust the American cattle industry to provide accurate, unbiased numbers about cows? I wouldn’t.

People are out to sway you and change your opinion constantly. Politicians do this habitually. They know you won’t check, and so they can toss anything they want at you. All they have to do is find your particular weak point and then target an argument right at it, whether it’s factual or not. And we just slurp up whatever they tell us without bothering to fact-check one iota of it. It’s annoying, it’s dangerous, and it’s wrong. It makes you no better than the cows, although probably less tasty.