Sunday, March 22, 2015

Value * Easy = Consumption

Over the past decade I've witnessed a constant stream of IT executives and technology professionals view cloud as a threat to their careers.  When viewed through the eyes of an internal IT shop where the business has been a captive customer I can understand their worry.  Now they're being asked to enable innovation instead of taking orders; bring solutions to the business instead of begrudgingly accept new challenges.  However I can't understand why they don't see the other side of the cloud coin, the very equation which drives cloud adoption: Value * Easy = Consumption.

Public Cloud has been built on two value propositions.  First providing value through availability to resources, in a short time period, without capital investment.  Those three values align with the strategic goals of every CxO no matter how you write them:
  • "Do more with less"
  • "Improve agility, elasticity, efficiency"
  • "Reduce costs"
  • "Shift from maintain to innovate"
  • "Remake the cost curve" (*my personal favorite)
Those are just a sample of quotes from CxO's I've worked with over the past decade.  Moreso, each of the CxO's had a common opinion of IT: too slow and expensive for the value delivered.  This is the environment into which AWS started selling it's cloud capabilities, back before we had the phrase "cloud computing".  It's important to remember AWS grew out of Amazon's own internal needs, it was not the result of market surveys and product development.  Although Fortune 1000 adoption of public cloud has been slow, the concepts of cloud computing rapidly penetrated corporate America in an attempt to bring the AWS value proposition to the enterprise.  Shifting from a hardware centric view to a capability centric view of infrastructure is a major upheaval in approach.  

Given very few companies have been successful in adopting cloud, what's the holdup?

Whereas a CIO can buy "Value" in the form of tools (BMC, VMWare, etc.) or rent it (AWS, Google, Azure, etc.), the truth is they can't buy, rent, lease, borrow or even steal "Easy".  Making something easy isn't easy, and cloud is anything but easy.  Put yourself in the shoes of a business executive such as the Chief Marketing Officer or the Chief Financial Officer.  In your world you have very few hard assets, having shifted most everything to a lease model; from office space and PC's to digital advertising and audit.  You can shift your spend as your business changes throughout the year.  What you need are technology solutions able to meet your need for agility, elasticity and efficiency.  How does your IT team respond?  Building out large scale data centers, buying servers, writing software.  Do any of these approaches appear to be in synch with the CMO and CFO's needs?  No.  In fact strategic planning with IT is so difficult, these leaders are increasingly willing to go outside the company, which is not easy, to get what they need.  They're willing to invest their reputation and the success of their team in taking a risk to convince the CEO, corporate security, office of the general counsel, and fellow business leaders that going around IT is the right strategy.  Then they spend money on consultants and hire talent to move in the new direction.  And yet all of that is considered easier than getting solutions from IT, the place they would prefer be the first, last and only stop.

Without a concerted effort to make cloud use easy the entire equation is upset.  Easy is the governor on the economic engine of cloud.  Having cloud capabilities, being able to deliver the "Value", isn't enough.  When done right, "Easy" is a multiplier of "Value" and drives consumption significantly beyond expectations.  At that point IT executives and technology professionals don't view cloud as a threat to their careers, it morphs into a driver of career opportunity.  Their own value increases dramatically, and their strategic value to the long term success of the business in particular.  In my experience it's much more rewarding to have a seat at the table to discuss how to accomplish some new goal than being berated as the barrier to accomplishing an old goal.

Cloud in the enterprise will never be a success without "Easy".

Sunday, March 15, 2015

We're Well On Our Way to Serverless Computing

As I discussed in my first post, I came up with an idea I titled "Serverless Computing" in 2002.  At the time I was frustrated by the limitations of web, application and data server capabilities.  I was implementing a rather amazing B2B marketplace for a drug company; a leading edge architecture I had developed using XML at its core along with Java messaging services and styling objects to render the final views.  The same architecture and implementation had to support multiple lines of business without any crossover.  My frustrations led me to start questioning everything.  If things weren't working, why was I continuing to do everything the same way I had before?

In the middle of a snow storm in central Connecticut, sitting in a frozen rental car waiting for warmth (I'm from the South), I had an epiphany.  I realized most of the constructs of computing are driven by human needs, not the computer.  It dawned on my all my architecture work was about putting stakes in the ground as anchors for our thinking and development of the portal.  I was reasonably good at refactoring applications to improve security, efficiency, and efficacy; why not apply the same thinking to architecture?  I scurried down a path of thinking that led me to the conclusion our modern architectures are built on so many layers of abstraction that we've lost sight of the why.  I was perpetuating the problem by blindly following the norm.

As yourself this: where did the concept of a "server" come from?  Many people refer back to the origination of client/server computing; decoupling the processing unique to each user (the client), from the processing common to everyone (the server).  Once software was decomposed into two complimentary applications, it could run on two different computers where the server does the heavy lifting and is therefore optimized for its workload.  In reality client/server is really an extension of the mainframe architecture where desktop PC's replace the dumb green screen terminal and, by their nature of having a processor on board, share some of the processing load.  That's all good, but what drove the creation of the mainframe, and therefore client/server, was economics.  By centralizing processing power and enabling remote access, mainframes delivered a reasonable economic model for the automation of basic business tasks.  Dumb terminals made sense when people and the mainframe were local, few applications existed, applications were simple, and the costs of infrastructure was high.

Today none of the original drivers of mainframes and client/server exist, yet we still use the architecture unchanged.  If you took an ultra-modern data center and walked someone from 1965 through the center, there is no way they wouldn't mistake the mass of pods for mainframes.  Those massive data centers are nowhere near where the people who use them work.  In fact we can no longer assume employees even work in buildings, or from 8am to 6pm. The software landscape consists of billions of applications with thousands more created every day.  And the cost of infrastructure is so low thanks to density and scale, a modern smartphone has more processing power, memory, storage and network bandwidth than a "server" did just a decade ago.  We are surrounded by highly capable, network accessible computing devices which spend the majority of their life I/O bound, just waiting around for something to do.  Why are we letting all that computing power go to waste?  We're ignoring the real promise of cloud computing, a concept closer to P2P than the Internet and what we think of as public cloud today.  I'm talking massive distribution of applications, data and infrastructure; the kind of infrastructure people cower at when talking about cloud, but fully embrace when talking about the Internet of Things.

We need to rethink our approach to computing.  Period.

When you tear out the non-value added elements of client/server, the one tenet which survives is decoupling: separating the user interface from the business logic.  Decoupling's primary value proposition is at the software layer, not hardware, as we are reminded of every time a web server goes down.  And hardware, when viewed through cloud computing optics, is nothing more than a pool of resources (compute, memory, storage, etc.).  If we take the widely available computing resources we have on our client devices and run our user oriented "server" software there, we gain several benefits:
  • decreased impact of "server" outages
  • reduced complexity of "server" environments
  • federation of power consumption over the entire power grid
  • elimination of the need for large, centralized data centers
  • reduced long haul bandwidth requirements
  • raises the barrier for DDoS attacks while reducing the risk of penetrations as key data never leaves the premises
Preposterous!  You're crazy! Insane! Never! Yet that's precisely the direction we're not only heading in, but we are fast approaching the arrival platform.

The whitepaper I submitted to multiple outlets in 2002 told me I was crazy (including my employer at the time, IBM).  Nobody asked me to explain my thinking or even gave the idea a second thought.  Yet today I'm more convinced than ever it's the endgame of where we're heading.  Consider the rise in the popularity of Docker, a container oriented tool which approaches virtualization correctly (as opposed to the crazy idea of virtual machines which replicate the bloated operating systems multiple times over).  Consider the rise of microservices, self contained services which are distributed with the core application.  We are at the threshold already.

Moving over the threshold requires a tweak to Docker so it can be deployed seamlessly as part of an existing operating system install similar to Java, and the management tools required in a massively distributed system.  Second we need similarly scaled data federation tools which I don't believe exist today (for more on data federation see my entry on The Data War and Mobilization of  IT, or my  upcoming entry on Data Analytics in the Network).

Just imagine how the world of business computing would change if we eliminated just 20% of the web and application servers?  How about reducing web and application server instances for consumer  cloud offers such as Office 365, or for your bank.  Go way out on a limb and consider the adoption of P2P tools such as BitTorrent Sync.

And by the way, I'm still waiting for someone to provide mainframe based public cloud services.  Where is the new EDS?

Monday, March 9, 2015

Going Against the Grain

I've struggled for the past two months to write this entry.  It started with the topic of innovation and why companies are struggling at it, but that quickly devolved into a "how to be more innovative" treatise.  However if you're like me, you've already read several great articles and heard numerous speakers lay out a foundation for innovation.  And at some point you realize you're reading the same thing over and over again because, for whatever reason, nobody's listening to the message.  So another person jumped into say it a second time.  Then a third time.  Fourth time.  Fifth.  Well that's obviously a broken path, so rather than be the sixth I realized I needed to take a new direction.

In that moment of despair after endless edits, thinking perhaps my argument was flawed (which would explain why it was so hard to capture), it dawned on me to go back to the basics.  Yes, innovation is a struggle. But why?  Is it really just because innovation requires a willingness to invest in failure which is anathema to a company focused on quarterly results?  I don't think so.  I think the problem is more basic and has to do with the cultural proximity of the smart people who invent, the entrepreneurs who innovate, and the public who wants everything better, cheaper and faster.

Through the 1960's in the US we had a healthy habit of churning out earth shattering inventions which drove economic growth for decades.  However invention requires patience, a tolerance for failure, and funding.  As companies tightened up their bottom lines through the 1970's and 1980's we subdued this habit in the name of global sourcing and cost cutting, moving our research off-shore to locales with lower cost labor.  Of course nobody considered the opportunity cost of this shift.  One of the most often discussed result of this approach is how billions of dollars in economic growth have been shifted from the US to foreign countries, raising their standards of living and education while ours have remained stagnant or dropped.  However there is another opportunity cost rarely considered; what happens when you move research half-way around the world to a new culture which doesn't share the same appetite for change and risk as the United States?

Our culture in the United States has an acute case of individualitis.  In fact the "American Dream" is based on the concept of the individual controlling their destiny through dedication and hard work.  Our government was established to protect the rights of the individual from tyranny.  The reason a free market capitalist system works in the US is because it's the only system in which the talent and effort invested by the individual delivers a powerful dividend.  Capitalism is the great equality engine because it rewards innovation.  The rise of American economic power started with the Industrial Revolution, but it wasn't fueled by invention as is so often argued.  Innovation, the use of inventions to solve real problems, was the real rocket fuel.  Alexander Graham Bell invented the telephone, but the switchboard was the innovation which connected people over great distances.  On it's own the phone did very little.  Morris Tanenbaum invented the silicon transistor, but the silicon wafer was the innovation which brought microprocessors to the masses.

Our American culture embraces innovation.  We like cool, new technologies which purport to make life better, even when they don't.  Although there are certainly pockets the world over, there is no better market for launching new products that challenge the status quo or establish entire new segments.  Our software businesses prove this on a daily basis.  Despite repeated efforts by large companies to move software development off-shore, the most innovative software is still largely developed in the United States for the US market.

So back to the question then, why are companies struggling with innovation?  I believe it's because we've added noise between each step of the invention to insight to innovation process by separating the functions on a cultural plane.  I've learned over the past thirty years not to underestimate the importance of culture.  If you want to sell on anything other than price, you have to innovate.  But to innovate, you need access to invention, and that access is much more than reading whitepapers and listening to lectures.  People need to share more than a language, they need to share cultural experiences.  How are we as leaders enabling cultural exchange to occur as part of our daily routine?  How are we growing ourselves by creating interactions with other cultures at work and at home?

We need to consciously choose to go against the grain; to recognize even when we speak the same words, we can mean two different things when cross the culture divide.  Now that my eyes are open, it's incumbent upon me to make the time to move forward.  I'm sure to many this is all very obvious, in which case although I'm admittedly late to the party, at least I'm on my way.