Tuesday, April 19, 2016

Newton’s Three Laws and the Public/Private Cloud Debate

For the past five years I’ve noticed a change in the conversation with CFO’s on the use of cloud computing.  I’ve always known CFOs are the driving force behind the adoption of public cloud.  Who could say no to better, cheaper, faster with lower capex and the ability to reduce the hard assets of IT which all too often anchor technology in the past?  However, today I increasingly hear CFOs push back on public cloud driven by one singular concern: predictability.

From a financial point of view, there is a critical assumption built into the pay-as-you-go model: that consumption and therefore cost is predicable.  In the early days public cloud was always cheaper, however, great strides in private cloud technology, maturation of the space, and the challenges in moving to a cloud centric IT platform have muddied the waters.  As someone who has never been a proponent of private cloud, I feel the water is as clear as ever.  IT infrastructure has never been built on predictability.  For decades, networks, servers, and storage have been designed to a “just in case” standard.  The result is tremendous bloat which is being engineered out through virtualization, but the workloads are no more or less predictable.  What’s missing is the understanding that public cloud isn’t just about capex vs opex, it’s also about momentum and friction.

Infrastructure assets are evil; they anchor us in the technology of the day and act as a damper on innovation.  I’ve lived the nightmare of being forced to architect a solution around existing infrastructure assets, sometimes merely to justify their existence regardless of their impact on the solution.  Assets grow like a planet, swirling gasses of expectation coming together forming a gravitational field which attract more mass in the form of processes and protections.  Soon the planet becomes so massive it starts attracting its own moons of ancillary assets into orbit.  Where there’s mass we have to respect Newton’s three laws.  Objects at rest tend to stay at rest and thus have no momentum, not the message the CEO wants to hear in relation to innovation.  Objects require a force to accelerate dampened by friction; the greater the mass, the more friction will work against the building momentum.  And since for every action there is an equal and opposite reaction, to generate the momentum required for change, ever increasing investments in time and energy are required.

Like the car travelling toward the horizon will get closer to the mountains but not the moon, private cloud gets a company into virtualization but not cloud computing.  I know some will argue I’m too optimistic, however I point to the success of companies such as Netflix, whose competitive advantage is the result of investing early in learning the lessons of public cloud.  I also point to respected executives like Chris Drumgoole, COO of IT at GE, making bold public statements about their migration not just to cloud, but to public cloud.  What he has said publicly many others have told me privately, but first they must get their CFO’s to look past the head fake of relating predictability to public cloud risk.

Sunday, July 5, 2015

Cloud and the Omni-channel Customer Experience

We are truly living in a connected world.  Today I checked the weather on my computer and paid some bills before checking my email on my phone en route to my SUV.  Once I sat down, my SUV automatically connected to my phone and minutes later showed an incoming text from my daughter, whom I called back via voice command.  She texted me her order for a meal I bought via an app while waiting in a parking lot, and returned home to watch a video she created earlier in the day, by mirroring her phone with the TV.

Impressive to say the least.  Remember NONE of the solutions I used were designed by or bought from the same company.  Yet amazingly they meet my need, to help me make the most of my time.  In our connected world we have the opportunity to always be busy, to cram as much as possible into our daily life.  Mobility, eCommerce, Social Media, Collaboration all let us take advantage of those previously lost moments in time, whether waiting in the car or sitting at the airport.  And at the foundation of all of this technology, which is able to work so seamlessly together, is cloud computing.

While my consumer oriented world is full of neat new toys, the world of retail is trying to figure out how to play a bigger role; not just Retail as in stores, but retail as in producer/consumer interaction including banks and healthcare, automotive and high technology.  I haven't seen a business strategy or talked to a C-suite executive in retail the past three years without the topic of Omni-channel finding it's way into the conversation.  An Omni-channel Customer Experience is a simple concept: the creation of a common look and feel across all channels through which a customer interacts with a company.  Simple in concept, yet frustratingly difficult in reality, but companies know their customers expect a consistent experience.  CIO's know whoever delivers on the promise best has the opportunity to create some daylight as they pull ahead.

Why is omni-channel difficult to implement in a single company yet already a thread holding my personal life together?  The difference is cloud computing.  Each of the solutions I used today was built in the last five years on a cloud foundation.  However cloud continues to prove elusive for corporate America, and for numerous reasons.  There is simply no way to build an omni-channel customer experience and avoid cloud computing, yet focusing on cloud computing won't deliver the experience nirvana either.  It's easy to understand why CEO's to CIO's are frustrated.  It's the enigma of modern IT: all the data, compute and storage one could possibly ever need, and nothing put together in a way that makes it truly useful.  It's the cost of holding on to outdated models content to reap the benefits of one technology generation without considering the next.  Companies today are islands; islands of applications and data if not servers and storage as well.  Data in isolation is almost worthless in today's world of Real Time Analytics and Big Data.  And making it all the more frustrating, you can't hurry cloud, you just have to wait.

Cloud Computing is the single most important technological shift which has happened in Information Technology.  For the first time it's not about a domain, such as the network or data or applications.  Cloud is about everything, from technology to taxes.  Like the artillery shell that's a degree off when fired, those who get cloud wrong will simply miss the mark, measured in cost in the short term, but ultimately measured in customer satisfaction and solvency.

Sunday, March 29, 2015

The Last Mile and the Future of Cloud

"Look back to where you have been, for a clue to where you are going"

It's an unattributed quote, but I find it applies repeatedly throughout technology.  So often what appears new is really a twist on a tried and true approach.  Anyone who's spent any time around networks knows the phrase "last mile".  It's a reference to the final leg of the network connecting a home or office.  When AT&T was broken up in 1984 by the US Government, AT&T emerged as the long distance company while local service was divided into seven "baby bells" including Pacific Telesis, Ameritech, and BellSouth.  Experts believed long distance held the promise of higher profits while the Regional Bell Operating Companies (RBOC's) were doomed to a capital intensive, low margin struggle.

The experts were wrong.

Owning that "last mile" turned out to be very profitable; so profitable one of the RBOC's, Southwestern Bell, was able to buy those other three RBOC's, changed it's name to SBC, and then bought it's former parent, AT&T.  Although mobile networks are great for connecting smartphones and tablets and satellites can deliver radio and television, it turns out nothing yet can replace fiber and copper for bandwidth and low latency.  After the Telecommunications Act of 1996, last mile services exploded and today instead of just the local phone company there are a variety of competitors including Google.  And providing that last mile of service continues to be a significant revenue driver.

Let's put the last mile conversation to the side and switch gears.  Today large corporations are investing billions of dollars in Big Data; growing their analytic capabilities to generate the oxygen required by their growth engines.  These fierce competitors are slowly realizing there simply isn't enough time available to:

  1. capture data at the point of origination
  2. move the data across the country
  3. filter the data to focus on the most valuable elements
  4. combine the data with other data to broaden the perspective
  5. execute analytics on the data
  6. generate a result
  7. communicate the result back across the country
  8. leverage the result to drive some benefit

If the network operates at the speed of light, how can there not be enough time?  Beyond the reality that light slows down in fiber (by about 31%), there is not one single direct link between the user and the corporate data center.  Users have to be authenticated, security policies applied, packets routed, applications load balanced.  The multitude of events that occur, each one very quickly, add up to a delay we call latency.  In a world where everything is measured in minutes latency goes unnoticed, but in our Internet world we are moving from seconds to sub-second time frames.  Think about how patient you are when navigating through a website.  After that first second ticks off the clock, people begin to wonder if something is wrong.  It's the byproduct of taking our high speed access to the Internet for granted.  Marketers want to collect metadata about what you're trying to do, figure out how they can influence you, and insert themselves into your decision chain; and they only have the time between when you click the mouse and when the browser refreshes.  Hopefully now you can understand why that eight step process, moving data across the country is so unappealing.

For the past three years I have advocated an alternate approach; putting their servers as close to the end user as possible (commonly called "the edge").  Where "the edge" is located depends on the conversation, however the furthest it can be is at the start of that last mile, the last point on the network before it connects to the end user.  Today the edge extends as far as the same city for large populations, more often it's a region or even a state.  Although my serverless computing concept could be part of the answer and move the analytics into end user's computer, in truth at least some analysis needs to occur off-site, if for no other reason than to stage the right data.  Moving analytics closer to the edge requires us to move compute and storage resources closer.

Let's return to the "last mile".

If you looked at a map of the network which serves your home or business, you would notice the wire goes from your house, out through a bunch of routers, switches and signal boosters until it finally reaches a distribution point owned by your provider (or wherein they lease space).  These locations are often large, having previously housed massive switching systems for the telephone network, and they are secure, built like cold war bomb shelters.  What if these locations were loaded with high density compute and storage available much like a public cloud to augment the resources within the corporate data center?  If a business can operate while leveraging public cloud resources, what we lovingly refer to as a Hybrid Cloud model, then wouldn't it make sense to push the resources as far out toward the edge as possible?

I'm hoping you are increasingly buying into this idea, or at least skeptical enough to wait for it to implode, and want to know how this is broadly applicable.  I do not see this as a panacea, any more than I do serverless computing, cloud, big data, mobility or any other technology.  However I do see it at a minimum as moving the conversation forward on how to deal with a world where our endpoints are no longer fixed, and at a maximum another arrow in the quiver.  Consider how much data is being collected today; from the GPS location in your phone to the RFID tag on your razor blades, we are living in the Data Age.  Every single device powered by electricity is a likely candidate to be internet enabled, what we call the Internet of Things.  Each of these devices will communicate something, creating new data every day and adding to the pile of data that already exists.  To deal with the onslaught, companies need to filter what's coming in, remove the noise, and then execute their normalization routines (standardizing date formats, to get the data ready for use.  Since compared to the cost of moving data, everything else is free, there is an economic incentive to move data as short a distance as possible.  Handling the grunt work of analytics locally could have a dramatic impact on overall system speed.  And over time, having local compute resources will enable software architects to push analytics closer and closer to "the edge".

Today I am unaware of anyone working on this issue, despite pushing for it and finding a few leading edge Fortune 500 executives already facing this challenge.  The truth is we live locally, we're served locally, why not compute locally?  I see this as a gift of future revenue, sitting on the doorsteps of the telco's and cable providers waiting for them to create the product.  However I don't believe they realize what they have.  There is no last mile provider who has made a splash in cloud or big data. They own the last mile; they're the ones who own the gateway that links the world to their customers. Moving the public cloud from super-regional data centers to the local central office where the last mile connects could make the telcos instantly relevant in cloud and give them a nearly insurmountable competitive advantage over today's public cloud leaders like Amazon and Microsoft (perhaps this is part of the reason Google created Google Fiber).  Imagine a legacy infrastructure being resurrected to meet an emerging need.  But then, as I said at the beginning, "Look back to where you have been, for a clue to where you are going"

P.S.  I was so excited to see the headline "IBM, Juniper Networks work to build networks capable of real-time, predictive analysis", until I realized it was the opposite of integrating data analytics into the network.  Oh well, my quest lives on.