"
Look back to where you have been, for a clue to where you are going"
It's an unattributed quote, but I find it applies repeatedly throughout technology. So often what appears new is really a twist on a tried and true approach. Anyone who's spent any time around networks knows the phrase "last mile". It's a reference to the final leg of the network connecting a home or office. When AT&T was broken up in 1984 by the US Government, AT&T emerged as the long distance company while local service was divided into seven "baby bells" including Pacific Telesis, Ameritech, and BellSouth. Experts believed long distance held the promise of higher profits while the Regional Bell Operating Companies (RBOC's) were doomed to a capital intensive, low margin struggle.
The experts were wrong.
Owning that "last mile" turned out to be very profitable; so profitable one of the RBOC's, Southwestern Bell, was able to buy those other three RBOC's, changed it's name to SBC, and then bought it's former parent, AT&T. Although mobile networks are great for connecting smartphones and tablets and satellites can deliver radio and television, it turns out nothing yet can replace fiber and copper for bandwidth and low latency. After the Telecommunications Act of 1996, last mile services exploded and today instead of just the local phone company there are a variety of competitors including Google. And providing that last mile of service continues to be a significant revenue driver.
Let's put the last mile conversation to the side and switch gears. Today large corporations are investing billions of dollars in Big Data; growing their analytic capabilities to generate the oxygen required by their growth engines. These fierce competitors are slowly realizing there simply isn't enough time available to:
- capture data at the point of origination
- move the data across the country
- filter the data to focus on the most valuable elements
- combine the data with other data to broaden the perspective
- execute analytics on the data
- generate a result
- communicate the result back across the country
- leverage the result to drive some benefit
If the network operates at the speed of light, how can there not be enough time? Beyond the reality that light slows down in fiber (by about 31%), there is not one single direct link between the user and the corporate data center. Users have to be authenticated, security policies applied, packets routed, applications load balanced. The multitude of events that occur, each one very quickly, add up to a delay we call latency. In a world where everything is measured in minutes latency goes unnoticed, but in our Internet world we are moving from seconds to sub-second time frames. Think about how patient you are when navigating through a website. After that first second ticks off the clock, people begin to wonder if something is wrong. It's the byproduct of taking our high speed access to the Internet for granted. Marketers want to collect metadata about what you're trying to do, figure out how they can influence you, and insert themselves into your decision chain; and they only have the time between when you click the mouse and when the browser refreshes. Hopefully now you can understand why that eight step process, moving data across the country is so unappealing.
For the past three years I have advocated an
alternate approach; putting their servers as close to the end user as possible (commonly called "the edge"). Where "the edge" is located depends on the conversation, however the furthest it can be is at the start of that last mile, the last point on the network before it connects to the end user. Today the edge extends as far as the same city for large populations, more often it's a region or even a state. Although my
serverless computing concept could be part of the answer and move the analytics into end user's computer, in truth at least some analysis needs to occur off-site, if for no other reason than to stage the right data. Moving analytics closer to the edge requires us to move compute and storage resources closer.
Let's return to the "last mile".
If you looked at a map of the network which serves your home or business, you would notice the wire goes from your house, out through a bunch of routers, switches and signal boosters until it finally reaches a distribution point owned by your provider (or wherein they lease space). These locations are often large, having previously housed massive switching systems for the telephone network, and they are secure, built like cold war bomb shelters. What if these locations were loaded with high density compute and storage available much like a public cloud to augment the resources within the corporate data center? If a business can operate while leveraging public cloud resources, what we lovingly refer to as a Hybrid Cloud model, then wouldn't it make sense to push the resources as far out toward the edge as possible?
I'm hoping you are increasingly buying into this idea, or at least skeptical enough to wait for it to implode, and want to know how this is broadly applicable. I do not see this as a panacea, any more than I do serverless computing, cloud, big data, mobility or any other technology. However I do see it at a minimum as moving the conversation forward on how to deal with a world where our endpoints are no longer fixed, and at a maximum another arrow in the quiver. Consider how much data is being collected today; from the GPS location in your phone to the RFID tag on your razor blades, we are living in the Data Age. Every single device powered by electricity is a likely candidate to be internet enabled, what we call the Internet of Things. Each of these devices will communicate something, creating new data every day and adding to the pile of data that already exists. To deal with the onslaught, companies need to filter what's coming in, remove the noise, and then execute their normalization routines (standardizing date formats, to get the data ready for use. Since compared to the cost of moving data, everything else is free, there is an economic incentive to move data as short a distance as possible. Handling the grunt work of analytics locally could have a dramatic impact on overall system speed. And over time, having local compute resources will enable software architects to push analytics closer and closer to "the edge".
Today I am unaware of anyone working on this issue, despite pushing for it and finding a few leading edge Fortune 500 executives already facing this challenge. The truth is we live locally, we're served locally, why not compute locally? I see this as a gift of future revenue, sitting on the doorsteps of the telco's and cable providers waiting for them to create the product. However I don't believe they realize what they have. There is no last mile provider who has made a splash in cloud or big data. They own the last mile; they're the ones who own the gateway that links the world to their customers. Moving the public cloud from super-regional data centers to the local central office where the last mile connects could make the telcos instantly relevant in cloud and give them a nearly insurmountable competitive advantage over today's public cloud leaders like Amazon and Microsoft (perhaps this is part of the reason Google created Google Fiber). Imagine a legacy infrastructure being resurrected to meet an emerging need. But then, as I said at the beginning, "
Look back to where you have been, for a clue to where you are going"
P.S. I was so excited to see the headline "
IBM, Juniper Networks work to build networks capable of real-time, predictive analysis", until I realized it was the opposite of integrating data analytics into the network. Oh well, my quest lives on.