Monday, October 18, 2010

Project Clouds

As the debate over the value of Public Clouds rages on (not sure why, but it does), companies are just now starting to see the immediate value of cloud computing. I realize that people who run IT are risk averse (which is why there is so little innovation coming out of IT), resistant to change (which is why things rarely get better), and rarely comprehend the technologies they oversee (I won't expound on this one). However they have one tremendous trait their predecessors did not: they understand the business. This viewpoint is starting to leak into the thinking about cloud computing and how to take advantage of it. Rather than focus on fighting the governance, security, ownership, ego wars of change, cloud computing can be applied to projects!

The real driver behind project clouds is speed to market. Things have to happen quickly to seize opportunities. Sun Tzu teaches one to assume whatever one is doing, their enemy is also doing, and therefore the winner is often determined by the one who achieves the objective first. I have found over my twenty years in IT and consulting that the larger the company, the longer simple tasks take in IT. Nowhere is this more evident than in the time it takes to provision environments, applications, and other technology tools for projects. Today the business cannot wait for IT to go through all their risk avoidance, cost containment, change preventing processes.

Every project needs its own set of tools from document sharing to issue tracking and often more specialized, project specific tools. My most recent example. We had our cloud based project management, issue tracking, document sharing, and prototyping solutions up and running in one day. It took over two months for IT to provision the sandbox and development environments. Worse, the environments were over 30days late on a three month development timeline. Real world differences like these point to the need for using cloud solutions. Pay-as-you-go, instant availability and scalability, what's not to like? Adopting the cloud is a risk avoidance technique, it's the lowest cost possible, and it requires IT to make no changes.

And security is no longer an excuse for sitting on the sidelines. With billions of dollars in transactions traversing public networks and clouds already we've proven that existing security solutions are adequate to the challenges of the cloud.

IT can either enable this process, or fight against it. However nobody should ever choose to stand in the way of progress; history has not been kind to those who have.

Tuesday, August 31, 2010

Everything Should be Software

If there's one lesson we've learned over the years it's that software is king. The tech titans of the world are Microsoft and Google, not IBM and HP. Hardware has its place but suffers from multiple obstacles including the laws of physics and the depth of education required to drive innovation. Innovation on the software side however can be as easy as doing something for the first time, and there are hardly any obstacles. As a result we have proven that software is significantly easier to build, manage, update and deploy than hardware.

The question I have is why does hardware exist in so many different forms today? Many have jumped on the bandwagon incorporating low-power processors with embedded Linux to create appliances from DVR's to EKG's and even engine control modules. What about inside the data center? We pretend today there is a big difference between the "pizza box" 2U rackmount servers with one or two multi-core CPU's and the big box 16-way multi-core machines which require a small power plant to operate. In the end the only difference between the two machines is which offers the lowest cost per performance unit.

Looking introspectively at what we today consider hardware or infrastructure applications many are nothing more than specialized servers. The true value is in the software, not the hardware. Examples include backup devices, switches, bridges, routers, DNS, directory services, and the list continues to grow.

Virtualization has taken us to the limit of the existing hardware architecture. The next step is the continued commoditization of the hardware where specialization is moved into software. The key enabler of this migration is the capability of the modern processor, both in speed and bandwidth. Many hardware devices have migrated to "appliances" and will soon be migrating to software packages. Why does this matter? Because we can leverage the existing virtualized environment to move, grow, and shrink these services just as we do application, web, and database servers.

I consider the FPGA (field programmable gate array) as the ultimate model for the data center. Such a data center would be composed of a generic set of servers configured on the fly to provide ALL of the services required from load balancing to routing, naming services to applications, data to system management. This move would improve resiliency, lower overall TCO, and provide the most efficient infrastructure possible at any given time. Considering the end of rotating storage is on the horizon, such a static hardware architecture would provide the ultimate in dynamic services!

Monday, July 19, 2010

The Secret Sauce

Those of us who have been in IT over the past decade have seen a steady increase in the focus on business/IT alignment. Once the focus of Enterprise Architects, it's now become a part of everyone's job. But what exactly is the end goal? Ask the business leaders and the common answer is a more agile IT capable of making changes closer to the rapid pace of business and further from the lethargic, slothlike pace IT has followed for 30 years.

Offended? Well the reality is compared to the business IT is lethargic. The business has become nimble and I've finally discovered the secret to their sauce: assets. The business carries as few assets as possible. Everything today is an expense. Buildings, labor, equipment, everything. What was once capital is now leased resulting in fewer assets. Fewer assets to manage means greater agility. One specialty retailer I know of can put up a new store, from signing the lease to opening the door, in two days. They can shut one down in one day. Wow! Now that's agile.

How about IT? Most companies I know cannot setup a server, from request through completion, in two days. In fact I don' know of a company that can decommission a server in one day. See why the business thinks we lethargic?

So what we need on the IT side is more of that special sauce. We need to reduce our capital intensive asset driven footprint. Sure we already lease servers and data centers, but that's not enough. We need to adopt cloud computing and leverage other people's assets to deliver against our agenda. The best way to make money is to use other people's money. That's what cloud is all about. It's the opportunity IT has been longing for to gain the required agility to meet the business on their terms in their timeframes. From spinning up a new server or storage to rolling out a new application, timeframes need to be measure in hours and days, not months and years. SaaS solutions can put companies at the starting line on important enterprise solutions such as ERP, CRM, SFA, and MRP in days at a cost significantly lower than those opting for installed monliths like SAP and Oracle. And where do the savings go? Right to that bottom line!

Another secret? Innovation requires continuous tinkering and feedback loops. Product companies test, test, and retest. Focus groups, test markets, viral campaigns. It's all about iterations. Cloud can help IT in this approach as well, minimizing both the cost and time required for each iteration. More iterations means better results.

I always new the answer lay in the secret sauce!

Wednesday, June 2, 2010

Cloud Ain't Easy

If there's one thing I've learned over the past decade working with grid computing and virtualization technologies is that innovation isn't easy. Cloud computing is an innovation initiative. It's not one thing but rather the implementation, configuration, integration, and automation of several things. It's not avaialble from one provider, off the shelf, in an easy to install and support package.

No. Cloud Computing is more like a jungle. Lots of custom scripting and some custom development. Lots of building interim and bridge utilities. Lots of pushing vendors to deliver what you need rather than what they want to provide. Add to it the confusion over what cloud is (as recently as Vegas InterOp in April I heard someone state cloud was just another name for server virtualization) and it's no surprise it's a jungle.

Cloud only makes sense when it's done right. It needs a solid architecture at it's foundation. It needs a strategy and roadmap to ensure it's built from the start to deliver real business value. It needs to align with business and technology goals. And it needs to pay for itself where nobody wants to pay for infrastructure. It can be done, just ask Bechtel, but it's not easy.

First there's the skills gap. Many, if not most, organizations plan to go the cloud computing route on their own. The underlying strategy is vendor driven mistaking the vendor's best interests for their own. Whether it's Amazon, VMWare or Citrix I assure you tying to their strategy only ensures you'll be a profit generator for them. The most prominent skills gap is in enterprise architecture which is where cloud's rubber should be hitting the road. But we don't have enough EA's in the world and especially not enough with cloud skills. So instead companies forge ahead on their own with predictable results:
- lack of synchronization between technology teams
- lack of security standards and policies which address the technology footprint of cloud
- poor or missing governance structures
- growth of stealth IT in the form of expensed SaaS and even IaaS cloud services
- lack of understanding by the business how cloud can enhance what they do today

Second there's the focus on cost takeout. True, cloud optimizes the infrastructure stack which means more bang for the buck. But doesn't lower cost mean you can perform more iterations for the same budget? Learning is an iterative process so if we can learn more for the same dollar, don't we become smarter? Isn't this the foundation of innovation? Trying something new, tweaking it, and trying again until we have the better mouse trap?

Third there's the belief that IT controls technology. Technology goes wherever the money is, and today with a credit card anyone can move to the cloud.

Fourth and most scary there's the "Cloud at any cost" crew who are innovating in the cloud without proper planning and strategy, exposing their companies to massive risk which cannot be quantified. Who owns the data? Is the company properly indemnified from patent infringement? Where does the data go? Who has the authority to ramp usage up or down? Is any regulated data involved? What if the cloud vendor disappears or goes bankrupt?

Cloud ain't easy. It requires planning and strategy. It requires both a business and technology point of view. Most of all it requires people to realize as a new, disruptive technology the old rules don't apply.  It's time to throw out the nice, easy to use cookie cutters we've created for the Internet.

Cloud simply ain't easy, but it sure is compelling!

Wednesday, March 31, 2010

But It's Not About the Technology

Today the conversation on cloud computing continues having migrated slowly from definitions and scope to adoption and migration. The primary focus of IT professionals from the CIO and CTO through system administrators is cost reduction; maximizing the use of existing hardware. Secondary, in fact a very distant second whom only a few really seem to understand, is the application of cloud technologies to reduce complexity by drawing distinct lines of responsibility between the layers of the IT stack. Both goals are reasonable, even commendable, but they only breach the surface of the value of cloud computing. With Cloud, it isn't about the technology.

So what is it about? The business. As with so many other technologies that have come before we are again falling into that time honored trap of applying technology for it's own sake. Saving a few dollars here and there may sound like the interests of the business are first, but that's a very myopic point of view. Very few IT organizations have taken the time to explain cloud computing to their business leaders and engage them in the discussion. In fact I am actively looking for my first example so I can praise them, learn from them, and write about them. Stay tuned but don't expect any posts on the topic in the near future.

Cloud Computing provides a new model to loosely couple organizations providing them a dynamic, opportunity based model to share services and data. As the landscape in various industries shrinks through competition and acquisition pure head-to-head rivals fade replaced by the new model of "co-opetition". Companies today find themselves increasingly competing as well as cooperating with their partners. The concept isn't new but instead of being a rare exception it has emerged as a predominant model. How do you draw distinct lines in the sand to separate you from a competitor in one area when you want to share information and assets with the same company in another area. In the past we have created physical partitions which, by their static nature, often limited the growth of the partnership because the partitions either didn't or couldn't change easily. Workarounds become common which lead to wasted labor, inefficiencies, and increasing complexity. Cloud technologies with virutalization at the core provide the toolset required to dynamically map and remap resources. Adopt a cloud model whereby each partition becomes a cloud and the opportunities explode.

I am a firm believer the value of cloud computing lies in its ability to take collaboration further, extending our model of human-to-human cooperation between companies to our technology assets. Integrations between companies today are based on a steady state business process: accounts receivable, purchase order, shipment notification, etc. A new model will emerge, Cloud Keiretsus, whereby federations of companies will be loosely coupled with clouds providing the foundation. Data will move into, around, and out of the cloud as necessary to fulfill the objectives of the organizing principle whether it's to bring a product to market, manage a financial service offering, or develop a new drug. In this model when two or more organizations begin working together a collaboration cloud is provisioned as a foundation to facilitate the sharing of information; emails, prior work, document management, etc. As the relationship progresses and opportunities to share data emerge the cloud operates as both a repository and gateway managing the flow. New applications can be rapidly developed in the cloud to fulfill new needs and existing services can be exposed to automate processes as they cross over into the scope of the relationship.

An important advantage of this model is the ability to share and enforce standards, even when different, on both sides. Mapping the workflow of a process between organizations will enable compliance without requiring either company to abandon their own standards. Company A can map it's SDLC to that of Company B without explicit conversion. Exceptions where Company B has additional requirements can be mapped into the workflow of Company A. For example, such an approach would reduce the ramp-up time and cost of establishing the Program/Project Management Office for new projects and simplify the execution of the project.

As the world becomes smaller companies are pushed to cooperate combining resources to compete effectively. The next weakest link in the chain is the rapid integration of processes to ensure everyone involved is "singing from the same sheet of music." Cloud Computing provides a new architecture enabling collaboration between companies at the business process level. Those who see and take advantage of the opportunity first will be the best positioned to redefine their markets.

Sunday, January 17, 2010

You need it real time? Really? Real Time???

Business intelligence and executive dashboards are all the rage soon to be complimented, if not already, by customer dashboards. Although I'm no ERP specialist I get involved in all kinds of projects and one of the most glaring inconsistencies I face is the demand for real time data that changes daily or even weekly.

First we need to drop the term real time and focus on the term most recent. Real time to a guy like me from the plant floor automation/data acquisition/embedded systems/engine control module world means something TOTALLY different. Real time means just what it says, in the moment. No delays.

Second business people need to think about data with the fourth dimension of time. Why ask for a data update when the data hasn't changed? Whenever I hear the request for real time data I push back by asking "How often does the data change?" Sometimes it's hourly, sometimes periodic throughout the day. Most often its daily or weekly. Rarely does the data change minute to minute. More important to understand is how the data drive a decision. In one project we grabbed real time service data when all the executives needed was an hourly update. In return that hourly update was used to perform daily staffing analysis, and even then only when the service data exceeded set boundaries. Since the designers never asked, it was just assumed the data had to be real time. We changed from a real time posting to the executive information portal to an exception based alert and made everyone's life that much easier. This is one of hundreds of examples from healthcare to consumer packaged goods to retail that I have in my work history.

Why is this a big deal? Speed costs. It does in racing and it does in technology. The faster you want to go the more it's going to cost you. In IT we're not doing our jobs if we comply with the request just because the business "wants it". When we load up systems with unnecessary requests we slow everyone down; the opportunity cost of bad design. A great architectural option is the data intermediary which sits between the application and the data store to cache query and service results (if the data hasn't changed, the service response won't either)

Educate the business, add the time dimension to all data, and consider how the data drives decisions when building interfaces, services, reports, and dashboards. And the added side benefit? Less complexity!