Despite my best intentions, I keep having conversations about cloud computing. This probably means it's almost in the mainstream. As both of my faithful readers know, I've been a bit critical of cloud computing. Not so much the idea of cloud computing itself. It's more about everyone piling onto the concept even where it is inappropriate. I also find the discussion of private versus public clouds mostly irrelevant. That's a business decision related to how you want to cost out IT. It has nothing to do with technology.
Having had a bit of time to think about it, here's what I think it is and isn't about. In a nutshell:
It's about running enterprise applications, wholly or in part , somewhere out in the IT infrastructure. You don't care where so long as it's somewhere appropriate. From a software perspective, it means instantiating application objects but not caring where that happens so long as it meets the needs of the object.
It's about metered usage, either as a service or in-house. Paying for only what you use is very attractive.
It's about better application resource utilization. You save money when you don't overbuy. Another way to look at it is that you align resources to how critical something in the application space is.
It's about flexibility. Being able to run application objects anywhere in the infrastructure means less dependence on certain assets. Makes for better availability and more cost savings.
Public versus private cloud arguments are only valid or relevant when talking about how you pay for IT. If it is in your best interest to convert a CAPEX to a variable expense, by all means go for the public cloud. The same is true when thinking of personnel costs. You might not have the expertise to run a private cloud so you either hire or go outside. These are classic outsourcing decisions.
Often, a cloud uses a virtualized hardware environment (storage, servers, and networks). but it doesn't have to. The virtual application space is what matters. That's why we have middleware.
The last point is key. While virtualized servers and storage provide a great environment for running a cloud, it's not necessary. It's the middleware environment that matters. For example, a lot of what we think of as cloud computing is achievable using existing Java 2 Enterprise Edition (J2EE) technology. J2EE environments, such as JBOSS, perform all the tasks needed to build a cloud. It handles:
Persistence
Coherency
Distribution
Synchronization
Object Caching and Reuse
J2EE allows you to instantiate objects on any physical server running the J2EE application server. It doesn't matter if that server is virtual or not. While that might be a good idea, it's not necessary to make a cloud.
One look at Google's cloud SDK tells the story. You import a library of Java objects that interact with the Google cloud and voila! Your application objects are running in their cloud. You could conceivably run some objects in their cloud and others in-house. It's not that easy of course but pretty close. Google provides all the infrastructure that you need to instantiate and manage application objects elsewhere. How they do it is unimportant.
The ultimate cloud would be virtual everything of course. That way you get maximum alignment and utilization. Virtual servers using virtualized/federated storage, with middleware that provides a virtual application space would meet the needs of a cloud nicely.
But in the end, it's the software that counts. The application is what it is really all about.
No comments:
Post a Comment