There is no shortage of cloud providers. From traditional co-location providers that are providing cloud and platform services like Peak10 and Rackspace, to more platform service providers like Amazon and Microsoft, the list of cloud providers continues to grow. Each service provider is trying to differentiate and the expansion of cloud services is valuable to the enterprise consumer. As cloud providers compete for market share and try to differentiate themselves they are adding more products and services to make the customer relationship stickier. That stickiness, disguised in the form of value-added features, can create risk, especially as your company consumes more services.
Looking back to 1980s you start to see some similarities to the days of Big Iron (legacy mainframe). The goal of Big Iron was to lock you into their platform, get you to turn on a bunch of services and start the meter running. Every month you would get a bill for all those services you were using. The problem was the cost of switching from Big Iron was so high due to tightly coupled services and capital investment that most companies couldn’t afford to change and didn’t have any viable alternatives.
The same was true in the early 2000’s with the introduction of J2EE specification. Application interoperability was a big driver behind the specification. The reality was many of the vendors that implemented the specification also built in proprietary objects and api’s to keep you on their platform. Again, making the cost of switching high and the dependency on the vendor even higher.
This pattern of hardware and software providers locking their customers into expensive solutions isn’t a new endeavor. Now, we are starting to see the same patterns emerge with cloud providers. Consider Amazon’s cloud and the abundance of cheap services they provide the consumer. Depending on the maturity of your application designs and your developers, the more applications you connect to their proprietary services the more risk your organization assumes with Amazon’s pricing and future product releases.
Remember – moving to the cloud wasn’t just about being able to scale and innovate quicker, it was also supposed to provide cost savings and operational efficiencies. In order to achieve these objectives, you have to design and build your systems with interoperability in mind or understand the potential risk you are creating with a single cloud vendor.
The good news is jumping from cloud to cloud is getting easier as long as you plan for it up front. Consider the following design and implementation best practices when building software in the cloud:
1. Evaluate multiple cloud providers
2. Build platform images for a couple of cloud providers
3. Test platform images and applications to better understand total costs for each cloud
4. Understand what services are proprietary and how to mitigate/abstract them
5. Develop a strategy to monitor costs and cloud pricing
6. Test and run multiple clouds in parallel
If you need help with cloud strategy or evaluating the potential risk with your current cloud strategy, we can help. Connect with us to learn about how the UDig team is helping businesses realize the benefits of the cloud.