A large number of companies — large enterprises, government agencies as well as SMBs — have started moving more workloads onto the cloud. However, a number of IT leaders realise that traditional cloud infrastructure options fall short of meeting the intense computing demands of today’s enterprises. That’s why upgrading to a modern, secure, second-generation cloud infrastructure is a prerequisite for businesses in the new decade.

Four factors enterprises need to be kept in mind to achieve superior cloud economics:

Estimate cloud costs better

Most organisations’ cloud bills comprise compute nodes (CPUs) and storage, along with networking. A good first step would be to assess your needs across those services carefully. In networking, organisations need to focus extra on costs for outbound data transfer, in case you need to move huge sets of data from the cloud either to customers/partners, or to your own data centres.

A number of CIOs have already had unpleasant surprises when it comes to data egress charges. While it is common knowledge that moving data into a given cloud often doesn’t cost anything (or is minimal at best), what’s not widely known is that moving data out again can be very expensive, as well as time consuming.

One way to get a more accurate estimate of your cloud costs is to try out a cloud calculator that assesses the price of resources you need to meet your requirements before actual deployment.

Deeper understanding of data

Though cloud is fast becoming the destination for more and more corporate data, some organisations may still prefer keeping sensitive data behind their firewall in the near term — be it for regulatory or any other reasons. In such a scenario, you’d certainly want your public cloud provider to join forces with your in-house IT team to remain in sync. By ensuring smooth operations between internal and outside cloud services, you can boost both savings and morale.

Further, many organisations prefer to run flexible pay-as-you-go cloud services within their own data centres — one more reason why there’s a need for good fidelity between services running internally and those operating on the technology provider’s cloud infrastructure.

Price/performance equation

When it comes to different public clouds, the cost difference typically boils down to price/performance. If your organisation is paying for both the data being processed as well as the time taken to process transactions involving that data, using a cloud that processes things faster naturally becomes the better option. So in summary, the same amount of data processed will cost much more on a cloud that has longer processing times.

Now, if you consider application performance, execution speed becomes all the more important for high-performance workloads and online transaction processing scenarios. Here again, by choosing the right cloud provider offering the faster data processing time and higher throughput, your cloud usage time declines, once again reducing your overall cloud bill.

Check regulatory/compliance needs

Working with a cloud provider who has two or more cloud regions in your country helps you adhere to disaster recovery (DR) and data residency requirements, while also offering the least possible data latency — helping you save time and money. From purely a DR perspective, it’ll certainly help if your cloud provider offers you two different cloud regions in country (one for data centre and one more for DR) across two separate seismic zones to diversify risks. In fact, if regulatory authorities allow in future, some organisations may even choose to host their DR site in another country.

Last but not the least, if your cloud provider can offer you a way to deploy public cloud services in the location of your choosing, that’ll give you the true power of choice.

By making the right cloud choice, you can help your organisation change growth orbits and sustain success in the new decade.

Khanna is CIO, Polycab; and Rajan isVP-Technology Cloud, Oracle India

comment COMMENT NOW