Four industries where on-premises infrastructure beats the cloud
Cloud computing has fundamentally transformed information technology by offering enterprises an allegedly cheaper, more flexible, and relatively maintenance-free alternative to purchasing their own IT infrastructure. However, despite the mad rush to migrate to the cloud, cloud solutions are not always the best or least expensive choice, particularly in industries that work with very large, complex data sets, perform many intricate mathematical calculations, possess invaluable digital intellectual property, or are subject to certain compliance standards.
Let’s take a look at four industries where in-house IT infrastructure still beats the cloud.
In the cybersecurity field, the ability to process extremely large data sets as quickly as possible is crucial, especially as cyber attacks shift away from “lone wolf” one-off hacks and towards highly organized, sophisticated operations carried out by well-trained cyber criminals. IBM estimates that the average organization encounters an average of 200,000 security event alerts each day. An enormous amount of computing power is required to run the SIEM systems that not only detect those anomalies, but also analyze them, separate the false positives from the possible attacks, and deliver actionable information to security analysts – way more power than even the most robust cloud solution could possibly deliver.
In addition to latency problems, the amount of bandwidth consumed would become very expensive, very quickly, making on-premises equipment the most cost-effective solution over time. On-premises cybersecurity hardware also prevents chain-reaction situations where client organizations end up getting hacked because the cloud vendor that their cybersecurity provider was using did.
The internet has transformed the way in which consumers and businesses shop. Long buying cycles have been replaced by “just-in-time,” last-minute purchasing decisions, and advertising-weary prospects are using ad blockers, email filters, and DVR fast-forward functions to tune out traditional advertising. To reach these elusive prospects, companies are turning to ad tech, which employs extensive market research and big data analytics to deliver highly targeted marketing messages to qualified prospects at the precise time that they are ready to buy.
The complex data sets and high-level analytics that power the ad tech industry require computing power that is already beyond what any could solution could offer. As the industry matures, ad tech firms will require even more power and more storage space. On-premises equipment can be scaled much more quickly than cloud solutions, and latency problems are avoided.
The ad tech industry also faces intellectual property issues. As Amazon, Google, and other cloud providers enter the market research and ad tech spaces themselves, questions arise as to the safety of digital intellectual property stored on a competing company’s cloud service. Dropbox listed Amazon’s move into the file-sharing space as one of the reasons why it decided to ditch AWS for its own equipment.
The life sciences industry is grappling with a “data avalanche” of clinical results, disease states, scientific studies, and individual patient data, which results in stratospheric cloud bills and latency problems. Because researchers work on limited budgets, simply storing all of this data on the cloud could deplete a project’s funding – and that’s before anything is actually done with it.
Cyber security and compliance issues also come into play due to the sensitive nature of this data. Storing patient data on the cloud may result in an organization running afoul of HIPAA and other privacy regulations if the cloud provider gets hacked, even if the hack turns out to be the provider’s fault.
Finally, on-premises hardware, unlike cloud solutions, keeps running even if the internet is down, which makes on-site equipment a must for scientists who are performing research in remote areas where internet access is spotty.
Design and engineering
Much like cyber security, ad tech, and the life sciences, design and engineering involves performing intricate calculations on very large, complex data sets, as well as running memory-intensive software and generating terabytes of new data every day. A cloud solution would be both wildly expensive and far too slow. In particular, cloud servers offer very poor interdomain communication; a physical server equipped with a high-performance communication architecture such as Intel’s Omni-Path is less expensive than a cloud solution is less expensive and offers low communication latency, low power consumption, and a high throughput.
Design and engineering companies also face a constant threat from digital intellectual property theft; everything from new product prototypes to R&D data could be targeted by competitors, cyber extortionists, or even foreign governments. Digital IP is simply too sensitive to be stored in the cloud, especially since hackers are increasingly targeting cloud providers as the industry grows.
Beyond cloud-first hype, a balanced approach
Despite its drawbacks, cloud computing does have a place in many organizations’ IT ecosystems. Many companies use on-premises equipment to handle standard functions and store highly sensitive data and employ cloud solutions when they require additional capacity or to store less-sensitive data.
Instead of migrating to the cloud because it’s trendy, and “everyone” says it costs less and offers more flexibility than in-house equipment, enterprises should take a step back, consider their individual computing needs, and perform an objective cost analysis.
- » China biggest spender on public cloud in Asia Pacific, says IDC
- » Bessemer sees blockchain, serverless and APIs influencing the cloud arena in 2018
- » UKCloud partnership with Microsoft and Cisco pushes forward multi-cloud for public sector
- » Disaster recovery: The importance of choosing the right provider for you
- » Public cloud: The app compatibility problem and how to overcome it