How big data is a key part of Intel’s data centre vision
Intel recently shared its long-term strategic vision of how corporate data centers will evolve. Big Data processing plays a central role, driven by a future of escalating data volumes from mobile, cloud, and “Internet of Things” sources. Intel is starting to pull together a portfolio of offerings as a foundational infrastructure for analyzing Big Data.
Not surprisingly, this is a hardware-centric strategy, allowing Intel to sell more server engines to effectively run Big Data software applications. Intel believes it has the right chips (x86) to manage and analyze this data. But the chipmaker is taking a proactive role thinking further up the stack and fishing for Hadoop-related software opportunities.
Big Data is one of several demand-side drivers shaping the data center of the future
Intel laid out a detailed blueprint of key infrastructure for building the data centers of today and tomorrow, anchored to several demand drivers – continuing uptake of mobile devices, the rise of “Internet of Things”, and increasing cloud-resource consumption.
Mobile devices generate spades of M2M data every second, and Internet-connected devices and appliances promise to spew out fork-loads of data that potentially hold valuable insights. Both are still work in progress but will increase demand for scalable Big Data analytics server and storage infrastructure.
Storage is of particular interest for Intel as a provider of CPU units with a vested interest in efficient rationing of CPU performance and storage resourcing. Big Data factors into the equation as it drives greater demand for combined (and more intelligent) storage/server hardware.
To this end Intel seems to be investing heavily in optimizing servers to take advantage of Big Data platforms such as Hadoop. That’s understandable; it is in the market to make money on sales of its semiconductors and servers – not software. Intel is therefore optimizing its hardware and in turn developing API hooks that help ISVs shift compute loads to software that takes greater advantage of chipset instructions – particularly through initiatives in data encryption and graph processing.
Both have been the Achilles’ heel of Hadoop up to now. Its hardware-centric efforts to optimize software stacks will therefore be a boon for addressing the cost, complexity, and security issues around implementing Big Data analytics solutions.
Intel continues to expand its portfolio for Big Data
Intel announced its entry into the Hadoop market in March 2013, offering its own distribution of the open source framework; though some Hadoop distribution rivals assert that Intel has been dragging its heels in getting their code out to the Apache community. There are many Hadoop distributions on the market, but one differentiator for Intel is silicon-based (Xeon) encryption/decryption capabilities, allowing customers to securely analyze Big Data sets without compromising performance.
Intel has added to that with Intel Enterprise Edition for Lustre software, its distribution of the Lustre open source, parallel distributed file system and storage infrastructure that promotes fast access to integrated data sources.
Intel is pairing the two, basically allowing Hadoop to be run on top of Lustre, aided by the inclusion of management tooling and adaptors. The net benefit is allowing users to access data files directly from the global file system at faster rates, thereby speeding up analytics processing times, providing more productive use of storage assets and simpler storage management.
Intel continues to invest in Big Data hardware enablement
Intel is still a long way from being a household name in Big Data analytics. But it seems intent to invest heavily in this area, demonstrated by ongoing research by Intel Labs around data-intensive computing platforms, machine learning, parallel algorithms, visualization, and computer architecture.
Intel has also established a Big Data Center hosted at MIT with a remit to encourage new data-intensive user experiences. In June 2013 Intel, in conjunction with Dell and Revolution Analytics, also launched a Big Data analytics center housed at the Dell Solution Center in Singapore.
Intel’s investment arm, Intel Capital, has also taken stakes in Big Data technology firms such as 10g and Guavus.
Can Intel really become a serious player in Big Data?
Intel’s Big Data strategy is not just about buzzword compliance. It is aimed at entrenching and reinforcing its x86 chips in Hadoop compute clusters – an area in which it already has a strong foothold. Ovum’s opinion is that Intel’s Big Data initiatives remain focused on protecting its x86 franchise. The more wed Big Data processing is to x86, the more you need more of, and more powerful, x86s.
While the overarching angle for Intel is that Big Data sells more Intel hardware, it is not being passive about Big Data. Rather, it is taking a more proactive role – not just coming up with processor optimizations, but actively engineering firmware further up the stack, and Hadoop distributions.
So is Big Data driving Intel to venture into the software business? Ovum believes that Intel is not necessarily trying to sell software (although it does have a software business with McAfee which is arguably a best kept secret in its marketing), but that the Hadoop optimizations will in turn further protect its core x86 franchise. Intel is essentially developing its Hadoop distribution as a wedge.
This is not so much an effort to sell more Hadoop distributions, but to promote and encourage other Haddop distributions to support its APIs. We believe that Intel will pursue an OEM strategy for this, with SAP (with HANA) a likely initial partner. The key question is whether customers really care what the processing chip brand is. Most customers will not care, but they do care about vesting in a standardized x86 cluster architecture that is already in place. Intel is probably not the only hardware vendor cognizant of making its chips Big Data-friendly. It faces competition from the likes of graphical processing units (GPUs) for high performance computing (HPC) number crunching for Big Data applications.
- » Storage: The heart of the next generation data centre
- » Opinion: The inverted pyramid of IT infrastructure
- » IBM looks to Canada for latest expansion, installs SoftLayer data centre
- » Why physical security is essential to combating the ever present and growing threat to data centres
- » Harnessing the power of Google’s cloud: Google BigQuery Analytics book extract