Bitcasa CEO Brian Taptich: Competing with Microsoft, Amazon, Google a “suicide mission”
Brian Taptich, the CEO of developer-centric cloud storage provider Bitcasa, has told this publication that Microsoft shuttering its unlimited OneDrive storage policy is “definitely not a failure” and other players in the space are risking a “suicide mission” by competing against the hypervendors on their own terms.
Earlier this week, the Redmond giant announced it was to cap each Office 365 subscriber OneDrive account at 1 TB because some users interpreted the definition ‘unlimited’ to its fullest extent. Some users had entire movie collections in Microsoft’s cloud, with the scales topping 75 TB – or 14,000 times the average customer’s data usage – in extreme cases.
For Bitcasa, this represents a full circle change. This time last year, as Microsoft was opening up its OneDrive service for users to store the entire history of recorded cinema into, Bitcasa was shutting down its own $10 all you can eat plan for a similar reason; there was more than a suspicion of businesses using and abusing individual unlimited accounts.
Taptich said he could “empathise” with Microsoft after making the decision to stop unlimited accounts, but stopped short of saying this was akin to waving the white flag. “This is definitely not a failure or an admission of failure,” he tells CloudTech. “I suspect Microsoft just learned that, however theoretical models may have supported the efficacy of offering unlimited storage of a fixed –and low – fee, in the almost entirely frictionless world of data transfer, the first users who show up to the all you can eat buffet break the model with their unimaginable volumes of data.”
Bitcasa’s move away from unlimited, Taptich argues, was not so much a case of trying to recoup lost money, but trying to shift engineering resources from a black hole. But it wasn’t just a case of making the decision, clicking their fingers and switching over.
The company suspected the unlimited plan wasn’t working as out as early as 2013, but it took until late 2014 to implement it. As Taptich explains: “There were two important reasons to continue offering the service. Because we had grown to over 30 PB of data under management from users across 120 countries, the additional time provided an amazing sandbox to test and refine the scalability and performance of the underlying infrastructure that is now the backbone of our developer-focused platform.”
He adds: “We [also] had a small but passionate user base, and we made the decision to invest in building systems that would provide our users relatively seamless ways to transition to our new tiers or, in some cases, to retrieve their data.”
Taptich argues that, while the past year has not been without its challenges, the company’s biggest struggle has been to stay patient. “Companies like Apple and Google have understood for a number of years that whoever owns the customer data owns the customer, but it’s only in the past 12 months that the balance of the connected – and increasingly mobile – ecosystem has woken up,” he explains. “They are playing catch up, and we have a platform which solves their desire to maintain customer ownership without having to custom build solutions.”
Despite arguing that fighting “trench warfare” with Google, Microsoft, and Amazon is not a smart plan, as the battle to provide public cloud infrastructure is being fought alongside companies with trillions of dollars in market capital, Taptich insists there is an “enormous amount of opportunity” for smaller players long term.
Take enterprise-centric file sync and share vendor Egnyte. While their funding pales in comparison to the likes of Dropbox and Box, their laser focus on enterprise customers, and in particular, picking up second generation enterprise customers, is a viable target.
“You have to realise that we are in the middle of an extraordinary transformation from local, device based resources to remote, cloud based resources,” Taptich explains. “All data will eventually reside remotely, and the speed with which this happens is purely a function of connection speed and ubiquity, and security.
“Whether it takes two years or 20 years, this is a fundamental shift that represents multi-hundreds of billions of dollars,” he adds. “And in the short term, successful companies will be laser focused on providing services which enhance the 11 nines reliability of the public cloud, will be targeting customers with a user base every bit as scaled, and will be building a financial model that benefits from the plummeting costs of underlying public cloud infrastructure.”
At the very least, this is the theory which Bitcasa expects to come true.
- » Google Cloud officially opens Zurich data centre region
- » Google Cloud launches new cloud storage plan to give enterprises more scalability options
- » Best practices for global enterprises moving to multi-cloud environments: How SD-WAN can help
- » RightScale State of the Cloud 2019: Azure gains again, cost optimisation key, PaaS explodes
- » Intel, Google, Microsoft and more team up for CXL consortium to supercharge data centre performance