Citigroup is cutting costs by making storage simpler
- 23 September, 2016 03:49
Citigroup is using software-defined storage to build an infrastructure that could last 25 years – while generations of hardware come and go.
The financial services company needs to transform its storage architecture to deal with growing and changing demands, says Dan Maslowski, global head of storage and engineered systems. By simplifying its architecture, Citigroup expects to slash its operational expenses, which make up most of its storage costs.
Citigroup’s need for storage is growing so fast that if costs don’t go down, the company’s spending on storage might eat up its entire IT budget in a few years, Maslowski told an audience at the Storage Developer Conference in Santa Clara, California, on Tuesday.
“We can’t satisfy demand and capacity fast enough. More importantly, we can’t do it at a cost that is effective for our customers,” Maslowski said. There’s also a sheer limit to scalability. A major financial company, which may have as many as 50,000 storage points around the world, just couldn’t manage them if each one needed manual intervention by an administrator.
Public clouds offer an alternative that, at first blush, looks both cheaper and more flexible than what Citigroup can give its internal customers. But Maslowski said public clouds aren’t as inexpensive as they seem when you factor in things like the cost of migrating data back out of them.
Citigroup’s solution is to emulate some of what the cloud companies do and deliver services in a similar way: Users just order capacity and service levels, not worrying about the underlying infrastructure.
The company does this by building a comprehensive software architecture – the part that’s expected to last 25 years -- and then rolling in a standard hardware configuration that’s swapped out every few years as technology improves. It's using best-in-class hardware, and its software is designed to be able to work with anyone's equipment.
Citigroup has come up with a hardware pod that will have 1TB of RAM, 28 computing cores for data services, between 4 and 12 petabytes of flash, and probably some spinning disks. The company plans to deploy 28 of these next year.
So far, the company has gone from no software-defined storage to more than 20 petabytes deployed in three regions of the world. Where implemented, it’s cut the cost of storage by 60 percent, Maslowski said.
Though it’s meant to make things simpler, a project like this has its challenges. One is finding the right people. Maslowski says he’s hiring now. The kinds of experts he's looking for can command jobs at hot cloud companies, so there's competition.
But the biggest challenge right now involves vendors' software, Maslowski said. It would be impossible for his team to learn and deal with all the unique software tools and user interfaces that come with vendors' hardware, he said. So Citigroup wrote an API (application programming interface) and asks all the suppliers it works with to adopt it.
“Standardization on interfaces ... makes my life a lot easier,” Maslowski said.
Finally, part of his job is to ease the fears of those who resist the change, including storage administrators who have built their careers on knowing how to manage particular kinds of hardware.
“Not everyone sees the value of this, and some people are actually pretty scared about it,” he said.