By Christina P. O’Neill
Real-time analytics used to be the province of the biggest and richest companies, but a combination of declining cost of random access memory (RAM) and the availability of open-source software to manage real-time computing needs is addressing the need for immediate analysis. Midsized and small banks can increasingly access in-memory computing (IMC) power that until a few years ago was economically out of reach. The caveat: IMC products cannot be all things to all users. The upside: IMC can help smaller banks reduce their risk exposure and stay abreast of the competition.
Max Herrmann is executive vice president of marketing at GridGain Systems, a California-based open-source and commercial in-memory data fabric solutions provider. He notes that financial-service firms are increasingly adopting technologies to help manage big-data information and queries, as the cost for memory continues to drop – by an estimated 30 percent per year, according to GridGain. “Smaller banks have to be more careful; each decision [has more impact],” Herrmann said. “It takes a lot for a big bank to fail, but small banks can’t be wrong.”
Massimo Pezzini, vice president and research fellow for Gartner Group Inc., notes that IMC technology is isn’t new – it’s been in market use for the past 10 to 15 years. But as online and mobile banking transactions, particularly mobile, have increased the workload on banks for transactions that don’t bring in revenue for the bank, IMC is making the proposition more attractive. Institutions that can offload this data from their core systems to an IMC platform can both increase their knowledge and responsiveness, and reduce their operational cost.
Using IMC, bank staff can run queries on the IMC data without needing to engage their central IT staff, and the reports can be generated in seconds rather than hours or days, Pezzini said. Individual customer risk profiles can include up-to-the-second data to evaluate business-loan or mortgage applicants; older methods rely on risk profile data that may be at least a month old.
There’s a catch, said Michael Matchett, senior analyst and consultant of Taneja Group. “The challenge with in-memory solutions is that DRAM is volatile, meaning that if you unplug the power, you lose the data.” Solutions to the problem range from adding batteries or capacitors to the memory, or logging out transactions to persistent disk to support eventual recovery, or replicating the data to other nodes – much as home computer users may use flash drives or external storage to keep copies of their work.
What Users Want
GridGain Systems recently announced a survey of nearly 200 IT decision-makers in the financial services industry about their priorities and their adoption of real-time analytics. The report, “A Cautious Revolution: Financial Services’ Prudent Embrace of Real-Time Analytics,” found that accessing the information wasn’t the problem. Respondents said that traditional disk-based processing and the variety of data structures and sources result in informational silos that create barriers to real-time informational access (see below).
“In these findings, we see an industry facing some growing pains, but we also see that it is clearly poised to continue taking the lead in adopting data processing innovations,” said GridGain’s Herrmann. Large banks have taken a selective approach to implementing IMC by placing 10 percent of their data with GridGain on IMC and additional material into flash memory.
58% Say their companies use in-memory technology
42% Report real-time analysis most important in risk-analysis
40% Cite processing speed as a challenge in their current systems
39% Identified scalability and speed as next desired upgrade
22% Prioritize cloud-based analytic tools
19% Prioritize security upgrades
18% Prioritize improved uptime
Source: GridGain Systems
Tweet and Re-Tweet
The explosion of social media falls into the real-time purview, with both positive and negative comments in the mix. “It’s so easy to re-tweet,” Herrmann told Banker & Tradesman. “Retweets can go viral through unexpected sources. [Financial institutions] have to find those posts. It’s no longer possible to wait a few days. You have to find them and analyze what happened.”
Disk storage won’t go away, he said – there are many applications for which it is the most economical – but RAM access is faster and has become almost unlimited in size due to the advent of cloud computing – and increasing acceptance of and comfort with cloud computing on the part of financial institutions, if they can be assured of adequate cybersecurity.
Taneja Group’s Matchett said running an IMC solution in the cloud doesn’t really present any different approach. Cloud providers can provide an environment in which a distributed IMC grid can be implemented across geographic regions, without a financial institution having to have built its own geographically distributed data centers.
Costing out the Solutions
In the last few years, customers have embraced the pay-per-use benefit versus making capital expenditures on resources from which they will not derive immediate benefit, Herrmann said.
Matchett suggested asking a systems designer if it’s possible to eliminate some tiers of costly intermediate infrastructure, focusing on IMC at one end and slower, less-expensive and higher-capacity disks on the other. For banks that have merged or acquired, the cost of legacy systems has to be realistically assessed. “One of the questions you might ask as a systems designer is if you can eliminate some tiers of costly intermediate infrastructure, and focus on memory on one end with cheaper, slower, higher capacity disks on the other,” Matchett said. “As you are looking at legacy systems, over time, [they do] depreciate. … Sunk cost is a fallacy if I can only make it work for another year. That cost is gone, and what’s the best thing I can do now?”
Christina O’Neill is editor of custom publications for The Warren Group.