The Data Tsunami: Combating the Overwhelming Supply of GRC Data
The institutional capital markets are dealing with the aftermath of one of the most aggressive periods of regulatory intervention since the Great Depression. The costs and consequences of non-compliance within financial services industry are greater than ever before. In addition, the financial industry is going through tough times with thinner margins, all time low volumes and low investor confidence. As a result, firms are dealing with new realities: Do more with less and adhere to stricter regulations. Existing legacy processes and technology across front, middle and back office do not help meet these goals. Today’s risk and compliance data management techniques have largely failed to meet internal expectations and regulatory requirements.
The financial crisis resulted in a structural change in the financial services’ industry operating environment, driving permanent shifts in regulatory and shareholder expectations. The requirements of GRC vary widely depending upon the jurisdiction in question, in the US, rules such as Dodd Frank and its corresponding Volcker rule and initiates such as the Consolidated Audit Trail (CAT) and, in Europe EMIR and MiFID2, are giving sleepless nights to senior executives at financial intuitions. The pressure of constantly intensifying regulatory environment requires institutions to adapt and expand their in-house compliance programs in respond to compliance requirements that are increasing in breadth, depth and complexity.
Sharing information within an institution is now a business imperative as regulations force market practitioners to share data and compile information from disparate systems to meet complex compliance requirements. However, most legacy data management infrastructures deployed by financial institutions have proved to be ill equipped to handle the new challenges facing financial services firms. The existing IT infrastructure is struggling to deliver on various levels such as —speed, performance, and scalability, to name a few.
What institutions need is a combination of dynamic and flexible data management architectures along with aggregation capabilities layered on top of user-friendly front ends. When combined into an elegant solution, will allow market participants to increase the utility gained from data management structures beyond regulatory reporting. Pre-trade risk checks, risk management and trade analysis mean little if they occur in silos, and as new regulations come into force those who cannot monitor across their enterprise in a holistic way will fall foul of regulatory authorities. Those that can transform their bytes (tera and peta) beyond just regulatory requirements will gain the advantage. Regulations are both a challenge and an opportunity for institutions.
This 18 Page TABB Group note – “The Data Tsunami: Combating the overwhelming supply of GRC data” discusses the deeper issues and challenges facing the industry, and the latest trends. Furthermore, the note also describes in detail the forward-looking solutions that solve many pain points that institutions are currently facing. These new tools will be indispensable on the journey ahead, which is why we expect a heightened sense of urgency. IT department at financial institutions are struggling with the implementation of these new regulations and this note also highlights the initiatives of a number of the larger vendors in this space including Axiom SL, Bloomberg Vault, BWise (NASDAQ), NICE Actimize, and SunGard.
This report can be downloaded by TABB's Research Alliance Data and Analytics (DnA) Members and qualified media. For more information or to purchase the report, write to email@example.com.