Well-publicized bank transgressions are frequently labeled "compliance failures." However, to those of us who work in the industry, this sometimes feels like an insult. Anyone, like me, who has worked in banking, anti-money laundering (AML), software creation, audit readiness, compliance and process engineering, knows how easy it is to blame "the system." That excuse is often coupled with process lapses, and the official explanation of failure becomes "we need new software and better training of our (low-level) back-office staff." This may be true. It is quite difficult to grasp the totality of compliance mandates and then implement effective software and process solutions, especially in huge financial institutions. Geographic spread and fuzzy organizational lines can also cause compliance problems.
The collision of ineptitude and avarice is a frequent occurrence; sometimes it's on purpose. And always, in the case of financial malfeasance and extreme noncompliance, it involves IT systems. People drive the fraud. Processes can either support or hide the crime depending on the quality of back-office controls, but the proof lies in understanding the data, the software, and how they can be manipulated. Application software, tables within the applications and discrete data elements comprise the fundamentals-the ABCs of understanding how financial crimes are committed and detected. The following two stories are based on true events and are emblematic of how financial crimes and lack of compliance are enabled. One was committed by a single person, the other by a cadre within the bank. And they both matched the MO of the recent HSBC anti-money laundering (AML) debacle: a revolving door in Compliance and Audit of under-qualified people and an active intent to obfuscate by continually changing systems.
The recent guidance (warning, really) from the FDIC1 on the need for financial institutions to perform due diligence when selecting anti-money laundering (AML) software puts the proof of compliance burden squarely on the financial institution. It also points to the need for an enterprise solutions architecture, one that builds on existing structures-how things really are-rather than on pushing through a vendor package. While there is no doubt that commercial off-the-shelf (COTS) products play an integral part in AML compliance, there is also no doubt that AML software depends on the quality and uniformity of data supplied by the financial institution. The systems, data, processes and organizational structure of the enterprise form the infrastructure of compliance, and these must be understood and documented to ensure that the COTS "solutions" are just that. If, for example, a bank wanted to institute an automated customer risk scoring system, there would be many questions that needed answers before software could be selected and installed...
A lonely woman falls for a silver-haired gentleman-the face of a criminal enterprise in Africa-in one of the most common romance scams known to law enforcement. She loses her house and her savings. She is devastated both financially and emotionally, taken in by shysters who understand the vulnerabilities of the lonely. And in this true case, the lonely woman is also a victim of the American banking system.
In one critical anti-money laundering (AML) review of a mid-tier bank, examiners admitted to having a hard time understanding how the transaction monitoring system worked.
A financial institution can have a comprehensive anti-money laundering program, a staff of experts, and a million-dollar (or more) specialized computer system in place and yet still miss potential problem customers because they failed to collect or use important data. In the end, it all comes down to data-discrete pieces of information that need to be collected, analyzed and presented in meaningful ways-to make a successful anti-money laundering program. No matter what automated or procedural anti-money laundering programs you have in place, the success of the program depends on meaningful data.One of the first steps in creating an effective program, then, is to develop a data plan to understand what data must be captured, how to capture it, how to analyze it, how to report it and how to use it.
What started out as an apparently straight-forward Transaction Monitoring System Validation project took an interesting and cautionary turn at an international bank recently. The Project Team assembled for the task-as well as executive management at the Bank-expected that the Validation would discover some less-than-perfect data mapping from their core banking system to their Transaction Monitoring System. A completely new Compliance staff had reviewed the Bank's unfamiliar (to them) Transaction Monitoring System, and could see that something wasn't quite right. Wires were not appearing properly on reports, General Ledger account numbers were showing up instead of Customer Account Numbers, and there were unnecessary transaction codes, like Wire fees, clogging the system. It seemed like a simple, methodical task of documenting the current mapping and making appropriate changes.