Data Analysis In a Compliance Best Practices Program

Thomas Fox - Compliance Evangelist
Contact

Compliance Evangelist

I. What is Data Analysis in a Best Practices Compliance Program?

This paper will discuss the use of data analysis in a best practices compliance program under the Foreign Corrupt Practices Act (FCPA), UK Bribery Act or other anti-corruption compliance regime and how it can be used by the Chief Compliance Officer (CCO) or compliance practitioner to support best practices. My partner in this exploration is Joe Oringel, a co-founder and Managing Director at Visual Risk IQ, a data analytics services firm.

Being a recovering trial lawyer, I began with the basics, which is: what are data analytics and data analysis? Oringel kept it simple, saying that it’s merely using data to answer questions. He noted that such analysis predates computers since Sherlock Holmes became well known for using deductive reasoning to make determinations from data based evidence. In the 21st century business world, the best evidence that we have as to whether something took place or not is most often digital evidence. Oringel pointed to a variety of authoritative digital data sources, which intone that modern data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goals of highlighting useful information and supporting our decision-making, so data analysis is answering a question with data. 

Oringel next pointed to another set of definitions for data analysis, which derived from Thomas Davenport, who is a well-known academic and author who teaches at Babson College. Davenport incorporates the notion of time to categorize data analytics as answering certain questions about either the past, the present or even the future. Incorporating time into analytics focuses these efforts so you can build repeatable patterns into the questions that should be asked and answered. 

Oringel, who has both academic and professional training as an internal auditor, said that external financial auditors, like the Big Four, usually focus on answering the question, “What has happened?” This is really a focus on historical transactions, looking backwards and looking at the reporting of transactions, for example what was recorded in the books and records of the company? How was the transaction recorded? Why was the transaction recorded a certain way?

I next turn to the difference between data analysis and traditional internal auditing or sampling. Oringel believes this is the most significant change in technology in the last 25 to 30 years due to the advent of the personal computer and the associated spreadsheets and database softwares that allow auditors to make their conclusions with data, and to have those conclusions not be based on a sample of data, but, rather, on analyzing the population of data. 

He said “In the late 1980’s, early 1990’s, the predominant technique that internal auditors used was sampling. If an audit was designed to vouch fixed assets, auditors would pick a sample of 25 or more fixed assets; re-compute, or test, the acquisition date, and the disposition date; and finally re-compute depreciation by hand. If the fixed assets in our sample were properly recorded, then we looked up on a statistical chart or table and concluded that we were sufficiently confident that all of the fixed assets at the company were properly stated.” 

He further said “with today’s digital accounting software, every fixed asset can be downloaded and the depreciation re-computed based on the acquisition date and the disposition date and the various depreciation rules for each asset class. If there are any differences in the valuation of any asset, the differences can be found through data analysis. Data analysis allows a company’s auditor, whether internal or external, to re-compute or model the financial recording of transactions, as they ought to be recorded and, therefore, have even greater confidence than if they had tested using sampling. By analyzing every asset and related transaction, a company is able to test the entire population and be much more confident in the results. This has obvious implications for any FCPA audit as there is no materiality standard under the FCPA.”

Data analytics can transition from a review of historical transaction to a review of current transactions simply by asking similar questions of similar data, but with a change in focus. This focus change is to answer the question “what is happening now and what should we do about it” instead of merely “what has happened.” When your bank or credit card company puts a freeze on your charge card because of suspicious transactions, they are using data analysis as an alerting function. More sophisticated companies use this sort of data analytics tools and processes as part of their Compliance program for areas like monitoring for improper payments or to identify vendors who may be a match with entities on a Denied Parties list. 

This use of monitoring as an alerting task is a logical next step for compliance teams, but most are not yet for any number of reasons. The transition from data analytics as historical analysis to alerting through continual or continuous monitoring can be a challenge, and it is still an emerging best practice. Continual or continuous monitoring establishes these alerts and suggests us to take action based on something that happened just a quick moment ago. 

I asked Oringel if he could provide an example along the lines of the Department of Justice (DOJ) and Securities and Exchange Commission (SEC), jointly released FCPA Guidance, which says that the goal of a best practices compliance program should be to Prevent Detect Remedy matters before they become FCPA violations. He translated the FCPA Guidance into “Stop, find and fix”. He believes that it is about asking the time period that you are pulling the data from, so if you are looking at transactions that happened 6 or 9 months ago, then your analytics are serving as a reporting function. He gave an example where a business development person entertained a government official, yet did not seek preapproval to do so. Unfortunately, the amount spent was more than was allowed under the company’s Gifts and Entertainment Policy for entertaining a foreign official. Now the compliance function needs to fix that policy violation and make sure that it does not happen again. 

The next frontier for data analytics is a move from alerting to predictive analytics, which is using data analysis to predict what will likely happen in the future. This allows us to move from answering questions about what has happened in the past or present to what will likely happen in the future. While predictive analytics is common in many industries and processes, like Commercial Lending or Insurance, it is not at all common in compliance. Yet. 

The “find” capability goes from the past to the present and to the future and may be where the most advanced audit and compliance teams go next. This actually moves to an almost a proscriptive action, where, because you were able to predict, or have an insight, going forward you are able to deliver a risk management solution to that potential situation. 

Oringel concluded by saying that it is this future orientation, with data analysis as a predictor, that he believes is the next step in the compliance function using data. A company can score high risk employees in their unit by identifying the salespeople that tend to not respect the organization’s T&E policies; who spend too much on lavish meals or engage in other activities which contradict company policies, such as neglecting mandatory compliance training or simply being routinely late with expense report submissions. 

A more sophisticated approach would be to consider that if the employees in certain business units or functions have a low rate of complying with corporate expense policies, how can we trust those same employees to comply with our discount policies? The use of predictive analytics in the audit and compliance arena suggests that if there are employees who are proving to be non-compliant on some matters, they may well warrant a closer inspection on the other things which they are responsible for on a more frequent basis. Using this information to direct the audit and inspection efforts allows teams to identify and possibly correct errors in other areas that they’re responsible for.

II. Setting Up Your Analysis

The next phase is how to set up a data analysis program and how to use it to help monitor for a compliance program. I asked Oringel how he helps clients think through a project that involves data analytics. As a lawyer, I was intimidated by the issues of not only how to get the data but how to use it going forward. Oringel then laid out their firm’s five-step process and said that for any Visual Risk IQ analytics project, the steps are: (1) Brainstorming, (2) Acquire and Map Data, (3) Write Queries, (4) Analyze and Report, and (5) Refine and Sustain.

A. Step 1 - Brainstorming

It all begins with Step 1, brainstorming. Any data analysis project in a compliance setting, or any business context, begins by picking the business questions to answer with data. So in an initial meeting, Visual Risk IQ’s team might ask one or more of the following opening questions: What do we expect to find if we do a detailed review of this data? What policies should have been followed? What would a mistake or even fraud look like? The data to be reviewed could be expense reports, accounts payable invoices, or sales contracts. The key to successful brainstorming is to identify the questions you want to ask and answer, and then identify the digital data sources that can best answer these questions. This process should be iterative, with questions being refined based on the available sources of digital data. This brainstorming process that Oringel and his team uses is central to their work with helping clients to develop queries specific to their organization. 

B. Step 2 - Acquire and Map the Data

Acquiring and mapping data can be a technical step, but most modern software can create files that can be easily read by basic data analysis software, such as Microsoft Excel, as well as more advanced tools. Mapping data is simply identifying, naming, and categorizing the data fields (e.g. text, dates, numbers) so that the software tool can best interpret the data for analysis. Many data sources are internal (e.g. sales or expense transactions) but increasingly external sources from vendors and business partners are used too. Even the US Government is an occasional data source for analytics, as various Federal Departments publish watch lists of debarred individuals and companies. 

Once the data is loaded into the analysis tool, control totals should be compared to source systems for completeness and accuracy. Oringel recommends comparing record counts, grand totals, and even selected balances for a sample of records to make sure that nothing was lost in translation into the data analysis tool. Once data is confirmed to be complete and accurately loaded and mapped into the analysis tool, then the real fun can begin. 

C. Step 3 - Writing the Queries

Oringel identified Step 3 as writing the queries. Though it can be valuable to double-check the accuracy of reports that are provided from existing internal and external systems, Oringel recommends using data analysis to answer questions that are not readily reported from internal systems. Often comparing data across multiple data files can yield the most interesting results.  

While writing queries surely sounds technical, it can be quite simple. Sorting data from oldest to newest or biggest to smallest is often only a few clicks of the mouse. Once sorted by several different columns, business insights can be quick. Writing queries is simply writing the business questions you laid out in the brainstorming session, and using software in a way that makes it easy to understand the answers. 

A simple example would be “Show me any purchasing transaction that didn’t have the proper pre-approval.” This answer can be identified by comparing the dates between purchase orders and invoices, and then looking for any vendor invoice date that is prior to the purchase order date. Other query techniques are similarly simple, yet effective.

D. Step 4 - Analyze and Report Results

Oringel said that Step 4 is to analyze and report the results. I have wondered how a compliance practitioner would be able to not only view but then use such information. He said that Visual Risk IQ’s tagline comes from this notion. “See. Analyze. Act.” has been a part of their firm since 2006. By summarizing results in a way that measure something important, an action step becomes apparent. In the example above, if a vendor’s invoice date pre-dated its purchase order then the action step is to understand if the date it was received may be later than the date on the document itself. Perhaps the vendor has backdated that invoice in hopes of earlier payment, instead of our purchase order having been created after the fact to cover up the lack of required pre-approval. 

Oringel recommends summarizing the results of data analysis into visual form, for example by showing color, size, and location in a graph, so that the compliance practioner can understand what has happened, quickly see the data and conclude whether the picture supports a decision of whether the transaction was or was not compliant. 

E. Step 5 - Refine and Sustain

That brings us to Step 5, which Oringel identified as refine and sustain. Part of this step is about about fixing the root cause of any problem identified through data analysis. I certainly believe one of the key functions for any compliance practitioner, and one of the first things you should do, is to make sure any violations of your policies and procedures do not move to an illegal conduct stage. 

Yet there are other remedial steps that Oringel believes are critical at this stage. He said that when a condition or transaction is identified as being a potential issue, documenting the next action step and ensuring its proper completion is important. If an employee incorrectly submitted a personal or duplicate expense (e.g. they claimed $20 for a lunch yet they were listed as having attended a lunch paid by someone else on the same day) and they were reimbursed for a personal expense on a travel expense trip report, then the organization should ask for reimbursement of that expense and ensure thorough follow-up. 

Consistent action when these circumstances arise is important. Seeking and obtaining reimbursement for improper expenses should not be based on whether the employee is an officer or a manager or an individual contributor, or even the amount of the error. 

I turn briefly to the COSO Framework, which was updated in 2013 and became much more prescriptive with respect to the elements of an effective internal control program. There are five objectives under the COSO Framework and the fifth and final objective is monitoring activities. Monitoring activities are those that management should perform to ensure that the control environment, risk assessment, control activities, information and communication layers have been affected. 

The only way that I know to make sure that the principles of effective internal controls have been followed are to do some monitoring. Oringel turned to one of his favorite subjects for an analogy, how his children are performing in school. He believes that he and his wife have set a robust “tone-at-the-top” around the importance of attendance, homework and strong academic performance and that they provide some direction for the children about what is important in terms of their results at school. There are some control activities that he can utilize in terms of reviewing their schedule, homework, how much time they spend studying versus playing video games, but the best technique to make sure they are getting the outcomes that they want for them academically is to do some monitoring and an evaluation of their performance. 

A way to do that is to monitor their academic performance through the application, in his hometown called “PowerSchool.” It allows the parents and the students, together or separately, to log on and to answer the questions, “Was the homework assignment turned in?”; “What was the grade on the homework assignment?”;  “Was the most recent grade better or worse than last time?”; Oringel said, “We use PowerSchool as a data-driven monitoring tool to make sure that our kids are performing in school the way that we want them to.” 

Part III. Case Studies

A. Data Analysis to Prevent Employee Fraud

We next review how data analytics can be used to help detect or prevent bribery and corruption where the primary sales force used by a company are its own employees. Several significant corruption actions in China, involving both the FCPA and Chinese domestic law, involved China based employees defrauding their company by using false expense reports to create a pot of money to use as a slush fund to pay bribes. Here you can think back to the Eli Lilly FCPA enforcement action from 2012 up to the 2014 GlaxoSmithKline Plc (GSK) problems as examples of where employees used their expense accounts not for personal use but for greater corporate malfeasance. 

I asked Oringel how data analysis might help a CCO or compliance practitioner detect such conduct, and also move towards helping prevent such conduct in the future. Oringel related case studies from his organization where they used data analysis to review employee expense reports and how that experience can be used to formulate the same type of data analysis for a CCO or compliance practitioner. 

As previously discussed, Visual Risk IQ recommends by beginning with brainstorming. This step includes understanding an organization’s Procurement and Travel & Expense policies, and asking questions about how those policies can be circumvented. One common technique that takes place is to split larger purchases across multiple smaller transactions, so their organization has designed their data analytics queries to detect such split transactions.

In the example we discussed, Visual Risk IQ’s client uses procurement cards (P-cards) for certain low dollar-value expenses. The Company has a procurement card limit for most employees in their organization, which is $3,000 for a single transaction and $10,000 in aggregate spend for a single month. The Company wanted to identify any use of P-cards for larger dollar transactions that may have required capitalization as fixed assets, in addition to identifying inappropriate or personal purchases. Through the use of data analytics, Oringel shared how his team identified the purchase of a $9,500 computer system, but that an employee had split the purchase into multiple invoices across multiple days using one invoice per day from the same computer vendor. The transactions looked like these listed below:

 

Date

Purchase

Vendor

Amount

Monday

Computer

XYZ Computers

$2,800

Tuesday

Monitor

XYZ Computers

$2,400

Wednesday

Printer

XYZ Computers

$1,800

Thursday

Software

XYZ Computers

$1,500

Friday

 

XYZ Computers

$1,000

 

 

Total

$9,500

 

In total, the five transactions easily circumvented the organization’s $3,000 single transaction limit and their capital expense limit as well. The single computer system purchase was with the same merchant but split across multiple days and invoices. Clearly this series of transactions was a problem.

Oringel contrasted the above example with a similar issue they identified related to split transactions. The organization had an employee who was responsible for maintaining and scheduling a fleet of over 100 vehicles. One of the responsibilities was paying various bills related to the vehicles, including the State Department of Motor Vehicles, and the taxes billed individually per car. Visual Risk IQ wrote queries, similar to those that identified the inappropriate computer system purchases, and identified this employee as one who routinely exceeded the P-Card’s single transaction limit with the same vendor when multiple transactions in a month were evaluated together.

Their split limit query identified that this employee often completed multiple transactions with the same vendor, the State Department of Motor Vehicles, on the same day. However the “aha!” moment was quite different than the employee splitting transactions to purchase items above her limit in violation of the company policy. Here Visual Risk IQ’s data analysis demonstrated that those transactions were not fraudulent, improper or inappropriate, rather, the employee’s spending limit needed to be raised because the card was being used as intended, and this employee had more spending responsibilities than most others in the organization. There were benefits to paying the tax bill via P-Card, but the organization had set her spending limit before vehicles were managed centrally, so with the larger fleet and central management of vehicles, the organization needed to raise her spending limit specifically for that vendor. For other transactions, she would have the same transaction limits as other employees, but because her responsibilities involved registering so many vehicles, Visual Risk IQ recommended that the root cause be remediated by changing some of the controls in place. 

Another area that Oringel and Visual Risk IQ have focused on is travel and entertainment (T&E). Oringel advocates using analytics to identify out-of-policy expense reports and out-of-compliance expenses. This is achieved by using similar logic, as noted above, for accounts payable and when used on employee expense accounts Oringel said that it is often called “double dipping”. This means an expense is recorded once on a T&E report and then a second time on another expense report or a P-card charge or other type of expense. These are examples that can be uncovered with data with analytics and from there you can move to determine if they might be an intentional, as opposed to an unintentional, mistake. 

In the case of double dipping, Oringel said a key is to look for the same airfare or hotel or meals, perhaps being reported on multiple employees’ T&E expense reports. He gave the following example, “An employee takes another employee out for a business meal; and they pay for the meal on one expense report, all while, at the same time, the coworker records the meal, same day, same city, and claims that employee as one of their attendees. We find these sorts of situations with our analytics, and these are clear examples of suspicious transactions that ought to be discussed with both employees”

Other examples of double dipping include duplicate transactions between meals and per diem allowances, or mileage and company vehicles or rental cars. Oringel noted those are all things that can be identified with data analytics that are very difficult for an individual approver to see on a single expense report. He cautioned it is not that the approver is not doing a good or prudent job, “but typically, when you’re tasked with approving an employee’s expense report, what we have is just their single report in front of us. It’s difficult to recall who would have submitted a report one or two months ago, and it’s very possible that somebody submitted an airplane ticket when the ticket was purchased, and then six weeks later when they took the trip, that air expense could be reported a second time.” 

Oringel said the same issue could arise with P-card purchases if you have an approver considering a single $2,500 purchase who approves that purchase on Monday and then again on Friday. Yet had those two transactions been on the same day, in excess of the employee’s spending limit, the approver might not have approved both of them, but because they were submitted on different dates, it may well appear to the approver they were two separate transactions. With data analytics, you are able to aggregate those multiple trip or P-card reports into a single screen or report, to help a reviewer or an approver determine whether the transactions meet employees’ policies, both individually and in the aggregate.

B.  Third Parties and Duplicate Invoices

Next we consider how data analytics can be used to help detect or prevent bribery and corruption where the primary sales force used by a company is third parties. A vast majority of FCPA violations and related enforcement actions have come from the use of third parties. While sham contracting (i.e. using a third party to conduit the payment of a bribe) has lessened in recent years, there are related data analysis that can be performed to ascertain whether a third party is likely performing legitimate services for your company and is not a sham.  

I asked Oringel how data analysis might help a CCO or compliance practitioner detect such conduct and also move toward preventing such conduct in the future. Oringel described different case studies from his organization’s clients where they used data analysis on accounts payable invoices and how that experience can be used to formulate similar data analysis for a CCO or compliance practitioner. There are a number of more complex analytics that can be run in combination to identify suspicious third parties, and some of the simplest can be to look for duplicate or erroneous payments. 

Oringel said that a key to moving from detection to prevention is the frequency of review. It is common for organizations to periodically review a year or more of accounts payable invoices at a time for errors or overpayment. Changing this from a one-time annual or biannual event to something that is done daily or weekly dramatically changes the value of such internal controls. This more frequent, preventative analysis is integral to the foundation of how Visual Risk IQ works with many of its clients. While the company does perform periodic look-back audits, it also works with technology to accomplish the same queries on a daily or weekly basis. This allows organizations to find duplicate payments or overpayments after the invoice has been approved but prior to its disbursement. So instead of detecting a payment error three or six months after it is made, you prevent the money from leaving the company altogether. 

Oringel provided several client examples where duplicate invoices had been submitted but were not immediately caught. In one instance, Invoice No. 0000878-IN, was paid for $1,617.95. Thirty days later the same vendor re-submitted the same invoice due to non-payment, but it was recorded without the hyphen and was not detected by the system of controls. The problem was that it was the same invoice with slightly different writing on the face of it, and both were scanned into the company’s imaging system and queued for payment. The Visual Risk IQ’s team used data analysis to locate such overpayments, and to identify that the second payment should not be made because it is a match of one that had been previously approved.

In another example, Oringel detailed a query which a compliance practitioner could compare using vendor name and other identifying information, for example address, country, data from a watch list such as Politically Exposed Persons (PEP) or Specially Designated National (SDN), to names and other identifying information on your vendor file. He gave an example where a duplicate payment of more than $75,000 was made. One payment in that amount was made to a law firm named ‘Kilpatrick Stockton’ and the second was made to a different vendor, the law firm ‘Kilpatrick Townsend’. Oringel and his team recognizing that these were related entities, even though they had been established as different vendors in the vendor master. Because of the amount and the date were similar enough as detected by data analysis, the invoices warranted a human inspection. 

Oringel said such an inquiry could also be used to test in other ways. He posed the example if a “vendor has the same surname as a vendor on the specially designated national terrorist list, or a politically exposed person. They share the same name as an elected official down in Brazil. How do we make sure that our vendor or broker is a different John Doe than the John Doe that is a politically exposed person in that country? It is only upon closer inspection where you can determine that the middle names are different and the ages are different, one of has an address is Brasilia and the other is in Sao Paolo.” He noted that until you inspect the other demographic information about your vendors, consultants or third parties and compare them to watch list individuals, you just do not know. That is what data analytics is designed to do, is to help you go from tens of thousands of “maybes” to a very small number of potential issues which need to researched individually.

One of the important functions of any best practices compliance program is to not only follow the money but try to spot where pots of money could be created to pay bribes. Through comparison of invoices for similar items among similar vendors, he has seen data analytics uncover overcharges and fraudulent billings. Oringel said that continual transaction monitoring and data analysis can prove its value through more frequent review, including the Hawthorne effect which states that individuals tend to perform better when they know they are being monitored. 

Oringel emphasized that the techniques used in transaction monitoring for suspicious invoices can be easily translated into data analysis for anti-corruption. Software allows a very large aggregation of suspicious payments “not only by day or by month, but also by vendor or even by employee who may have keyed the invoices” into your system. As these suspicious invoices begin to cluster by market, business unit or person a pattern forms which can be the basis of additional inquiry. Oringel stated, “That’s the value of analytics. Analytics allows us to sort and resort, combine and aggregate, so that patterns can be investigated more fully.” 

This final concept, of finding patterns that can be discerned through the aggregation of huge amounts of transactions, is the next step for compliance functions. Yet data analysis does far more than simply allow you to follow the money. It can be a part of your third party ongoing monitoring as well by allowing you to partner the information on third parties who might come into your company where there was no proper compliance vetting. Such capabilities are clearly where you need to be heading.  

C. Bribes on the Other Side of the Ledger and Best Practices Going Forward

Having previously discussed Visual Risk IQ case studies involving review of employee expenses and duplicate payments to vendors, we now reflect on an area not usually considered; that being rebates and other adjustments to customer revenues that can be a source to create a pot of money to pay bribes. 

Adjustments to revenue (returns, rebates, and discounts) are quite different from the duplicate invoice situation we previously explored, where someone has provided services and then overbills or bills multiple times for those services. A similar mechanism was used by Hewlett-Packard’s (HP’s) German subsidiary to pay bribes where the business unit made fraudulent sales to corrupt distributors, booked the revenue then a couple of quarters later repurchased the equipment at a higher price and pooled those price differences to pay bribes. A similar scheme was used to fund to fund bribes paid to senior level Petrobras employees where corrupt companies would provide a discount to Petrobras yet the money was not rebated or credited to Petrobras but diverted to Swiss bank accounts.

Oringel introduced the techniques that one would use to identify what accountants call a contra-revenue account (CRA), which is generally recognized as the account in which you might record a discount or a rebate. He further explained these are ways through which gross revenue gets reduced and becomes net revenue. This is yet another way a pot of money can be developed from which bribes can paid. 

Oringel and his team have tackled this issue when performing data analytics around rebates. Visual Risk IQ is located in North Carolina, which is a state where there continues to be a large domestic furniture manufacturing industry. This is an industry where rebates, particularly in the form of advertising allowances, are fairly common. He explained the fact pattern similar to the following, a “furniture manufacturer sells an independent dealer a mattress with a wholesale cost of $1,000; and if that mattress brand is advertised and promoted during the 4th quarter, and that mattress sells during the 4th quarter, then the dealer can claim an additional $100 discount to be used for that advertising, yielding them a net wholesale price of only $900 for the mattress.”

Visual Risk IQ was asked by a client to use data analysis to help determine whether there were improper or suspicious claims for advertising allowances by the channel partners of a furniture manufacturer. After comparing the relative discounts between dealers, based on both percentage and absolute dollar amount, the team also began to compare orders by month to advertising allowances claimed. These analyses were used to select dealers for additional scrutiny as part of their advertising allowance rebate program, but this approach was different from prior reviews, which were primarily accomplished using statistical sampling. Oringel built an analysis that compared order size by month with prior claims for advertising allowances for each of the various dealers that were buying the furniture from this manufacturer. By comparing order size to advertising allowance claims, the team identified dealers that were claiming disproportionate allowances relative to orders and expected on hand inventory. Indeed, certain dealers were claiming to have significantly negative on-hand inventory balances during the holiday selling season, based on their past orders and the timing of these large allowance claims. 

Oringel further explained, “by identifying what was estimated to be an expected and minimum on-hand inventory, based on dealer size and prior order history, a forecasted allowance was computed. The additional scrutiny devoted to dealers whose claims yielded unusually low levels of inventory resulted in disallowed rebates and allowances after additional customer sales documentation was not provided as requested.” Visual Risk IQ and the client team “found, as you might expect, since this was the first time that advertising rebates were ever audited with such a data analytics approach, that there were many dealers and channel partners that appeared to be following the rules, but there were also several that really did appear to be problematic.”

I asked Oringel if he could provide any examples where he found issues involving the client’s channel partners. He stated an “example of one of those problematic channel partners was a dealer that had sold almost a year's worth of furniture in a single quarter. To help put some numbers on this. They had purchased, in the preceding 12 months, about 400 units with order size varying between 25 and 50 units each month. Yet nearly all of these 400 units were claimed as Q4 sales, which was the quarter with the largest advertising allowance.” 

The Visual Risk IQ team asked some thoughtful follow-up questions when they compared the pattern of purchases with the sales claimed to be related to the advertising allowances. “Do these orders make sense? Why did they keep ordering February, March, April, 40 pieces, 30 pieces, 40 pieces all year, if they were not selling any of them in Q1 and Q2?” Finally he added, “And how did they get from 400 to nearly zero in Q4?” Using these and other questions together with the data analytics, the company was able to successfully challenge some of the advertising allowance monies claimed by certain dealers. 

These CRA’s are similar to customer rebates, which can be fraught for abuse. Improper accounting of customer rebates can be used to create a pot of money to pay a bribe. However, a CCO may not consider these for review. Oringel’s example shows the power of data analytics for a wide variety of transactions which could be used to pay bribes.   

IV. Best Practices Going Forward

If there has been one consistent message the Department of Justice (DOJ) has communicated since at least the Lanny Breuer days, it is that a compliance program should evolve, both in terms of how your company’s business evolves but also as standards and technology evolves. Data analysis is moving towards the forefront in the realm of best practices. Moreover, use of data analysis can be the only way a CCO or compliance practitioner can have visibility into a large amount of data to determine trends and issues. 

These case studies that Oringel has laid out demonstrate the techniques which can be brought to bear for the compliance function. Finally, it is through the use of data analytics that a CCO can move the compliance practice from detection to preventative to proscriptive so that your program can spot and then stop issues and trends from becoming FCPA violations. 

 

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Thomas Fox - Compliance Evangelist | Attorney Advertising

Written by:

Thomas Fox - Compliance Evangelist
Contact
more
less

Thomas Fox - Compliance Evangelist on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide