An Overview of Instruments and Tools to Detect Fraudulent Financial Statements

The great corporate frauds reported over the past twenty years have damaged and undermined the confidence of stakeholders, shareholders, investors, savers, and regulators. These facts have increased the focus on prevention and detection activities and tools. Fraud is costly, therefore it is very important to prevent and detect the fraudulent event to avoid high losses and damages to corporate financial health. Fraud detection is a dynamic discipline where traditional auditing activity is strengthened with new tools like Benford’s Law and Beneish Model. This paper aims to focus on the analysis of the usage and application of these instruments in detecting accounting fraud.


Introduction
The economic world has experienced a lot of relevant frauds over the past twenty years, in the early 2000s, the financial world was shocked by some of the worst accounting frauds in history, at companies including Enron and WorldCom. These massive scandals, mostly regarding the accounting area, have brought the stakeholders' confidence to a minimal level, causing them big losses. Financial statement figures and information that are free from errors, whether in form of mistakes, manipulations or misestimating, are essential for a correct functioning of capital markets. Dalnial et al [1] said that fraudulent financial reporting usually occurs in the form of falsification of financial statements in order to obtain some benefits, and primarily consists of manipulating elements by overstating assets, sales and profits or by understating liabilities, expenses or losses [2].
Accurate and detailed financial reports allow a more efficient resource allocation and a better contracting [3]. So, the evaluation and the detection of errors in financial statements are some of the most important tasks for economic operators, such as investors, auditors or regulators.
Fraud detection has therefore become one of the highest priorities for capital market participants and other stakeholders in the financial reporting process [4][5], and it is a continuously evolving discipline that nowadays it can take advantage of new technologies and techniques.
This paper aims to analyze two of the most powerful methods to detect financial statement anomalies: the Beneish model and the Benford's Law. Beneish model is a mathematical method which is based on financial ratios and their analysis, meanwhile Benford's Law is an observation about the actual frequency of the digits in the data.

Methodology
This study is primarily a literature-based study with limited use of secondary data, and with more attention on highlighting the concept and building a purpose to use these tools to prevent and detect financial frauds.
The research is based on the analysis of some contributions obtained from the database "Business Source Complete", using as key words: "forensic accounting tools", "fraud detection", "fraudulent financial statement", "corporate fraud" and "fraud prevention". This analysis tries to allow a better understanding of the most widespread tools that can be used in the fraud detection activity. This analysis has shown that the Beneish Model and the Benford's Law are the most applied.
The study submits an overview and a comparison of these instruments. trust, and it is "deception or misrepresentation that an individual or entity makes, knowing the misrepresentation could result in some unauthorized benefit to the individual or to the entity or to some other party".
In his influential work on frauds, Donald Cressey [7] postulated the fraud triangle theory and argued that three key elements of the occurrence or likelihood of fraud are pressure, opportunity and rationalization, where pressure is the element that forces an individual to commit fraud, opportunity is related to the chance and the skills to commit fraud and rationalization means accepting this behavior.
The Fraud Triangle has also been very significant for the regulators; in fact due to the widespread accounting fraud in the early 2000s, the Accounting Standards Board (ASB), in 2002, issued Statement on Auditing Standards (SAS) No. 99, which significantly redefined both fraud detection procedures and a set of risk factors, better known as ''red flags,'' which can support auditors in the identification of patterns of fraud. In order to facilitate the usefulness and effectiveness of SAS No. 99, the ASB based categorized each red flag based on its relation to one of the following elements of the Fraud Triangle: pressure, opportunity, and rationalization.
Following up Cressey's work, Wolfe and Hermanson [8] presented the fraud diamond theory which is generally viewed as an expanded version of the Fraud Triangle. The Fraud Diamond adds to the three factors identified by the Fraud Triangle, an element called capability.
Wolfe and Hermanson [8] claimed that although perceived pressure could coexist with an opportunity and rationalization, it is unlikely for fraud to take place unless the fourth element (i.e., capability) is also present. In other words, the potential fraudster must have the skills and ability to commit fraud.
Bell and Carcello [9] found a support for the existence of fraud triangle conditions for a sample containing financial fraud companies. They estimate a logistic regression model predicting the incidence of fraud and find several risk factors related with fraud: weak control environment, rapid growth, management overly preoccupied with meeting analysts' previews and shareholders' expectations, ownership status, management that lied to auditors or was evasive and an interaction between the control environment and management attitude toward financial reporting.
However, the Bell and Carcello [9] did not find evidence of a significant link between financial fraud and some of the more traditional risk factors associated with fraud, such as significant and unusual related party transactions, declining industry conditions or high management turnover.
To detect manipulation in financial statements using traditional audit tools and procedures is a hard task for auditors and for all the stakeholders too.
Fanning and Cogger [10] also stated that detecting frauds is difficult due to three main reasons. The first is the lack of knowledge concerning the characteristics of fraud management, the second is auditors' lack of experience necessary to detect manipulated financial statements and the third is the ability of managers to derive new techniques to mislead auditors and investors in their reports. Skousen et al. [11] stated that fraud is very common currently and has various types, and financial fraud causes huge losses, not only to investors, but to the country's economy as a whole. Therefore, it is important to prevent and detect fraud before it causes the business to collapse, devastates investors and damages the economy. Hence, knowledge and use of appropriate forensic tools, techniques and models are contingent to the detection of sophisticated frauds in organizations.
Besides having helped to identify some fraud risk factors, SAS No.99, issued by ASB of the AICPA in November 2002 [12] gives the definition of fraudulent financial reporting as: "intentional misstatements or omission of amounts of disclosures in the financial statements designed to deceive financial statement users". According to these definitions, financial statement fraud may be perceived as violation of accounting and auditing standards, laws and regulations enforced by any relevant reporting bodies, intentionally committed the wrong doings with the purpose to deceive the users of the financial statements. Many of today's largest frauds are committed by intelligent, experienced, creative people with a solid grasp of firms' controls and vulnerabilities.

The Beneish Model
Identifying earnings management is important for financial statement users to assess current economic performance, to predict future profitability, and to determine firm value [13].
Literature on earnings management has examined the amount of discretionary and non-discretionary accruals in financial statements, considering these values as the main sources of manipulation. Healy's contribution [14] assumes that profits derive form a cash part and accruals, and an increase of these denotes the presence of a not really cashed income and hence more maneuverable. Accruals include expenditure and revenue that have taken place in a certain period, without generating a cash flow during the same period. There are many different models that estimate accruals, which use accounting ratios or statistic index. Among these works, the most popular are the DeAngelo [18].
The M-score Model was created by Professor Messod Daniel Beneish, who gives the name to the tool.
The Beneish Model is a mathematical model that exploits some financial metrics and ratios to identify the occurrence of financial fraud or the tendency of a firm to engage in earnings manipulation. Data in the organization's financial statements are fed into a model to create the M-score, an indicator that shows the degree to which earnings have been manipulated.
Beneish used data from 1982 to 1992 in COMPUSTAT database to develop the model and the final sample consists of 74 firms that manipulated earnings and 2332 non-manipulators. Tests correctly identified 76 percent of frauds, while generating 17.5 percent of false alarms.
The M-Score is composed of eight ratios that capture either financial statement distortions that can result from earnings manipulation or indicate a predisposition to engage in earnings manipulation [19]. But, there also exists a simplified version of the M-score with only five variables involved. The analysis of the financial statement required at least two period of financial reporting to detect unusual fact and event. However, to identify the trend of the company's financial reporting, it could be better to conduct a five-year comparison.
The model includes eight variables: 1. Days Sales in Receivables Index (DSRI): DSRI is the ratio of days sales in receivable measures the ratio of days that sales are in accounts receivable in a year compared to that of the prior year. This variable gauges whether receivables and revenues are in or out-of-balance in two consecutive years. An index higher than 1 describes the increased percentage of non-cash sales compared to the prior year. A disproportionate increase of accounts receivable may be indicative of an inflation of revenues.

Gross Margin Index (GMI):
GMI is the ratio between gross operating margins from two following years. It identifies the variation of gross margin and if it is higher than 1 shows that the profits has decreased in the period analyzed with the consequence that the firm is likely to manipulated its revenues and expenditures.

Asset Quality Index (AQI):
AQI is the ratio of current and non-current asset (property, plants and equipment) to total assets in one year to the prior year. An increase in the index may mean more expenses that are being capitalized to improve or maintain the same level of profitability. From the other side, a value of index higher than 1 indicates that the firm has potentially increased its intangible assets or its cost deferral, generating a potential earnings manipulation.

Sales Growth Index (SGI):
SGI measures the growth in revenue and if it is greater than 1, it will indicate a positive growth in the year under review.

Depreciation Index (DEPI):
DPI is the ratio of depreciation expense and gross value of property, plants and equipment (PPE) in one year over a prior year. An index that is higher than 1 could reflect an upward adjustment of the useful life of PPE.

Sales General and Administrative Expenses Index (SGAI):
SGAI is calculated as the ratio of SGA to sales in a year to the corresponding measure in prior year. The variable is taken into account in the Beneish Model following Lev and Thiagarajan's [20] suggestion that analysts would interpret a disproportionate increase in sales as a bad signal about firms' future prospects. This may increase the possibility of earnings manipulation.

Leverage Index (LVGI):
LVGI measures the ratio of total debt to total assets, giving a representation of the long term risks of a company. A LVGI that is higher than 1 indicates an increase in leverage. The variable is included to capture debt covenants incentives for earnings manipulation.

Total Accruals to Total Assets (TATA):
TATA measure the quality of cash flows of the firms. Total accruals are determined as the change in working capital accounts other than cash less depreciation. An increasing level of accruals as part of total assets may indicate a higher chance of manipulation.
To define the M-score, the index is included in the following formula: M-score = -4.84 + 0.92 DSRI + 0.528 GMI + 0.404 AQI + 0.892 SGI + 0.115 DEPI -0.172 SGAI + 4.679 TATA -0.327 LVGI A manipulation score of less than -2.22 suggests the company will not be a manipulator. An M-Score of greater than -2.22 signals that the company is likely to be a manipulator.
Beneish in his work highlight that the model can make two different types of error: it can classify a firm as a non-manipulator when it actually manipulates (Type I error) and it can classify a firm as a manipulator when it does not manipulate (Type II error).
A Type II error produces a much higher cost than a Type I because the impact of failing to discover a fraud could be detrimental, whereas the cost of a false alarm is lower, given that investors can allocate their capital to a large number of other stocks. Due to this reason Beneish recommends a cutoff point of -1.89 to balance the cost of Type II error (missing the fraud) over that of a Type I error (false alarm).
According to Beneish, companies identified as manipulators typically lose about 40 percent of their market value on a risk-adjusted basis in a quarter. Assuming a typical equity gain of 1 to 2 percent per quarter, it would take the gains of 20 to 40 non-manipulators in the same portfolio to offset this single loss. Therefore, the relative error cost of a Type II compared with a Type I is 20 to 40 times. Beneish derived his cutoff point of -1.89 from the relative error cost of 40 times.
Many researchers have applied the Beneish model to some of the biggest corporate scandals in order to identify financial statement manipulations. John Maccarthy [21] noted that if the Beneish model had been applied to Enron Corporation, the scandal could have been discovered in a proactive manner as early as 1997, significantly before it petitioned for insolvency in 2001. In another analysis, Drabkova [22], who tested several different statistical and mathematical models available for Fraudulent Financial Reporting detection, found out that the Beneish Model were the most effective in identifying the financial health of an organization. However, there are other studies that proved that the Beneish Model is not an ultimate detector of fraud, and that the ratios used in the model can mainly help report the problematic areas for a deeper and more analytical review. In Cynthia's work [23], it was proven that the Beneish model did not have the ability to consistently discover problems in FFR. Ugochukwu, Ikechukwu and Azubuike [24] compared the effectiveness of the Beneish model on relevant items in the financial reports of 11 selected manufacturing companies in Nigeria for the period 2008-2013. The results showed that the five-variable version appeared to be more effective in predicting existing risks of material misstatement.

Benford's Law
Benford's Law [25] is an advanced digital analysis technique based on the examination of the actual frequency of the digits in the data to detect potential anomalies and manipulations.
This law was first observed by Simon Newcomb [26] in 1881 and was rediscovered by Frank Benford in 1938, who tried to explain his functioning through several tests.
It is a mathematical tool that proposed a probability distribution for first, second and other digits of numbers in data sets, as it gives the expected proportions of the leading digits of tabulated numerical data. The first digit of a number is the leftmost digit and leading zeroes are ignored. For example, the leading digit of 3.14, 0.314 and -3.14 anyway is 3.
Benford's distribution is not uniform with smaller digits, being more likely to occur than the larger digits, as the law calculates that numbers in sets of data with low first digits, such as 1, occur with more frequency than numbers with high first digits, like 8 or 9.
The formula behind the Benford's Law is the following: B(c) = log10 (1 + 1/c) Valid and non-manipulated data, without exceptional transactions, will follow the projected frequencies.
Benford's Law was introduced to the auditing and accounting literature and researchers have since used these digit patterns to detect data anomalies. This mathematical tool can be used to detect fraud in accounting statements because manipulated numbers tend to deviate significantly from the anticipated frequencies.  Benford's Law principle has been found to apply to many sets of financial data, including corporate disbursements. This tool is commonly being used in identifying fraud in insurance claims, corporate income tax, employee expense reports, vendor invoices, accounts receivable, accounts payable and also fixed asset records.
It is interesting to highlight that there are several tests which can be run using Benford's Law, not only on the first digit, but also on the second, the first two combined or the last two digits.
Audit software can employ digital analyses using Benford's law to identify fraud and other irregularities in accounts payable, income tax forms, claims payments and other disbursements. It is very difficult for people to make up credible numbers, as invented numbers are unlikely to follow the law. Thus, this principle can be tested by the audit staff to spot irregularities, including possible error, fraud detection, or other anomalies. Auditors have usually applied different types of digital analysis when they perform analytical procedures. Drake and Nigrini [27] say that Benford's Law applied to auditing is simply a more complex form of digital analysis. It looks to an entire account to determine if the numbers fall into the expected distribution instead of scan a sample. Boyle [28] shows that data sets that follow Benford's distribution when the elements result from variables taken from different sources that have been divided, multiplied or raised to integer power. This explains why certain data sets of accounting numbers often appear to closely follow the predicted distribution of Benford's Law. In fact, accounting numbers are often the results of a mathematical combination (e.g. account receivable or cost of goods sold).
To determine if a population of data fits the Benford distribution, there are some tests that can reveal whether or not Benford's Law applies to a particular data set. Wallace [29] states that the relationship between the mean and the median of a set of numbers can signal if a set of data may conform to Benford distribution, saying that if the mean of a set of data is bigger than the median and the skewness value is positive, the data set is likely to conform to the Benford's Law. It follows that the larger the ratio of the mean divided by the median, the more closely the set will follow the expected distribution.
Nigrini [30] was the first to show that Benford's Law could be used as an indicator of fraud in the area of accounting, auditing and taxation. He analyzes the first-two digits of payroll data known to be fraudulent and finds that the fraudulent numbers deviate significantly from Benford's Law. Later, he applied the method of Benford's Law with the aim of detecting tax evasion by analyzing tax returns on the U.S. [31].
Durtschi et al [32] summarized when it is appropriate to use Benford analysis, and when it should be used with caution.
Using Benford's Law, one must start with measuring deviation. The deviation of the distribution of digits between what is observed and what is expected in many ways. One method is the "Chi Square" test, a standard statistical test for measuring the degree of similarity between elements in a table. Based upon this statistic, and the amount of "degrees of freedom", it is possible to assign a probability that any variation between actual and observed is due to chance alone.
The higher the Chi Square, the less likely that any difference can be explained by chance alone.
Output from the processing will include measures of Benford's compliance using D-statistic. D-Statistics is a test that relies on the fact that the value of the sample cumulative density function is asymptotically normally distributed. For D-statistics test, values in excess of 0.10 indicate that the observed distribution differs significantly from what is expected and does not conform to Benford's Law.

When Benford Analysis is likely useful Examples
Set of numbers that result from mathematical combinations of numbers -Results come from two distributions

Conclusions
Corporate fraud is a serious and dangerous threat for any firms and for the global economic system too, as they generate a decrease of the stakeholders' confidence. Nowadays, prevention of corporate frauds represents an absolute necessity to not only restore the confidence of stakeholders but also to aid the country's overall economic stability and progress Both big and small enterprises are exposed to this threat and they often do not realize how much important an effective fraud detection activity could be.
This threat is greater when firms have weak control structure and ineffective corporate governance system. Although fraudulent financial reporting reported relatively low percentage of cases but they cause huger losses. Due to the fact that fraud is costly, prevention and rapid detection of the fraudulent event are a must for every organization.
Detection of accounting frauds is not an easy task, and it requires the usage of tools that are different respects to traditional audit procedures and techniques; the Beneish Model, with its manipulation score, and the Benford's Law are two of most appreciated tools in forensic accounting environment.
These tools are part of the statistical techniques that are used to improve the probability to detect an accounting fraud in addition to auditing activity.
These statistical techniques had grown in term of acceptance among the users in identifying anomalies or red flags in financial statements. Benford's Law permits to apply data mining tool trying to catch accounting frauds. On the contrary, Beneish Model focuses on the application of ratio analysis to detect financial statement anomalies by the comparison between, at least two years of financial data.
The application of these two tools will allow users of accounting data and auditor to have better chances to identify anomalies, which could be caused by manipulations or frauds.
The following table is a brief comparison between the Beneish Model and the Benford's Law.