Destop Tools for Preventing and Detecting Fraud

Page 1

Desktop Tools for Preventing and Detecting Fraud

Fraud remains at epidemic levels in the United States, so we must do a better job at preventing and detecting these crimes in businesses of all sizes. In this session, you will learn how common frauds are committed and how you can use desktop technology tools such as Excel and Access to prevent and detect fraud. Do not miss this opportunity to reclaim profits lost to fraud!

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Fraud in Business Today Fraud is alive and well! From large, multinational companies to small, family-held businesses, fraud continues to plague American businesses with annual costs reaching into the billions of dollars. In this session, we will examine the breadth and depth of fraud in the United States to illustrate the need for more efficient and effective techniques for preventing and detecting the schemes that are draining profits and productivity from the economy. The Association of Certified Fraud Examiners (ACFE) defines occupational fraud as The use of one's occupation for personal enrichment through the deliberate misuse or misapplication of the employing organization's resources or assets. These frauds have four common characteristics: •

The activities are clandestine.

The activities violate the perpetrator's fiduciary responsibilities and positions of trust within the employing organization.

The activities are committed for personal enrichment, either directly or indirectly.

The activities exact a cost on the employing organization.

Occupational fraud is costing organizations 5% of their annual revenues, or about $2.9 trillion worldwide, based on 2010 estimates. The average loss amounts to $160,000 per instance, with 25% of all fraud losses exceeding $1 million. Small businesses are disproportionally affected, both in incidence of fraud and also in losses relative to earnings. According to the ACFE, asset misappropriation frauds lasted a median of 18 months before discovery, while financial statement frauds lasted 24 months. More than 80% of frauds are committed within one of six departments: accounting, operations, sales, executive management, customer service, and purchasing. Occupational fraud falls into three broad categories: 1) misappropriation of assets, 2) corruption, and 3) financial statement fraud. In the 2010 ACFE report, misappropriation of assets was the leading source of fraud, representing 90% of cases, but they were also the least costly, causing a median loss of $135,000. Financial statement fraud was the most costly, with a median loss of more than $4 million, but it made up less than 5% of the frauds identified in the report. Corruption schemes fell in the middle, comprising just under one-third of cases and causing a median loss of $250,000. Behaviorally, fraud perpetrators often display warning signs that they may be engaging in fraudulent activity. The red flags most often exhibited are living beyond their means (43% of all cases) and experiencing financial difficulties (36%). Quite often, multiple symptoms are present; among them are divorce or other family problems (17%), drug or alcohol addiction (12%), never taking vacations (10%), or having past employment problems (10%). However, only 15% of fraudsters had ever been previously charged or convicted of a fraud-related offence.

1

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 1 displays a comprehensive system for classifying occupational frauds. The figure was taken from the 2010 ACFE Report to the Nations on Occupational Fraud and Abuse.

2

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 1 – Occupation Fraud Classification System

3

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Now that we have a good understanding of the scope of the fraud problem facing businesses today, we will turn our attention to desktop tools for preventing and detecting fraud. We will focus on three readily available tools: 1) Microsoft Excel, 2) Microsoft Access, and 3) ActiveData. ActiveData is a dedicated tool for performing fraud analysis on financial data similar to ACL or IDEA, but at the palatable price of $250 per user. ActiveData is an add-in for Excel that uses pre-defined automated processes to overcome the general lack of user knowledge and skill required to perform sophisticated fraud analysis in Excel.

General Purpose Analyses Accountants conduct tests and procedures during an audit or examination to help determine whether errors exist. If any of the errors uncovered are intentional, then fraud is present. Among the types of tests and procedures performed are horizontal and vertical analysis, ratios analysis, trend analysis, and regression analysis, as well as calculation of descriptive statistics and sample stratifications.

Horizontal and Vertical Analysis Horizontal analysis and vertical analysis are analytical procedures that can be used to identify relationships between numbers which appear to be out-of-line. Excel is often the tool used to facilitate these two tests. In horizontal analysis, numbers are compared over time. Vertical analysis, on the other hand, compares numbers within the same time period.

Figure 2 – Vertical and Horizontal Analysis Performed in Excel Note that while the horizontal analysis shows that the Cost of Goods Sold percentage increased only 2%, from 56% in 2012 to 58% in 2013, the vertical analysis shows that Cost of Goods Sold increased at a rate 155% greater than that of Revenue over the same period. Whether this anomaly is caused by fraud cannot be determined without further investigation, but we do know that additional work should be done.

4

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Trend Analysis Trend analysis is another technique that can provide valuable insight to auditors and fraud examiners. Again, Excel is an excellent tool for conducting this type of test. In a trend analysis test, numbers are compared over time in an attempt to identify potentially erroneous or fraudulent values. For example, auditors and fraud examiners can perform trend analysis by charting the data of interest, as shown in Figure 3.

Figure 3 – Sales Trend Analysis Chart in Excel The chart makes it visibly apparent that fourth quarter sales spike each year, followed by a significant drop in first quarter sales. Note that the spike is more pronounced in the fourth quarter of 2013. This could be the result of normal seasonal patterns or from channel stuffing used to overstate revenue and net income. Again, the auditor or fraud examiner has more work to do before making any conclusions.

Statistical Measures and Summarizations Statistical measures are often used for identifying outliers, numbers that fall outside expected values. Excel includes a number of statistical functions (SUM, COUNT, AVERAGE, MEDIAN, MODE, STDEV, MIN, MAX, VAR, KURT, etc.) to assist in the calculation of expected values and in the development of statistical measures, such as the median, mode, mean, standard deviation, minimum value, and

5

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


maximum value. Excel also includes an add-in for performing a full range of statistical analyses; among them is the ability to calculate descriptive statistics of a sample. Simply select Data Analysis, Descriptive Statistics from the Data tab of the ribbon. The panel on the left of Figure 4 displays the descriptive statistics of a sample of sales transactions calculated in Excel. Excel Descriptive Statistics of a Sample

ActiveData Descriptive Statistics of a Sample and a Population

Figure 4 - Descriptive Statistics Output

Similarly, ActiveData can produce descriptive statistics for a sample or population with the click of a button. Simply select Statistics on the ActiveData tab and then select the fields on which to calculate the statistics, as shown in the panel on the right in Figure 4.

Stratifications Stratifications of data are often useful in identifying areas of greatest risk. For example, the stratification shown in Figure 5 clearly indicates that those transactions with an amount greater than $1,000 carry the greatest risk. Although they represent only about 44% of the transaction volume, they include over 91% of the total value of all transactions.

6

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 5 - Stratification of Sample Data Excel's SUMIFS and COUNTIFS functions are very useful when attempting to stratify data. Both of these functions allow the introduction of one or more conditions to an otherwise routine COUNT or SUM function. With SUMIFS and COUNTIFS, stratifying data in Excel based on conditions becomes a very simple exercise. ActiveData can be used to stratify data easily. Select Strata from the ActiveData tab, define the stratification bands, and then click Stratify. Click Finished to write the results along with a chart to a new sheet in the workbook, as shown in Figure 6.

Figure 6 – Stratification Results and Chart in ActiveData

7

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Regression Analysis Regression analysis may be one of the most important tools an auditor or fraud examiner has available. Yet, many choose not to utilize this tool because they view regression analysis as being too difficult to understand or too time-consuming to perform. Fortunately, Excel is an excellent tool for performing regression analysis. Defined simply, regression analysis is the process of predicting or estimating one variable (the dependent variable) based on the values associated with one or more independent variables. For example, we may use regression analysis to predict sales (within a tolerable range of error) based on advertising expense or the previous month's sales. Because regression analysis provides a structured and organized approach to establishing expected values, it is useful for developing the expectations on which analytical procedures are based. Excel's more powerful regression analysis functions are contained in the Data Analysis ToolPak. To access them, select Data Analysis from the Data tab and choose Regression. Enter the cell references for the existing dependent variable(s) in the area for Input Y Range and the cell references for the existing independent variable(s) in the area for Input X Range. Note that Excel can perform regressions on up to sixteen different independent variables. Click OK to generate the output shown in Figure 7. An R-Square that is greater than or equal to .80 or less than or equal to -.80 indicates that the regression model "fits" the data.

Coefficients are used to predict the dependent variable (Y), in this case Payroll Expense.

Figure 7 - Sample Regression Analysis Output Fortunately, practitioners do not need to have a PhD in statistics to interpret the output. Though not intended to provide a complete explanation of each of the items mentioned, the following summary serves as a useful primer in interpreting the output. First, since the R Square value is greater than 0.80,

8

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


the regression shows a good "fit" of the data. As a rule of thumb, a good fit exists when R Square is between 0.80 and 1.00 or -0.80 and -1.00. Second, based on the regression analysis, payroll expenses are estimated to be $709,414 (54,514 plus (5.55 times 118,000)). If the recorded payroll expenses significantly deviate from the regression estimate, this would call into question the legitimacy of the recorded amount. Similar results can be achieved by plotting the data on a scatter (XY) chart and then adding a linear regression trend line.

Accessing Data Using ODBC ODBC is a standard database access method developed by the SQL Access Group in 1992. With ODBC, it is possible to access data stored in any ODBC-compliant database from within any ODBC-compliant application. ODBC manages this by inserting a middle layer, called an ODBC driver, between the application and the database management system (DBMS) that translates the application's data queries into commands that the DBMS understands. ODBC can be used to provide two-way integration between Excel and the general ledger, although generally ODBC is used only to extract data. Most accounting software applications support ODBC. If your accounting application supports ODBC, then Excel can access historical data directly in the general ledger without re-keying any data or going through the typical export-import process used by most management accountants. Using ODBC, Excel can extract data into an Excel Table, a PivotTable, or a PivotChart. Figure 8 depicts the data access process. ODBC can be used from within Microsoft Office Access to extract data from G/L applications when the number of records extracted exceeds Excel's grid size of 1,048,576 rows. Excel can then be used to query and report the data stored in Access. Access can also be used to store and maintain data extracted from the G/L without providing users direct access to the accounting application, thereby reducing the security risks associated with direct access. Using an intermediary database, such as Access, in the extraction and analysis process also allows auditors to examine and manipulate data in Excel without modifying the source transaction data stored in the database.

Figure 8 - Accessing Data in Excel Using ODBC

Using ODBC in Excel to Query Data Stored in Access In this example, a large number of transaction records have been extracted for examination from a general ledger into Microsoft Access. To analyze the data in Excel, select Data, From Other Sources on

9

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


the ribbon and then choose From Microsoft Query. In the resulting Choose Data Source dialog box, select MS Access Database as the source of the data and click OK. Select the Access database to be queried and click OK. Then, select the fields to be included in the query and the order in which the fields are to be displayed by clicking on a field and utilizing the directional arrows, as shown in Figure 9. If the queried data is to be filtered – and most likely it will need to be because of the volume of data – specify the filters, also shown in Figure 9. In this case, only checks greater than $1,000 will be included in the data queried from Access into Excel. If the data needs to be sorted, that can be specified as part of the query also. Alternatively, the data can be sorted in Excel after it's imported. In this case, the checks will be sorted in ascending invoice date order.

Select the fields to be included in the output and then filter the data accordingly.

Figure 9 – Querying Data Stored in Access from within Excel Choose to return the data to Excel, select whether to present the data as a Table, PivotTable Report, or as a PivotChart and PivotTable Report, and whether the data should be returned to an existing worksheet or to a new worksheet. As shown in Figure 10, the data has been queried, filtered, sorted, and returned to Excel. Once in Excel, users may apply any of Excel's commands or functions to analyze the data. Importantly, changing the data in Excel will not cause changes in the underlying Access database. This means that the auditor or fraud examiner does not run the risk of accidentally changing source transaction data by application of an audit test.

10

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 10 – Data from Access Returned to Excel Table Note that the Excel Table is dynamically linked to the underlying data stored in Microsoft Access. The table can be updated for changes in the underlying data by simply refreshing the table. Right-click on the face of the table and select Refresh from the context-sensitive menu. This is very useful in situations where tests are made repetitively over a number of periods or when the same tests are performed on different data sources, such as different company files within QuickBooks.

Saving Queries for Future Recall After creating a query, the query may be saved for reuse in the future. This can occur in one of two ways. First, when a query is created, and a workbook is saved, the query is saved with the workbook. Whenever that workbook in opened, the saved query is available. Second, users can save a query independently of a workbook. This method allows the query to be utilized in other workbooks. To save a query outside of a workbook, click the Save Query button in the Query Wizard – Finish dialog box before returning the data to Excel.

Billing Schemes - Payments to Fictitious Vendors One of the most prevalent frauds committed against small businesses (due to inadequate segregation of duties), billing schemes often involve submitting invoices from fictitious vendors for payment. When payments are made to a fictitious payee, the perpetrator intercepts the check and negotiates it. Billing schemes, such as payments to fictitious vendors, are involved in over 26.0% of cash misappropriations and generate median losses of $128,000, according to the Association of Certified Fraud Examiners. As indicated in the April 2005 issue of The CPA Journal, billing schemes often leave a number of red flags: •

An employee's home address matches a vendor's address

An employee's initials matches a vendor's name

11

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Checks written to "Cash"

Using PO boxes for vendor addresses

Missing vendor data

Improperly formatted vendor data

A large number of vendors added to the vendor master file

Using Queries to Uncover Billing Schemes Many of these symptoms of billing scheme fraud can be uncovered using simple queries in Access or Excel. Accounting data can be queried to uncover changes in the vendor list, vendors with the same address as an employee, vendors without street addresses, and those without proper EIN numbers. In this example, Access will be used to identify vendors with PO Box addresses in vendor data imported using ODBC from a QuickBooks company file.

Figure 11 – Access Query to Identify Vendors with PO Box Addresses

12

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


To identify vendors with PO Box addresses, select Create, Query Design to open the Show Table dialog box. Select Vendor on the Tables tab and then click Add. Click Close to close the dialog box. Drag each of the fields VendorAddressAddr1, VendorAddressAddr2, and VendorAddressAddr3 in turn to the Query by Example grid. Set the Criteria for VendorAddressAddr2 to Like "P*O*" and the Criteria for VendorAddressAddr3 to Like "P*O*". Make sure to specify the criteria for VendorAddressAddr3 on an Or line, as shown in Figure 11. Otherwise, the criteria will use the conjunction "and," thereby requiring both criteria to be true in order for matching records to be returned. Click ! to run the query. Alternatively, this same query can be performed in Excel. Using the ODBC techniques identified earlier, begin the query process by selecting QuickBooks Data as the ODBC data source. Select the Vendor table or specific fields in the Vendor table. In the Finish dialog box, select Edit Query in Microsoft Query. To identify vendors with PO Box addresses, use the query settings displayed in Figure 12. Note that the criteria syntax in Microsoft Query is slightly different than in Access. In Microsoft Query, the criteria are specified Like 'P%O%'.

Figure 12 - ODBC Query to Identify PO Box Addresses Similarly, vendors without proper EIN numbers specified in their record can be identified with simple queries. To identify vendors without proper EIN numbers in Excel, use the query settings displayed in Figure 13. An underscore ( _ )is used to denote a numeric placeholder in Microsoft Query. In Access, a numeric placeholder is specified with a hash ( # ).

13

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 13 - ODBC Query to Identify Improper EINs To identify vendors with the same address as an employee requires that the Employee table and the Vendor table be joined. Simply drag the EmployeeAddressAddr1 field and drop it on the VendorAddressAddr2 field, as shown in Figure 14. The resulting query will identify the employees and vendors with the same address.

Figure 14 - Query to Identify Employees with the Same Address as a Vendor

14

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


To identify changes to the Vendors list, a simple query that returns all fields in the Vendors table is sufficient. Import the data as an Excel Table and then filter the Time Modified field accordingly. The resulting list will identify all vendors added or modified during the period under review.

Finding Duplicate Payments Another way that fraudsters may commit a billing scheme fraud is to make multiple payments on a valid invoice. Generally, when a legitimate vendor receives a duplicate payment, the vendor notifies the customer and asks how the duplicate payment is to be handled: 1) issue a credit against future invoices, or 2) issue a refund check. Fraudsters will request a refund and then intercept and negotiate the refund check. Identifying and investigating duplicate payments becomes an important part of the fraud prevention and detection process. Microsoft Access has a query wizard for identifying duplicate values in a field or across multiple fields. From a fraud detection standpoint, this feature can be very effective in identifying potential duplicate transactions, such as duplicate payments of the same invoice. To begin the process, select Create, Query Wizard from the ribbon. In the New Query dialog box, select Duplicate Query Wizard and then click OK, as shown in Figure 15.

Figure 15 – Find Duplicates Query Wizard in Microsoft Access

15

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Next, select the source data to query. The source can be a table, a query, or a combination of the two. In this case, select the QB Check Data table. Once the data source has been chosen, select the field or fields containing possible duplicate values, and then select any additional fields to be displayed in the query results. In this case, select Name and Amount as the fields that contain the possible duplicates, and then choose to display all fields in the results. Finally, name the query for future recall and click Finish to view the results as shown in Figure 16.

Figure 16 – Duplicate Payments Identified with Access Query ActiveData also has an automated process for finding duplicates in a single field or across multiple fields similar to Access. To begin the process, select ActiveData, Duplicates on the ribbon. In the Duplicates dialog box, select Name and Amount in the Columns to Analyze and click Finished, as shown in Figure 17. ActiveData will insert a new worksheet in the workbook that identifies the duplicates data points.

Figure 17 – Creating an Extract Duplicates Query in ActiveData

16

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


With a little ingenuity, duplicates payments can be identified easily in Excel. First, convert your data range to an Excel Table. Then, add a calculated field to combine the fields that contain potential duplicates. In this case, use the CONCATENATE function to combine the Name and Amount fields. Use conditional formatting to highlight the duplicates, and then filter by the color used to identify the duplicates as shown in Figure 18.

Figure 18 – Finding Duplicates in Excel

Sales Bonus or Commission Fraud Salespersons can positively affect their earnings by selling products and services without an expectation of collection. By selling to less credit worthy customers, salespersons can more easily meet their sales targets (and thereby earn bonus incentives) and/or earn commissions on the sales. In larger companies, this moral hazard is overcome by paying commissions or bonuses only on collected sales. In smaller companies, affordable accounting applications are generally not sophisticated enough to calculate commissions on collected sales. These same companies may not have sufficient accounting staff to calculate collected sales in off-book spreadsheets, so they pay commissions and bonuses on sales whether they are collected or not. In these cases, a simple aging of accounts receivable by salesperson can be used uncover these types of frauds. If the aging identifies a large number of past-due accounts for a given salesperson, this is indicative of possible false sales to generate commission or bonus payments.

Using a PivotTable to Uncover Bonus or Commission Fraud In this example, an aging of accounts receivable by salesperson will be prepared using a PivotTable connected to MAS 90 with ODBC. Fields will be selected from a single table, the customer master file, AR_Customers. Expand the field list by clicking on the plus (+) sign to the left of the table name in the left-hand pane. Then, choose the fields that are to be displayed in the report and click on the (>) button to move them to the right-hand pane, as shown below in Figure 19. Select the following fields:

17

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


CustomerNo, State, SalespersonNo, AgingCategory3, and Aging Category4.

CurrentBalance,

AgingCategory1,

AgingCategory2,

Figure 19 - Select the Fields for the Report After the required fields are selected, click Next several times. In the Query Wizard - Finish pane, select Return Data to Microsoft Office Excel and click Finish. In the Import Data dialog box, select PivotTable Report, specify the location for the report, and then click OK. Next, drag and drop the fields to the appropriate report quadrants in the PivotTable Task Pane, and then rename and format the column labels. To complete the table, drag SalespersonNo to the Row Labels quadrant to reveal the sales and collection performance of the sales staff. The completed PivotTable is shown in Figure 20.

Figure 20 - Aging of Accounts Receivable by Sales Person Remember, our PivotTable is dynamically linked to the underlying data in MAS 90. To use this report in the future, retrieve the workbook and click Refresh to produce an updated report based on the transactions and postings as they exist in MAS 90 at the time of refresh.

18

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Financial Statement Fraud Financial statement fraud is generally committed by overstating assets, recording fictitious revenues, or understating liabilities and expenses. The following lists contain some of the common warning signs associated with each of these types of fraud.

Overstating Assets •

Journal entries initiated or recorded by members of senior management team, particularly at year-end

Large number of journal entries in round dollar amounts

A significant amount of slow-moving inventory items

A significant percentage of accounts receivable being past due with relatively little bad debt expense

Recording Fictitious Revenues •

Relatively large amounts of customer refunds or adjustments immediately after year-end

New customers existing on the customer master file with missing information

Average sales per customer increasing significantly in the month/quarter which includes year-end

Understating Liabilities and Expenses •

Journal entries initiated or recorded by members of senior management team, particularly at year-end

Average purchase per vendor decreasing significantly in the month/quarter which includes year-end

Relatively high amount of expense recorded in month after year-end

Relatively large number of open purchase orders at year-end where the "required by" date has been exceeded

Uncovering Financial Statement Fraud Many frauds involving overstatement of assets or understatement of liabilities or expenses involve management override of internal controls and often involve journal entries recorded or authorized by executive management. In this example, selected fields from the Journal table have been extracted into an Excel Table from a QuickBooks company file. Two calculated fields are used to find even $1000 transactions and those transactions entered on Saturday or Sunday, which for this company are outside of the normal business week. After the formulas are created, the results can be filtered to identify specific transactions of interest. Note the formulas used to identify transactions of interest in Figure 21. One formula uses the WEEKDAY function to identify the day of the week that the journal entry was recorded. Transactions entered on Saturday or Sunday evaluate to "7" and "1" respectively. The other formula uses the MOD function to

19

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


calculate the remainder of the debit amount divided by 1000. Transactions entered in even 1000 dollar amounts generate a result of "0".

Figure 21 - Identifying Suspicious Journal Transactions ActiveData can be used to identify transactions recorded on specific days or recorded as even dollar amounts. To identify transactions recorded in even thousands of dollars, select ActiveData, Query Sheet, Query by Formula. In the Query by Formula dialog box, type in the following formula: MOD(JournalDebitLineAmount, 1000) = 0, as shown in Figure 22. Click Finished, and ActiveData will insert a new sheet in the workbook that contains just those transactions that are evenly divisible by 1000.

Figure 22 – Specifying a Query in ActiveData to Identify Transactions in Even Thousands

20

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Similarly, ActiveData can be used to identify transactions that were recorded on specific days of the week using the WEEKDAY function as illustrated previously.

Identifying Gaps in Numbers Gaps in sequence numbers of checks, invoices, or bills, etc., may indicate fraudulent activity. Out of sequence numbers (the missing checks, invoices, or bills) can be quickly identified with ActiveData. To begin the process, select ActiveData, Gaps. In the Gap Analysis dialog box, select the column to analyze, in this case Num (the check number field). Click Preview to display the gaps detected in the Preview pane, as shown in Figure 23. Click Finished to write the results to a new worksheet in the workbook.

Figure 23 – Using ActiveData to Identify Gaps in a Series of Check Numbers Gaps in numbers can be quickly identified in Excel with a macro. Figure 24 contains the macro code for identifying missing numbers. The macro identifies any number in a series that is not equal to the previous number in the series plus one. Any missing numbers are written to a list.

21

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 24 - Macro to Find Out-of-Sequence Numbers

Understanding and Implementing Benford's Law Astronomer Simon Newcomb in 1881 noticed that the pages of his logarithm books which started with lower numbers were much more worn than the pages of his logarithm books which started with higher numbers. Newcomb postulated that the reason for this was numbers beginning with smaller digits – such as "1" and "2" – occurred more frequently than numbers beginning with higher digits – such as "8" and "9;" therefore, he and others using the logarithm books would have used the pages with numbers beginning with smaller digits more than the pages beginning with larger digits. Newcomb published his findings in The American Journal of Mathematics in 1881. Subsequently, in 1938, Frank Benford made essentially the same observation and came to the same conclusions as Newcomb while working as a physicist for GE Research Laboratories. Benford then tested and validated his theory – which became known as Benford's Law despite Newcomb's work a half century earlier – against twenty different lists of data with over 20,000 observations.

22

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Benford's Law is a powerful tool in detecting fraud because it allows examiners to test numbers against recognized norms. In doing so, significant deviations from these expected values can be investigated to determine if there is evidence of fraud. Benford's Law is especially useful at identifying bogus transactions and at spotting transactions manipulated to avoid authorization and approval. By computing the actual distribution of digits and comparing them with expected values, auditors can determine which digits are oversampled – those digits that show greater occurrences than expected. This allows auditors to concentrate their sampling and investigation in those areas having the greatest probability of uncovering fraudulent activity. For example, if the first two digits "86" occurred at a rate much greater than expected, auditors could adjust their testing to sample all or a greater proportion of transactions whose amounts began with "86." For accountants and auditors faced with the suspicion of bogus or illegal transactions entered by employees, Benford's Law provides an excellent means of highlighting transactions that do not fit expected norms. However, be aware that if the perpetrators are aware of Benford's Law, then it is likely that they will cause the bogus transactions to fit the expected distributions defined by Benford's Law. In those cases where there is no suspicion of bogus transactions, applying Benford's Law can still be useful as a quick check to identify any suspicious pattern in recorded transactions.

Applying Benford's Law ActiveData provides an automated process for applying Benford's Law. Begin the analysis by selecting ActiveData, Digital Analysis to open the Digital Analysis Using Benford's Law dialog box.

Figure 25 – Setting Options for Benford's Analysis in ActiveData In the dialog box, select InvoiceAmount as the Column to Analyze and set the Confidence Level to 95.00%. Check Chart Results, First Digit Test, Second Digit Test, and First 2 Digits Test, as shown in

23

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 25. Click Finished, and ActiveData will complete and display the analysis on a new worksheet. The results of the First Digit Analysis are displayed in Figure 26.

Figure 26 – Results of First Digit Analysis Performed in ActiveData Note that "8" and "9" are significantly oversampled, as indicated by the magnitude of the Z Statistic. An examination of the results of the First 2 Digits Test (not shown) indicates that "83," "88," "95," and "97" are significantly oversampled. The auditor should concentrate his examination on those transactions whose invoice amounts begin with the oversampled numbers. Using familiar Excel commands such as LEFT, MID, and COUNTIFS, auditors can construct a Benford's Law test in Excel for any single digit or for the first two digit probabilities. The first step in applying the first digit test is to extract the first, second, and first two digits of the invoice amount. Auditors and fraud examiners can accomplish this by creating the following formulas in cells C2, D2, and E2, and then copying the formulas down to the extent of the data set.

Benford Test First Digit Second Digit First 2 Digits

Formula =LEFT(B2,1) =MID(B2,2,1) =LEFT(B2,2)

24

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


These formulas extract the appropriate digits from the invoice amount so that the number of occurrences can be counted and the proportion of each digit in the sample can be calculated. The worksheet in Figure 27 displays the results after the formulas are entered and copied down.

Figure 27 – Extracting Digits from the Invoice Amount in Excel Next, it is necessary to count the number of occurrences of each digit now contained in column C and compare that to expected norms as defined by Benford's Law. To do this, add a second worksheet to the workbook named Data Analysis. In cell B2 of the Data Analysis worksheet, enter the following formula to count the number of instances of each digit in the Raw Data worksheet.

=COUNTIFS('Raw Data'!C:C,'Data Analysis'!A2) Using Excel's COUNTIFS function, the worksheet counts and summarizes all occurrences of the digit in cell A2 of the Data Analysis worksheet that are found in column C of the Raw Data worksheet. Copy this formula down through cell B10 of the Data Analysis worksheet. Next add SUM formulas to cells C11, D11, and E11 and percentage calculations to column E. Chart the expected and actual distributions of the digits using a combination column/line chart with markers to produce the Data Analysis worksheet shown in Figure 28. Note that in using Excel, we were able to produce an identical analysis to that created by ActiveData. Benford's Second Digit Test and First 2 Digits test could be prepared and charted using similar formulas and techniques. Accomplished VBA programmers could automate this process with a macro so that the entire analysis is done with the click of a button.

25

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


The Benford distribution proportions are constants.

Figure 28 - Completed Data Analysis Worksheet for First Digit Benford's Law Test In the first digit Benford's Law example presented above, far too many transactions begin with the digits "8" and "9" and not enough begin with "1" and "2." Whether or not these findings represent errors or fraud is not determined solely by applying Benford's Law; however, Benford's Law is useful in identifying those areas where additional testing and evaluation is required.

Limitations of Using Benford's Law Though Benford's Law provides accountants and auditors with a powerful set of fraud detection tools, there are certain distinct and significant limitations to its use. Failing to understand these limitations could result in excessive examination of data and drawing incorrect conclusions based on observations of data.

26

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


First, large sets of data are more likely to fit Benford's Law than smaller sets. As with virtually all mathematical models, large sets of data are more reliable. When applying Benford's Law, a small sample size or population simply may not contain enough observations to produce a reliable test. Second, any set of data based on assigned numbers will likely not follow Benford's Law. A good example would be Social Security Numbers, which are issued based on a prescribed order and methodology. Likewise, checking account numbers will also usually not adhere to Benford's Law as banks typically bias the numbers based on factors such as whether the account is for an individual or business and in what state the account is opened. Third, any set of numbers where the population has defined minimum values, maximum values, or both will likely not conform to Benford's Law. For instance, an analysis of hourly wage rates may not be consistent with Benford's Law due to the presence of minimum wage requirements. Likewise, annual contributions to Individual Retirement Accounts or 401(k) plans may be skewed due to the presence of defined maximum values that can be contributed. In general, any population of data where the category of data by its nature introduces bias or skew will likely not follow Benford's Law. From a fraud detection standpoint, this means that great care must be taken to understand all of the relevant characteristics of the data being tested. For instance, many companies automatically round selling prices so that the cents section of the price is always "95" or "99." Failing to understand this characteristic when applying a Benford's Law test to the population of selling prices may cause an examiner to reach incorrect conclusions on the validity of sales prices.

Analytical Procedures as a Fraud Detection Tool Analytical procedures can be a very important tool in detecting fraud. By comparing actual results to expectations and industry norms, one may be able to identify numbers or ratios that appear to be outof-line. As with Benford's Law, just because a result does not fit an expected pattern, no conclusion can be drawn that the underlying data is erroneous or fraudulent, only that additional work and investigation are necessary before drawing any conclusions. Nonetheless, analytical procedures can be very effective at identifying conditions where fraud might exist. Furthermore, analytical procedures can be effective at identifying all three types of frauds that exist: asset misappropriation, corruption, and financial statement fraud. Analytical Procedures are defined in SAS 56 as noted below. Analytical procedures involve comparisons of recorded amounts, or ratios developed from recorded amounts, to expectations developed by the auditor. Though SAS 56 is directed to auditors of financial statements, the definition provided above is readily adaptable to fraud examiners as well. SAS 56 also states the following. Analytical procedures are an important part of the audit process and consist of evaluations of financial information made by a study of plausible relationships among both financial and nonfinancial data. Analytical procedures range from simple

27

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


comparisons to the use of complex models involving many relationships and elements of data. A basic premise underlying the application of analytical procedures is that plausible relationships among data may reasonably be expected to exist and continue in the absence of known conditions to the contrary. Particular conditions that can cause variations in these relationships include, for example, specific unusual transactions or events, accounting changes, business changes, random fluctuations, or misstatements. The plausible relationships referred to in the preceding paragraph are particularly important when attempting to detect fraud. For example, if a company has relaxed its credit policy and standards in an effort to stimulate sales, and, as a result, accounts receivable has increased markedly, it would be likely that bad debt expense would rise also. Yet, if bad debt expense rose at a rate slower than accounts receivable, would that be plausible, considering that the company's credit policy and standards had been relaxed? The answer is most likely "no." An auditor would need to conduct a closer examination to determine if the failure to recognize bad debts was an error or oversight, or was the result of intentional manipulation of financial statement assertions.

Common Ratios Used in Analytical Procedures Ratio analysis is probably the most frequently used form of analytical procedure utilized when attempting to detect fraud. Table 1 provides the formulas used to compute many of the more common ratios used in ratio analysis, all of which can be performed easily in Excel. Each of these formulas, when compared to expectations, can be very helpful in identifying potential instances of errors or fraud. Ratio Name

Formula

Description

Gross Profit Margin

Gross Profit divided by Sales

A measure of profitability after subtracting cost of goods sold

Net Profit Margin

Net Income divided by Sales

A measure of profitability after subtracting all expenses

Return on Equity

Net Income divided by Shareholder Equity

A measure of the return on shareholders' investments – including retained earnings

Return on Assets

Net Income divided by Total Assets

A measure of how profitably assets are being utilized in the business

Return on Sales

Net Income divided by Total Sales

A measure of how profitable the company is based on sales volume

Current Ratio

Current Assets divided by Current Liabilities

A measure of liquidity, designed to provide guidance on whether a company has sufficient assets to meet its immediate liabilities

Interest Coverage

Earnings Before Interest and Taxes (EBIT) divided by Interest Expense

A measure of the company's ability to pay the interest charges on its debt

Accounts Payable Turnover

Cost of Goods Sold (or Credit Purchases) divided by Average Accounts Payable

A measure of efficiency or activity

28

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Days in Payables

365 divided by Accounts Payable Turnover

A measure of how many days of purchases are outstanding in accounts payable

Quick Ratio or Acid Test Ratio

(Cash + Accounts Receivable) divided by Current Liabilities

A measure of liquidity that specifically excludes inventory

Asset Turnover

Sales divided by Total Assets

A measure of the utilization of all assets

Fixed Asset Turnover Sales divided by Total Fixed Assets

A measure of the utilization of fixed assets

Inventory Turnover

Cost of Goods Sold divided by A measure of how well inventory is being Average Inventory managed

Days in Inventory

365/Inventory Turnover

A measure of the efficiency of inventory management

Accounts Receivable Turnover

Accounts Receivable divided by Average Daily Credit Sales

A measure of the collectability of accounts receivable

Table 1 - Common Ratio Analysis Formulas Using Excel for Ratio Analysis When using Excel for ratio analysis, it is often helpful to use defined names rather than cell references when creating formulas. Defined names, referred to as named ranges in Lotus 1-2-3, can be used in place of cell references when building formulas. Defined names are more easily remembered than corresponding cell references, constants, or formulas when building worksheet models. In default, defined names are specific to the workbook in which they are created, although worksheet-specific names can also be created. Defined names can be created in numerous ways in Excel. The easiest method is to highlight the cell or range of cells to be named and then type the desired name in the Range Box on the Formula Bar just below the Ribbon as shown in Figure 29. Excel has rules for naming defined names: •

Names must begin with a letter, backslash, or underscore.

No spaces are allowed in a defined name.

Names that resemble cell references cannot be used.

Single letters cannot be used for names except for R, r, C, c.

Names can contain up to 255 characters and are not case-sensitive.

29

Copyright © 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Figure 29 – Using Defined Names in Excel Formulas created using defined names are self-documenting. Compare the two following formulas, both of which calculate the current ratio.

=B16/B51 =CurrentAssets2013/CurrentLiabilities2013 For those preparing or reviewing ratio analysis worksheets, the formula using defined names is much easier to read and interpret, plus defined names are "sticky" with respect to the underlying formula or value. Once a defined name is created, it stays with the underlying formula or value even if the underlying formula or value is moved to another location. Thus, there is no need to modify formulas after inserting rows or columns that would otherwise alter row and column addresses. This also assists in resolving many of the problems associated with linked workbooks where the data is stored in one workbook and the analysis is done in another. To automate the process of calculating ratios, a workbook can be constructed with formulas that parse a trial balance in such a way as to derive the ratio components (current assets, current liabilities, total assets, etc.) and then calculate the ratios from the derived components. Generally, the ratio components are calculated using array formulas that test what accounts should be included in each value sum, thereby allowing for changes to the chart of accounts.

30

Copyright Š 2012. Reproduction or reuse for purposes other than a K2 Enterprises’ training event is prohibited.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.