Technology Already Transforming Audits, But Regulators Must Catch Up

By:
Chris Gaetano
Published Date:
Nov 17, 2017
Robot_finance

The age of data analytics is already transforming the audit profession, but regulators have been slow to adapt to this new world, according to speakers at the Foundation for Accounting Education's Auditing Standards Conference on Nov. 16.

Speaking at the Society's Manhattan offices, Miklos A. Vasarhelyi, Director of the Rutgers Accounting Research Center and Continuous Auditing & Reporting Lab, noted that the way people typically think of an audit involves taking samples from vast collections of data. Today, though, computers can examine every single transaction an entity has made in the audit period. He said that in an experiment his own group did, researchers took a pool of 600,000 transactions and set a materiality threshold of $3 million, which produced 30,000 aberrations, roughly 5 percent of the total data pool. 

"So, you know 5 percent is not bad, but that puts you in a dilemma here. It is materially 'wrong' because in the 30,000 transactions you pulled, their value exceeds the materiality threshold [mandated in the auditing standards] so in theory you would have to examine them one by one. But I won't examine 30,000 transactions!" he said. 

What would be reasonable, he said, would be to apply some analytic techniques on the 30,000 to see which are important and only examine those. This, he said, allows for a much more powerful audit that takes every single transaction into account. However, he said, the Public Company Accounting Oversight Board (PCAOB) will not take a full population test, versus pulling 50 or so samples, as audit evidence despite the fact that, through conducting such a test, he already knows that 95 percent of the entire data set does not have any material price or quantity variance. 

He said that his group is conducting four concurrent experiments to make its case to the PCAOB. There's one involving a general ledger, one involving a purchase-to-pay system, one involving an order-to-cash system, and another involving payroll. The experiment is about what filters work best with what data sets, as he noted that having too many filters on a very large data set places too much demand on computing power, meaning the auditor must select the best filters for the situation. This was the situation that he encountered when using data analytics for internal audit at a large Brazillian bank with over 70 million credit card accounts. He added, however, that the group will also pull random samples and monitoring unit samples and show the difference in quality against a full population test. The plan, he said, is to use this controlled experiment to "show the PCAOB that this makes sense and is even better." 

"If you think you can extrapolate on one tenth of 1 percent, you cannot. ... And so what you are going to try to show is that this method is reasonably good and we know a lot about this population," he said. 

He also talked about what he called continuous process mining, which is basically a continuous audit process taking place in real time, examining and classifying transactions as they happen. The experiments are being done with the cooperation of major accounting firm in Holland as well as a very large not-for-profit. He said he was able to convince Rutgers to support the study by saying it would be very useful for the university management itself. This represents one of the major applications of this type of technology: allowing management to better understand its own internal processes and become a more efficient organization.

He noted, however, that the Sarbanes-Oxley Act doesn't allow for this application since it would technically count as advising the audit client. He also noted that visualization of this data, another thing his group is working on, does not currently count as audit evidence. He argued that it should, as data visualization can provide great insight at any stage of the audit. 

Another speaker, Katie Greehan, a partner at RSM U.S., noted that regulators are starting to take this area seriously. For instance, she noted that an AICPA working group has finalized a new guide on applying data analytics procedures to the audit, which is expected to be released next month. She said the intention was to make the guide focus on foundational concepts and be "tool agnostic" so it can be used regardless of what sort of programs a firm is running. It's also meant to be very user-friendly, she said, noting that a lot of times people shy away from analytics because they don't understand it. To address this, she said, the guide provides a lot of practical examples of the kinds of tests an auditor might want to run, making analytics much more practical. 

Beyond the AICPA, she also pointed out that the PCAOB has recently added audit analytics to their research agenda, with member Jeanette Franzel saying that the board is planning to eventually issue guidance on the topic. She also said that the Center for Audit Quality also formed a task force to focus on data analytics with the specific purpose of giving the PCAOB insights into what standards should be updated to account for this new model. On the global level, she also noted that the International Auditing and Assurance Standards Board recently asked for input on this topic as well. 

Vasarhelyi said that these measures will only be the beginning. He predicted that the use of sophisticated analytics data will only expand as the years go on, and will start using not just an organization's internal information but external information as well. To illustrate, he said that a student of his is working on a dissertation analyzing data from a large retailer with over 2,000 outlets in the United States in order to make predictions on how individual stores will do. 

"So this is very exciting; we call it a predictive audit. Now, we started thinking, maybe, there are other variables that influence things, so what influences sales? Weather. You can't believe how much ... data of weather that exists in the outside world. It is an amazing amount. More than databases of large companies. That big. So we spent a lot of time collecting weather data," he said. 

What they found, he said, was that adding weather data improved predictions "tremendously." Turns out, weather is predictive for when people do and do not want to shop. On top of weather data, the group also began applying macro- and micro-economic trend data for the regions the stores were in, which in turn helped them better understand which branches would sell more or extend more credit. On top of this, he said, they added social media data. The group began monitoring words being said on social media about a particular company and seeing how that correlated with sales. 

Greehan said that all this means big changes in the auditing profession. While she acknowledged concerns that automation might lead to the end of the auditing profession, she said she didn't believe that. What will be endangered, though, will be the old ways of thinking about the audit, particularly its separation from technological processes. What will be needed, she said, is to change the culture of the audit profession itself. 

"[Firms must] create a culture ready to embrace the changes. We can all get very comfortable with the 'last year' approach, but if we at our firms can create that culture ready to take on this change, it can only help. Encourage all levels at our firms to think outside the box. ... One of the challenges we have is what data do they even have to begin with and what can I do with it to get the audit evidence I need. [Think] about what types of tools or software we need to do more of this. And [think] about training, to help people get more comfortable with analytics and get that analytic mindset. And of course just [continue] to monitor the industry," she said. 

Click here to see more of the latest news from the NYSSCPA.