Attention FAE Customers:
Please be aware that NASBA credits are awarded based on whether the events are webcast or in-person, as well as on the number of CPE credits.
Please check the event registration page to see if NASBA credits are being awarded for the programs you select.

Want to save this page for later?

NextGen Magazine

 
 

Latest Version of ChatGPT Passed a Practice CPA Exam

By:
S.J. Steinhardt
Published Date:
May 23, 2023

ChatGPT 3.5 may have failed a practice CPA exam, but Chat GPT 4.0 passed it, Accounting Today reported.

ChatGPT 4.0's scores were:

AUD - 91.5 percent
BEC - 85.7percent
FAR - 78 percent
REG - 82 percent

ChatGPT 3.5’s scores were:

REG: 39 percent
AUD: 46 percent 
FAR: 35 percent
BEC: 48 percent

Accounting Today’s experiment used ChatGPT 3.5, while a recent academic paper’s experiment used Version 4.0.

The researchers first tested GPT 4.0 in a "zero shot" scenario, a model in which a prompt or a question is provided, along with some high-level instructions or descriptions, but no explicit training on the specific task. It relies solely on its pre-existing knowledge and general understanding to generate a response.

In this scenario, ChatGPT 4.0 performed a little better than Version 3.5 but still failed, with an average score of 67.8 percent, equating this performance with not having studied and relying on pre-existing knowledge.

In the subsequent "10-shot" scenario, in which the AI was prepared with 10 sample accounting questions to provide subject matter training and to train the AI to think like an accountant, it scored an average of 74.4 percent across all sections. 0.6 percent short of what is needed to pass.

The researchers then used "chain of thought" prompting, which the authors of the academic paper defined as "decomposing a larger problem into several intermediate steps to get the final answer." It is the functional equivalent of studying before the exam, according to Accounting Today. Using chain-of-thought prompting on a model that was previously primed with 10 accounting questions resulted in ChatGPT passing the practice exam with an average of 84.3 percent across all four sections.

"The results of our study demonstrate that ChatGPT can perform sufficiently well to pass important accounting certifications. This calls into question the 'competitive advantage' of the human accountant relative to the machine," said the study's conclusion. "To our knowledge, for the first time, AI has performed as well as a majority of human accountants on a real-world accounting task. This raises important questions of how will machine and accountant work together in the future. We encourage research to help understand where machine and human abilities are best deployed in accounting. We also encourage research that develops and invents the capabilities for machines to perform greater amounts of accounting work—freeing accountants to innovate and add greater value to their organizations and society."

"I am amazed and excited by how fast this technology is changing,” Brigham Young University Professor of Accounting David Wood, one of the main authors of this paper, told Accounting Today. “So far, using ChatGPT in my own work has made me more productive and I enjoy it! It has allowed me to add creativity to my work and remove some of the mundane, boring parts of my job. The more I use this technology, the more I believe it is going to prove disruptive and change what we do as accountants and educators. My overall belief is that the changes will be positive, but I do think it will be a bumpy process implementing this technology into our work."