Galileo Disclosure Model
The Galileo Disclosure Model (GDM): reengineering Business Reporting through using new technology and a demand driven process perspective to radically transform the reporting environment for the 21st century
Miklos A. Vasarhelyi, Rutgers University
Michael Alles, Rutgers University
DISCLAIMER: while this paper was developed under the work AICPA's Special Committee on the Enhanced Business reporting it does not represent in any form the official position of the AICPA or its committee. It represents the opinions of its authors. NOTE: This is only a draft. Please do not cite or reproduce. All comments are welcome and should be sent to Miklos Vasarhelyi.
1. Overview: Problem Definition and Solution
1.1 The EBR Consortium and the Role of the Galileo Project
Since the joint collapse of Enron and Arthur Andersen in 2000, there has been a series of accounting related scandals which, at the very least, raise serious concerns about the appropriateness of the current financial reporting system. In response, the AICPA has taken the initiative of re-thinking financial reporting by establishing the Special Committee for the Enhanced Business Reporting Model (EBRM), also called the Starr Committee after its chairman Michael Star from Grant Thornton. This committee examined the proposals presented in the early nineties by another special Committee, the Jenkins Committee. Despite the fact that its chairman, Ed Jenkins, subsequently headed the FASB, the Jenkins Committee recommendations were mainly not put into practice, one reason being that the late 1990s bull market made its concerns about the adequacy of GAAP seem redundant. By contrast, one of the underlying questions confronting the Starr committee was an unanswerable one - whether the malfeasance crisis could have been avoided if the improvements to financial accounting and reporting suggested in that earlier report had been implemented.
While these questions are mostly speculative, the committee decided that the accounting profession by itself did not have the authority or the ability to create a new reporting model with its enormous societal consequences, and so in order to bring about substantive change it transformed itself in July 2004 into a broader consortium of stakeholders in the financial reporting process. The Enhanced Business Reporting Consortium (EBRC) describes itself as: "A Consortium of stakeholders collaborating to improve the quality, integrity, and transparency of information used for decision-making in a cost effective, time efficient manner." The Star Committee, under its Public Company Task Force, had a set of work products that will serve as inputs to the EBRC. These work products of the consortium were a set of sample reports that illustrate the kinds of enhanced disclosures that it advocates as necessary and useful for complex organizations in today’s information economy, and which serve as a starting point for further discussion.
By design, the content of the first two sample reports are not especially “radical”. As Paul Herring, the chair of the Public Company Task Force wrote during the process that created these reports: “Formats that follow outlines that are already in general use in the business information supply chain are likely to gain faster acceptance than those that are new… We will explore potential enhancements to the existing financial reporting format but will not consider wholesale re-structuring of the financial statements.” By contrast, the third sample report project, labeled the “Galileo project” was the one that was meant to be far-reaching in nature. The Starr Committee examined extended financial reporting—additions to the standard set of GAAP based accounting reports, with the explicit understanding that while these reports are no longer appropriate for stakeholders, the committee itself was not in a position to change GAAP.
This incremental approach may make perhaps sense in terms of change management, but it can also constrain the possible changes to the reporting model that are made available to the consortium to discuss. Thus the Galileo project serves as a remedy to the cautious approach, by being the medium to consider “extreme accounting” including both supplements to standard reports as well as possible changes or modifications to GAAP itself. Further, while all the sample reports use somewhat technology to transform the way in which financial information is presented, Galileo goes further in examining how the IT infrastructure of today’s digital can also fundamentally transform the process of obtaining and preparing, as well as communicating, financial information.
1.2 Back to Basics
Financial reporting would not be needed if all stakeholders in the firm shared the same information about how the firm has performed in the past and had similar expectations as to how it will perform in the future. Furthermore this shared information should be correct and well representative of the actual business conditions of the firm. In reality, those within the firm are inevitably in a better position to know its state than those stakeholders outside of it. Moreover, the former are not just informationally advantaged, but as managers they can actually shape the firm’s future performance. This is the fundamental informational asymmetry that bedevils financial reporting, a reflection of the conflict of interest between shareholders who only care about the financial performance of the firm as reflected in its market price, and managers who can directly benefit from exploiting the firm’s assets. [m1]
These informational asymmetry and moral hazard issues add the possibility of deliberately distorted reporting to the already formidable problem of measuring firm performance even in a non-strategic setting. Moreover, measuring past firm performance is largely a means towards the end of forecasting future performance for it is only the future and not the past that affects firm valuation. [m2] Clearly managers can affect the degree to which past performance predicts future performance, thus affecting the value of financial reporting.
Adding to these measurement problems are changes in the way in which firms transform capital into returns. Once the main function of the firm was to apply unskilled labor to physical assets, so that the reporting which concentrated on the disposition of those physical assets adequately captured firm performance. Xxxx Indeed, even accuracy in measuring assets could be sacrificed for other goals such as verifiability through the doctrine of conservatism without greatly reducing the usefulness of the reports. But today firms create value by the use of such intangible assets as knowledge and the skills of its workers with the result that the relationship between its physical assets and its performance is greatly diminished. This creates two problems: a pure measurement issue of how to account for the presence and role of intangibles and a strategic measurement problem in that this broken relationship opens up a wider scope for managers to manipulate earnings.
An example of these problems comes from a sizable accounting transaction[m3] : the decision by Cisco Systems, in May 2001, to write-down its inventory by $2.25 billion, an amount larger than the inventory value in its books.xxx[m4] One explanation is that the write-down related to the value of inventories that could be not sold by its suppliers in the value chain where Cisco had a contractual or moral obligation. In particular, during the e-commerce boom Cisco had offered vendor financing to many dot com firms in exchange for sales contracts, while signing contracts itself with downstream suppliers in anticipation of tight demand. These obligations were not reflected anywhere in the financial reports. Of course, even granting these problems, there was also the suspicion that the sheer magnitude of the write-off resulted from the use of the well known tactic of the “big bath”, in which all the bad news are anticipated in advance, all at once, thereby creating reserves to boost income in the future. [m5] [m6]
This example and the difficulty in disentangling its purpose are indicative of the difficulty that users face today with financial reports. In fact, the underlying accounting fails to account for the way in which the modern firm operates and for the intangible factors which underlie value creation or destruction. Moreover, managers are able to take advantage of the resulting ambiguity to act in their own best interest and not necessarily that of the firm or other stakeholders. Most importantly, this is not an example of outright fraud or audit failure, but rather an example of what is arguably a far more compelling problem: the systematic inability of the current financial reporting system to meet the needs of users, to understand the ways in which complex organizations perform and to hold managers accountable.
This example also undermines one of the arguments in support of the current financial reporting system and against changes to that system: the need to maintain comparability and consistency across firms in the ways in which they account. But even strict rules, such as those that apply to inventory valuation or special purpose entities, is no guarantee that firms will apply those rules in the same way given the underlying ambiguity about what is being measured. This is really an argument for more information disclosure to enable stakeholders to better discern the purpose and meaning of specific business activities.
Other examples of the difficulties posed by the existing financial reporting system are reflected in many of the recent scandals, as the prosecution of the perpetrators did not deal directly with the core malfeasance issues but attacked more peripheral facts. Thus,
* Arthur Andersen was not convicted for performing bad audits but of destroying evidence.
* Martha Stewart was not convicted for trading on insider information, but for lying to federal investigators.
* Dennis Kozlowski of Tyco will likely be convicted for not paying sales taxes in the state of New York not for plundering the treasury of his company in lavish self-given benefits that were “approved” by a deceased director.
The press attributes these aberrations to the hesitation in the part of prosecutors to discuss a set of “arcane accounting laws” in a court of law where jurors, lawyers and judges will have great difficulty comprehending the issues presented by armies of highly paid attorneys who, in collaborations with expert witnesses, point out the ambiguities of the law and explore the “beyond reasonable doubt” concept.
It is also striking that the parties that have been involved in many of these cases are stalwart institutions which help define the nation’s economic environment. [m7] Take for example the case of Enron, which had over 600 CPAs on its payroll and hired McKinsey for strategic advice, Arthur Andersen for audit and consulting services, and worked with Citibank, Merrill Lynch, and JP Morgan for structuring and supporting its financial operations. These firms, the best and the brightest in the business, helped Enron stretch the boundaries of accounting in order to manage its earnings. These financial institutions had entire groups devoted to “structured [m8] transactions”[1] whose main purpose was to disguise the nature of the financial transactions of Enron within the “arcane set of rules” of accounting that they expected never to be revealed to the world, and in case of litigation expected the prosecutors to avoid.
Enron also had an intricate web of additional financial relationships with its directors who advised it on many issues while handsomely profiting from their relationship. These directors were also stalwarts of society and most likely were aware of the aggressive nature of Enron’s accounting even if there were not cognizant of the criminal profiteering of some of its top managers. Ex-regulators, leading academics and well known international figures were compensated by being on Enron’s board as well as by providing other services as external consultants. The fundamental problem is that the highly complex nature of Enron’s transactions would have been very difficult to detect by even the most committed and best trained external director.
The need for drastic change in financial reporting has been recognized by many. Arthur Levitt[2], the former chair, commenting on Senator Carl Levin makes a very damning statement:
… well before the Enron disaster, he saw the fiction that corporate financial statements had become: companies technically were in compliance with accounting rules, yet their financial statements were hiding huge debts and other liabilities. (p 243, emphasis added)
What is needed to update the financial reporting system to deal with this kind of complexity? The rest of this paper discusses the options in detail. Here we present the main issues and principles of a new financial reporting process.
1.3 Rethinking the Role of the Standard Financial Statements
The current financial reporting system is centered on the annual income statement and balance sheet as prepared and distributed by the firm. They serve as summary measures of the state of the firm and its performance. Such summarization and condensation inevitably results in a loss of information which cannot be in the best interest of users unless the measure perfectly captures future firm value, or the costs of more detailed information exceed its benefits to users.
Given that the former is an unlikely prospect, the rationale for the current systems of disclosure is predicated on the basis that: a) users are assumed to be unsophisticated (the “widows and orphans” mentioned at the time the ’33 acts were passed) and incapable of processing more disaggregate information for themselves, and b) it is costly to prepare and report information on a more timely basis.
These conditions speak more of the 19th century beginnings of financial reporting than they do of the circumstances in which financial markets operate today. Firstly, technology enables the firm to manipulate data at low cost, meaning that there is no longer a compelling reason to restrict information disclosures to an annual basis. Second, the purpose of financial reporting has shifted from its original stewardship function toward valuation and comparative evaluation, which necessitates a broader, future oriented set of information. As these statements have proven to be insufficient for the needs of more sophisticated users, they have been expanded periodically in response to demand or the latest scandal, in a largely haphazard fashion. In some cases, the statements themselves have been reconfigured (for example, to allow mark to market accounting to reduce the dependence on historical cost) or else additional information has been provided outside the statements, as through the use of footnotes. But the centrality of the two primary statements has been retained, along with their underlying assumption that it is important to restrict the scope of information provided to users in order to avoid overwhelming them (akin to the recent proposals for a condensed and simplified version of mutual fund prospectuses). The end result is a highly aggregate, episodic flow of information from the firm in which a small set of standardized information attempts to satisfy the widely varying needs of users.
This approach implies that auditing is also centered on the mandated financial statements. Thus auditing is also episodic and focused largely on whether the firm has correctly condensed and aggregated its information into those statements (which is what “prepared in accordance with GAAP” literally means). Validating information on a more concurrent basis is held to be outside the scope of the external auditor and assigned to the internal auditors instead. But it has also become steadily apparent that the mandated statements cannot be considered independently of the underlying data of the firm and the firm’s accounting and control infrastructure that gives rise to that data and records, manipulates and aggregates it. Thus, as with financial reporting, auditing has been periodically expanded, albeit also in a largely haphazard fashion, first to encompass general examination of controls, and with the passage of Section 404 of the Sarbanes/Oxley Act, to a detailed attestation of financial reporting controls.
With the financial reporting environment almost exclusively focused on the income statement and the balance sheet[m9] it is not surprising that the financial markets also have tended to view a firm largely through the prism of those documents. In an extreme, this can lead to forms of functional fixation, where form can seem more important than content, as when information in the statements themselves dominate the market’s reaction even when information in footnotes modifies or contradicts it. In turn, firms expend vast resources in fighting accounting changes that impact the income statement even if that same information is presented elsewhere and could be readily used to recalculate the reported numbers, as in the current debate over stock option expensing. [m10]
The continuing fascination with reported net income is not, however, due to the lack of sophistication of market participants. Financial markets today have today some professionals who are not only capable of handling highly disaggregate financial data and forming their own conclusions about it, but actively do so. Thus some analysts simply discard the financial statements issued by firms in favor of extracting specific information from them and inserting it, along with other external information, into their own models of firm performance[m11] . However, there are a some constrains including a) the focus of the financial reporting system on the mandated statements leaves them with few other options on which to base their analysis, and, flowing from that, b) the lack of other instruments of communication lead firm managers to use those statements to signal information, requiring a continuing focus on the form of those statements, independent of their content.; and of course, c) the assurance that is attached to those statements alone, requiring that they receive disproportionate weight, again regardless of their information value. The lack of other audited information has also resulted in auditors becoming insurers of last resort[m12] , as users who are forced to view the firm through those statements come to see the auditors as gatekeepers for the firm, and so hold them responsible not only for the preparation of those statements, but also for their content.
If the financial reporting system was being built from scratch today, it would likely be aimed also to the needs of these very sophisticated users than the “mother and orphans” type of investors predominant at the time of the ’30’s acts. In particular, there exists today a large group of financial intermediaries that work on behalf of these unsophisticated users, or who interpret information for them, (for example, mutual fund managers, financial analysts) so that there is no real need for these investors to personally assimilate financial information, obviating the need to pitch financial information at the lowest common dominator.
A reengineered financial reporting system would be predicated on two underlying assumptions: First, that technology has reduced the cost of preparing and reporting financial information with much finer detail on a more timely basis; and Second, that some very important users are much more sophisticated and capable of forming their own metrics for firm performance, rather than having to depend on the condensed and aggregate annual statements issued by the firm.
These two assumptions have to be applied against the financial process value chain of financial information which extends from the raw data of the firm at one end to sophisticated users at the other. Part of this chain takes place within the firm and part of it is external to the firm, with a handover of financial statements taking place at the boundary between the firm and its constituents. As the forces affecting the supply and demand of financial information have changed, it is surely time to ask whether the location of that boundary point is still appropriate. So the question becomes whether the firm should aggregate and condense information to such an extent before releasing it, or whether users can be assumed to be sophisticated enough to perform these functions on their own.
That is not to say that firms will not prepare income statements and balance sheets. After all, they already do so for their own internal management purposes. But there is no reason why users should be restricted to that one perhaps self serving and highly restrictive method of aggregation when users can be allowed to see how that report was created and either accept it as it is, or else use the underlying data as they see fit. Reducing the single minded emphasis on just the income statement and balance sheet will not only increase the information content in the marketplace about a firm, but would reduce the likelihood of functional fixation, since it would be clear that valuation is meant to be based on a broad set of information.
Questions that have to be examined are a) the degree of aggregation that will take place given the needs of users and the concerns of the firm about revealing competitive data, b) how much pre-processing of information will be undertaken before information is released and who is in the best position to do that processing, and c) how much validation will be provided with the information and who will provide that assurance.
These three are not independent issues, since aggregation is a form of information processing in which a great deal of information is lost. It also allows for those who have access to the raw information (i.e. the managers) to shape the degree and form of condensation that suits their interests best. At present, managers constrained only by their ability to get their interpretation of GAAP through the auditor, direct their energies towards making one metric of firm performance, earnings per share, as favorable for them as possible. Reducing the degree of pre-processing and aggregation of information by the firm would presumably also reduce the ability of firm managers to manipulate that information.
Technology can be an effective tool in providing a richer flow of information to users, with tagging, as in XBRL, being a particularly promising technology. Tagging is particularly important because it makes information content independent of its presentation, thus reducing the tendency for functional fixation. Ultimately the latency between economic activity and reporting can be reduced, in order to bring the reporting frequency more closely in line with the dynamics of the business and the needs of users.
A reengineered financial reporting system will also, of course, impact the role of auditing. With more information being issued more frequently, auditing will have to move away from an annual focus towards a more continuous auditing model. Moreover, with more disaggregate information being reported, auditing will also shift its emphasis away from verifying the way in which the firm aggregates and condenses its data, towards more data-level assurance. The degree of verification which users will demand from the broader set of data they receive will determine the extent to which data is actively audited, as opposed to being assured passively, for example, by threat of criminal liability or civil litigation.
1.4 The Role of Technology
In the new accounting environment, the firm’s databases and ERP systems will play the same role the general ledger did in the old manual reporting world, with the difference being that the reporting system will require a monitoring and control layer, probably including a continuous assurance component, which will evolve from the systems being implemented for Sarbanes/Oxley 404 certification. It is likely that firms will progressively implement such monitoring layers for their own internal management purposes, the output of which could then be adapted for external reporting. Indeed, this would have the advantage of letting the market assess the adequacy of the firm’s control systems. On the other hand, it can be argued that the reporting system will depend on the IT decisions of individual firms and so it is not clear what would compel a firm to implement the particular monitoring layer that is desired by users. In other words, the more sophisticated the infrastructure underlying the reporting system, the more difficult it will be to obtain cross sectional consistency, at least in the absence of regulation, which is unlikely in this context. This fact may constrain how extreme the new reporting systems can be, given their reliance on technology.
1.5 Paper Outline
Having defined the problem facing the financial reporting system and outlined the drivers of the proposed solution we now turn to an in-depth examination of the issues raised in this introduction. This examination involves details of the specific problem areas in accounting today and then a look at the changes impacting the business environment, and especially the technological infrastructure of the firm that both undermines existing reporting systems and provides the foundation for the creation of a new and more effective system.
[1] The New York Times in October 8, 2004 article by Eric Dash entitled “ Parmalat Files Another Suit Naming Bank of America” relates a law suit of Parmalat against Bank of America stating that “It charges that between December 1997 and December 2001 Bank of America helped certain Parmalat senior managers structure and execute "a series of complex, mostly off-balance-sheet transactions that were deliberately designed to conceal Parmalat's insolvency." … Meanwhile, the bank and its executives collected tens of millions of dollars in interest, improper payments and transaction structuring fees, ….. seven examples of what it claims were fraudulent and highly lucrative transactions that Bank of America managers arranged for Parmalat subsidiaries in Venezuela, Brazil, Chile and South Africa. "In some cases, what appeared to be conventional loans from Bank of America were in reality intracompany transfers or loans from other Parmalat entities," the complaint said.
"In other cases, what appeared to be conventional debt offerings to third-party investors, supposedly underwritten by Bank of America, were in reality loans to other undisclosed Parmalat entities," the complaint said.
The complaint said investors were intentionally misled into believing that Bank of America was standing behind Parmalat's creditworthiness when the bank's activities really suggested that it was doing all it could to reduce its risk. In some cases, the complaint said, it established secret loan guarantees and side-letter agreements so it had no risk at all
Comments
[2] Levitt, Arthur, Take on the Street, Pantheon Books, NY 2002.
Back to Homepage
[m1]Mike just the sharing of info is not sufficient… it is necessary that the information is a good measurement to support the many forms of decision making necessary for the different stakeholders of business.
[m2]Is this a warranted comment… and valuation is not the only purpose of business measurement. For example taxation, pension liability estimate, production capability, etc.
[m3]Mike please lets not bebombastic… we create enemies and sound like the new york post.
[m4]Mike this does nto make sense can u explain better?
[m5]We need a source – ref here)
[m6]Silvia can u go to the statement and see if there is an explanatory foot note?
[m7]Mike I would like to simplify the tone a bit ..we do not need its contentiousness, we’ll have enough of it just with our content
[m8]I think this is too big for a footnote .. body of the paper???
[m9]Are u forgetting the funds flow?
[m10]I do not think that the option disclosures allow investors or user to replicate adequately what the compensation is
[m11]First, the markets are not dominated..most financial professionals are kind of clueless.. second do you have any evidence that this is eally the case that they go to other WHAT???- information.. I think we need to support this if this is true… I do believe that this is true at say goldman and morgan Stanley that they ar eusing other information but DOMINATED is an awfully strong work
[m12]Again I think this is too strong…
2. An Evolving Scenario
2.1 The Intractable Problems of the Current Accounting and Reporting System
2.1.1 Consolidation
Over the last two decades the standard setters have struggled with representing businesses with multiple segments that are not fully integrated, produce different products, and are in different geographies with differing currencies and methods of accounting. Large companies that have heavy industrial and financial components (e.g. GE and GM) blend into one measure very distinct types of numbers, which can tend to obscure financial performance rather than make it more transparent. The perennial problem of determining if two entities are one and need to be consolidated, or if they are different entities has been exacerbated by the development of Special Purpose Entities (SPEs). These are originally entities of very specific and narrow denomination which even 97% ownership did not create any co-dependency for the firm. This original definition was followed by a much larger set of usage by many businesses to the extreme abuse that Enron showed. While statistics on the existence and nature of SPEs are not available, they are much more widely used than generally understood and are applied by many of the most reputable organizations in the financial markets.
The evolution of organizations in the 21st century will lead to substantial deconstruction of business[1] where using internetworking technology will allow many functions to be outsourced, partnered, or turned over to the competition. While outsourcing could be a straight forward arrangement as when implemented by single independent firm, many forms of outsourcing will exist. Simply adding the component entities in a consolidation creates a very false sense of reality. These relationships are often more than their simple formalization. Xxx The core issues are ownership, inter and intra entity transactions, obligations for residuals and commitments over time even if not contractual. One of the reasons that the solution of the consolidation problem has eluded standard setting is that one of the motivations for consolidation is the obfuscation of individual unit performance. Consequently, standard setters never had the stomach[m1] to force substantial disclosure at the business unit level, nor the desire to force narrow business units of standardized form and standardized activity, reporting with the same accuracy and detail required from the consolidated entity. While the Jenkins report strongly suggested narrow and complete reporting at the line of business level, the changes effected by the FASB were limited and did not satisfy the real need that is emerging in the 21st century of creating dynamic standards and industry bench marks for online real-time business monitoring. Comparisons among organizations should be at the sector level not at the aggregate level where the addition of non-similar parts creates substantive obfuscation. A new type of aggregate entity should be invented and enough disclosure detail provided to allow for income calculation and asset allocation across and along the value chain.
2.1.2 Intangibles
The recent literature has started to pay increasing attention to intangible items. As discussed earlier, the new business measurement model must take into consideration a much wider set of assets such as intellectual property, human resources, brands, marketing investments, reputation and other items. However, many of these items should not be added to the traditional total assets figure, as their addition could result in a very misleading total assets figure. While we can estimate the value of cash fairly easily with a small expected variance , a much larger variance of the value can be estimated out of say inventory or property and a huge lack of reliability comes with the intangibles item of the balance sheet. Figure 1 illustrates the different levels of precision of the different items in the balance sheet and support the argument that these should not be added as this would lead to a total assets figure that is likely to be inaccurate..
Figure 1: relative precision of balance sheet items
New types of reports, which aggregate data with similar levels of accuracy, must be created. Research is needed to help the reporting agencies come up with methodologies that would adequately incorporate heterogeneous reliability measures that specify their components. Many of these “intangible” items, some of which are currently disclosed in the balance sheet and others just described in the body of a financial statement or in a report to other agencies that are not financial in nature, must not only be disclosed but should also presented in some form of comparable metrics. In this work the idea of POC’s (point of comparison) is suggested for these non-financial variables. These POCs will serve as the basis for disclosed relationships that link financial and non-financial variables of different companies.
Process or variable
Metric / point of comparison
Human resources
Pension retirement matrix
Summary of training and investment on HR
Brand
Brand value assessment and method
Intellectual capital
Number of patents granted and applied
Expenses in R&D
External valuation of IP
Method of valuation and estimates
Marketing
Market share
Industry ranking
Table 1: Points of Comparison
The creation of metrics that describe non-financial variables is fraught with concerns and potential inconsistencies. As it can seen above many of the measures are estimates of a very soft nature which will share the same problems that current financial estimates possess. Intangibles will have different measurement and valuation bases and an entire non-financial GAAP must be developed for their disclosure.
2.1.3 Materiality
[ 2]
The accounting profession has struggled for years with the concept of materiality. The audit opinion states that financial statements “fairly represent” the financial health of an organization. The materiality threshold is in engineering jargon an indication of “allowable error in measurement.” Current audit practice relative to materiality has been in place for three decades. It represents a compromise between the cost of audit investigation in manual records and the benefit for stockholders of this investigation. Information technology has dramatically changed allowing for cheaper and more effective controls and investigation, unbalancing this archaic compromise. While the tradeoff between the accuracy of measurement and the cost of assertion continues to be real, the break even point has changed in reality but not in practice.
A likely and desirable change, the leveraging of technological change, is the justification of the audit turning towards the improvement of data quality at the client (Vasarhelyi and Cohen, 2005[2]) and providing a variable level of assertion depending on asserted process. Clients and auditors would agree on the assertion needed on different processes subject to minimal requirements set by statute. Business entities that have real needs of data quality and validation would decide where the optimum tradeoff would be and pay accordingly. This would create a much larger economic threshold for assurance services as companies already pay much attention to data quality. In the future world of a universal data bus and balkanized information being transferred among interoperable Web Services will create even larger concerns for data quality.
While the concept of financial statement audit will continue for a while, a new set of assurance types will emerge where auditors, or other assurors, will place an imprimatur on data at the tag level. This imprimatur can be at the data accuracy level (this data is 98% correct) or at the process level where effective controls that act on the data would be either listed or rated. Obviously these two approaches dovetail and can be used simultaneously. Furthermore, it must not be lost that a wider set of assurance service may emerge with classes such as wider audits, intervening audits, ubiquitous audits, control rating audits, causal audits, etc.
As continuous audit techniques, become more prevalent, the entire economics of auditing and financial report preparation will change. With the cost of automatic procedures becoming negligible in an ERP environment, so will the ability to conduct analytic procedures on a real time basis. The tradeoff between sampling and full population testing will shift akin to the change in the materiality threshold. More generally, the evolution and ubiquity of ERPs will fundamentally lower the costs of compliance and reporting. The basic cost of preparing a report that obeys a particular auditing/accounting standard will become slight as it is prepared by the ERP provider and pulled out as a standard product. Setup costs however, may vary among installations as the basic data for the new requirement may not be available
2.1.4 Stale, erroneous, and opaque information
Annual reports have turned to be major tools of public relations. Currently the idea of just publishing the “standard packaged in the ERPS” report is unthinkable. Annual reports are tools of “spin” While this is not a palatable thought for many it is clear that the future world is one of more and more regulatory compliance and consequently the organization provisioning substantively more information. The spin mentality must give way to multi-dimensional realistic reporting that is drawn directly off corporate systems and deposited or delivered to users without expensive (PR) manipulation. Specific reputational penalties must ensue from issuing stale, erroneous and opaque information. Today’s paranoid concern for breaches in competitive intelligence where competitors discover important economic facts about the reporting business must give way to a more data cooperative attitude where the society and the corporation benefits from the existence of comparison benchmarks in the many facets of business. Just like today entire sectors cooperate in the development of XML derivative standards to create interoperability between applications and data transitivity in the value chain these sectors must cooperate in the development of disclosure standards that can be compared and used for industry benchmarking. Competitiveness has to be preserved by fast ever improving processes, timely research and aggressive data sharing not by self-serving paranoid opacity that slows the progress of science and interferes in the natural economic optimizing process of allocation of capital.
2.1.5 The specification of contractual terms in the measurement model
One clear shortcoming of today’s reporting model is its focus on realized operations and its ignorance of a large set of tacit and contractual obligations that often determine much of future economic activity. Organizations, their clients, their business partners, and suppliers are linked by a network of contracts that are formal and informal. Many of these contracts present larger liabilities for future operations than most reportable events. For example:
* a power utility may have a fuel supply contract that is 10% over current market price for the next 10 years
* a business concern may outsource most of its supply chain and as a result may have consensual obligations even if these are not contractual
* A business concern that has “return” agreements with their clients for inventory that is obsolete or cannot be used or sold
* Company with a long term practice of supporting local and communal projects to enhance the environment
* Company with many social welfare practices relative to the employees that cannot be stopped
* Company with passive obligations for environmental cleanup that are not recognized
These types of instances and the non-reported legal contingencies are often much larger than the liabilities typically reported in annual reports under contingencies. Only a probability-based system of contingency reporting can provide the necessary description that is useful and realistic in this an information society. Where clear obligations (and benefits) are not available a deeper standard of disclosure applies where disclosure must be prepared such as:
* legal, operational, and contractual contingencies
* management compensation contracts at a much deeper level…(including a taxonomy of types of compensation)
* Hyperlinks to fuzzy contracts or non-standard financially engineered contracts
* Description of corporate litigation
* Description of government investigations
* Etc.
2.1.6 Valuation
The accounting profession due to a highly litigious environment and the inherent difficulties of probabilistic measurement has resorted to the more confirmable and less valid forms of modified historical based reporting. Furthermore with the increased consideration of non-financial measurements where organizations try to assess the value of their workforce, of their intellectual property, of their sustainable resources, etc the temptation is to go back again to historical values invested on these issues. For example valuing an employee based the company’s investment on his/her education, professional training, etc. This is one of the examples of a very intractable set of problems. The standard to apply here is whether the information user will be better or worse served by being supplied verifiable (say historical cost based) investment information rather than estimates which may be more indicative of future value, but are less verifiable. If the estimate is used, will this information be more or less reliable than the old method? And can a structure be developed that users can download and perform their own analytics on this data?
The modern world is developing a wide of set live markets whose by-product is online real time valuation of many assets. Research is needed to understand how prevalent is this type of information and how expensive it is to harness it. Clearly the new economy has troves of transaction prices, valuation prices, indices, price lists and live exchange data available on a minute by minute basis. While the type of asset concentration changes substantially from sector to sector,[3] current values may exist for a substantive set of assets and temporal estimates (say weekly or monthly) of values may exists for many others.
[SS3]
n Some assets are to be measured in some form of high fluctuation transaction-based values following real-time indices xxxx
nAn account for valuation changes must be created that allows for valuation changes not to flow thru income [SS4]
n Income flow thru only should happen when asset realization occurs and this calculation should be using some form of inflation adjustment
n Where appropriate even future indices may be applied as long as the documentation is clear
n As today we keep depreciation schedules for major property items the new model should have valuation schedules for say the largest 100 assets of the corporation
n The economics of information today are such that constant evaluations of asset values should be doable, disclosable without prejudice of competitiveness, and usable by the user’s analytic tools
n Present value of any future income flow with allowance for best-estimators (and their variance)
nProcesses, nature of account, inter-process controls and other lesser items determine reliability of numbers at the transaction, reporting aggregate, and general ledger levels among many.
nAssurance / audit processes change these values on a continuous basis (real time seal, alarms, control tickers, points of comparison)
[MA5]
2.1.7 Deterministic representation of stochastic phenomena
The litigious nature of American society has led to poor compromises in the disclosure of data. The profession, stung by criticism and litigation, has often decided and set standards for single number disclosure on stochastic assessments. The profession has not issued attestation stating that a particular financial statement is reliable to the 95%, has not allowed for management earnings forecasts to be stated in ranges, and has not stated that most mineral reserves are of a certain value based on the commodity prices in the last 12 months. However it is pretty clear that statements of this type would be preferable for sophisticated users.
While many statistical estimates pervade annual reports (e.g. pensions, bad debt, etc.) these are stated in a deterministic format emphasizing the basic weakness of traditional reporting. When the distance between the report and its underlying stochastic reality gets too big, the credibility of business reporting disappears. If the variances around the values of estimates are very large the credibility of point estimates are very small.
The new business reporting model will have to rely on a wide set of disclosed probabilistic assessments for past results, current actions and future estimates. It is better to be about right than exactly wrong.
proposes a set of probabilistic oriented reports whereby all items in the traditional statements (BS, IS, & FF) are reported as point estimates with a variance measure. For example, the corporate cash level at 12/31/xx was 20m plus or minus 5%, and our best estimate for the value of inventories is 60m plus or minus 15%, and that our current estimate for P,P & E is 71m plus or minus 25%, and that the intangibles in our balance sheet originated by the merger with ABC corporation are 75m plus or minus 100%. Each of these numbers is composed of numbers from each division and each of these numbers has its intrinsic variance. As the public today glazes at the thought of point estimates and variances, a targeted educational effort could help significantly while the investment public can ultimately use the point estimate at a deterministic estimate if so desired.
Extending the reporting range for non-financial variables, key numbers in financial and non financial units would describe non-financial items and hyperlink to the bases of estimates using point of comparison indices developed by the specific industry relevant to the particular line-of-business. Assessments of quality control probability based scorecards would complement this picture. [m6]
Figure 2: Probabilistic reporting
2.1.8 The disclosure of predictive information
Congressional hearings during the malfeasance crisis demonstrated deep skepticism about earnings projections by management. However management is clearly the one that can provide the best predictions of company performance and so the issue should be how to present and constrain this information to avoid spin and self-serving stock manipulation. If managers have stock options or stock holdings that are available in a short period of time they can overstate earnings to create a spike in valuation until results come in. The above supports the argument that a new process and requirements for predictive information must be developed
Figure 3 breaks down information relative to its time frame. Future information is there focused on 1) leading indicators and basic relationships and 2) forecasting and models. Consequently the emphasis is both on specific numbers and the structure that is driving these numbers.
Figure 3: time frames of reporting
2.1.9 Semantic versus quantitative description of accounting phenomena
A company’s annual report contains traditional financial statements, [m7] footnotes, and a wide set of textual materials. An entire information intermediation industry has emerged to extract, standardize and organize information for the final user. Large companies can acquire S&P’s Compustat that contains most public US companies financial data normalized for use. The emergence of the XBLR standard may facilitate the utilization of data and comparison among companies at a more democratic level where individual users have an Excel add-on and harvest the information themselves without any data transformation. [SS8] However, most information contained in an annual report or an SEC filing is not the formal information from Balance Sheet, Income Statement or Uses and Sources of Funds. It includes footnotes, comparative history and a wide array of soft information available from the annual report. New techniques will need to be developed to extract, categorize and disseminate the qualitative information contained in financial statements.
Overcoming these deficiencies in the current reporting system requires taking advantage of radical changes in the technological basis of business and an equally important shift towards a process perspective of the firm.
2.2 The Real-time economy: The Technological Basis for Reengineered Business Reporting
The real time economy requires dynamic adaptive models for its realization. The core objective of the real time economy is the reduction of latency between and within processes. Latency reduction will reduce capital occupancy costs by occupying assets (physical and labor) for less time. Technology now provides a public and common communication infrastructure, increasing information sensing for automatic measurement, and large integrative databases. The second major wave of Internet usage is beginning to take shape and now that there is inter-linkage of systems on a global and ubiquitous basis, the age of the interoperable applications is emerging. Interoperability means that applications that interact do not need to be closely coupled but share common data specifications which allows for independent applications to work together without major adaptations.
The W3C has proposed the XML (extensible markup language) as the tool for data standardization for interoperability.
Figure 4: Basic XML transaction
These information capsules, as described in Figure 4, will be routed through the value chain. For data transfer to be effective it is essential that data be self explanatory and that the applications managing the data use and transfer be able to ubiquitously understand the content of this data. The XML derivative for managerial accounting is XBRL / GL (eXtensible Business Reporting Language / General Ledger) while closely associated to it XBRL is focused on external business reporting.
2.3 The Evolving Path for XBRL
While the adoption of standards for external business reporting is inevitable, by essence this process is a [4]dynamic road. Its original proposed structure will have to withstand the test of usage, while the standard itself changes over time to improve its usefulness. Most likely a series of problems will arise which include:
* Heterogeneous acceptance of the standard across countries and sectors.
* Some regional differences in the interpretation of the standard
* Some features of the standard will become entangled in local legislation and practice causing incompatibilities
* Some adoption will be statutory some voluntary
* A Babel tower of taxonomies will emerge before some simplification and mapping occurs
* The expansion of the standard to less specific (semantic) regions of business reports will be slow and confusing. For example the labeling of footnotes will evolve naturally.
* After some positive standardization of balance sheet, income statement and fund flow information is likely that there will be some progressive agreement on key disclosure items and performance indicators which will have specific tags. These agreements hopefully will be synchronized with the emergence of some consensus on EBRM[m9] . Furthermore, key elements of common footnotes and other non-financial data will be progressively tagged with specificity.[5]
Financial intermediaries will be in the cusp of this evolution adding structure to the evolving (and increasing) standard. For example, they will create databases of XBRL disclosures and add data integration with additional sources to decrease transaction cots to the users. Also, they will progressively incorporate the above mentioned key disclosure items and performance indicators into these databases, saving the users the need for data collection, manipulation, backward compatibility construction (creating time series) and model building. While the large financial shops will continue building their own models, smaller entities and committed investors will use templates provided by these financial intermediaries, increasing substantially the democratic nature of market information.
Furthermore, while the traditional domain of data will be expanded in search of transparency and made accessible and easily usable by XBRL, many sources of less traditional information will come to being. For example the FDIC is formalizing the usage of XBLR in the collection of call report data from banks and applying a large set of business rules during the collection process both to decrease the potential of errors, as well as to allow analytic technologies weed out fallacious reports.
US corporations are subject to many reporting regulations such as the FCC, PGC, NYSE, OSHA, etc.. These regulations will eventually require reporting along a type of XBRL taxonomy and substantive convergence towards common requirements in an attempt to decrease the compliance burden. Financial intermediators will lead in the creation of these integrated databases and serve as a bridge towards common taxonomies and the creation of data streams that are backwards compatible. (go back and prepare data for periods prior to regulatory requirements)
XBRL as well as many of the other XML derivative standards[MAV10] will create a much more fluid path for data exchange. Figure 5 displays the interchangeability of internal and external value chains and the free flow of transactions of different nature (say labor, material, purchases, and services). These relationships, which are structural, can be modeled and controlled by the use of real time adaptable relationship models (continuity equations). Companies will choose the processes where they have competitive advantage and will outsource (create alliances, partner) the ones where they cannot provide improved margins. As a result, an entire new set of data integrity and ownership concerns will emerge.
s
Figure 5: Data transfer chains
The flow of this data will allow a new form of automatic corporate reporting and management to evolve. The transactions flowing through the pipe will be constantly measured and accumulated to have online-real-time balances of transaction flows that may or may not be disclosed[6] to the public. Figure 6 illustrates the arrival of three different types of transactions that are accumulated into continuous “income statement type” reports .
Ultimately twenty first century reporting will focus on the monitoring and control layer where measurements of corporate processes will be compared with process performance models for the determination of variances. If these variances turn to be too large some form of management reaction will be necessary. In a real-time society much of this comparison and following management reaction will be automated thru some simple management bots[m11] (automated management actions) while some unusual events will be relayed to real managers or audit action. Transactions will be accumulated into detailed general ledger accounts (following an XBRL/GL – more detailed – taxonomy) and will be available in the company’s Enterprise resource Planning System’s database for extraction. These extracts will typically be very numerous (in the form of tens of thousands of electronic reports) and standardized from the particular version of the ERPS. A small subset of these reports will be extracted and carefully staged to represent the corporate “official” business reports. These business reports will encompass the corporate Balance Sheet, Income Statement and Funds Flow that are currently easily tagged into XBRL but will also serve as the basis of a wide cadre of footnotes, body of the financial statement and information releases to many different entities and stakeholders.
Further into the future some degree of semantic processing technology as well as the issuance of standards[7] will allow for progressive business report content to be narrowly coded into XBRL tags. Consequently, and finally, an increase in transparency, such as clear comparability, will be possible in the footnotes such as pensions, compensation, accounting policies, extraordinary events, contingencies, options warranted, marketing plans, intellectual property assets, human resources deployed, intangible assets owned, etc.
Corporate Management Accounting is now the owner of a wide set of information. In the modern world, state-of-the-art companies have much online / real-time information. For example, no bank could live without their current daily financial balance closing as they would not be able to apply it overnight, no manufacturing concern could live without real time inventory information as they would not be able to practice just-in-time manufacturing, and most companies would have great competitive difficulties if they did not have real time payables and receivables information to collect or provide discounts based on time characteristics. [P12]
The monitoring and control process will eventually dominate corporate information processing with many of its components automated, as standards will evolve to provide interoperability. The next two decades will witness progressive development of management action algorithm using automatic (XML derived) data standards for accelerating the time delay (latency) of the performance of processes themselves and the transmission of data among processes. While current technology does not seem to be able to substantially accelerate trucks and airplanes to deliver goods between locations, to decrease lunch breaks of clerks, or increase/decrease [P13] the speed of consulting engagements, modeling and decision automation will accelerate dimensionally management action and bureaucratic information processing.
Figure 6: the reporting layer
The business reporting cycle will also suffer substantial acceleration. Recently Cisco and Motorola have announced their “virtual close.”[MAV14] This process brings the accounting closing to the daily cycle and allows for a substantial decrease in accounting adjustments and end-of-the-period earnings management. This process will also increase the volatility of results reflecting the realities of the real-life business process. While “continuous reporting” should be a process with NO CLOSINGS, and a constant set of balances, the “virtual close” has approximated its timing. Although in reality, real time reporting has its technological foundations available now for many companies, the business reporting, legal liability and management’s reticence for accountability at many dimensions, has effectively slowed down the adoption process.
The W3C consortium (Word Wide Web Consortium) has proposed not only a web infrastructure but also tool for Web development (SOAP) and the basic framework (XML) standards for data interchange. It has also formulated the philosophy of a progressive anonymity on the Web where data flows through the universal data bus (Internet) and applications can sniff it out and provide interoperable services. While this vision is still quite fuzzy it can be visualized in many domains and now is the venue of many starting commercial efforts.[m15]
An entire family of potential Financial Web Services that will cover the scope of many current services and prospective ones in represented in Figure 7. While today accounting functions are performed inside a series of software for large, medium and small companies, in the future many expensively updated functions (such as locality and state tax tables) will be served by Web Services. It is easy to envisage depreciation services, asset valuation services, intangible valuation services, option valuation services, transaction security and tracking services among many.
Today we already find many companies (say Boon) providing special reporting functions for example for the SEC and for the FDIC. Many layers of special reporting are possible and will eventually evolve to support business.
In the assurance arena we have currently a major standoff due to the emergence of the PCAOB (Public Company Audit Oversight Board) and the ensuing immobilization of the the AICPA, big accounting firms and other related market players. However the reality is that many different assurance needs are arising some of which are being satisfied by the accounting profession while many others are either being ignored or are being addressed by other professions. The AICPA[8] [9] [10]has reacted by creating the WebTrust and Systrust services which have not yet developed substantial traction. Eventually however, Web based assurance Services, more robust than the current Webtrust, Trust-E, etc will emerge to support Web site trust, transaction trust, valuation trust, data trust, etc.
Three other high potential financial services will involve analytic services (where the Web entity will provide models for the lower and middle markets), fraud detection services where transaction streams and balances will be continuously scrutinized and compared with fraud profiles, as well some form of data level assurance where each data will have a tag(s) indicating its level of reliability, its path, and the reliability of its underlying control processes.[P16]
in
Figure 7: Web Services
2.4. Continuity Equations: The Conceptual Basis for Reengineered Business Reporting
In this economy business processes are measured on a continuous basis through different types of sensors that capture digital measurements of business metrics. This data are captured at a far finer granularity in time and detail than have ever been possible before.[11] Everything else provided by this ability for more frequent reporting, is a by-product of this fundamental change in the capability of data capture. What that data stream makes possible is measurement with an unprecedented degree of correspondence to underlying business processes. Furthermore the utilization of this data stream and its comparison with a new class of performance models that must be developed[12] will provide the basis for many automatic management decision models where the slowest element of the process, the human being, is excluded by automation. Figure 8 describes a formalization of these processes of data capture, comparison standards, exception standards, and meta-processes for measurement, control, management and assurance. Business processes, which are defined as “a set of logically related tasks performed to achieve a defined business outcome,” (Davenport and Short, 1990), are considered today to be the fundamental atomic elements that make up a company.[13] Thus a company is now described by what it can do rather than by its assets.That changed mindset has yet to be incorporated into traditional management and its assurance. What is fundamental about the real-time economy is that it brings the process approach explicitly into management through the very prompt measurement of processes and the comparison of these metrics with dynamic benchmarks that represent prescribed levels of business performance.
Benchmarks that allow for the comparison of business process metrics with a standard (or model) will assume a much larger importance. The real-time economy discussed above, where processes are constantly monitored and their measurement compared with a benchmark for control purposes, requires highly dynamic adaptive models that can adequately represent the normative value that metrics must assume. Furthermore, in addition to basic benchmarking for first harmonic data comparison, second harmonic variance is also necessary for control purposes. Figure 8 illustrates this issue where processes are monitored and controlled by information systems, models, and management. When noteworthy exceptions occur adjusting management actions are effected. Some of these exceptions, are of (maybe also) assurance interest and are alarmed for audit purposes and directed to the audit “control” system.
Figure 8: Meta-processes in measurement and assurance -data capture and control
The monitoring and control of an organization’s processes can be viewed as a 5 level set of activities as described in Figure 10. The structural level (processes) is measured and metrics extracted and captured for the data level. Data is stored at a high level of granularity, say, basic transaction level. This data history may be examined under many distributions (cuts) such as time period, division, product, function, originator, etc.
The third level encompasses the relationships perceived or prescribed among metrics, against which the organization performs control functions. For example, all flows from one process that reach the next one would constitute a one to one relationship and any differences would be exceptions. In general, to use metrics captured from level one in a control process it is necessary to have the measurement of the actual (metric), a model for comparison and a model of variance (which specifies the acceptable variation). The control process will compare the metric with the model, calculate the variance, and then decide if the variance is acceptable. If not, an alarm is triggered that may call for management action and/or assurance. The models may be very simple univariate levels to very complex multi-entity relationships like continuity equations. Among the types of models in CA we find:[m17]
• A fixed number (normative or empirically derived)
• An adjusted number with some form of analytic related to seasonality, hierarchy, or structure relationship
The structure relationships can be represented by continuity equations and may represent:
1. Reconciliation structures
2. Semi deterministic relationships
3. Structures across processes
4. Empirical relationships across processes
5. Empirical relationships of a high level among KPIs
The fourth level is the level of analytic monitoring and links very high level measures across processes. KPI (Key performance indicators) can be used to help understand process consistency as well as process performance. If measurements are not available at a lower level, this level serves to provide coarse alarms of major process difficulties.
The fifth level is a meta-process level where the actual control and monitoring functions are performed based on continuous measurement, monitoring and proactive exception handling.
Building on this model, the proposed solution is based on a view of a business in a real-time economy that would serve as a solution for some of the ailments encompassing the following factors:
nCreation of a multivariate measurement model that does not focus exclusively on earnings per share and allows users to predict and evaluate business’ performance on a multivariate basis even if these measurements are in different dimensions (apples and oranges)
nCreation of a measurement model that is oriented not only to investors but to other stakeholders of the business
nCreation of a measurement model that not only represents static measurements of business but also the types of relationships that represent the business. These relationships can be structural, relational, empirical or comparative in the form of sector benchmarks.
Figure 9: Galileo Enhanced Business Reporting Model
Based on the examination of the current reporting model (GAAP) under this framework it can be concluded that a dynamic world cannot be well measured with static measurements, and that the technology exists for a more dynamic method of measurement to evolve. The disclosure model is very disjointed when the economic status of a firm has to be shown on a piece of paper (flat) and with very wide discrete intervals. Furthermore, while markets seem to value firms on a wide range of non-financial assets, the GAAP-based model focuses on financial items. It is also concerning that the measurement process focuses on the physical assets of companies more typical of the industrial age, while the valuable assets of an information economy are neglected.
In an age where companies outsource many of their processes, suppliers carry the inventories of many companies, the RFID technology allows for specific identification of inventories, parts and assets, we still use FIFO and LIFO inventory valuation methods.
In an age where dynamic markets exist where products are valued every minute we still focus on forms of historical cost as a substantive part of our business reports. In the days where it is well known that there is substantial leeway[14] [15] of interpretation in every number that determines an entity’s income we still focus on earnings per share.
Another irony is that in the last couple of years and supposedly the next few, the FASB and the IASC will be focusing on the convergence of standards, converging towards a set of standards that is irremediably obsolete.
If the measurement model is seriously compromised, progressively presenting less and less mapping with reality, the provisioning of assurance of these numbers is useless and is performed only for statutory purposes. It is not surprising therefore that accounting firms have progressively relied more in cursory analytical reviews and acted more like insurers than auditors. If the measures do not measure, even the best of the audits would just assure bad numbers that do not mean anything. Most litigation against auditors happens in failure situations, bad measures do not detect these, consequently good or bad auditing does not change much the auditing firms’ risk profile. Under these conditions, any downturn will show the underbelly of weak firms that have stretched their reporting to the limit and in their demise will punish CPA firms for purposely “bad audits” or irrelevant numbers that had little representativeness of the firm’s economic health.
[MA18]
2.4.1 Levels & Basic Concepts
The Galileo enhanced business representation model in Figure 9 entails 5 levels: 1) structural level, 2) data level, 3) relationship level, 4) analytic monitoring level, and 5) continuous reporting and assurance level. Furthermore we will define five main types of concepts[16]:
* Metrics – Metrics are defined as direct measurements of the system, drawn from reports, in the measurement stage. These metrics are compared against system standards. If a standard is exceeded, an alarm appears on the screen. For example, in the auditing of a billing system, the number of bills to be invoiced is extracted from a user report. The number of bills not issued due to a high severity error in the data is captured as well as the total dollar amount of bills issued. These three numbers are metrics that relate to the overall billing process.
* Analytics - Analytics are defined as functional (natural flow), logical (key interaction), and empirical (e.g. it has been observed that ....) relationships among metrics. Specific analytics, related to a particular system module can be derived from the auditor, management, user experience, or historical data from the system. Each analytic may have a minimum of three dimensions: 1) its algebraic structure, 2) the relationships and contingencies that determine its numeric value at different times and situations and 3) rules-of-thumb or optimal rules on the magnitude and nature of variance that may be deemed as “real variance” to the extreme of alarms. For example, a billing analytic would state that dollars billed should be equal to invoices received, minus values of failed edits plus (or minus) the change of the number of dollars in retained invoices. The threshold number of expected invoices for that particular day or week (allowing for seasonality) must be established to determine whether an alarm should be
fired.
* Alarms – are exception conditions where a measure and its standard are compared and the ensuing variance is larger than the variance standard.
Actual experience with these issues indicates that several levels of alarms are desirable: 1) minor alarms dealing with the functioning of the auditing system, 2) low level operational alarms to call the attention of operating management, 3) higher level alarms to call the attention of the auditor and trigger “exception audits” and 4) high level alarms to warn auditing and top management of serious crisis. Establishing these alarm thresholds is a second harmonic development. The data and experience needed to understand the phenomena being measured to the level of specification of alarm standards are probably not available in most organizations.
* Standards or models represent the ideal state-of-the-world in a particular process. Any monitoring process requires the comparison of a metric to a model or standard to determine abnormal conditions. Furthermore, the magnitude of this condition is evaluated by a “standard of variance” in the decision on whether an alarm should be activated. Models of variable behavior over time in real-time systems must be developed in a way that would represent real-time behavior of dynamic systems. The evolution of real time monitoring needs adaptive models that take into consideration: seasonality, business trends, relationships between processes, timing between the processes, and flow of anomalous but legitimate transactions process to process.
* Method of Measurement: the method of data capture and classification is an important variable in the future system representation scenario. Continuously captured data can drive monitoring processes to real-time exception measurement and alarming.
The CPAS[m19] process captured data through report scrapping (Vasarhelyi & Halper, 1991) in electronic reports. Different monitoring processes are progressively capturing data in many more direct manners such as data sensing, queries to databases or the utilization of intermediate data (Hume, xxx) between batch processes.
At the most basic level, the structural level, a number of transactions are taking place in various areas of the business, and there are time lags between each (illustrated by the hourglass shapes). In the new real time economy, there is decreased latency between these processes, which makes it possible to achieve real-time or near real-time reporting. Automation decreases the latency of processes by dimensions. The structural level represents a set of non-financial and financial processes that are interlinked in the generic process of wealth creation. There are physical, logical and statistical relationships between the processes and between the different metrics of these processes. Figure 10 is the lower level process were intrinsic relationships exist. Marketing drives advertising that drives sales. Once a sale is performed part of the transactions (40%) generate immediate cash while part of the transactions (60%) tend to become receivables 60% of which are paid within 30days, 20% within 60 days and 15% within 90+ days.
Five percent of the transactions become bad debt. Figure 11 represents a three period cash flow that comes from these transactions.
Figure 10: sales to cash
Figure 11: Cash flow modeling
While these transactions can get complex, the effects are very measurable and their study can help create models that are structurally-based. If sales are assumed constant a markov chain model can be used to model it and input levels will assume an ergodic state.[SS20] However the structural linkages are more complex and the structural modeling can be extended to ensuing boxes.
The model is expanded, still in structural nature by including the role of inventories and the role of provisioning into the process.
Figure 12: sales to cash inventory and bad debts
This representation can be modeled by now including the role of inventory payments in the depletion of cash and can be an input to the provisioning equations which drive inventory management and other functions. While this modeling focused on inflows of cash in a multi-period setting, assuming 3 period intervals, many different assumptions can be made .Figure 13 displays a more realistic set of flows for cash, a core variable that is worth modeling.
Figure 13: more complete cash structural model
However the structural level 1 includes processes that are not financial and not necessarily structurally linked such as inventory and provisioning (where physical factors such as obsolescence, shrinkage, and delays may have an effect), or even farther but still related processes such as marketing and CRM. For these, some stochastic continuity equations are to be built based on experience parameters. For example experience may say that for every dollar of advertising in the south region you generate 7 dollars of sales and in the northeast only 5.
Figure 14: Level 1 - structural
The next level is the data level, where measurement of financial and non-financial indicators takes place, and individual pieces of data are reported with the ability to drill down to look at historical performance and compare data across business lines, products, managers, etc. Most companies do this internally today through some form of spreadsheet analysis, but given the capabilities made possible through new systems and decreased latency between processes, which we discussed before, it is now possible through constant measurement to move to the relationship level.
The spreadsheet analogy
A SPREADSHEET program is a good metaphor for describing the IT architecture (and the measurement of business) of a real-time enterprise. But such programs also demonstrate the extent to which companies and their employees are often still stuck in batch mode. The data they use in spreadsheets are often out of date and must be put in by hand. ….
In contrast, modern spreadsheet software is as real-time as it gets. To a layman these programs look like tables with many rows and columns of “cells”. (the data level in the Galileo model) But their most important feature—how these cells are related to each other—is invisible. (the relationship and analytic monitoring levels in Galileo, and now necessary to present a non-obfuscable view of business) Often they are connected by simple operations such as addition or multiplication. Investment banks in particular, however, use more sophisticated spreadsheets in which the cells are linked by dozens of “macros”, sometimes quite elaborate sub-programs. If a user changes the data in one cell, many others are automatically recalculated.
To advocates of the concept, the real-time enterprise is a giant spreadsheet of sorts, in which new information, such as an order, is automatically processed and percolates through a firm's computer systems and those of its suppliers. Thus a simple inquiry such as, “When is my order being shipped?” can be answered immediately. Many consumers have already encountered real-time business without realizing it, for instance when they order a Dell computer. The firm's website allows customers to check the status of their order at any time.
Juice Software, based in New York, has developed a set of programs that allow users to turn their spreadsheet into living documents. With a few mouseclicks they can link a spreadsheet cell to a data source, for instance a corporate database.(such a interconnection and tracing capability is very important in the reporting and assurance of the modern enterprise) Smart software on a server in the network ensures that this cell is automatically updated whenever the information changes. Users can also connect their spreadsheets among themselves, so if one member of a project team changes a cell, the changes automatically appear in all the team members' files. (extracted from the Economist[17], annotations in bolded italics added)
The data level in the modern enterprise as described in Figure 15 entails many measurements out of the processes as described above and as listed in the list of POCs for non-financial variables. Furthermore with the advent of databases, OLAP tools and style sheets, the “spreadsheet of measurement” of the modern enterprise incorporates the capability of drill-down (in finer details of the data structure, at the extreme into certain characteristics of a transaction such as amount or geography), accumulation of history not only of reported variables but also of desired aggregates (at the extreme say sales for a certain store) and distributional characteristics (ability to cut access parameters as geography, product or division).
Figure 15: Level 2 data
In the relationship level between key variables (for instance),.this allows the modern manager in a real-time society to make decisions based on current relationship models in addition to historic information. This, under the Galileo model allows a deeper level of disclosure that explains how the measurements of the data level are related to each other. The analogy is the formulae in a spreadsheet that exist in the background of the report. These relationships can be structural or stochastic as described above. In Figure 16 the relationships involve sales and marketing, care queries and number of sales, and potential delay relationships.
Figure 16: Level 3 – relationship
To further explain this disclosure level a balance sheet could be transformed in a Galileo reports (a la sustainability report) and presented in sheet 1 of a spreadsheet while a model relating some of the variables would be in the second sheet and the user could calculate the variances in the third sheet.
The disclosure of these relationships, in addition to being valuable in increasing reporting transparency and deterring reporting obfuscation would have valuable feed-forward effect motivating better modeling of business and improved self insight of causes and consequences of business numbers. Figure 17 and Figure 18 introduce different representations of the relationship level.
Figure 17: processes, measures and relationships
In Figure 17 relationship 1 relates marketing to e-care. This is an obvious relationship which parameters must be examined and estimated with care. In this relationship increased marketing leads to increased sales which ultimately increases the demand for e-care contingent on the effectiveness of advertising and sales efforts, the quality of the products, and the accessibility of the care. The care effort also leads to secondary sales.
Relationships 2 & 3 are narrower and more direct.
Figure 18: the spreadsheet disclosure model
While eventually most corporate systems will have extensive levels of detail and statistics enough to sustain substantial relationship-based monitoring, the Galileo model also has a higher level of relationship monitoring. This level is called analytic monitoring level and relies heavily on industry and company specific key performance indicators (KPIs). Level 4 (Figure 19) is both aimed at third party monitoring of corporate performance as well as internal monitoring in particular where information is not sufficient.
Companies monitoring their processes step by step may miss significant macro trends in their performance (missing the forest for the trees) and will benefit also for having the KPI monitoring level where better understanding of business is obtained. Strategic planning level managers will tend to focus on level 4, while management and operational control managers in Anthony’s notation ( see Figure 19) will focus in level 3.
[MAV21] In analytic monitoring, significant deviations from the norm for key performance indicators can be identified. This may indicate that a process is out of sync (such as…) even if detailed support may not exist. The next step would entail detailed analysis to capture the reason of misbalance. And of course you still have drill down capabilities at these levels, which can be extremely powerful.
Figure 19: Level 4 - monitoring level
Finally, continuous reporting and assurance (Figure 20) ensure the reliability of your systems and data, through transaction assurance, estimate assurance (on mgmt projections), compliance assurance (comp. w/GAAP), and so on, which enables you to report important business information externally as well as internally with confidence…and so, what you have in the end, is a much more robust, automated reporting process that tells you much more about the effectiveness of management, specific divisions, etc…, providing accurate and useful data on a real or near real-time basis.
Furthermore, XML tagging will enable interoperability, making it possible for connections across internal and external partnering entities.
Figure 20: Level 5 continuous reporting and assurance
Figure 21 displays three types of XML tagged transactions flowing into the organization, which can be metered by some form of continuous reporting that would display cumulative levels of flows in a chosen time period. For example, all labor purchases (even if not yet paid) for the first 44 days of the year. This data being delivered to the system carries some form of data level assurance (for example a measure of the reliability of its generating systems, or an encrypted tag with an auditor’s assurance) or relying on other forms of assurance of system integrity (e.g. systrust). This data is delivered to the corporation’s ERPS under some form of XBRL/GL schema of reasonably fine chart of accounts. The accumulated data can, at any time, be queried for some form of level reporting (e.g. balance sheet) on a continuous or variable time basis. The ERPS support a large multitude of internal report, semi-internal reports and external reporting schema. Corporate processes under continuous assurance support: 1) transaction assurance (as described earlier), 20 estimate assurance, 3) rule assurance and 4) key judgment on process control assurance.
Figure 21: Continuous reporting and assurance
In order to create a process that reports on a wide set of financial and non financial variables, key POCs need to be defined. For example these could be:
(talk about pocs of accounting variables)[SS22]
[1] Greenstein and Vasarhelyi
[2] Vasarhelyi, M.A. & Cohen, E.C., “A Note on the emergence of data level assurance,” Rutgers Accounting Research Center, working paper, 2005.
[3] For example it is clear that the consulting and audit firm businesses will be much more dependent on human resources valuation than a highly automated manufacturer.
[4] EOL whitepaper on XBRL.
[5] AICPA’s Special Committee on the Enhanced Business reporting is evolving towards a societal consortium (
www.ebrconsortium.org ) where many of these expanding reports have been proposed. Four illustrations of the direction of business reporting were provided by this group that can be found at the above web site or at
http://www.lintun.org/ , and at
https://raw.rutgerrs.edu/raw/galileo .
[6] Disclosure is not limited by technological factors but by competitive intelligence and fears of evaluation of management on multiple dimensions.
[7] Eventually there will be tremendous pressure on standard setters to issue “digitalizable standards” that can be automatically converted into computer code.
[8]
http://www.aicpa.org
[9]
http://www.aicpa.org/webtrust
[10]
http://www.aicpa.org/systrust
[11] More details about this aspect of CA is provided in Vasarhelyi et al (2004) and Vasarhelyi and Greenstein (2003).
[12] Alles M., Kogan A., and Vasarhelyi, M. A., Continuity Equations, working paper, CARLAB, Rutgers Business School, Newark, NJ, September 2004.
[13] Porter (1996).
[14] Businessweek
[15] Swieringa, R., Accounting Magic …xxxxx
[16] Vasarhelyi & Halper, Continuous Process Auditing, Auditing: A Journal of Practice and ….cxxx, Fall 1991, pp- xxx-yy.
[17] Economist, The real time economy, January 31, 2002.
Back to Homepage
[m1]uhhh
[ 2]Miklos, why is this section needed, when it focuses on auditing and not on financial reporting?
Materiality is now part of the financial reporting conceptualization.
I think we should make the point that precision is part of accounting measurement consequently it must be part of the disclosure.
But I am happy to delete it from here if we do measurement precision and assurance somewhere.
[SS3]needs a connection to the points below
[SS4]explain?
[MA5]We can’t leave these as bullet points. Either we have to expand on them on delete.
[m6]Rewrite the paragraph
[m7]rewrite
[SS8]This paragraph is not clear
[m9]Explain EBRM
[MAV10]Need to find a survey document that tals about these
[m11]explain
[P12]Needs simplification?
[P13]Needs one or the other?
[MAV14]Please insert something fomr the articles I forwarded u
[m15]This paragraph is confusing
[P16]What exactly are the three financial services?
[m17]repetitious
[MA18]This is all repetitious and we also need to carefully distinguish comments about financial reporting from comments about assurance.
[m19]Explain cpas
[SS20]Are these real words?
[MAV21]Create linkage here with the Kotler report maybe moving down a few of the KPI indices there mentioned
[SS22]We are missing more detail here?
3. The Proposed Solution
The “Galileo” model is being developed in order to accomplish some of the above objectives and address the types of problems hereby raised.. This model does not aim to be an incremental solution but aims to show the potential of applying the computer and reporting technologies of modern age to the problems of corporate measurement. This “extreme accounting” model is not expected to be implemented as proposed but to be used as the extreme benchmark while also offering the types of social compromises that are necessary to enhance the business reporting model.
4. Illustration
4.1 Basic Stakeholder Driven Disclosure Technology
The Galileo approach follows a pull rather then push approach for business reporting. Under this scheme stakeholders of the company are able to access frequent audited data and pull specific information about the company. This approach calls for more frequent reporting and more disaggregated data, and business reporting rather then the limited financial reporting. Specifically, each company will provide access to part of its database (limited access permission to audited data) and stakeholders could tap into this resource. Some of the information should be disaggregated to the extent that it allow users to view the granular data or aggregate the data based on a number of given standards, assumptions and estimates. The following section provides a description of the technology described in Figure 25.
The Galileo model has a number of layered technological components. The first lower layer components comprises of the OLTP (Online Transaction Processing System) which is layered on top of a central enterprise relational database management system. An ERP system is an example for such application that provide cross functional integration for companies. The term Enterprise Resource Planning (ERP) refers to systems that typically span the entire enterprise and address all of the enterprise's resources. In addition to being able to handle multiple currencies and languages, a key feature of ERP systems is cross-functional integration. ERP systems are based on the so-called client-server architecture that is comprised of three tiers (or layers) that segregate: 1) the user interface (Presentation Layer), 2) the application processing component (Application Layer), and 3) the database system (Database Layer). Every ERP system has one central database that is accessed by all application servers. This central database is accessed by all ERP users, regardless of which module they use.
The enterprise ERP system and the relational database provide companies with the ability to generate and use real time data. This operational data is continuously assured using continuous auditing techniques such as embedded audit modules, parallel simulation and controls tags. One the assurance is done this data is periodically migrated into the corporate data warehouse. A data warehouse is a repository storing integrated information for efficient querying and analysis. Due to its non-transaction oriented nature, data warehouse allows for efficient storage and extraction of information. Because the data warehouse is separate from the corporate operating environment it is possible to use sophisticated indexing techniques to facilitate efficient data retrieval. Using OLAP (Online Analytical Processing) it is possible to extract summarized data and to drill down to the needed level of detail.
Therefore, the next layer in the stack is the OLAP engine which allow for data aggregation and disaggregation. The OLAP engine can create cubes of data based on predefined attributes and enable detailed representation based on numerous sub-attributes and categories. An OLAP interface engine would allow unsophisticated users to use this powerful tool based on their level of permission. In other word, an authorization and authentication layer would exist on top of the OLAP engine layer to provide users with the appropriate access into the data warehouse. An access control metrics will be devised based on a need to know basis and the level of publicity of the data. Specifically, and anonymous permission will be given to any un-identified user wishing to access the least restricted form of financial information. Other forms of access controls will be assigned based on some predefined criteria[1]. The OLAP engine is going to be masked so that users will not have to interface with the underlined infrastructure. The next layer is the aggregation layer, this layer interacts with the OLAP engine assisting in defining attributes and level of aggregation of the data. The aggregation layer will comprise of but is not limited to rules and standards such as GAAP and IGAAP by which data can be aggregated and models and estimates that the company uses. Again, users will not interface directly with the aggregation layer, users will logon to a secured website and will have a number of functionalities based on there authorization level. Users will use active web pages to request for specific data. All the data items that will be displayed with be made available in XBRL format as well. Data can comprise of an entire consolidated financial statement or a breakdown of detailed information about particular account in any of the virtual entity.
4.2 Valuation
Valuation has always proven to be a challenging task. One of the objectives of the existing accounting model is to make it possible to depart from historical cost based accounting and facilitate market valuation of assets. For this purpose the Galileo model proposes to disclose assets based on both historical based and market value based. For that purpose it is proposed that independent third party valuation service will be established. Independent third party value assessors will have to establish valuation techniques based on objective publicly available data. Each company should obtain valuation from at least two independent providers and valuation estimates for each asset category should be provided in a range format, i.e machinery for subsidiary A is valued between xx,xxx and xxx,xxx.
In today’s environment there is substantial amount of information that is publicly disclosed. There are enormous databases that contain recent real estate sales of properties. Thousands of transactions are taking place on eBay on daily basis. Historically, valuation of assets was a controversial issue. The objectivity of asset valuators was impaired by virtue of receiving compensation for these valuations. In today’s digital world, it is possible to use publicly available data to objectively value assets. Models can be developed to extract data from such source and apply predefined valuation technique to many assets that companies own. This process can be done with no human intervention and consequently provide an objective reliable way to supplement historical based accounting.
As an illustration, a company might own the following item “Catalyst 6500 Cisco Switch” as part of its inventory or as part of its operations. An independent service can digitally receive price quotes for such item from numerous vendors (some price comparison websites such as “MySimon” provide similar data), and observe recent sales at eBay. Subsequently a valuation algorithm can be applied to compile this data and calculated the estimated market value for that asset. In a similar manner many inventory items and other fixed assets can be valued.
[1] This predefined criteria can defer across industries and users. However permission based reporting does not necessarily contradict regulation FD which is intended to democratize the information propagation process.
5. Conclusions
5.1 The environment
The original business measurement model was developed for the industrial organization of the 19/20th century under a regime of limited to non-existent information technology. While business organizations have changed substantively towards an information organization whose main assets are not physical in nature, the measurement model has not evolved and so it is poorly equipped to deal with the emerging information organization.. Furthermore, a bad measurement environment and audit technologies consisting of pencil and paper do not help the assurance process. Consequently, new and improved accounting and reporting requirements and mechanisms are needed to meet our commitment to the user/public marketplace of the 21st century. This document is one of several models prepared to support the Public Company Task Force of the Special Committee on the Enhanced Business Reporting Model of the AICPA ( the Starr committee). Examining the feedback from members of the Jenkins committee on a new business reporting model combined with the very negative current view of the accounting profession by the public in general and by regulators clearly revealed to the committee that the accounting profession was in no condition to create a new business reporting model by itself. Consequently, the Starr Committee is evolving into a societal consortium (EBRC) that is embracing a much larger set of players and addressing a wider audience in order to respond to the above concerns.
The GDM (Galileo Disclosure Model) aims to take advantage of an entire new set of technology and their ensuing economics to measure organizations in the information economy.
These new technologies are the core of the new real-time economy and encompass four main elements: 1) an ubiquitous carrier layer, 2) an integrated corporate application software 3) pliable and accessible user interfaces and 4) a powerful database technology. With the above technologies a new method of business measurement can emerge which we call continuous reporting. It takes advantage of a continuous data flow to display corporate levels and flows at variable time frames that are contingent on the natural rhythm of the application and on the decision frame of the user.
Already exists internally in certain forms
It is paradigmatically different from traditional reporting
a. The Axioms
A series of axioms were presented to start the debate and development of the GDM. The state: 1) the proposed model is the basic “extreme” model used as a basis for discussion not a proposal for statutory rules, 2) when valuation is difficult, disclose basic facts that can lead to user judgmental valuation, 3) support any estimate with its calculation basis, 4) provide raw not massaged data with panoramic level of details, 5) standardize the states-of-the world to avoid financial / organizational engineering, 6) if a model is used to summarize / extrapolate information to provide more than one model or the basis for these computations, 7) statutes should be composed of digital standards that can immediately be impounded into software not vague principle –based standards, and 8) provide data level assurance in addition to other forms of statement / process based assurance and 9) Information provisioning is a continuum from internal to external information.
5.3 The Proposed Model
The proposed model focuses on three main dimensions: 1) impounding technology into the reporting model, 2) providing an updated information economics improved reporting model and 3) changing many principles of reporting disclosure.
Technology:
Changing the medium of the report, capturing transactions at the XML atomisitic level, using databases for reporting, OLAP technology, drill downs, hyperlinks, style sheets, etc.
Information:
Substantial changes in external and internal data content and in the attributes of data in particular frequency, timeliness, level of aggregation, publicity of data,
Rules of Measurement
Business reporting portal, virtual entities, points of comparison, non-financials, relationship reporting, continuity equations, real time analytics, dynamic valuations, KPIs and analytical monitoring, future oriented information, formalization of business artifacts, formalization of MDNA.
5.4 Takeaways
The GDM is basically electronic and user-driven, with style sheets as pre-set structures. It recognizes multiple stakeholders by providing a multi-source based set of pre-set reports and a wider set of drill down granularity.
While it is difficult to create statutes in this direction, the GDM aims to report without public relations adjustments or embellishment, and without management of data by executives. The continuous nature of the data flow and the recommendation that reporting structures be extracted directly from ERPSs through style sheets, changes the nature of management adjustments.
The GDM can be seen as the dashboard of all external reports that includes virtual entities/segment reporting/ subsidiaries breakdown/consolidation of commitments to suppliers.
A complex layer of technology facilitators that is still evolving in business usage exists[SS1] . The supporting technology of the GDM has to evolve accordingly.
5.5 Miscellaneous Issues
Many research issues arise with this study. It is envisaged that this document will be frequently enhanced by comments and additional studies using the same type of technologies that are proposed for the GDM
* Relationships: logical, disclosures, modeling, information content
* Real-Time analytics
* Extract XBRL tagged data and perform real-time analytics and performance evaluation.
* Models that are adaptive and responsive to conditions such as spikes, cyclicality and process relationships
* Models that can be updated constantly and are executed automatically
* Models that are applied at a non- “cookable” level (say validated transactions)
* Disclosure of key risk factors
* Quantification of some of these factors