2.1 The Intractable Problems of the Current Accounting and Reporting System
2.1.1 Consolidation
Over the last two decades the standard setters have struggled with representing businesses with multiple segments that are not fully integrated, produce different products, and are in different geographies with differing currencies and methods of accounting. Large companies that have heavy industrial and financial components (e.g. GE and GM) blend into one measure very distinct types of numbers, which can tend to obscure financial performance rather than make it more transparent. The perennial problem of determining if two entities are one and need to be consolidated, or if they are different entities has been exacerbated by the development of Special Purpose Entities (SPEs). These are originally entities of very specific and narrow denomination which even 97% ownership did not create any co-dependency for the firm. This original definition was followed by a much larger set of usage by many businesses to the extreme abuse that Enron showed. While statistics on the existence and nature of SPEs are not available, they are much more widely used than generally understood and are applied by many of the most reputable organizations in the financial markets.
The evolution of organizations in the 21st century will lead to substantial deconstruction of business[1] where using internetworking technology will allow many functions to be outsourced, partnered, or turned over to the competition. While outsourcing could be a straight forward arrangement as when implemented by single independent firm, many forms of outsourcing will exist. Simply adding the component entities in a consolidation creates a very false sense of reality. These relationships are often more than their simple formalization. Xxx The core issues are ownership, inter and intra entity transactions, obligations for residuals and commitments over time even if not contractual. One of the reasons that the solution of the consolidation problem has eluded standard setting is that one of the motivations for consolidation is the obfuscation of individual unit performance. Consequently, standard setters never had the stomach[m1] to force substantial disclosure at the business unit level, nor the desire to force narrow business units of standardized form and standardized activity, reporting with the same accuracy and detail required from the consolidated entity. While the Jenkins report strongly suggested narrow and complete reporting at the line of business level, the changes effected by the FASB were limited and did not satisfy the real need that is emerging in the 21st century of creating dynamic standards and industry bench marks for online real-time business monitoring. Comparisons among organizations should be at the sector level not at the aggregate level where the addition of non-similar parts creates substantive obfuscation. A new type of aggregate entity should be invented and enough disclosure detail provided to allow for income calculation and asset allocation across and along the value chain.
2.1.2 Intangibles
The recent literature has started to pay increasing attention to intangible items. As discussed earlier, the new business measurement model must take into consideration a much wider set of assets such as intellectual property, human resources, brands, marketing investments, reputation and other items. However, many of these items should not be added to the traditional total assets figure, as their addition could result in a very misleading total assets figure. While we can estimate the value of cash fairly easily with a small expected variance , a much larger variance of the value can be estimated out of say inventory or property and a huge lack of reliability comes with the intangibles item of the balance sheet. Figure 1 illustrates the different levels of precision of the different items in the balance sheet and support the argument that these should not be added as this would lead to a total assets figure that is likely to be inaccurate..
Figure 1: relative precision of balance sheet items
New types of reports, which aggregate data with similar levels of accuracy, must be created. Research is needed to help the reporting agencies come up with methodologies that would adequately incorporate heterogeneous reliability measures that specify their components. Many of these “intangible” items, some of which are currently disclosed in the balance sheet and others just described in the body of a financial statement or in a report to other agencies that are not financial in nature, must not only be disclosed but should also presented in some form of comparable metrics. In this work the idea of POC’s (point of comparison) is suggested for these non-financial variables. These POCs will serve as the basis for disclosed relationships that link financial and non-financial variables of different companies.
Process or variable
Metric / point of comparison
Human resources
Pension retirement matrix
Summary of training and investment on HR
Brand
Brand value assessment and method
Intellectual capital
Number of patents granted and applied
Expenses in R&D
External valuation of IP
Method of valuation and estimates
Marketing
Market share
Industry ranking
Table 1: Points of Comparison
The creation of metrics that describe non-financial variables is fraught with concerns and potential inconsistencies. As it can seen above many of the measures are estimates of a very soft nature which will share the same problems that current financial estimates possess. Intangibles will have different measurement and valuation bases and an entire non-financial GAAP must be developed for their disclosure.
2.1.3 Materiality
[ 2]
The accounting profession has struggled for years with the concept of materiality. The audit opinion states that financial statements “fairly represent” the financial health of an organization. The materiality threshold is in engineering jargon an indication of “allowable error in measurement.” Current audit practice relative to materiality has been in place for three decades. It represents a compromise between the cost of audit investigation in manual records and the benefit for stockholders of this investigation. Information technology has dramatically changed allowing for cheaper and more effective controls and investigation, unbalancing this archaic compromise. While the tradeoff between the accuracy of measurement and the cost of assertion continues to be real, the break even point has changed in reality but not in practice.
A likely and desirable change, the leveraging of technological change, is the justification of the audit turning towards the improvement of data quality at the client (Vasarhelyi and Cohen, 2005[2]) and providing a variable level of assertion depending on asserted process. Clients and auditors would agree on the assertion needed on different processes subject to minimal requirements set by statute. Business entities that have real needs of data quality and validation would decide where the optimum tradeoff would be and pay accordingly. This would create a much larger economic threshold for assurance services as companies already pay much attention to data quality. In the future world of a universal data bus and balkanized information being transferred among interoperable Web Services will create even larger concerns for data quality.
While the concept of financial statement audit will continue for a while, a new set of assurance types will emerge where auditors, or other assurors, will place an imprimatur on data at the tag level. This imprimatur can be at the data accuracy level (this data is 98% correct) or at the process level where effective controls that act on the data would be either listed or rated. Obviously these two approaches dovetail and can be used simultaneously. Furthermore, it must not be lost that a wider set of assurance service may emerge with classes such as wider audits, intervening audits, ubiquitous audits, control rating audits, causal audits, etc.
As continuous audit techniques, become more prevalent, the entire economics of auditing and financial report preparation will change. With the cost of automatic procedures becoming negligible in an ERP environment, so will the ability to conduct analytic procedures on a real time basis. The tradeoff between sampling and full population testing will shift akin to the change in the materiality threshold. More generally, the evolution and ubiquity of ERPs will fundamentally lower the costs of compliance and reporting. The basic cost of preparing a report that obeys a particular auditing/accounting standard will become slight as it is prepared by the ERP provider and pulled out as a standard product. Setup costs however, may vary among installations as the basic data for the new requirement may not be available
2.1.4 Stale, erroneous, and opaque information
Annual reports have turned to be major tools of public relations. Currently the idea of just publishing the “standard packaged in the ERPS” report is unthinkable. Annual reports are tools of “spin” While this is not a palatable thought for many it is clear that the future world is one of more and more regulatory compliance and consequently the organization provisioning substantively more information. The spin mentality must give way to multi-dimensional realistic reporting that is drawn directly off corporate systems and deposited or delivered to users without expensive (PR) manipulation. Specific reputational penalties must ensue from issuing stale, erroneous and opaque information. Today’s paranoid concern for breaches in competitive intelligence where competitors discover important economic facts about the reporting business must give way to a more data cooperative attitude where the society and the corporation benefits from the existence of comparison benchmarks in the many facets of business. Just like today entire sectors cooperate in the development of XML derivative standards to create interoperability between applications and data transitivity in the value chain these sectors must cooperate in the development of disclosure standards that can be compared and used for industry benchmarking. Competitiveness has to be preserved by fast ever improving processes, timely research and aggressive data sharing not by self-serving paranoid opacity that slows the progress of science and interferes in the natural economic optimizing process of allocation of capital.
2.1.5 The specification of contractual terms in the measurement model
One clear shortcoming of today’s reporting model is its focus on realized operations and its ignorance of a large set of tacit and contractual obligations that often determine much of future economic activity. Organizations, their clients, their business partners, and suppliers are linked by a network of contracts that are formal and informal. Many of these contracts present larger liabilities for future operations than most reportable events. For example:
* a power utility may have a fuel supply contract that is 10% over current market price for the next 10 years
* a business concern may outsource most of its supply chain and as a result may have consensual obligations even if these are not contractual
* A business concern that has “return” agreements with their clients for inventory that is obsolete or cannot be used or sold
* Company with a long term practice of supporting local and communal projects to enhance the environment
* Company with many social welfare practices relative to the employees that cannot be stopped
* Company with passive obligations for environmental cleanup that are not recognized
These types of instances and the non-reported legal contingencies are often much larger than the liabilities typically reported in annual reports under contingencies. Only a probability-based system of contingency reporting can provide the necessary description that is useful and realistic in this an information society. Where clear obligations (and benefits) are not available a deeper standard of disclosure applies where disclosure must be prepared such as:
* legal, operational, and contractual contingencies
* management compensation contracts at a much deeper level…(including a taxonomy of types of compensation)
* Hyperlinks to fuzzy contracts or non-standard financially engineered contracts
* Description of corporate litigation
* Description of government investigations
* Etc.
2.1.6 Valuation
The accounting profession due to a highly litigious environment and the inherent difficulties of probabilistic measurement has resorted to the more confirmable and less valid forms of modified historical based reporting. Furthermore with the increased consideration of non-financial measurements where organizations try to assess the value of their workforce, of their intellectual property, of their sustainable resources, etc the temptation is to go back again to historical values invested on these issues. For example valuing an employee based the company’s investment on his/her education, professional training, etc. This is one of the examples of a very intractable set of problems. The standard to apply here is whether the information user will be better or worse served by being supplied verifiable (say historical cost based) investment information rather than estimates which may be more indicative of future value, but are less verifiable. If the estimate is used, will this information be more or less reliable than the old method? And can a structure be developed that users can download and perform their own analytics on this data?
The modern world is developing a wide of set live markets whose by-product is online real time valuation of many assets. Research is needed to understand how prevalent is this type of information and how expensive it is to harness it. Clearly the new economy has troves of transaction prices, valuation prices, indices, price lists and live exchange data available on a minute by minute basis. While the type of asset concentration changes substantially from sector to sector,[3] current values may exist for a substantive set of assets and temporal estimates (say weekly or monthly) of values may exists for many others.
[SS3]
n Some assets are to be measured in some form of high fluctuation transaction-based values following real-time indices xxxx
nAn account for valuation changes must be created that allows for valuation changes not to flow thru income [SS4]
n Income flow thru only should happen when asset realization occurs and this calculation should be using some form of inflation adjustment
n Where appropriate even future indices may be applied as long as the documentation is clear
n As today we keep depreciation schedules for major property items the new model should have valuation schedules for say the largest 100 assets of the corporation
n The economics of information today are such that constant evaluations of asset values should be doable, disclosable without prejudice of competitiveness, and usable by the user’s analytic tools
n Present value of any future income flow with allowance for best-estimators (and their variance)
nProcesses, nature of account, inter-process controls and other lesser items determine reliability of numbers at the transaction, reporting aggregate, and general ledger levels among many.
nAssurance / audit processes change these values on a continuous basis (real time seal, alarms, control tickers, points of comparison)
[MA5]
2.1.7 Deterministic representation of stochastic phenomena
The litigious nature of American society has led to poor compromises in the disclosure of data. The profession, stung by criticism and litigation, has often decided and set standards for single number disclosure on stochastic assessments. The profession has not issued attestation stating that a particular financial statement is reliable to the 95%, has not allowed for management earnings forecasts to be stated in ranges, and has not stated that most mineral reserves are of a certain value based on the commodity prices in the last 12 months. However it is pretty clear that statements of this type would be preferable for sophisticated users.
While many statistical estimates pervade annual reports (e.g. pensions, bad debt, etc.) these are stated in a deterministic format emphasizing the basic weakness of traditional reporting. When the distance between the report and its underlying stochastic reality gets too big, the credibility of business reporting disappears. If the variances around the values of estimates are very large the credibility of point estimates are very small.
The new business reporting model will have to rely on a wide set of disclosed probabilistic assessments for past results, current actions and future estimates. It is better to be about right than exactly wrong.
proposes a set of probabilistic oriented reports whereby all items in the traditional statements (BS, IS, & FF) are reported as point estimates with a variance measure. For example, the corporate cash level at 12/31/xx was 20m plus or minus 5%, and our best estimate for the value of inventories is 60m plus or minus 15%, and that our current estimate for P,P & E is 71m plus or minus 25%, and that the intangibles in our balance sheet originated by the merger with ABC corporation are 75m plus or minus 100%. Each of these numbers is composed of numbers from each division and each of these numbers has its intrinsic variance. As the public today glazes at the thought of point estimates and variances, a targeted educational effort could help significantly while the investment public can ultimately use the point estimate at a deterministic estimate if so desired.
Extending the reporting range for non-financial variables, key numbers in financial and non financial units would describe non-financial items and hyperlink to the bases of estimates using point of comparison indices developed by the specific industry relevant to the particular line-of-business. Assessments of quality control probability based scorecards would complement this picture. [m6]
Figure 2: Probabilistic reporting
2.1.8 The disclosure of predictive information
Congressional hearings during the malfeasance crisis demonstrated deep skepticism about earnings projections by management. However management is clearly the one that can provide the best predictions of company performance and so the issue should be how to present and constrain this information to avoid spin and self-serving stock manipulation. If managers have stock options or stock holdings that are available in a short period of time they can overstate earnings to create a spike in valuation until results come in. The above supports the argument that a new process and requirements for predictive information must be developed
Figure 3 breaks down information relative to its time frame. Future information is there focused on 1) leading indicators and basic relationships and 2) forecasting and models. Consequently the emphasis is both on specific numbers and the structure that is driving these numbers.
Figure 3: time frames of reporting
2.1.9 Semantic versus quantitative description of accounting phenomena
A company’s annual report contains traditional financial statements, [m7] footnotes, and a wide set of textual materials. An entire information intermediation industry has emerged to extract, standardize and organize information for the final user. Large companies can acquire S&P’s Compustat that contains most public US companies financial data normalized for use. The emergence of the XBLR standard may facilitate the utilization of data and comparison among companies at a more democratic level where individual users have an Excel add-on and harvest the information themselves without any data transformation. [SS8] However, most information contained in an annual report or an SEC filing is not the formal information from Balance Sheet, Income Statement or Uses and Sources of Funds. It includes footnotes, comparative history and a wide array of soft information available from the annual report. New techniques will need to be developed to extract, categorize and disseminate the qualitative information contained in financial statements.
Overcoming these deficiencies in the current reporting system requires taking advantage of radical changes in the technological basis of business and an equally important shift towards a process perspective of the firm.