Published on RAW (https://raw.rutgers.edu)

Home > 2. An Evolving Scenario

2. An Evolving Scenario

2.1 The Intractable Problems of the Current Accounting and Reporting System

2.1.1 Consolidation

Over the last two decades the standard setters have struggled with representing businesses with multiple segments that are not fully integrated, produce different products, and are in different geographies with differing currencies and methods of accounting. Large companies that have heavy industrial and financial components (e.g. GE and GM) blend into one measure very distinct types of numbers, which can tend to obscure financial performance rather than make it more transparent. The perennial problem of determining if two entities are one and need to be consolidated, or if they are different entities has been exacerbated by the development of Special Purpose Entities (SPEs). These are originally entities of very specific and narrow denomination which even 97% ownership did not create any co-dependency for the firm. This original definition was followed by a much larger set of usage by many businesses to the extreme abuse that Enron showed. While statistics on the existence and nature of SPEs are not available, they are much more widely used than generally understood and are applied by many of the most reputable organizations in the financial markets. The evolution of organizations in the 21st century will lead to substantial deconstruction of business[1] where using internetworking technology will allow many functions to be outsourced, partnered, or turned over to the competition. While outsourcing could be a straight forward arrangement as when implemented by single independent firm, many forms of outsourcing will exist. Simply adding the component entities in a consolidation creates a very false sense of reality. These relationships are often more than their simple formalization. Xxx The core issues are ownership, inter and intra entity transactions, obligations for residuals and commitments over time even if not contractual. One of the reasons that the solution of the consolidation problem has eluded standard setting is that one of the motivations for consolidation is the obfuscation of individual unit performance. Consequently, standard setters never had the stomach[m1] to force substantial disclosure at the business unit level, nor the desire to force narrow business units of standardized form and standardized activity, reporting with the same accuracy and detail required from the consolidated entity. While the Jenkins report strongly suggested narrow and complete reporting at the line of business level, the changes effected by the FASB were limited and did not satisfy the real need that is emerging in the 21st century of creating dynamic standards and industry bench marks for online real-time business monitoring. Comparisons among organizations should be at the sector level not at the aggregate level where the addition of non-similar parts creates substantive obfuscation. A new type of aggregate entity should be invented and enough disclosure detail provided to allow for income calculation and asset allocation across and along the value chain.

2.1.2 Intangibles

The recent literature has started to pay increasing attention to intangible items. As discussed earlier, the new business measurement model must take into consideration a much wider set of assets such as intellectual property, human resources, brands, marketing investments, reputation and other items. However, many of these items should not be added to the traditional total assets figure, as their addition could result in a very misleading total assets figure. While we can estimate the value of cash fairly easily with a small expected variance , a much larger variance of the value can be estimated out of say inventory or property and a huge lack of reliability comes with the intangibles item of the balance sheet. Figure 1 illustrates the different levels of precision of the different items in the balance sheet and support the argument that these should not be added as this would lead to a total assets figure that is likely to be inaccurate.. Figure 1: relative precision of balance sheet items New types of reports, which aggregate data with similar levels of accuracy, must be created. Research is needed to help the reporting agencies come up with methodologies that would adequately incorporate heterogeneous reliability measures that specify their components. Many of these “intangible” items, some of which are currently disclosed in the balance sheet and others just described in the body of a financial statement or in a report to other agencies that are not financial in nature, must not only be disclosed but should also presented in some form of comparable metrics. In this work the idea of POC’s (point of comparison) is suggested for these non-financial variables. These POCs will serve as the basis for disclosed relationships that link financial and non-financial variables of different companies. Process or variable Metric / point of comparison Human resources Pension retirement matrix Summary of training and investment on HR Brand Brand value assessment and method Intellectual capital Number of patents granted and applied Expenses in R&D External valuation of IP Method of valuation and estimates Marketing Market share Industry ranking Table 1: Points of Comparison The creation of metrics that describe non-financial variables is fraught with concerns and potential inconsistencies. As it can seen above many of the measures are estimates of a very soft nature which will share the same problems that current financial estimates possess. Intangibles will have different measurement and valuation bases and an entire non-financial GAAP must be developed for their disclosure.

2.1.3 Materiality

[ 2] The accounting profession has struggled for years with the concept of materiality. The audit opinion states that financial statements “fairly represent” the financial health of an organization. The materiality threshold is in engineering jargon an indication of “allowable error in measurement.” Current audit practice relative to materiality has been in place for three decades. It represents a compromise between the cost of audit investigation in manual records and the benefit for stockholders of this investigation. Information technology has dramatically changed allowing for cheaper and more effective controls and investigation, unbalancing this archaic compromise. While the tradeoff between the accuracy of measurement and the cost of assertion continues to be real, the break even point has changed in reality but not in practice. A likely and desirable change, the leveraging of technological change, is the justification of the audit turning towards the improvement of data quality at the client (Vasarhelyi and Cohen, 2005[2]) and providing a variable level of assertion depending on asserted process. Clients and auditors would agree on the assertion needed on different processes subject to minimal requirements set by statute. Business entities that have real needs of data quality and validation would decide where the optimum tradeoff would be and pay accordingly. This would create a much larger economic threshold for assurance services as companies already pay much attention to data quality. In the future world of a universal data bus and balkanized information being transferred among interoperable Web Services will create even larger concerns for data quality. While the concept of financial statement audit will continue for a while, a new set of assurance types will emerge where auditors, or other assurors, will place an imprimatur on data at the tag level. This imprimatur can be at the data accuracy level (this data is 98% correct) or at the process level where effective controls that act on the data would be either listed or rated. Obviously these two approaches dovetail and can be used simultaneously. Furthermore, it must not be lost that a wider set of assurance service may emerge with classes such as wider audits, intervening audits, ubiquitous audits, control rating audits, causal audits, etc. As continuous audit techniques, become more prevalent, the entire economics of auditing and financial report preparation will change. With the cost of automatic procedures becoming negligible in an ERP environment, so will the ability to conduct analytic procedures on a real time basis. The tradeoff between sampling and full population testing will shift akin to the change in the materiality threshold. More generally, the evolution and ubiquity of ERPs will fundamentally lower the costs of compliance and reporting. The basic cost of preparing a report that obeys a particular auditing/accounting standard will become slight as it is prepared by the ERP provider and pulled out as a standard product. Setup costs however, may vary among installations as the basic data for the new requirement may not be available

2.1.4 Stale, erroneous, and opaque information

Annual reports have turned to be major tools of public relations. Currently the idea of just publishing the “standard packaged in the ERPS” report is unthinkable. Annual reports are tools of “spin” While this is not a palatable thought for many it is clear that the future world is one of more and more regulatory compliance and consequently the organization provisioning substantively more information. The spin mentality must give way to multi-dimensional realistic reporting that is drawn directly off corporate systems and deposited or delivered to users without expensive (PR) manipulation. Specific reputational penalties must ensue from issuing stale, erroneous and opaque information. Today’s paranoid concern for breaches in competitive intelligence where competitors discover important economic facts about the reporting business must give way to a more data cooperative attitude where the society and the corporation benefits from the existence of comparison benchmarks in the many facets of business. Just like today entire sectors cooperate in the development of XML derivative standards to create interoperability between applications and data transitivity in the value chain these sectors must cooperate in the development of disclosure standards that can be compared and used for industry benchmarking. Competitiveness has to be preserved by fast ever improving processes, timely research and aggressive data sharing not by self-serving paranoid opacity that slows the progress of science and interferes in the natural economic optimizing process of allocation of capital.

2.1.5 The specification of contractual terms in the measurement model

One clear shortcoming of today’s reporting model is its focus on realized operations and its ignorance of a large set of tacit and contractual obligations that often determine much of future economic activity. Organizations, their clients, their business partners, and suppliers are linked by a network of contracts that are formal and informal. Many of these contracts present larger liabilities for future operations than most reportable events. For example: * a power utility may have a fuel supply contract that is 10% over current market price for the next 10 years * a business concern may outsource most of its supply chain and as a result may have consensual obligations even if these are not contractual * A business concern that has “return” agreements with their clients for inventory that is obsolete or cannot be used or sold * Company with a long term practice of supporting local and communal projects to enhance the environment * Company with many social welfare practices relative to the employees that cannot be stopped * Company with passive obligations for environmental cleanup that are not recognized These types of instances and the non-reported legal contingencies are often much larger than the liabilities typically reported in annual reports under contingencies. Only a probability-based system of contingency reporting can provide the necessary description that is useful and realistic in this an information society. Where clear obligations (and benefits) are not available a deeper standard of disclosure applies where disclosure must be prepared such as: * legal, operational, and contractual contingencies * management compensation contracts at a much deeper level…(including a taxonomy of types of compensation) * Hyperlinks to fuzzy contracts or non-standard financially engineered contracts * Description of corporate litigation * Description of government investigations * Etc.

2.1.6 Valuation

The accounting profession due to a highly litigious environment and the inherent difficulties of probabilistic measurement has resorted to the more confirmable and less valid forms of modified historical based reporting. Furthermore with the increased consideration of non-financial measurements where organizations try to assess the value of their workforce, of their intellectual property, of their sustainable resources, etc the temptation is to go back again to historical values invested on these issues. For example valuing an employee based the company’s investment on his/her education, professional training, etc. This is one of the examples of a very intractable set of problems. The standard to apply here is whether the information user will be better or worse served by being supplied verifiable (say historical cost based) investment information rather than estimates which may be more indicative of future value, but are less verifiable. If the estimate is used, will this information be more or less reliable than the old method? And can a structure be developed that users can download and perform their own analytics on this data? The modern world is developing a wide of set live markets whose by-product is online real time valuation of many assets. Research is needed to understand how prevalent is this type of information and how expensive it is to harness it. Clearly the new economy has troves of transaction prices, valuation prices, indices, price lists and live exchange data available on a minute by minute basis. While the type of asset concentration changes substantially from sector to sector,[3] current values may exist for a substantive set of assets and temporal estimates (say weekly or monthly) of values may exists for many others. [SS3] n Some assets are to be measured in some form of high fluctuation transaction-based values following real-time indices xxxx nAn account for valuation changes must be created that allows for valuation changes not to flow thru income [SS4] n Income flow thru only should happen when asset realization occurs and this calculation should be using some form of inflation adjustment n Where appropriate even future indices may be applied as long as the documentation is clear n As today we keep depreciation schedules for major property items the new model should have valuation schedules for say the largest 100 assets of the corporation n The economics of information today are such that constant evaluations of asset values should be doable, disclosable without prejudice of competitiveness, and usable by the user’s analytic tools n Present value of any future income flow with allowance for best-estimators (and their variance) nProcesses, nature of account, inter-process controls and other lesser items determine reliability of numbers at the transaction, reporting aggregate, and general ledger levels among many. nAssurance / audit processes change these values on a continuous basis (real time seal, alarms, control tickers, points of comparison) [MA5]

2.1.7 Deterministic representation of stochastic phenomena

The litigious nature of American society has led to poor compromises in the disclosure of data. The profession, stung by criticism and litigation, has often decided and set standards for single number disclosure on stochastic assessments. The profession has not issued attestation stating that a particular financial statement is reliable to the 95%, has not allowed for management earnings forecasts to be stated in ranges, and has not stated that most mineral reserves are of a certain value based on the commodity prices in the last 12 months. However it is pretty clear that statements of this type would be preferable for sophisticated users. While many statistical estimates pervade annual reports (e.g. pensions, bad debt, etc.) these are stated in a deterministic format emphasizing the basic weakness of traditional reporting. When the distance between the report and its underlying stochastic reality gets too big, the credibility of business reporting disappears. If the variances around the values of estimates are very large the credibility of point estimates are very small. The new business reporting model will have to rely on a wide set of disclosed probabilistic assessments for past results, current actions and future estimates. It is better to be about right than exactly wrong. proposes a set of probabilistic oriented reports whereby all items in the traditional statements (BS, IS, & FF) are reported as point estimates with a variance measure. For example, the corporate cash level at 12/31/xx was 20m plus or minus 5%, and our best estimate for the value of inventories is 60m plus or minus 15%, and that our current estimate for P,P & E is 71m plus or minus 25%, and that the intangibles in our balance sheet originated by the merger with ABC corporation are 75m plus or minus 100%. Each of these numbers is composed of numbers from each division and each of these numbers has its intrinsic variance. As the public today glazes at the thought of point estimates and variances, a targeted educational effort could help significantly while the investment public can ultimately use the point estimate at a deterministic estimate if so desired. Extending the reporting range for non-financial variables, key numbers in financial and non financial units would describe non-financial items and hyperlink to the bases of estimates using point of comparison indices developed by the specific industry relevant to the particular line-of-business. Assessments of quality control probability based scorecards would complement this picture. [m6] Figure 2: Probabilistic reporting

2.1.8 The disclosure of predictive information

Congressional hearings during the malfeasance crisis demonstrated deep skepticism about earnings projections by management. However management is clearly the one that can provide the best predictions of company performance and so the issue should be how to present and constrain this information to avoid spin and self-serving stock manipulation. If managers have stock options or stock holdings that are available in a short period of time they can overstate earnings to create a spike in valuation until results come in. The above supports the argument that a new process and requirements for predictive information must be developed Figure 3 breaks down information relative to its time frame. Future information is there focused on 1) leading indicators and basic relationships and 2) forecasting and models. Consequently the emphasis is both on specific numbers and the structure that is driving these numbers. Figure 3: time frames of reporting

2.1.9 Semantic versus quantitative description of accounting phenomena

A company’s annual report contains traditional financial statements, [m7] footnotes, and a wide set of textual materials. An entire information intermediation industry has emerged to extract, standardize and organize information for the final user. Large companies can acquire S&P’s Compustat that contains most public US companies financial data normalized for use. The emergence of the XBLR standard may facilitate the utilization of data and comparison among companies at a more democratic level where individual users have an Excel add-on and harvest the information themselves without any data transformation. [SS8] However, most information contained in an annual report or an SEC filing is not the formal information from Balance Sheet, Income Statement or Uses and Sources of Funds. It includes footnotes, comparative history and a wide array of soft information available from the annual report. New techniques will need to be developed to extract, categorize and disseminate the qualitative information contained in financial statements. Overcoming these deficiencies in the current reporting system requires taking advantage of radical changes in the technological basis of business and an equally important shift towards a process perspective of the firm.

2.2 The Real-time economy: The Technological Basis for Reengineered Business Reporting

The real time economy requires dynamic adaptive models for its realization. The core objective of the real time economy is the reduction of latency between and within processes. Latency reduction will reduce capital occupancy costs by occupying assets (physical and labor) for less time. Technology now provides a public and common communication infrastructure, increasing information sensing for automatic measurement, and large integrative databases. The second major wave of Internet usage is beginning to take shape and now that there is inter-linkage of systems on a global and ubiquitous basis, the age of the interoperable applications is emerging. Interoperability means that applications that interact do not need to be closely coupled but share common data specifications which allows for independent applications to work together without major adaptations. The W3C has proposed the XML (extensible markup language) as the tool for data standardization for interoperability. Figure 4: Basic XML transaction These information capsules, as described in Figure 4, will be routed through the value chain. For data transfer to be effective it is essential that data be self explanatory and that the applications managing the data use and transfer be able to ubiquitously understand the content of this data. The XML derivative for managerial accounting is XBRL / GL (eXtensible Business Reporting Language / General Ledger) while closely associated to it XBRL is focused on external business reporting.

2.3 The Evolving Path for XBRL

While the adoption of standards for external business reporting is inevitable, by essence this process is a [4]dynamic road. Its original proposed structure will have to withstand the test of usage, while the standard itself changes over time to improve its usefulness. Most likely a series of problems will arise which include: * Heterogeneous acceptance of the standard across countries and sectors. * Some regional differences in the interpretation of the standard * Some features of the standard will become entangled in local legislation and practice causing incompatibilities * Some adoption will be statutory some voluntary * A Babel tower of taxonomies will emerge before some simplification and mapping occurs * The expansion of the standard to less specific (semantic) regions of business reports will be slow and confusing. For example the labeling of footnotes will evolve naturally. * After some positive standardization of balance sheet, income statement and fund flow information is likely that there will be some progressive agreement on key disclosure items and performance indicators which will have specific tags. These agreements hopefully will be synchronized with the emergence of some consensus on EBRM[m9] . Furthermore, key elements of common footnotes and other non-financial data will be progressively tagged with specificity.[5] Financial intermediaries will be in the cusp of this evolution adding structure to the evolving (and increasing) standard. For example, they will create databases of XBRL disclosures and add data integration with additional sources to decrease transaction cots to the users. Also, they will progressively incorporate the above mentioned key disclosure items and performance indicators into these databases, saving the users the need for data collection, manipulation, backward compatibility construction (creating time series) and model building. While the large financial shops will continue building their own models, smaller entities and committed investors will use templates provided by these financial intermediaries, increasing substantially the democratic nature of market information. Furthermore, while the traditional domain of data will be expanded in search of transparency and made accessible and easily usable by XBRL, many sources of less traditional information will come to being. For example the FDIC is formalizing the usage of XBLR in the collection of call report data from banks and applying a large set of business rules during the collection process both to decrease the potential of errors, as well as to allow analytic technologies weed out fallacious reports. US corporations are subject to many reporting regulations such as the FCC, PGC, NYSE, OSHA, etc.. These regulations will eventually require reporting along a type of XBRL taxonomy and substantive convergence towards common requirements in an attempt to decrease the compliance burden. Financial intermediators will lead in the creation of these integrated databases and serve as a bridge towards common taxonomies and the creation of data streams that are backwards compatible. (go back and prepare data for periods prior to regulatory requirements) XBRL as well as many of the other XML derivative standards[MAV10] will create a much more fluid path for data exchange. Figure 5 displays the interchangeability of internal and external value chains and the free flow of transactions of different nature (say labor, material, purchases, and services). These relationships, which are structural, can be modeled and controlled by the use of real time adaptable relationship models (continuity equations). Companies will choose the processes where they have competitive advantage and will outsource (create alliances, partner) the ones where they cannot provide improved margins. As a result, an entire new set of data integrity and ownership concerns will emerge. s Figure 5: Data transfer chains The flow of this data will allow a new form of automatic corporate reporting and management to evolve. The transactions flowing through the pipe will be constantly measured and accumulated to have online-real-time balances of transaction flows that may or may not be disclosed[6] to the public. Figure 6 illustrates the arrival of three different types of transactions that are accumulated into continuous “income statement type” reports . Ultimately twenty first century reporting will focus on the monitoring and control layer where measurements of corporate processes will be compared with process performance models for the determination of variances. If these variances turn to be too large some form of management reaction will be necessary. In a real-time society much of this comparison and following management reaction will be automated thru some simple management bots[m11] (automated management actions) while some unusual events will be relayed to real managers or audit action. Transactions will be accumulated into detailed general ledger accounts (following an XBRL/GL – more detailed – taxonomy) and will be available in the company’s Enterprise resource Planning System’s database for extraction. These extracts will typically be very numerous (in the form of tens of thousands of electronic reports) and standardized from the particular version of the ERPS. A small subset of these reports will be extracted and carefully staged to represent the corporate “official” business reports. These business reports will encompass the corporate Balance Sheet, Income Statement and Funds Flow that are currently easily tagged into XBRL but will also serve as the basis of a wide cadre of footnotes, body of the financial statement and information releases to many different entities and stakeholders. Further into the future some degree of semantic processing technology as well as the issuance of standards[7] will allow for progressive business report content to be narrowly coded into XBRL tags. Consequently, and finally, an increase in transparency, such as clear comparability, will be possible in the footnotes such as pensions, compensation, accounting policies, extraordinary events, contingencies, options warranted, marketing plans, intellectual property assets, human resources deployed, intangible assets owned, etc. Corporate Management Accounting is now the owner of a wide set of information. In the modern world, state-of-the-art companies have much online / real-time information. For example, no bank could live without their current daily financial balance closing as they would not be able to apply it overnight, no manufacturing concern could live without real time inventory information as they would not be able to practice just-in-time manufacturing, and most companies would have great competitive difficulties if they did not have real time payables and receivables information to collect or provide discounts based on time characteristics. [P12] The monitoring and control process will eventually dominate corporate information processing with many of its components automated, as standards will evolve to provide interoperability. The next two decades will witness progressive development of management action algorithm using automatic (XML derived) data standards for accelerating the time delay (latency) of the performance of processes themselves and the transmission of data among processes. While current technology does not seem to be able to substantially accelerate trucks and airplanes to deliver goods between locations, to decrease lunch breaks of clerks, or increase/decrease [P13] the speed of consulting engagements, modeling and decision automation will accelerate dimensionally management action and bureaucratic information processing. Figure 6: the reporting layer The business reporting cycle will also suffer substantial acceleration. Recently Cisco and Motorola have announced their “virtual close.”[MAV14] This process brings the accounting closing to the daily cycle and allows for a substantial decrease in accounting adjustments and end-of-the-period earnings management. This process will also increase the volatility of results reflecting the realities of the real-life business process. While “continuous reporting” should be a process with NO CLOSINGS, and a constant set of balances, the “virtual close” has approximated its timing. Although in reality, real time reporting has its technological foundations available now for many companies, the business reporting, legal liability and management’s reticence for accountability at many dimensions, has effectively slowed down the adoption process. The W3C consortium (Word Wide Web Consortium) has proposed not only a web infrastructure but also tool for Web development (SOAP) and the basic framework (XML) standards for data interchange. It has also formulated the philosophy of a progressive anonymity on the Web where data flows through the universal data bus (Internet) and applications can sniff it out and provide interoperable services. While this vision is still quite fuzzy it can be visualized in many domains and now is the venue of many starting commercial efforts.[m15] An entire family of potential Financial Web Services that will cover the scope of many current services and prospective ones in represented in Figure 7. While today accounting functions are performed inside a series of software for large, medium and small companies, in the future many expensively updated functions (such as locality and state tax tables) will be served by Web Services. It is easy to envisage depreciation services, asset valuation services, intangible valuation services, option valuation services, transaction security and tracking services among many. Today we already find many companies (say Boon) providing special reporting functions for example for the SEC and for the FDIC. Many layers of special reporting are possible and will eventually evolve to support business. In the assurance arena we have currently a major standoff due to the emergence of the PCAOB (Public Company Audit Oversight Board) and the ensuing immobilization of the the AICPA, big accounting firms and other related market players. However the reality is that many different assurance needs are arising some of which are being satisfied by the accounting profession while many others are either being ignored or are being addressed by other professions. The AICPA[8] [9] [10]has reacted by creating the WebTrust and Systrust services which have not yet developed substantial traction. Eventually however, Web based assurance Services, more robust than the current Webtrust, Trust-E, etc will emerge to support Web site trust, transaction trust, valuation trust, data trust, etc. Three other high potential financial services will involve analytic services (where the Web entity will provide models for the lower and middle markets), fraud detection services where transaction streams and balances will be continuously scrutinized and compared with fraud profiles, as well some form of data level assurance where each data will have a tag(s) indicating its level of reliability, its path, and the reliability of its underlying control processes.[P16] in Figure 7: Web Services

2.4. Continuity Equations: The Conceptual Basis for Reengineered Business Reporting

In this economy business processes are measured on a continuous basis through different types of sensors that capture digital measurements of business metrics. This data are captured at a far finer granularity in time and detail than have ever been possible before.[11] Everything else provided by this ability for more frequent reporting, is a by-product of this fundamental change in the capability of data capture. What that data stream makes possible is measurement with an unprecedented degree of correspondence to underlying business processes. Furthermore the utilization of this data stream and its comparison with a new class of performance models that must be developed[12] will provide the basis for many automatic management decision models where the slowest element of the process, the human being, is excluded by automation. Figure 8 describes a formalization of these processes of data capture, comparison standards, exception standards, and meta-processes for measurement, control, management and assurance. Business processes, which are defined as “a set of logically related tasks performed to achieve a defined business outcome,” (Davenport and Short, 1990), are considered today to be the fundamental atomic elements that make up a company.[13] Thus a company is now described by what it can do rather than by its assets.That changed mindset has yet to be incorporated into traditional management and its assurance. What is fundamental about the real-time economy is that it brings the process approach explicitly into management through the very prompt measurement of processes and the comparison of these metrics with dynamic benchmarks that represent prescribed levels of business performance. Benchmarks that allow for the comparison of business process metrics with a standard (or model) will assume a much larger importance. The real-time economy discussed above, where processes are constantly monitored and their measurement compared with a benchmark for control purposes, requires highly dynamic adaptive models that can adequately represent the normative value that metrics must assume. Furthermore, in addition to basic benchmarking for first harmonic data comparison, second harmonic variance is also necessary for control purposes. Figure 8 illustrates this issue where processes are monitored and controlled by information systems, models, and management. When noteworthy exceptions occur adjusting management actions are effected. Some of these exceptions, are of (maybe also) assurance interest and are alarmed for audit purposes and directed to the audit “control” system. Figure 8: Meta-processes in measurement and assurance -data capture and control The monitoring and control of an organization’s processes can be viewed as a 5 level set of activities as described in Figure 10. The structural level (processes) is measured and metrics extracted and captured for the data level. Data is stored at a high level of granularity, say, basic transaction level. This data history may be examined under many distributions (cuts) such as time period, division, product, function, originator, etc. The third level encompasses the relationships perceived or prescribed among metrics, against which the organization performs control functions. For example, all flows from one process that reach the next one would constitute a one to one relationship and any differences would be exceptions. In general, to use metrics captured from level one in a control process it is necessary to have the measurement of the actual (metric), a model for comparison and a model of variance (which specifies the acceptable variation). The control process will compare the metric with the model, calculate the variance, and then decide if the variance is acceptable. If not, an alarm is triggered that may call for management action and/or assurance. The models may be very simple univariate levels to very complex multi-entity relationships like continuity equations. Among the types of models in CA we find:[m17] • A fixed number (normative or empirically derived) • An adjusted number with some form of analytic related to seasonality, hierarchy, or structure relationship The structure relationships can be represented by continuity equations and may represent: 1. Reconciliation structures 2. Semi deterministic relationships 3. Structures across processes 4. Empirical relationships across processes 5. Empirical relationships of a high level among KPIs The fourth level is the level of analytic monitoring and links very high level measures across processes. KPI (Key performance indicators) can be used to help understand process consistency as well as process performance. If measurements are not available at a lower level, this level serves to provide coarse alarms of major process difficulties. The fifth level is a meta-process level where the actual control and monitoring functions are performed based on continuous measurement, monitoring and proactive exception handling. Building on this model, the proposed solution is based on a view of a business in a real-time economy that would serve as a solution for some of the ailments encompassing the following factors: nCreation of a multivariate measurement model that does not focus exclusively on earnings per share and allows users to predict and evaluate business’ performance on a multivariate basis even if these measurements are in different dimensions (apples and oranges) nCreation of a measurement model that is oriented not only to investors but to other stakeholders of the business nCreation of a measurement model that not only represents static measurements of business but also the types of relationships that represent the business. These relationships can be structural, relational, empirical or comparative in the form of sector benchmarks. Figure 9: Galileo Enhanced Business Reporting Model Based on the examination of the current reporting model (GAAP) under this framework it can be concluded that a dynamic world cannot be well measured with static measurements, and that the technology exists for a more dynamic method of measurement to evolve. The disclosure model is very disjointed when the economic status of a firm has to be shown on a piece of paper (flat) and with very wide discrete intervals. Furthermore, while markets seem to value firms on a wide range of non-financial assets, the GAAP-based model focuses on financial items. It is also concerning that the measurement process focuses on the physical assets of companies more typical of the industrial age, while the valuable assets of an information economy are neglected. In an age where companies outsource many of their processes, suppliers carry the inventories of many companies, the RFID technology allows for specific identification of inventories, parts and assets, we still use FIFO and LIFO inventory valuation methods. In an age where dynamic markets exist where products are valued every minute we still focus on forms of historical cost as a substantive part of our business reports. In the days where it is well known that there is substantial leeway[14] [15] of interpretation in every number that determines an entity’s income we still focus on earnings per share. Another irony is that in the last couple of years and supposedly the next few, the FASB and the IASC will be focusing on the convergence of standards, converging towards a set of standards that is irremediably obsolete. If the measurement model is seriously compromised, progressively presenting less and less mapping with reality, the provisioning of assurance of these numbers is useless and is performed only for statutory purposes. It is not surprising therefore that accounting firms have progressively relied more in cursory analytical reviews and acted more like insurers than auditors. If the measures do not measure, even the best of the audits would just assure bad numbers that do not mean anything. Most litigation against auditors happens in failure situations, bad measures do not detect these, consequently good or bad auditing does not change much the auditing firms’ risk profile. Under these conditions, any downturn will show the underbelly of weak firms that have stretched their reporting to the limit and in their demise will punish CPA firms for purposely “bad audits” or irrelevant numbers that had little representativeness of the firm’s economic health. [MA18]

2.4.1 Levels & Basic Concepts

The Galileo enhanced business representation model in Figure 9 entails 5 levels: 1) structural level, 2) data level, 3) relationship level, 4) analytic monitoring level, and 5) continuous reporting and assurance level. Furthermore we will define five main types of concepts[16]: * Metrics – Metrics are defined as direct measurements of the system, drawn from reports, in the measurement stage. These metrics are compared against system standards. If a standard is exceeded, an alarm appears on the screen. For example, in the auditing of a billing system, the number of bills to be invoiced is extracted from a user report. The number of bills not issued due to a high severity error in the data is captured as well as the total dollar amount of bills issued. These three numbers are metrics that relate to the overall billing process. * Analytics - Analytics are defined as functional (natural flow), logical (key interaction), and empirical (e.g. it has been observed that ....) relationships among metrics. Specific analytics, related to a particular system module can be derived from the auditor, management, user experience, or historical data from the system. Each analytic may have a minimum of three dimensions: 1) its algebraic structure, 2) the relationships and contingencies that determine its numeric value at different times and situations and 3) rules-of-thumb or optimal rules on the magnitude and nature of variance that may be deemed as “real variance” to the extreme of alarms. For example, a billing analytic would state that dollars billed should be equal to invoices received, minus values of failed edits plus (or minus) the change of the number of dollars in retained invoices. The threshold number of expected invoices for that particular day or week (allowing for seasonality) must be established to determine whether an alarm should be fired. * Alarms – are exception conditions where a measure and its standard are compared and the ensuing variance is larger than the variance standard. Actual experience with these issues indicates that several levels of alarms are desirable: 1) minor alarms dealing with the functioning of the auditing system, 2) low level operational alarms to call the attention of operating management, 3) higher level alarms to call the attention of the auditor and trigger “exception audits” and 4) high level alarms to warn auditing and top management of serious crisis. Establishing these alarm thresholds is a second harmonic development. The data and experience needed to understand the phenomena being measured to the level of specification of alarm standards are probably not available in most organizations. * Standards or models represent the ideal state-of-the-world in a particular process. Any monitoring process requires the comparison of a metric to a model or standard to determine abnormal conditions. Furthermore, the magnitude of this condition is evaluated by a “standard of variance” in the decision on whether an alarm should be activated. Models of variable behavior over time in real-time systems must be developed in a way that would represent real-time behavior of dynamic systems. The evolution of real time monitoring needs adaptive models that take into consideration: seasonality, business trends, relationships between processes, timing between the processes, and flow of anomalous but legitimate transactions process to process. * Method of Measurement: the method of data capture and classification is an important variable in the future system representation scenario. Continuously captured data can drive monitoring processes to real-time exception measurement and alarming. The CPAS[m19] process captured data through report scrapping (Vasarhelyi & Halper, 1991) in electronic reports. Different monitoring processes are progressively capturing data in many more direct manners such as data sensing, queries to databases or the utilization of intermediate data (Hume, xxx) between batch processes. At the most basic level, the structural level, a number of transactions are taking place in various areas of the business, and there are time lags between each (illustrated by the hourglass shapes). In the new real time economy, there is decreased latency between these processes, which makes it possible to achieve real-time or near real-time reporting. Automation decreases the latency of processes by dimensions. The structural level represents a set of non-financial and financial processes that are interlinked in the generic process of wealth creation. There are physical, logical and statistical relationships between the processes and between the different metrics of these processes. Figure 10 is the lower level process were intrinsic relationships exist. Marketing drives advertising that drives sales. Once a sale is performed part of the transactions (40%) generate immediate cash while part of the transactions (60%) tend to become receivables 60% of which are paid within 30days, 20% within 60 days and 15% within 90+ days. Five percent of the transactions become bad debt. Figure 11 represents a three period cash flow that comes from these transactions. Figure 10: sales to cash Figure 11: Cash flow modeling While these transactions can get complex, the effects are very measurable and their study can help create models that are structurally-based. If sales are assumed constant a markov chain model can be used to model it and input levels will assume an ergodic state.[SS20] However the structural linkages are more complex and the structural modeling can be extended to ensuing boxes. The model is expanded, still in structural nature by including the role of inventories and the role of provisioning into the process. Figure 12: sales to cash inventory and bad debts This representation can be modeled by now including the role of inventory payments in the depletion of cash and can be an input to the provisioning equations which drive inventory management and other functions. While this modeling focused on inflows of cash in a multi-period setting, assuming 3 period intervals, many different assumptions can be made .Figure 13 displays a more realistic set of flows for cash, a core variable that is worth modeling. Figure 13: more complete cash structural model However the structural level 1 includes processes that are not financial and not necessarily structurally linked such as inventory and provisioning (where physical factors such as obsolescence, shrinkage, and delays may have an effect), or even farther but still related processes such as marketing and CRM. For these, some stochastic continuity equations are to be built based on experience parameters. For example experience may say that for every dollar of advertising in the south region you generate 7 dollars of sales and in the northeast only 5. Figure 14: Level 1 - structural The next level is the data level, where measurement of financial and non-financial indicators takes place, and individual pieces of data are reported with the ability to drill down to look at historical performance and compare data across business lines, products, managers, etc. Most companies do this internally today through some form of spreadsheet analysis, but given the capabilities made possible through new systems and decreased latency between processes, which we discussed before, it is now possible through constant measurement to move to the relationship level. The spreadsheet analogy A SPREADSHEET program is a good metaphor for describing the IT architecture (and the measurement of business) of a real-time enterprise. But such programs also demonstrate the extent to which companies and their employees are often still stuck in batch mode. The data they use in spreadsheets are often out of date and must be put in by hand. …. In contrast, modern spreadsheet software is as real-time as it gets. To a layman these programs look like tables with many rows and columns of “cells”. (the data level in the Galileo model) But their most important feature—how these cells are related to each other—is invisible. (the relationship and analytic monitoring levels in Galileo, and now necessary to present a non-obfuscable view of business) Often they are connected by simple operations such as addition or multiplication. Investment banks in particular, however, use more sophisticated spreadsheets in which the cells are linked by dozens of “macros”, sometimes quite elaborate sub-programs. If a user changes the data in one cell, many others are automatically recalculated. To advocates of the concept, the real-time enterprise is a giant spreadsheet of sorts, in which new information, such as an order, is automatically processed and percolates through a firm's computer systems and those of its suppliers. Thus a simple inquiry such as, “When is my order being shipped?” can be answered immediately. Many consumers have already encountered real-time business without realizing it, for instance when they order a Dell computer. The firm's website allows customers to check the status of their order at any time. Juice Software, based in New York, has developed a set of programs that allow users to turn their spreadsheet into living documents. With a few mouseclicks they can link a spreadsheet cell to a data source, for instance a corporate database.(such a interconnection and tracing capability is very important in the reporting and assurance of the modern enterprise) Smart software on a server in the network ensures that this cell is automatically updated whenever the information changes. Users can also connect their spreadsheets among themselves, so if one member of a project team changes a cell, the changes automatically appear in all the team members' files. (extracted from the Economist[17], annotations in bolded italics added) The data level in the modern enterprise as described in Figure 15 entails many measurements out of the processes as described above and as listed in the list of POCs for non-financial variables. Furthermore with the advent of databases, OLAP tools and style sheets, the “spreadsheet of measurement” of the modern enterprise incorporates the capability of drill-down (in finer details of the data structure, at the extreme into certain characteristics of a transaction such as amount or geography), accumulation of history not only of reported variables but also of desired aggregates (at the extreme say sales for a certain store) and distributional characteristics (ability to cut access parameters as geography, product or division). Figure 15: Level 2 data In the relationship level between key variables (for instance),.this allows the modern manager in a real-time society to make decisions based on current relationship models in addition to historic information. This, under the Galileo model allows a deeper level of disclosure that explains how the measurements of the data level are related to each other. The analogy is the formulae in a spreadsheet that exist in the background of the report. These relationships can be structural or stochastic as described above. In Figure 16 the relationships involve sales and marketing, care queries and number of sales, and potential delay relationships. Figure 16: Level 3 – relationship To further explain this disclosure level a balance sheet could be transformed in a Galileo reports (a la sustainability report) and presented in sheet 1 of a spreadsheet while a model relating some of the variables would be in the second sheet and the user could calculate the variances in the third sheet. The disclosure of these relationships, in addition to being valuable in increasing reporting transparency and deterring reporting obfuscation would have valuable feed-forward effect motivating better modeling of business and improved self insight of causes and consequences of business numbers. Figure 17 and Figure 18 introduce different representations of the relationship level. Figure 17: processes, measures and relationships In Figure 17 relationship 1 relates marketing to e-care. This is an obvious relationship which parameters must be examined and estimated with care. In this relationship increased marketing leads to increased sales which ultimately increases the demand for e-care contingent on the effectiveness of advertising and sales efforts, the quality of the products, and the accessibility of the care. The care effort also leads to secondary sales. Relationships 2 & 3 are narrower and more direct. Figure 18: the spreadsheet disclosure model While eventually most corporate systems will have extensive levels of detail and statistics enough to sustain substantial relationship-based monitoring, the Galileo model also has a higher level of relationship monitoring. This level is called analytic monitoring level and relies heavily on industry and company specific key performance indicators (KPIs). Level 4 (Figure 19) is both aimed at third party monitoring of corporate performance as well as internal monitoring in particular where information is not sufficient. Companies monitoring their processes step by step may miss significant macro trends in their performance (missing the forest for the trees) and will benefit also for having the KPI monitoring level where better understanding of business is obtained. Strategic planning level managers will tend to focus on level 4, while management and operational control managers in Anthony’s notation ( see Figure 19) will focus in level 3. [MAV21] In analytic monitoring, significant deviations from the norm for key performance indicators can be identified. This may indicate that a process is out of sync (such as…) even if detailed support may not exist. The next step would entail detailed analysis to capture the reason of misbalance. And of course you still have drill down capabilities at these levels, which can be extremely powerful. Figure 19: Level 4 - monitoring level Finally, continuous reporting and assurance (Figure 20) ensure the reliability of your systems and data, through transaction assurance, estimate assurance (on mgmt projections), compliance assurance (comp. w/GAAP), and so on, which enables you to report important business information externally as well as internally with confidence…and so, what you have in the end, is a much more robust, automated reporting process that tells you much more about the effectiveness of management, specific divisions, etc…, providing accurate and useful data on a real or near real-time basis. Furthermore, XML tagging will enable interoperability, making it possible for connections across internal and external partnering entities. Figure 20: Level 5 continuous reporting and assurance Figure 21 displays three types of XML tagged transactions flowing into the organization, which can be metered by some form of continuous reporting that would display cumulative levels of flows in a chosen time period. For example, all labor purchases (even if not yet paid) for the first 44 days of the year. This data being delivered to the system carries some form of data level assurance (for example a measure of the reliability of its generating systems, or an encrypted tag with an auditor’s assurance) or relying on other forms of assurance of system integrity (e.g. systrust). This data is delivered to the corporation’s ERPS under some form of XBRL/GL schema of reasonably fine chart of accounts. The accumulated data can, at any time, be queried for some form of level reporting (e.g. balance sheet) on a continuous or variable time basis. The ERPS support a large multitude of internal report, semi-internal reports and external reporting schema. Corporate processes under continuous assurance support: 1) transaction assurance (as described earlier), 20 estimate assurance, 3) rule assurance and 4) key judgment on process control assurance. Figure 21: Continuous reporting and assurance In order to create a process that reports on a wide set of financial and non financial variables, key POCs need to be defined. For example these could be: (talk about pocs of accounting variables)[SS22] [1] Greenstein and Vasarhelyi [2] Vasarhelyi, M.A. & Cohen, E.C., “A Note on the emergence of data level assurance,” Rutgers Accounting Research Center, working paper, 2005. [3] For example it is clear that the consulting and audit firm businesses will be much more dependent on human resources valuation than a highly automated manufacturer. [4] EOL whitepaper on XBRL. [5] AICPA’s Special Committee on the Enhanced Business reporting is evolving towards a societal consortium (www.ebrconsortium.org [1] ) where many of these expanding reports have been proposed. Four illustrations of the direction of business reporting were provided by this group that can be found at the above web site or at http://www.lintun.org/ [2], and at https://raw.rutgerrs.edu/raw/galileo [3]. [6] Disclosure is not limited by technological factors but by competitive intelligence and fears of evaluation of management on multiple dimensions. [7] Eventually there will be tremendous pressure on standard setters to issue “digitalizable standards” that can be automatically converted into computer code. [8] http://www.aicpa.org [4] [9] http://www.aicpa.org/webtrust [5] [10] http://www.aicpa.org/systrust [6] [11] More details about this aspect of CA is provided in Vasarhelyi et al (2004) and Vasarhelyi and Greenstein (2003). [12] Alles M., Kogan A., and Vasarhelyi, M. A., Continuity Equations, working paper, CARLAB, Rutgers Business School, Newark, NJ, September 2004. [13] Porter (1996). [14] Businessweek [15] Swieringa, R., Accounting Magic …xxxxx [16] Vasarhelyi & Halper, Continuous Process Auditing, Auditing: A Journal of Practice and ….cxxx, Fall 1991, pp- xxx-yy. [17] Economist, The real time economy, January 31, 2002. Back to Homepage [m1]uhhh [ 2]Miklos, why is this section needed, when it focuses on auditing and not on financial reporting? Materiality is now part of the financial reporting conceptualization. I think we should make the point that precision is part of accounting measurement consequently it must be part of the disclosure. But I am happy to delete it from here if we do measurement precision and assurance somewhere. [SS3]needs a connection to the points below [SS4]explain? [MA5]We can’t leave these as bullet points. Either we have to expand on them on delete. [m6]Rewrite the paragraph [m7]rewrite [SS8]This paragraph is not clear [m9]Explain EBRM [MAV10]Need to find a survey document that tals about these [m11]explain [P12]Needs simplification? [P13]Needs one or the other? [MAV14]Please insert something fomr the articles I forwarded u [m15]This paragraph is confusing [P16]What exactly are the three financial services? [m17]repetitious [MA18]This is all repetitious and we also need to carefully distinguish comments about financial reporting from comments about assurance. [m19]Explain cpas [SS20]Are these real words? [MAV21]Create linkage here with the Kotler report maybe moving down a few of the KPI indices there mentioned [SS22]We are missing more detail here?

Source URL: https://raw.rutgers.edu/node/18

Links:
[1] http://www.ebrconsortium.org
[2] http://www.lintun.org/
[3] https://raw.rutgerrs.edu/raw/galileo
[4] http://www.aicpa.org
[5] http://www.aicpa.org/webtrust
[6] http://www.aicpa.org/systrust