Published on RAW (https://raw.rutgers.edu)

Home > 2.4. Continuity Equations: The Conceptual Basis for Reengineered Business Reporting

2.4. Continuity Equations: The Conceptual Basis for Reengineered Business Reporting

In this economy business processes are measured on a continuous basis through different types of sensors that capture digital measurements of business metrics. This data are captured at a far finer granularity in time and detail than have ever been possible before.[11] Everything else provided by this ability for more frequent reporting, is a by-product of this fundamental change in the capability of data capture. What that data stream makes possible is measurement with an unprecedented degree of correspondence to underlying business processes. Furthermore the utilization of this data stream and its comparison with a new class of performance models that must be developed[12] will provide the basis for many automatic management decision models where the slowest element of the process, the human being, is excluded by automation. Figure 8 describes a formalization of these processes of data capture, comparison standards, exception standards, and meta-processes for measurement, control, management and assurance. Business processes, which are defined as “a set of logically related tasks performed to achieve a defined business outcome,” (Davenport and Short, 1990), are considered today to be the fundamental atomic elements that make up a company.[13] Thus a company is now described by what it can do rather than by its assets.That changed mindset has yet to be incorporated into traditional management and its assurance. What is fundamental about the real-time economy is that it brings the process approach explicitly into management through the very prompt measurement of processes and the comparison of these metrics with dynamic benchmarks that represent prescribed levels of business performance. Benchmarks that allow for the comparison of business process metrics with a standard (or model) will assume a much larger importance. The real-time economy discussed above, where processes are constantly monitored and their measurement compared with a benchmark for control purposes, requires highly dynamic adaptive models that can adequately represent the normative value that metrics must assume. Furthermore, in addition to basic benchmarking for first harmonic data comparison, second harmonic variance is also necessary for control purposes. Figure 8 illustrates this issue where processes are monitored and controlled by information systems, models, and management. When noteworthy exceptions occur adjusting management actions are effected. Some of these exceptions, are of (maybe also) assurance interest and are alarmed for audit purposes and directed to the audit “control” system. Figure 8: Meta-processes in measurement and assurance -data capture and control The monitoring and control of an organization’s processes can be viewed as a 5 level set of activities as described in Figure 10. The structural level (processes) is measured and metrics extracted and captured for the data level. Data is stored at a high level of granularity, say, basic transaction level. This data history may be examined under many distributions (cuts) such as time period, division, product, function, originator, etc. The third level encompasses the relationships perceived or prescribed among metrics, against which the organization performs control functions. For example, all flows from one process that reach the next one would constitute a one to one relationship and any differences would be exceptions. In general, to use metrics captured from level one in a control process it is necessary to have the measurement of the actual (metric), a model for comparison and a model of variance (which specifies the acceptable variation). The control process will compare the metric with the model, calculate the variance, and then decide if the variance is acceptable. If not, an alarm is triggered that may call for management action and/or assurance. The models may be very simple univariate levels to very complex multi-entity relationships like continuity equations. Among the types of models in CA we find:[m17] • A fixed number (normative or empirically derived) • An adjusted number with some form of analytic related to seasonality, hierarchy, or structure relationship The structure relationships can be represented by continuity equations and may represent: 1. Reconciliation structures 2. Semi deterministic relationships 3. Structures across processes 4. Empirical relationships across processes 5. Empirical relationships of a high level among KPIs The fourth level is the level of analytic monitoring and links very high level measures across processes. KPI (Key performance indicators) can be used to help understand process consistency as well as process performance. If measurements are not available at a lower level, this level serves to provide coarse alarms of major process difficulties. The fifth level is a meta-process level where the actual control and monitoring functions are performed based on continuous measurement, monitoring and proactive exception handling. Building on this model, the proposed solution is based on a view of a business in a real-time economy that would serve as a solution for some of the ailments encompassing the following factors: nCreation of a multivariate measurement model that does not focus exclusively on earnings per share and allows users to predict and evaluate business’ performance on a multivariate basis even if these measurements are in different dimensions (apples and oranges) nCreation of a measurement model that is oriented not only to investors but to other stakeholders of the business nCreation of a measurement model that not only represents static measurements of business but also the types of relationships that represent the business. These relationships can be structural, relational, empirical or comparative in the form of sector benchmarks. Figure 9: Galileo Enhanced Business Reporting Model Based on the examination of the current reporting model (GAAP) under this framework it can be concluded that a dynamic world cannot be well measured with static measurements, and that the technology exists for a more dynamic method of measurement to evolve. The disclosure model is very disjointed when the economic status of a firm has to be shown on a piece of paper (flat) and with very wide discrete intervals. Furthermore, while markets seem to value firms on a wide range of non-financial assets, the GAAP-based model focuses on financial items. It is also concerning that the measurement process focuses on the physical assets of companies more typical of the industrial age, while the valuable assets of an information economy are neglected. In an age where companies outsource many of their processes, suppliers carry the inventories of many companies, the RFID technology allows for specific identification of inventories, parts and assets, we still use FIFO and LIFO inventory valuation methods. In an age where dynamic markets exist where products are valued every minute we still focus on forms of historical cost as a substantive part of our business reports. In the days where it is well known that there is substantial leeway[14] [15] of interpretation in every number that determines an entity’s income we still focus on earnings per share. Another irony is that in the last couple of years and supposedly the next few, the FASB and the IASC will be focusing on the convergence of standards, converging towards a set of standards that is irremediably obsolete. If the measurement model is seriously compromised, progressively presenting less and less mapping with reality, the provisioning of assurance of these numbers is useless and is performed only for statutory purposes. It is not surprising therefore that accounting firms have progressively relied more in cursory analytical reviews and acted more like insurers than auditors. If the measures do not measure, even the best of the audits would just assure bad numbers that do not mean anything. Most litigation against auditors happens in failure situations, bad measures do not detect these, consequently good or bad auditing does not change much the auditing firms’ risk profile. Under these conditions, any downturn will show the underbelly of weak firms that have stretched their reporting to the limit and in their demise will punish CPA firms for purposely “bad audits” or irrelevant numbers that had little representativeness of the firm’s economic health. [MA18]

2.4.1 Levels & Basic Concepts

The Galileo enhanced business representation model in Figure 9 entails 5 levels: 1) structural level, 2) data level, 3) relationship level, 4) analytic monitoring level, and 5) continuous reporting and assurance level. Furthermore we will define five main types of concepts[16]: * Metrics – Metrics are defined as direct measurements of the system, drawn from reports, in the measurement stage. These metrics are compared against system standards. If a standard is exceeded, an alarm appears on the screen. For example, in the auditing of a billing system, the number of bills to be invoiced is extracted from a user report. The number of bills not issued due to a high severity error in the data is captured as well as the total dollar amount of bills issued. These three numbers are metrics that relate to the overall billing process. * Analytics - Analytics are defined as functional (natural flow), logical (key interaction), and empirical (e.g. it has been observed that ....) relationships among metrics. Specific analytics, related to a particular system module can be derived from the auditor, management, user experience, or historical data from the system. Each analytic may have a minimum of three dimensions: 1) its algebraic structure, 2) the relationships and contingencies that determine its numeric value at different times and situations and 3) rules-of-thumb or optimal rules on the magnitude and nature of variance that may be deemed as “real variance” to the extreme of alarms. For example, a billing analytic would state that dollars billed should be equal to invoices received, minus values of failed edits plus (or minus) the change of the number of dollars in retained invoices. The threshold number of expected invoices for that particular day or week (allowing for seasonality) must be established to determine whether an alarm should be fired. * Alarms – are exception conditions where a measure and its standard are compared and the ensuing variance is larger than the variance standard. Actual experience with these issues indicates that several levels of alarms are desirable: 1) minor alarms dealing with the functioning of the auditing system, 2) low level operational alarms to call the attention of operating management, 3) higher level alarms to call the attention of the auditor and trigger “exception audits” and 4) high level alarms to warn auditing and top management of serious crisis. Establishing these alarm thresholds is a second harmonic development. The data and experience needed to understand the phenomena being measured to the level of specification of alarm standards are probably not available in most organizations. * Standards or models represent the ideal state-of-the-world in a particular process. Any monitoring process requires the comparison of a metric to a model or standard to determine abnormal conditions. Furthermore, the magnitude of this condition is evaluated by a “standard of variance” in the decision on whether an alarm should be activated. Models of variable behavior over time in real-time systems must be developed in a way that would represent real-time behavior of dynamic systems. The evolution of real time monitoring needs adaptive models that take into consideration: seasonality, business trends, relationships between processes, timing between the processes, and flow of anomalous but legitimate transactions process to process. * Method of Measurement: the method of data capture and classification is an important variable in the future system representation scenario. Continuously captured data can drive monitoring processes to real-time exception measurement and alarming. The CPAS[m19] process captured data through report scrapping (Vasarhelyi & Halper, 1991) in electronic reports. Different monitoring processes are progressively capturing data in many more direct manners such as data sensing, queries to databases or the utilization of intermediate data (Hume, xxx) between batch processes. At the most basic level, the structural level, a number of transactions are taking place in various areas of the business, and there are time lags between each (illustrated by the hourglass shapes). In the new real time economy, there is decreased latency between these processes, which makes it possible to achieve real-time or near real-time reporting. Automation decreases the latency of processes by dimensions. The structural level represents a set of non-financial and financial processes that are interlinked in the generic process of wealth creation. There are physical, logical and statistical relationships between the processes and between the different metrics of these processes. Figure 10 is the lower level process were intrinsic relationships exist. Marketing drives advertising that drives sales. Once a sale is performed part of the transactions (40%) generate immediate cash while part of the transactions (60%) tend to become receivables 60% of which are paid within 30days, 20% within 60 days and 15% within 90+ days. Five percent of the transactions become bad debt. Figure 11 represents a three period cash flow that comes from these transactions. Figure 10: sales to cash Figure 11: Cash flow modeling While these transactions can get complex, the effects are very measurable and their study can help create models that are structurally-based. If sales are assumed constant a markov chain model can be used to model it and input levels will assume an ergodic state.[SS20] However the structural linkages are more complex and the structural modeling can be extended to ensuing boxes. The model is expanded, still in structural nature by including the role of inventories and the role of provisioning into the process. Figure 12: sales to cash inventory and bad debts This representation can be modeled by now including the role of inventory payments in the depletion of cash and can be an input to the provisioning equations which drive inventory management and other functions. While this modeling focused on inflows of cash in a multi-period setting, assuming 3 period intervals, many different assumptions can be made .Figure 13 displays a more realistic set of flows for cash, a core variable that is worth modeling. Figure 13: more complete cash structural model However the structural level 1 includes processes that are not financial and not necessarily structurally linked such as inventory and provisioning (where physical factors such as obsolescence, shrinkage, and delays may have an effect), or even farther but still related processes such as marketing and CRM. For these, some stochastic continuity equations are to be built based on experience parameters. For example experience may say that for every dollar of advertising in the south region you generate 7 dollars of sales and in the northeast only 5. Figure 14: Level 1 - structural The next level is the data level, where measurement of financial and non-financial indicators takes place, and individual pieces of data are reported with the ability to drill down to look at historical performance and compare data across business lines, products, managers, etc. Most companies do this internally today through some form of spreadsheet analysis, but given the capabilities made possible through new systems and decreased latency between processes, which we discussed before, it is now possible through constant measurement to move to the relationship level. The spreadsheet analogy A SPREADSHEET program is a good metaphor for describing the IT architecture (and the measurement of business) of a real-time enterprise. But such programs also demonstrate the extent to which companies and their employees are often still stuck in batch mode. The data they use in spreadsheets are often out of date and must be put in by hand. …. In contrast, modern spreadsheet software is as real-time as it gets. To a layman these programs look like tables with many rows and columns of “cells”. (the data level in the Galileo model) But their most important feature—how these cells are related to each other—is invisible. (the relationship and analytic monitoring levels in Galileo, and now necessary to present a non-obfuscable view of business) Often they are connected by simple operations such as addition or multiplication. Investment banks in particular, however, use more sophisticated spreadsheets in which the cells are linked by dozens of “macros”, sometimes quite elaborate sub-programs. If a user changes the data in one cell, many others are automatically recalculated. To advocates of the concept, the real-time enterprise is a giant spreadsheet of sorts, in which new information, such as an order, is automatically processed and percolates through a firm's computer systems and those of its suppliers. Thus a simple inquiry such as, “When is my order being shipped?” can be answered immediately. Many consumers have already encountered real-time business without realizing it, for instance when they order a Dell computer. The firm's website allows customers to check the status of their order at any time. Juice Software, based in New York, has developed a set of programs that allow users to turn their spreadsheet into living documents. With a few mouseclicks they can link a spreadsheet cell to a data source, for instance a corporate database.(such a interconnection and tracing capability is very important in the reporting and assurance of the modern enterprise) Smart software on a server in the network ensures that this cell is automatically updated whenever the information changes. Users can also connect their spreadsheets among themselves, so if one member of a project team changes a cell, the changes automatically appear in all the team members' files. (extracted from the Economist[17], annotations in bolded italics added) The data level in the modern enterprise as described in Figure 15 entails many measurements out of the processes as described above and as listed in the list of POCs for non-financial variables. Furthermore with the advent of databases, OLAP tools and style sheets, the “spreadsheet of measurement” of the modern enterprise incorporates the capability of drill-down (in finer details of the data structure, at the extreme into certain characteristics of a transaction such as amount or geography), accumulation of history not only of reported variables but also of desired aggregates (at the extreme say sales for a certain store) and distributional characteristics (ability to cut access parameters as geography, product or division). Figure 15: Level 2 data In the relationship level between key variables (for instance),.this allows the modern manager in a real-time society to make decisions based on current relationship models in addition to historic information. This, under the Galileo model allows a deeper level of disclosure that explains how the measurements of the data level are related to each other. The analogy is the formulae in a spreadsheet that exist in the background of the report. These relationships can be structural or stochastic as described above. In Figure 16 the relationships involve sales and marketing, care queries and number of sales, and potential delay relationships. Figure 16: Level 3 – relationship To further explain this disclosure level a balance sheet could be transformed in a Galileo reports (a la sustainability report) and presented in sheet 1 of a spreadsheet while a model relating some of the variables would be in the second sheet and the user could calculate the variances in the third sheet. The disclosure of these relationships, in addition to being valuable in increasing reporting transparency and deterring reporting obfuscation would have valuable feed-forward effect motivating better modeling of business and improved self insight of causes and consequences of business numbers. Figure 17 and Figure 18 introduce different representations of the relationship level. Figure 17: processes, measures and relationships In Figure 17 relationship 1 relates marketing to e-care. This is an obvious relationship which parameters must be examined and estimated with care. In this relationship increased marketing leads to increased sales which ultimately increases the demand for e-care contingent on the effectiveness of advertising and sales efforts, the quality of the products, and the accessibility of the care. The care effort also leads to secondary sales. Relationships 2 & 3 are narrower and more direct. Figure 18: the spreadsheet disclosure model While eventually most corporate systems will have extensive levels of detail and statistics enough to sustain substantial relationship-based monitoring, the Galileo model also has a higher level of relationship monitoring. This level is called analytic monitoring level and relies heavily on industry and company specific key performance indicators (KPIs). Level 4 (Figure 19) is both aimed at third party monitoring of corporate performance as well as internal monitoring in particular where information is not sufficient. Companies monitoring their processes step by step may miss significant macro trends in their performance (missing the forest for the trees) and will benefit also for having the KPI monitoring level where better understanding of business is obtained. Strategic planning level managers will tend to focus on level 4, while management and operational control managers in Anthony’s notation ( see Figure 19) will focus in level 3. [MAV21] In analytic monitoring, significant deviations from the norm for key performance indicators can be identified. This may indicate that a process is out of sync (such as…) even if detailed support may not exist. The next step would entail detailed analysis to capture the reason of misbalance. And of course you still have drill down capabilities at these levels, which can be extremely powerful. Figure 19: Level 4 - monitoring level Finally, continuous reporting and assurance (Figure 20) ensure the reliability of your systems and data, through transaction assurance, estimate assurance (on mgmt projections), compliance assurance (comp. w/GAAP), and so on, which enables you to report important business information externally as well as internally with confidence…and so, what you have in the end, is a much more robust, automated reporting process that tells you much more about the effectiveness of management, specific divisions, etc…, providing accurate and useful data on a real or near real-time basis. Furthermore, XML tagging will enable interoperability, making it possible for connections across internal and external partnering entities. Figure 20: Level 5 continuous reporting and assurance Figure 21 displays three types of XML tagged transactions flowing into the organization, which can be metered by some form of continuous reporting that would display cumulative levels of flows in a chosen time period. For example, all labor purchases (even if not yet paid) for the first 44 days of the year. This data being delivered to the system carries some form of data level assurance (for example a measure of the reliability of its generating systems, or an encrypted tag with an auditor’s assurance) or relying on other forms of assurance of system integrity (e.g. systrust). This data is delivered to the corporation’s ERPS under some form of XBRL/GL schema of reasonably fine chart of accounts. The accumulated data can, at any time, be queried for some form of level reporting (e.g. balance sheet) on a continuous or variable time basis. The ERPS support a large multitude of internal report, semi-internal reports and external reporting schema. Corporate processes under continuous assurance support: 1) transaction assurance (as described earlier), 20 estimate assurance, 3) rule assurance and 4) key judgment on process control assurance. Figure 21: Continuous reporting and assurance In order to create a process that reports on a wide set of financial and non financial variables, key POCs need to be defined. For example these could be: (talk about pocs of accounting variables)[SS22] [1] Greenstein and Vasarhelyi [2] Vasarhelyi, M.A. & Cohen, E.C., “A Note on the emergence of data level assurance,” Rutgers Accounting Research Center, working paper, 2005. [3] For example it is clear that the consulting and audit firm businesses will be much more dependent on human resources valuation than a highly automated manufacturer. [4] EOL whitepaper on XBRL. [5] AICPA’s Special Committee on the Enhanced Business reporting is evolving towards a societal consortium (www.ebrconsortium.org [1] ) where many of these expanding reports have been proposed. Four illustrations of the direction of business reporting were provided by this group that can be found at the above web site or at http://www.lintun.org/ [2], and at https://raw.rutgerrs.edu/raw/galileo [3]. [6] Disclosure is not limited by technological factors but by competitive intelligence and fears of evaluation of management on multiple dimensions. [7] Eventually there will be tremendous pressure on standard setters to issue “digitalizable standards” that can be automatically converted into computer code. [8] http://www.aicpa.org [4] [9] http://www.aicpa.org/webtrust [5] [10] http://www.aicpa.org/systrust [6] [11] More details about this aspect of CA is provided in Vasarhelyi et al (2004) and Vasarhelyi and Greenstein (2003). [12] Alles M., Kogan A., and Vasarhelyi, M. A., Continuity Equations, working paper, CARLAB, Rutgers Business School, Newark, NJ, September 2004. [13] Porter (1996). [14] Businessweek [15] Swieringa, R., Accounting Magic …xxxxx [16] Vasarhelyi & Halper, Continuous Process Auditing, Auditing: A Journal of Practice and ….cxxx, Fall 1991, pp- xxx-yy. [17] Economist, The real time economy, January 31, 2002. Back to Homepage [m1]uhhh [ 2]Miklos, why is this section needed, when it focuses on auditing and not on financial reporting? Materiality is now part of the financial reporting conceptualization. I think we should make the point that precision is part of accounting measurement consequently it must be part of the disclosure. But I am happy to delete it from here if we do measurement precision and assurance somewhere. [SS3]needs a connection to the points below [SS4]explain? [MA5]We can’t leave these as bullet points. Either we have to expand on them on delete. [m6]Rewrite the paragraph [m7]rewrite [SS8]This paragraph is not clear [m9]Explain EBRM [MAV10]Need to find a survey document that tals about these [m11]explain [P12]Needs simplification? [P13]Needs one or the other? [MAV14]Please insert something fomr the articles I forwarded u [m15]This paragraph is confusing [P16]What exactly are the three financial services? [m17]repetitious [MA18]This is all repetitious and we also need to carefully distinguish comments about financial reporting from comments about assurance. [m19]Explain cpas [SS20]Are these real words? [MAV21]Create linkage here with the Kotler report maybe moving down a few of the KPI indices there mentioned [SS22]We are missing more detail here?

Source URL: https://raw.rutgers.edu/node/31

Links:
[1] http://www.ebrconsortium.org
[2] http://www.lintun.org/
[3] https://raw.rutgerrs.edu/raw/galileo
[4] http://www.aicpa.org
[5] http://www.aicpa.org/webtrust
[6] http://www.aicpa.org/systrust