I recently read a number of research papers on the subject of information overload in accounting (no researchers’ names or references released). I am overwhelmed by the level of obscure jargon that accounting researchers need to analyse such a simple subject.
Accounting information overload occurs when a manager has too much financial data to make good decisions. Researchers then go into more detail saying the financial data may be too complex or the managers may not have enough time, or perhaps the data may not be relevant. They analyse the nature of the decisions to be made, their quality and accuracy, the experience of the managers making the decisions, the links between the data analysed. The list seems endless giving them multiple opportunities to invent their jargon. The vaguest of the obscure has to be the mechanical judge, the most energetic: snowballing and the most elegant: stylized facts. They need all three and many others to analyse accounting information overload.
To illustrate I have made up a dense sentence:
“With information cues aggregated into high and low integration structures, the mechanical judge following decision rules uses stylized facts with relevance after elimination of data load and snowballing while using interactive drilldown functionality to understand accounting information overload.”
I admit it sounds like incomprehensible rubbish, but researchers use such terms to analyse accounting information overload. Each term, on its own, leaves ordinary accountants like me in distress, put together into a sentence they baffle us, spread out in one research paper they dazzle us into misunderstanding.
Information cues
At the simplest level cues become ‘information cues’ in the study of accounting information overload and describe accounting indicators. To make them digestible cues need to be put in clusters and one must consider ‘cue usage’ and its cousin ‘relative cue usage’. Cue usage answers the question: Is the cue used to make a decision (relevant information)? It might have relevance to the decision but not be chosen to make the decision.
Then to make life more complicated, researchers recommend measuring cue usage both objectively and subjectively. Decision-makers never measure usage, they decide to use the cue they consider useful and it might differ between different people. Moreover they never care about relative cue usage. Only researchers analyse why decision makers choose one cue over another.
Aggregated
When accountants present information they can summarise it or present it in detail. Accounting researchers don’t like simple wording they develop more complex jargon. They cluster variables, for instance, or put variables in information sets to summarise. But their favourite is aggregation and its opposite disaggregation of accounting information. When researchers aggregate they summarise. When they disaggregate they break down aggregated figures into more detail. Normal accountants start with the detail and decide to summarise. Researchers do the opposite they take the summarised information and put in more detail.
Aggregated or disaggregated data sounds so much better to researchers. Aggregation merely aggravates me and disaggregation exacerbates.
Integration structures
But researchers do not limit themselves to aggregation and disaggregation. They have invented sophisticated intellectual variations, to organise information. They called them high or low integration structures, also known as abstract conceptual levels or concrete conceptual levels. High integration structures ‘can deal with complex patterns’ of organising information using by differentiation, combination, and comparison of information dimensions. For ordinary accountants to understand these complex structures, patterns or levels, high integration can be summarised as aggregation, and low integration as disaggregation. So integration structures and conceptual levels go further than mere aggravation and exacerbation, they infuriate me.
Mechanical judge
The mechanical judge has no mechanics and the judge doesn’t judge anything. It involves no physical person. Instead the mechanical judge helps managers make decisions using algorithms and models to process information. To make the name easier to understand, researchers could have called it a judgemental tool or an algorithmic model but decided on human imagery of a robot in robes.
Decision rule
The decision rule guides the mechanical judge by giving out the rules for it to follow in the judgement. The decision rule decides nothing. An example might be ‘select expenses over £10,000’ and the mechanical judge would then review only them. A rudder for the robot in robes.
Stylized facts
Stylized in stylized facts has nothing to with fashion, and the facts don’t represent facts either but observations, patterns or findings found by researchers. As a result researchers have made stylized facts a distraction. To follow their imagery of elegance and luxury in the fashion industry ‘ornate observations’, ‘polished patterns’ or ‘fashionable findings’ would describe the meaning better than their stylized facts.
However, what researchers find important in overload is the ‘methodology’ of stylized facts not the observations on their own. Instead of calling it an analysis or comparison of stylized facts researchers use the word methodology, not to confuse normal accountants, but to keep up obscure jargon. They define methodology as lying between a review and a meta-analysis; more detailed than a review less strict than a meta-analysis. I know what a review consists of but not a meta-analysis, but I show up my lack of statistical education; a meta-analysis being the combination of results from more than one study
Relevance and redundancy
Relevance and redundancy are the two key terms in the study accounting information overload. Accountants believe that they need only relevant information to make a decision, but researchers go further by stating that relevance refers to the predictive ability of a cue. A cue which cannot predict is irrelevant. But then they complicate it further saying that relevance can be measured by regression models or using expert judgments. Who needs to measure relevance? Accountants chose the relevant data and decide.
Redundancy in this context does not refer to making someone redundant but to having two or more ‘information cues’ (two or more indicators) which give accountants the same information when they need only one to make the decision. Researchers consider the additional indicators redundant. They say that too many redundant indicators can lead to information overload without improving accuracy. Many accountants might disagree. More than one indicator giving the same information might also make the decision easier to make. Researchers do not seem to have studied that yet!
Data load
To most accountants data load means volume of data but not to all accountants in research. Some disagree with the broad definition. For example, I found this beautiful jargon infested definition of data load by well known researchers I refuse to name:
“the total number of cues minus the number of the relevant cues …”
To peel away the jargon irrelevant data equals data load. Or to go back to the jargon: irrelevant data increases cognitive noise which causes accounting overload and leads to decision degradation. So beware sometimes data load means all data sometimes only irrelevant data.
But information overload also comes from too much relevant data. If accountants analyse irrelevant data as well as relevant data, they would stay in a permanent state of information overload.
Snowballing
As you might imagine snowballing has no relation to playing in the snow, nor does backward and forward snowballing signify throwing snowballs towards the front or the back. When researchers mention snowballing, they analyse reference lists of research papers. They could have called it waterskiing to give it more nuance and one can water ski backwards.
Interactive drilldown functionality
I had to put in my favourite jargon infested phrase: the interactive drilldown functionality, into my dense sentence even though I am unable to understand what it means. Drilldown on its own I do comprehend, I have used it myself. ‘Drilldown functionality’ I can live with, but all three together: ‘interactive drilldown functionality’, drives me straight into a state of jargon overload.
In the end, what starts as a straightforward issue—too much information—gets buried under an avalanche of dense terminology. Researchers, in their attempt to understand accounting information overload create two levels of complexity: exquisite detail and elaborate jargon, giving us accounting practitioners jargon overload just reading the research. And I am certain they did not do it on purpose.