# Applying Shannons information theory to the measurement and analysis of financial statements

Post on 15-Apr-2017

220 views

TRANSCRIPT

This article was downloaded by: [Akdeniz Universitesi]On: 20 December 2014, At: 13:24Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Journal of Discrete Mathematical Sciences andCryptographyPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/tdmc20

Applying Shannons information theory to themeasurement and analysis of financial statementsNezameddin Faghih a & Mohammad Namazi aa Department of Management , Tappeh Eram Shiraz University , Shiraz , 71945 , IranPublished online: 03 Jun 2013.

To cite this article: Nezameddin Faghih & Mohammad Namazi (1998) Applying Shannons information theory to themeasurement and analysis of financial statements, Journal of Discrete Mathematical Sciences and Cryptography, 1:1, 49-62,DOI: 10.1080/09720529.1998.10697864

To link to this article: http://dx.doi.org/10.1080/09720529.1998.10697864

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the Content) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

http://www.tandfonline.com/loi/tdmc20http://www.tandfonline.com/action/showCitFormats?doi=10.1080/09720529.1998.10697864http://dx.doi.org/10.1080/09720529.1998.10697864http://www.tandfonline.com/page/terms-and-conditionshttp://www.tandfonline.com/page/terms-and-conditions

Applying Shannon's information theory to the measurement and analysis of financial statements

Nezameddin Faghih

Mohammad Namazi

Department of Management

Tappeh Eram

Shiraz University

71945 Shiraz

Iran

ABSTRACT This paper innovates the application of Shannon's information theory, as a novel

approach, to the measurer.lent and analysis of financial statements. A brief description of the information and entropy concepts and computations is rendered; then it is shown how to apply the computation procedure to the financial statements. Accordingly, as a mlmerical example, a typical fmandal statement is employed for the application of Shannon's information theory to its measurement and analysis. The information contributions of various elements in the financial statement and also the entropies, are com})uted for successive years. Hence, the results obtained by the application of Shannon's theory are, duly, compared and discussed. Consequently, areas of further research work to be lmdertaken, in continuing this interesting and promising study, are raised and suggested.

1. INTRODUCTION

One of the greatest revolutions in the scientific world outlook in the 20th century is the turn from Laplacian determinism to a probabilistic picture of the phenomena. A natural extension of this point of view is an understanding that our knowledge is of a probabilistic nature, too. Any information that we obtain affects the probabilities of possible alternative of actions, rather than indicating uniquely one particular outcome. Therefore, it seems to be not just a sheer coincidence that information theory forms its fundamental basis with the concept of entropy; entropy appearing as a measure of uncertainty and disorder [1], [2].

Journal of Discrete Mathematical Sciences & Cryptography Vol. 19 (1998), No.1, pp. 49-62 Academic Forum

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

50 N. FAGHIH AND M. NAMAZI

Mathematically, information theory is a branch of the theory of probabilities and stochastic processes. Nevertheless, information theory not only presents a quantitative measure of information common for both deterministic and probabilistic cases,' but it'also shows the qualitative equivalence of these two kindsofimowledge: In fact, Shannon was the first to show this important property [3], [4], [5]. Since then, however, the importance and generality of information theory concepts and approaches have expanded far beyond the area of communication engineering (where Shannon's model was initiated). The ideas of information theory have been applied in a variety of diverse fields including physics, linguistics, biology, psychology, economics, management & accounting [6], [7], [8], [91, [10], [U], [12]. All these have been proved to be productive and innovative. Thus, information theory has become not just another special branch, but an indispensable part of the modern scientific disciplines. Also, information-theoretical analysis has performed a significant role in the theory of computational and structural complexities and in the design of effective decision algorithms [2], [12-18].

Thus, introducing information theory, measures the financial items and their information contents and will produce a fruitful approach to financial analysis and decision making. This study, undertakes such a task, hoping to provide a modern direction in analyzing financial statements in the area of financial management and accounting. Hence, in the first section, a brief description of information and entropy concepts and computations, is rendered.

2. INFORMATION AND ENTROPY

The concept of entropy is the central part of the information theory. The entropy of a random variable is defined in terms of its probability distribution, and can be shown to be an appropriate measure of randomness or uncertainty. The importance of entropy in practical applications has much to do with its relationship with long sequences of random variables. It turns out that repeated trials of a probabilistic experiment give rise to outcomes that can, unger quite general circumstances, be devided into two categories. The first category consists of sequences that have a very small probability of occurrence; in fact, the probability of occurrence of this category as a whole approaches zero with the increasing length of the sequences. The second category consists of the remaining sequences; these sequences, which are known as likely or typical sequences, all have more or less the same probability and, as their length increases, they become closer and closer to being equally likely. The number of typical

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

51 APPLYING SHANNON'S INFORMATION THEORY

sequences and their individual probabilities are the function of the entropy, and it is this relationship that provIdes a major role for the entropy in the process. of inforn:ation theory.

Let x be a random vai:iable with sample space XI ,X2 , .. , xN and probability measure P(xn) = Pn'.The entropy Of X is defined as:

N

. H(x) = -LPn 10g(Pn), (n == 1,... , N). (1) n=1

The base of the logarithm is arbitrary and amounts to a constant multiplicative factor; bearing in mind that:

lo~ == (log!) (lom). (2)

If the logarithm base is 2, the entropy is said to be in "bits", for binary digits. If the base is Napier number, e, then the entropy is said to be in "Nats", for natural units [1], [2], [4].

The entropy as defined by equation (1) has several properties which makes H(x) a good measure of the uncertainty about the outcome of a probabilistic phenomenon. It is interesting to note that if PI == 1 andpn == 0 for n :;:.1, then H(x) == 0; that is, the uncertainty about an event with deterministic outcome is zero. Moreover if PI == P2 == ... ==p" == liN, then equation (1), gives H(x) == log N; implying that as the number of equiprobable outcomes increases, then the entropy of the event would also increase. If the random variable x has sample space Xl ,x2 , ... , Xn and the random variable y has sample space,)'1 , Y2 , ... , y" , then the joint random vari~ble z = (x, y) will have sample space (xl' Yl), (x2 ,Y2)' ... , (xn ,yn). It can be shown that, the entropy (uncertainty) of the joint event z = (x, Y) is the sum of entropies (uncertainties) of the independent events x and y; i.e.:

H(z) = H(x) + H(y). (3)

It should be added that Shannon, originally, suggested that information content be defined as:

I = logp. (4) It follows that, due to logarithm properties, as the probability of a message reduces, its information content increases and vice-versa. Also, for P 1, information content is zero and for a very rare or seldom event, the information content is very high. Now, substituting equation (4) in equation (1), gives:

N

H(x) LPnI!/' (5) n=1

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

52 N. FAGHIH AND M. NAMAZI

It is observed that entropy may also be interpreted as the average of the information content of a source [1], [2], [4].

3. APPLICATION OF FINANCIAL STATEMENTS

In order to ap~ly Shannon's theorem to financial statements, consider a financial statement as depicted in a simplified form, in Figure 1, containing m rows and n columns. Financial data processing may be performed according to various routines. As an example, suppose that elements aU of each column may be summed up, to obtain the total Sj for the jth column, i.e.,

m

Sj= 2aij ; j= 1, ..., n. (6) i=l

Now, each elementaij of any column j may be divided through by Sj' to convert the statement into the normalized arrangement depicted in Figure 2, with new elements Pi) as:

Pij =;.1 aij ; i 1, ... , m;j == 1, ... , n. (7) J

.............. ~ ..............

am1 am2 amn

... ~ .............. ,. ........

sums SI S2

~

Sn

Figure 1. A simplified depict of a financial statement

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

53 APPLYING SHANNON'S INFORMATION THEORY

Pl1

Pml Pm2 Pmn.

sums 1 1 1

Figure 2. The financial statement in normalized form

Then, using equation (4), the information content of each element in a column may be shown as:

Iij = - log2Pij' (bits) (8) Also, the entropy of each column may be evaluated, by applying equation (1), as:

m

Hj =-LPijlog2Pij (bits);j=l, ...,n, I=l, ... ,m. (9) i=l

Finally the entropy of the financial statement can be obtained, for independent columns, by the application of equation (3) as:

n

H=LHj . (bits) (10) j=1

The results are shown in Figure 3.

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

54 N. FAGHIH AND M. NAMAZI

........................111 112 I1n

~ ................................

121 122 I2n

........ ~ ........... ~ ............Im1 1m2 Imn

.......................entropy H1 H2 Hn

Figure 3. The information content and entropies of a financial statement with entropy H

4. NUMERICAL EXAMPLE

Consider the financial statement shown in Figure 4, which is a typical financial statement of a company, for three successive years (1993-95). This financial statement can be converted into the normalized form by dividing each element of the top section of every column by its ::orresponding value of the Total Assets; i.e., each element in the top section of the first column (year 1993) is divided through by 7136, the second column (year 1994) by 10300 and the third column (year 1995) by 11904: Similarly, for the middle section of each column, the' elements are divided through by the corresponding value of the Total Liabilities & Net WOlth, which are again 7136 for 1993, 10300 for 1994 and 11904 for 1995. The bottom section of each column is also divided through by the corresponding sales value. Hence, each element in the bottom section of the first column (year 1993) is divided through by 11863, the second column (year 1994) by 14952 and the third column (year 1995) by 16349. The corresponding normalized statement is shown in Figure 5.

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

APPLYING SHANNON'S INFORMATION 'fllEOJiY 55

(in million rials)

year 1993 year HJH4 year 1995

561 387 202

1963 2870 4051

Cash

Receivables

2031 . 2613 3287Inventories

Total Current Assets 4555 5870 7540 I

2581 4430 4364Net Fixed Assets I i

i Total Assets . 7136 10300 11904

Payables 1862 2944 3613

Accrules 301 516 587 I

1250 900 1050Bank Loan i

ITotal Current Liabilities 2413 4360 5250 l i

. Long Term Debt 500 1000 950

Net Worth 4223 4940 5704

Total Liabilities & Net 7136 10300 11904 Worth

11863 14952 16349Sales

Cost of goods sold (8537) (11124) (12016)

Selling & Administrative (2349) (2659) (2993)Expenses

977 1169 1340Profit Before taxes

Figure 4. A typical financial statement of a company during 1993-95

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

56

iCash

IReceivables

IInventories

i Total Current Assets I

i Net Fixed Assets I I Total Assets

~

Payables

Accrules

'Bank Loan

". Total Current Liabilities

: Long Term Debt

Net Worth

Total Liabilities & Nei, Worth

i

lSales

ICost of goods sold

Selling & Administrative Expenses

Profit Before taxes

year 1993

0.078

0.275

0.285

0.638

0.362

1

0.261

0.042

0.035

0.338

0.070

0.592

1

1

(0.720)

(0.198)

0.082

N. FAGHIH AND M. NAMAZI

year 1994 year 1995

0.037 0.017

0.279 0.340

0.254 0.276

0.570 0.633

0.430 0.367 ,

1 1

0.286 0.304

0.050 0.049

0.087 0.088

0.423 0.441

0.097 0.080

0.480 0.479

1 1

1 1

(0.744) (0.735)

(0.178) (0.183)

0.078 0.082

Figure 5. The financial statement for years 1993-95 in the normRlbed form

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

57 APPLYING SHANNON'S INFORMATION THEORY

Now, in order to calculate the information contribution of each element, the values in Figure 5 can be substituted for p in equation (8) and the binary logarithm base may also be used to obtain the information content in bits. Thus, every element in Figure 5 is replaced by its corresponding information value, to obtain the table of values shown in figure 6, rendering the information quantities. In fact, ill figure 6, the numerical values signify the information contributions of various composition terms, in the financial statement, in different years. .. .

Furthermore, the entropies can also be computed, by using equation (9); for example, for the year 1993, the entropy of the top section of the column may be computed as:

HI == 0.078 x 3.680 + 0.275 x 1.862 + 0.285 x 1.811 + 0.362 x 1.466+ 1 x 0

== 1.520 bits.

The entropy of the middle section of this ,column can also be calculated as:

H2 == 0.261 x 1.938 + 0.042 x 4.573 +0.035 x 4.836 + 0.070 y. 3.83G + 0.592 x 0.756 + 1 x 0

:: 1.585 bits.

The entropy of the bottom section of the first column is also calculated as: .

Ha == 1 x 0 + 0.720 x 0.474+ 0.198 x 2.336 + 0.082 x 3.608

== 1.099 bits.

Similarly, the entropies of various parts of columns may be computed as the sums of multiplications of the corresponding values in figures 5 and 6. The results are shown in figure 7, which indicates the entropies of various parts of the financial statement for the years 1993-95.

l\1oreover, equation (10) can be used to calculate the entropies of each year and the whole statement, as:

1.520 + 1.585 3.105 bits H 1993 ==

H l994 =1.715+ 1.874::: 3.589 bits Hl995 = 1.672 -l- 1.844 == 3.516 bits

HS!(Jtemen!:: 3.105 + 3.589 + 3.516 == 10.210 bits.

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

58 N. FAGHIH AND M. NAMAZI

I year 1993 year 1994 year 1995 I

3.680 4.756 5.878Cash

Receivables 1.862 1.842 1.556

Inventories 1.811 1.977 1.857

0.648 0.811 0.659Total Current Assets

Net Fixed Assets 1.466 1.217 1.446

Total Assets 0 0 0

Payables 1.938 1.806 1.718

4.573 4.322 4.352

Bank Loan 4.836 3.523 3.506

Accrules

1.565 1.241 1.181Total Current Liabilities

Long Term Debt 3.836 3.366 3.644

Net Worth 0.756 1.059 1.062

Total Liabilities & Net 0 0 0Worth

0 0 0!Sales

(0.474) (0.427) (0.444)Cost of goods sold

Selling & Administrative (2.336) (2.490) (2.450)Expenses

3.608 3.680 3.608: Profit Before taxes

Figure 6. The information content of the financial statement for years 1993.95, in bits

i

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

59 APPLYING SHANNON'S INFORMATION THEORY

year 1993 year 1994 , year 1995

Cash

Receivables

Inventories

Total Current Assets 1.520 1.715 1.672

Net Fixed Assets

Total Assets

Payables

Accrules

Bank Loan

Total Current Liabilities 1.585 1.874 1.844

Long Term Debt

Net Worth

Total Liabilities & Net Worth

Sales

Cost of goods sold

Selling & Administrative 1.099 1.047 1.070 Expenses

Profit Before taxes

Figure 7. The entropies of the financial statement for years 199395, in bits

5. DISCUSSION

A novel approach to the analysis and measurement of financial statements waf) innovated, by applying the Shannon Information Theory. The information and entropy concepts were adopted for the financial statements and the method for their corresponding computations were outlined. The procedure was applied to a typical financial statement of a company for three successive years and it was shown, how to measure the information content of the statement.

l

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

60 N. FAGHIH AND M. NAMAZI

Considering the financial statement as a communicative and informative device, one can examine the information content column-wise (for each year) to realize the significance and information contribution of each term, composing the finan.cial statements. F

61 APPLYING SHANNON'S INFORMATION THEORY

remain more or less the same and constant, as far as their information contributions are concerned, despite the fact that their monetary values in the original financial statement (figure 4) are very much different. For example, profit shows a constant information contribution of about 3.6 bits, while it has increased from 977 (in 1993) to 1169 (in 1994) and to 1340 (in 1995). Some other terms, e.g., Payables, have remained relatively constant or decreased slightly (from about 1.9 to about 1.7 bits) while increasing significantly, in monetary terms, from 1862 (in 1993) to 2944 (in 1994) and to 3613 (in 1995). Some other terms, e.g., cash, show relatively significant increase in their amount of information contribution, with respect to the decrease in their monetary values; or vice-versa (e.g. Bank loan).

Further than analyzing a financial statement as above, one can measure the entropy of a statement in different years. The entropy of the financial statement exemplified in this paper, appeared to be more than 3 bits and less than 4 bits. The entropy of the whole statement, for three years, was also just above 10 bits. Such measurements can also be employed to compare different years or different types of statements; noting that, as observed from equation (5), entropy may be interpreted as the average of the information content of a source

6. CONCLUSIONS

It seems promising that the application of Shannon's information theory can provide ways and also pave for further work and research to be undertaken, as far as quantitative measurement and analysis of the financial statements and other financial practices are concerned. Evidently, this is very important as far as financial decision making is concerned, specially by providing simulation approaches (based on information and entropy measures), for adopting the best financial policies and the most effective decisions. Moreover, for the financial statements, envisaged as communication devices, entropies may be determined. This can also, in turn, be useful (as in communication systems) to assess the capacity of information transmission and, ultimately, it can provide sophisticated approaches to the analysis and design of financial statements and other related financial decision instrumen ts.

Acknowledgement. This research project was supported by Shiraz University Research Council, which is gratefully acknowledged.

REFERENCES

1. R. Ash (1965), Information Th.eory, Wiley Interscience.

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

62 N. FAGHIH AND M. NAMAZI

2: A. M. Yaglom and I. M. Yaglom (1983), Probability and Information, D. Reidel Publishing Co.

3. C. E. Shannon (1948), A Mathematical Theory of Communication, Bell System Technical Jollrnal, Vol. 27.

4. C. E. Shannon and W.Weaver (1949), The Mathematical Theory of Communication, University o( Illinois Press.

5. C.E. Shannon (1949), ComlllUnication Theory of Secrecy Systems, Bell System Technical 'Journal, Vol. 28 ..

G.'C. E. Shannon Cl95iY, Prediction and Entropy of Printed English, Bell System Technical Journal, Vol. 30. '

7. B. Mandelbrot (1953), An Information Theory of the Statistical Structure of , Language, in W. Jackson (ed.), Communication Theory, Academic Press.

8. R. S, Ingarden (}975), Quantum Information Theory, Torlln Press. 9. H. QuasUer (1955), Information Theory in Psychology,. Free Press. 10, ,1:1. P. Yockey, R. L. Platzman and H. QlIastler (1958), Information Theory in

Biology. Pergamon Press.,. 11. A. R.' Horwitz and I. Horwitz (hI76), The Real and Illusory virtues of

, Entropy-Based Measures for Business and Economic Analysis, Decision Science. 12. Hartmann et.al. (i982), Application of Information Theory to Construction of

EillcientDecision Trees, IEEE Trans., Vol. IT-28, No.4. 13. R. Gray (1990), Entropy and Information Theory, Springer-Verlag. 14. G. J. Chaitin (1990), Information, Randomness and Incompleteness, World

Scientific. 15. J. S. Nicolis (1991), Chaos and Information Processing, World Scientific. 16. P. Van Goort (1994), Dynamic Sysums of Development, Harvester Wheatsheaf

Publishers. 17, N, Gilbert and J. Doran (1994), Simulating Societies, UCL (University College

London) Prpss.

18. A. J,Berry, J. Broadbent and D. Otley (1995), Management Control, MacMillan Press. '

Received December, 1997

Dow

nloa

ded

by [

Akd

eniz

Uni

vers

itesi

] at

13:

24 2

0 D

ecem

ber

2014

Recommended