Applying Shannon’s information theory to the measurement and analysis of financial statements

Download Applying Shannon’s information theory to the measurement and analysis of financial statements

Post on 15-Apr-2017

221 views

Category:

Documents

7 download

Embed Size (px)

TRANSCRIPT

<ul><li><p>This article was downloaded by: [Akdeniz Universitesi]On: 20 December 2014, At: 13:24Publisher: Taylor &amp; FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK</p><p>Journal of Discrete Mathematical Sciences andCryptographyPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/tdmc20</p><p>Applying Shannons information theory to themeasurement and analysis of financial statementsNezameddin Faghih a &amp; Mohammad Namazi aa Department of Management , Tappeh Eram Shiraz University , Shiraz , 71945 , IranPublished online: 03 Jun 2013.</p><p>To cite this article: Nezameddin Faghih &amp; Mohammad Namazi (1998) Applying Shannons information theory to themeasurement and analysis of financial statements, Journal of Discrete Mathematical Sciences and Cryptography, 1:1, 49-62,DOI: 10.1080/09720529.1998.10697864</p><p>To link to this article: http://dx.doi.org/10.1080/09720529.1998.10697864</p><p>PLEASE SCROLL DOWN FOR ARTICLE</p><p>Taylor &amp; Francis makes every effort to ensure the accuracy of all the information (the Content) containedin the publications on our platform. However, Taylor &amp; Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor &amp; Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.</p><p>This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms &amp; Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions</p><p>http://www.tandfonline.com/loi/tdmc20http://www.tandfonline.com/action/showCitFormats?doi=10.1080/09720529.1998.10697864http://dx.doi.org/10.1080/09720529.1998.10697864http://www.tandfonline.com/page/terms-and-conditionshttp://www.tandfonline.com/page/terms-and-conditions</p></li><li><p>Applying Shannon's information theory to the measurement and analysis of financial statements </p><p>Nezameddin Faghih </p><p>Mohammad Namazi </p><p>Department of Management </p><p>Tappeh Eram </p><p>Shiraz University </p><p>71945 Shiraz </p><p>Iran </p><p>ABSTRACT This paper innovates the application of Shannon's information theory, as a novel </p><p>approach, to the measurer.lent and analysis of financial statements. A brief description of the information and entropy concepts and computations is rendered; then it is shown how to apply the computation procedure to the financial statements. Accordingly, as a mlmerical example, a typical fmandal statement is employed for the application of Shannon's information theory to its measurement and analysis. The information contributions of various elements in the financial statement and also the entropies, are com})uted for successive years. Hence, the results obtained by the application of Shannon's theory are, duly, compared and discussed. Consequently, areas of further research work to be lmdertaken, in continuing this interesting and promising study, are raised and suggested. </p><p>1. INTRODUCTION </p><p>One of the greatest revolutions in the scientific world outlook in the 20th century is the turn from Laplacian determinism to a probabilistic picture of the phenomena. A natural extension of this point of view is an understanding that our knowledge is of a probabilistic nature, too. Any information that we obtain affects the probabilities of possible alternative of actions, rather than indicating uniquely one particular outcome. Therefore, it seems to be not just a sheer coincidence that information theory forms its fundamental basis with the concept of entropy; entropy appearing as a measure of uncertainty and disorder [1], [2]. </p><p>Journal of Discrete Mathematical Sciences &amp; Cryptography Vol. 19 (1998), No.1, pp. 49-62 Academic Forum </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>50 N. FAGHIH AND M. NAMAZI </p><p>Mathematically, information theory is a branch of the theory of probabilities and stochastic processes. Nevertheless, information theory not only presents a quantitative measure of information common for both deterministic and probabilistic cases,' but it'also shows the qualitative equivalence of these two kindsofimowledge: In fact, Shannon was the first to show this important property [3], [4], [5]. Since then, however, the importance and generality of information theory concepts and approaches have expanded far beyond the area of communication engineering (where Shannon's model was initiated). The ideas of information theory have been applied in a variety of diverse fields including physics, linguistics, biology, psychology, economics, management &amp; accounting [6], [7], [8], [91, [10], [U], [12]. All these have been proved to be productive and innovative. Thus, information theory has become not just another special branch, but an indispensable part of the modern scientific disciplines. Also, information-theoretical analysis has performed a significant role in the theory of computational and structural complexities and in the design of effective decision algorithms [2], [12-18]. </p><p>Thus, introducing information theory, measures the financial items and their information contents and will produce a fruitful approach to financial analysis and decision making. This study, undertakes such a task, hoping to provide a modern direction in analyzing financial statements in the area of financial management and accounting. Hence, in the first section, a brief description of information and entropy concepts and computations, is rendered. </p><p>2. INFORMATION AND ENTROPY </p><p>The concept of entropy is the central part of the information theory. The entropy of a random variable is defined in terms of its probability distribution, and can be shown to be an appropriate measure of randomness or uncertainty. The importance of entropy in practical applications has much to do with its relationship with long sequences of random variables. It turns out that repeated trials of a probabilistic experiment give rise to outcomes that can, unger quite general circumstances, be devided into two categories. The first category consists of sequences that have a very small probability of occurrence; in fact, the probability of occurrence of this category as a whole approaches zero with the increasing length of the sequences. The second category consists of the remaining sequences; these sequences, which are known as likely or typical sequences, all have more or less the same probability and, as their length increases, they become closer and closer to being equally likely. The number of typical </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>51 APPLYING SHANNON'S INFORMATION THEORY </p><p>sequences and their individual probabilities are the function of the entropy, and it is this relationship that provIdes a major role for the entropy in the process. of inforn:ation theory. </p><p>Let x be a random vai:iable with sample space XI ,X2 , .. , xN and probability measure P(xn) = Pn'.The entropy Of X is defined as: </p><p>N </p><p>. H(x) = -LPn 10g(Pn), (n == 1,... , N). (1) n=1 </p><p>The base of the logarithm is arbitrary and amounts to a constant multiplicative factor; bearing in mind that: </p><p>lo~ == (log!) (lom). (2) </p><p>If the logarithm base is 2, the entropy is said to be in "bits", for binary digits. If the base is Napier number, e, then the entropy is said to be in "Nats", for natural units [1], [2], [4]. </p><p>The entropy as defined by equation (1) has several properties which makes H(x) a good measure of the uncertainty about the outcome of a probabilistic phenomenon. It is interesting to note that if PI == 1 andpn == 0 for n :;:.1, then H(x) == 0; that is, the uncertainty about an event with deterministic outcome is zero. Moreover if PI == P2 == ... ==p" == liN, then equation (1), gives H(x) == log N; implying that as the number of equiprobable outcomes increases, then the entropy of the event would also increase. If the random variable x has sample space Xl ,x2 , ... , Xn and the random variable y has sample space,)'1 , Y2 , ... , y" , then the joint random vari~ble z = (x, y) will have sample space (xl' Yl), (x2 ,Y2)' ... , (xn ,yn). It can be shown that, the entropy (uncertainty) of the joint event z = (x, Y) is the sum of entropies (uncertainties) of the independent events x and y; i.e.: </p><p>H(z) = H(x) + H(y). (3) </p><p>It should be added that Shannon, originally, suggested that information content be defined as: </p><p>I = logp. (4) It follows that, due to logarithm properties, as the probability of a message reduces, its information content increases and vice-versa. Also, for P 1, information content is zero and for a very rare or seldom event, the information content is very high. Now, substituting equation (4) in equation (1), gives: </p><p>N </p><p>H(x) LPnI!/' (5) n=1 </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>52 N. FAGHIH AND M. NAMAZI </p><p>It is observed that entropy may also be interpreted as the average of the information content of a source [1], [2], [4]. </p><p>3. APPLICATION OF FINANCIAL STATEMENTS </p><p>In order to ap~ly Shannon's theorem to financial statements, consider a financial statement as depicted in a simplified form, in Figure 1, containing m rows and n columns. Financial data processing may be performed according to various routines. As an example, suppose that elements aU of each column may be summed up, to obtain the total Sj for the jth column, i.e., </p><p>m </p><p>Sj= 2aij ; j= 1, ..., n. (6) i=l </p><p>Now, each elementaij of any column j may be divided through by Sj' to convert the statement into the normalized arrangement depicted in Figure 2, with new elements Pi) as: </p><p>Pij =;.1 aij ; i 1, ... , m;j == 1, ... , n. (7) J </p><p>.............. ~ .............. </p><p>am1 am2 amn </p><p>... ~ .............. ,. ........ </p><p>sums SI S2 </p><p>~ </p><p>Sn </p><p>Figure 1. A simplified depict of a financial statement </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>53 APPLYING SHANNON'S INFORMATION THEORY </p><p>Pl1 </p><p>Pml Pm2 Pmn. </p><p>sums 1 1 1 </p><p>Figure 2. The financial statement in normalized form </p><p>Then, using equation (4), the information content of each element in a column may be shown as: </p><p>Iij = - log2Pij' (bits) (8) Also, the entropy of each column may be evaluated, by applying equation (1), as: </p><p>m </p><p>Hj =-LPijlog2Pij (bits);j=l, ...,n, I=l, ... ,m. (9) i=l </p><p>Finally the entropy of the financial statement can be obtained, for independent columns, by the application of equation (3) as: </p><p>n </p><p>H=LHj . (bits) (10) j=1 </p><p>The results are shown in Figure 3. </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>54 N. FAGHIH AND M. NAMAZI </p><p>........................111 112 I1n </p><p>~ ................................ </p><p>121 122 I2n </p><p>........ ~ ........... ~ ............Im1 1m2 Imn </p><p>.......................entropy H1 H2 Hn </p><p>Figure 3. The information content and entropies of a financial statement with entropy H </p><p>4. NUMERICAL EXAMPLE </p><p>Consider the financial statement shown in Figure 4, which is a typical financial statement of a company, for three successive years (1993-95). This financial statement can be converted into the normalized form by dividing each element of the top section of every column by its ::orresponding value of the Total Assets; i.e., each element in the top section of the first column (year 1993) is divided through by 7136, the second column (year 1994) by 10300 and the third column (year 1995) by 11904: Similarly, for the middle section of each column, the' elements are divided through by the corresponding value of the Total Liabilities &amp; Net WOlth, which are again 7136 for 1993, 10300 for 1994 and 11904 for 1995. The bottom section of each column is also divided through by the corresponding sales value. Hence, each element in the bottom section of the first column (year 1993) is divided through by 11863, the second column (year 1994) by 14952 and the third column (year 1995) by 16349. The corresponding normalized statement is shown in Figure 5. </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>APPLYING SHANNON'S INFORMATION 'fllEOJiY 55 </p><p>(in million rials) </p><p>year 1993 year HJH4 year 1995 </p><p>561 387 202 </p><p>1963 2870 4051 </p><p>Cash </p><p>Receivables </p><p>2031 . 2613 3287Inventories </p><p>Total Current Assets 4555 5870 7540 I </p><p>2581 4430 4364Net Fixed Assets I i</p><p>i Total Assets . 7136 10300 11904 </p><p>Payables 1862 2944 3613 </p><p>Accrules 301 516 587 I </p><p>1250 900 1050Bank Loan i </p><p>ITotal Current Liabilities 2413 4360 5250 l i </p><p>. Long Term Debt 500 1000 950 </p><p>Net Worth 4223 4940 5704 </p><p>Total Liabilities &amp; Net 7136 10300 11904 Worth </p><p>11863 14952 16349Sales </p><p>Cost of goods sold (8537) (11124) (12016) </p><p>Selling &amp; Administrative (2349) (2659) (2993)Expenses </p><p>977 1169 1340Profit Before taxes </p><p>Figure 4. A typical financial statement of a company during 1993-95 </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>56 </p><p>iCash </p><p>IReceivables </p><p>IInventories </p><p>i Total Current Assets I </p><p>i Net Fixed Assets I I Total Assets </p><p>~ </p><p>Payables </p><p>Accrules </p><p>'Bank Loan </p><p>". Total Current Liabilities </p><p>: Long Term Debt </p><p>Net Worth </p><p>Total Liabilities &amp; Nei, Worth </p><p>i </p><p>lSales </p><p>ICost of goods sold </p><p>Selling &amp; Administrative Expenses </p><p>Profit Before taxes </p><p>year 1993 </p><p>0.078 </p><p>0.275 </p><p>0.285 </p><p>0.638 </p><p>0.362 </p><p>1 </p><p>0.261 </p><p>0.042 </p><p>0.035 </p><p>0.338 </p><p>0.070 </p><p>0.592 </p><p>1 </p><p>1 </p><p>(0.720) </p><p>(0.198) </p><p>0.082 </p><p>N. FAGHIH AND M. NAMAZI </p><p>year 1994 year 1995 </p><p>0.037 0.017 </p><p>0.279 0.340 </p><p>0.254 0.276 </p><p>0.570 0.633 </p><p>0.430 0.367 , </p><p>1 1 </p><p>0.286 0.304 </p><p>0.050 0.049 </p><p>0.087 0.088 </p><p>0.423 0.441 </p><p>0.097 0.080 </p><p>0.480 0.479 </p><p>1 1 </p><p>1 1 </p><p>(0.744) (0.735) </p><p>(0.178) (0.183) </p><p>0.078 0.082 </p><p>Figure 5. The financial statement for years 1993-95 in the normRlbed form </p><p>Dow</p><p>nloa</p><p>ded </p><p>by [</p><p>Akd</p><p>eniz</p><p> Uni</p><p>vers</p><p>itesi</p><p>] at</p><p> 13:</p><p>24 2</p><p>0 D</p><p>ecem</p><p>ber </p><p>2014</p></li><li><p>57 APPLYING SHANNON'S INFORMATION THEORY </p><p>Now, in order to calculate the information contribution of each element, the values in Figure 5 can be substituted for p in equation (8) and the binary logarithm base may also be used to obtain the information content in bits. Thus, every element in Figure 5 is replaced by its corresponding information value, to obtain the table of values shown in figure 6, rendering the information quantities. In fact, ill figure 6, the numerical values signify the information contributions of various composition terms, in the financial statement, in different years. .. . </p><p>Furthermore, the entropies can also be computed, by using equation (9)...</p></li></ul>

Recommended

View more >