the mathematics of living systems

Upload: mohamedriaz437985

Post on 24-Feb-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/25/2019 The Mathematics of Living Systems

    1/49

    Living Algorithm Articles (Key below) Stage Headings Edit Pages

    1. Data Stream Mathematics: Requirements 3/4 K 9

    2. Living Algorithm's Predictive Cloud

    3/4 K 72a. Living Algorithm's Evolutionary Potentials 3/4 L 13

    3. The Batting Average: Living Algorithm vs. Probability 3/4 K 8

    4. Mathematics of the Moment (vs. Probability) 3/4 L 6

    5. General Patterns vs. Individual Measures 3/4 L 4

    6. Dynamic Causation vs. Static Description 3/4 L 11

    7. Mathematics of Relationship 3/4 L 3

    8. Precision vs. Fungible Meaning 3/4 L 5

    9. Living Algorithm Algorithm 3/4 L 9

    10. Mathematics of Informed Choice (vs. Deterministic Physics) 3/4 L 11

    Expository Totals 11 86

    Bio Totals 3 7

    Narrative Totals 9 27

    Grand Total 26 120

    Living Algorithm Narratives (Key below) Stage Headings

    Edit Pages

    1-2: Life searching for a Mathematics of the Moment 4 K 2

    3-4: Is the Living Algorithm just an insignificant subset of Probability? 4 L 2

    4-5: Probability challenges Living Algorithm's scientific credentials. 4 L 3

    5-6: Probability's Numbers vs. Living Algorithm Patterns 4 L 3

    6-7: From Causation to Relationship 4 L 5

    7-8: Can the Living Algorithm provide Life with Fungible Meaning? 4 L 2

    8-9: Could Life employ Living Algorithm to digest Data Streams? 4 L 2

    9-10: Comfortable with Living Algorithm's Algorithm, Life wondersabout Choice.

    4 L 3

    10-Dyn1. Digestible Information & the Living Algorithm's birth

    4 D 5Totals 9 27

    Ads by DNSUnlockerAd Options

    Living Algorithm Bios (Key below) Stage Headings Edit Pages

    1. Data Stream's special Significance Evolution of Understanding 4 K 3

    2: Living Algorithm's special Significance Evolution ofUnderstanding

    4 K 3

    9. Just following Directions 4 D 1

    Totals 1 7

    Living Algorithm Articles All (Key below) Stage Headings

    Edit Pages

    Article List29 November 2015

    08:09 AM

    The Mathematics of Living Systems Page 1

    http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#1http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#1http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#1http://donlehmanjr.com/BD/Cell/CE02%20Cell%20Equation%20System.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2http://donlehmanjr.com/BD/Cell/CE02a%20Evolution.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2ahttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2ahttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2ahttp://donlehmanjr.com/BD/Cell/CE03%20Batting%20Average.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#3http://donlehmanjr.com/BD/Cell/CE04%20Probability.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#4http://donlehmanjr.com/BD/Cell/CE05%20Confidence%20Limits.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#5http://donlehmanjr.com/BD/Cell/CE06%20Causation.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#6http://donlehmanjr.com/BD/Cell/CE07%20Relationship.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#7http://donlehmanjr.com/BD/Cell/CE08%20Fungible.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#8http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#8http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#8http://donlehmanjr.com/BD/Cell/CE09%20Algorithm.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#9http://donlehmanjr.com/BD/Cell/CE10%20Informed%20Choice.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#10http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#10http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#10http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#1http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#3http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#3http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#4http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#4http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#5http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#5http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#6http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#6http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#7http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#7http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#8http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#8http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#9http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#9http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#9http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#9http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#10http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#10http://luu.lightquartrate.com/sd/apps/adinfo-1.1-p/index.html?bj1ETlNVbmxvY2tlciZoPWx1dS5saWdodHF1YXJ0cmF0ZS5jb20mYz1ncmVlbiZvPWh0dHA6Ly9rZHYuZGVjaXBoZXJpbmd3YXJucy5jb20vb3B0X291dC8xMSZkPSZ0PSZhPTk1NjAmcz0xMDA5Jnc9ZG9ubGVobWFuanIuY29tJm9vdT1odHRwOi8va2R2LmRlY2lwaGVyaW5nd2FybnMuY29tL29wdF9vdXQvMTEmYj0xJnJkPSZyaT0=http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#2Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#9http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#9Bhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#1http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#1http://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#9Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#9http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#2Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://luu.lightquartrate.com/sd/apps/adinfo-1.1-p/index.html?bj1ETlNVbmxvY2tlciZoPWx1dS5saWdodHF1YXJ0cmF0ZS5jb20mYz1ncmVlbiZvPWh0dHA6Ly9rZHYuZGVjaXBoZXJpbmd3YXJucy5jb20vb3B0X291dC8xMSZkPSZ0PSZhPTk1NjAmcz0xMDA5Jnc9ZG9ubGVobWFuanIuY29tJm9vdT1odHRwOi8va2R2LmRlY2lwaGVyaW5nd2FybnMuY29tL29wdF9vdXQvMTEmYj0xJnJkPSZyaT0=http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#10http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#10http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#9http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#9http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#9http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#8http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#8http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#7http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#7http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#6http://donlehmanjr.com/BD/Narratives/cell%20narratives6.html#6http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#5http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#5http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#4http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#4http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#3http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#3http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#1http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#keyhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#10http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#10http://donlehmanjr.com/BD/Cell/CE10%20Informed%20Choice.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#9http://donlehmanjr.com/BD/Cell/CE09%20Algorithm.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#8http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#8http://donlehmanjr.com/BD/Cell/CE08%20Fungible.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#7http://donlehmanjr.com/BD/Cell/CE07%20Relationship.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#6http://donlehmanjr.com/BD/Cell/CE06%20Causation.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#5http://donlehmanjr.com/BD/Cell/CE05%20Confidence%20Limits.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#4http://donlehmanjr.com/BD/Cell/CE04%20Probability.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#3http://donlehmanjr.com/BD/Cell/CE03%20Batting%20Average.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2ahttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2ahttp://donlehmanjr.com/BD/Cell/CE02a%20Evolution.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/Cell/CE02%20Cell%20Equation%20System.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#1http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#1http://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html#key
  • 7/25/2019 The Mathematics of Living Systems

    2/49

    1B. Data Stream's special Significance Evolution of Understanding 4 K 3

    1-2: Life searching for a Mathematics of the Moment 4 K 2

    2. Living Algorithm System's Predictive Clouds 3/4 K 6

    2B: Living Algorithm's special Significance Evolution ofUnderstanding

    4 K 3

    3. The Batting Average: Living Algorithm vs. Probability 3/4 K 8

    3-4: Is the Living Algorithm just an insignificant subset of Probability? 4 L 24. Mathematics of the Moment (vs. Probability) 3/4 L 6

    4-5: Life yearns for a Mathematics of Relationship 4 L 3

    5. Mathematics of Relationship 3/4 L 15

    5-6: Can the Living Algorithm provide Life with Fungible Meaning? 4 L 2

    6. Precision vs. Fungible Meaning 3/4 L 6

    6-7: Could Life employ Living Algorithm to digest Data Streams? 4 L 2

    7. Living Algorithm Algorithm 3/4 L 8

    7-8: Comfortable with Living Algorithm's Algorithm Life wonders

    about Choice

    4 L 3

    8. Mathematics of Informed Choice (vs. Deterministic Physics) 3/4 L 11

    9. Dynamic Causation vs. Static Description 3/4 L 10

    9B. Just following Directions 4 D 1

    Totals 18 99

    Key

    This is the article list for The Living Algorithm System the second monograph in the series associated

    with the study ofBehaviorial Dynamics.On this page the Reader will find 5 columns. The first column

    (Table of Contents) is a list of the linked articles. The second column indicates whether the articles arefinished () or are still a work in progress (WIP). The 3rd column (Headings) provides a link to two

    levels of outline detail. If the Reader is interested in Section headings, click on 3 and if interested in

    Paragraph headings click on 4. The fourth column (Stage) indicates whether the articles have been

    edited (L/K) or not (-). The final column (Pages) indicates the length of each article.

    The Mathematics of Living Systems Page 2

    http://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/Narratives/cell%20bios.html#1http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1Bhttp://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#1http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1http://donlehmanjr.com/BD/Cell/CE02%20Cell%20Equation%20System.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#2Bhttp://donlehmanjr.com/BD/Cell/CE03%20Batting%20Average.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#3http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#3http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#3http://donlehmanjr.com/BD/Cell/CE04%20Probability.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#4http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#4http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#4http://donlehmanjr.com/BD/Cell/CE07%20Relationship.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#5http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#5http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#5http://donlehmanjr.com/BD/Cell/CE08%20Fungible.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#6http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#6http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#6http://donlehmanjr.com/BD/Cell/CE09%20Algorithm.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#7http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#7http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#7http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#7http://donlehmanjr.com/BD/Cell/CE10%20Informed%20Choice.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#8http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#8http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#8http://donlehmanjr.com/BD/Cell/CE06%20Causation.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#9http://donlehmanjr.com/BD/Narratives/cell%20bios.html#9http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#9Bhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#9Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#9http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#9http://donlehmanjr.com/BD/Cell/CE06%20Causation.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#8http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#8http://donlehmanjr.com/BD/Cell/CE10%20Informed%20Choice.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#7http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#7http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#7http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#7http://donlehmanjr.com/BD/Cell/CE09%20Algorithm.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#6http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#6http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#6http://donlehmanjr.com/BD/Cell/CE08%20Fungible.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#5http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#5http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#5http://donlehmanjr.com/BD/Cell/CE07%20Relationship.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#4http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#4http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#4http://donlehmanjr.com/BD/Cell/CE04%20Probability.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#3http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#3http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#3http://donlehmanjr.com/BD/Cell/CE03%20Batting%20Average.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#2Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/Narratives/cell%20bios.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/Cell/CE02%20Cell%20Equation%20System.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1http://donlehmanjr.com/BD/Narratives/cell%20narratives1.html#1http://donlehmanjr.com/BD/ToC/Cell%20NarBioToC%204.html#1Bhttp://donlehmanjr.com/BD/Narratives/cell%20bios.html#1
  • 7/25/2019 The Mathematics of Living Systems

    3/49

    Have you ever considered how we translate the impersonal digital information of 1s and 0s into personal knowledge that is rel evant to our

    existance? For instance, how are we able to derive 'music' from our favorite CD? How is it that we dance wildly or cry uncont rollably

    when we hear a sequence of 1s and 0s that can't even touch each other? What is the translation process that bridges the infin ite chasm

    between these 2 simple numbers?

    I am excited to present a plausible theory that accounts for our personal connection to the impersonal digital sequences cont ained on our

    CDs, DVDs, computers and IPhones. The process that seems to enable connectivity is contained in a mathematical systemlabeledInformation Dynamics.The theory concerns how living systems digest digital information to transform it into a form that is

    meaningful to Life. While the information processing epitomized by a computer is of necessity static, exact and fixed, living information

    digestion is of necessity dynamic, approximate and transformational. Think of the difference between a baby and a computer.

    Our initial monograph illustrated some of the many patterns of correspondence between the mathematical processes ofInformation

    Dynamics and empirical reality. These include the harmful effects ofInterruptions to the Creative Process, the negative impact ofSleep

    Deprivation,theNecessity of Sleep,and even theBiology of Sleep.

    These striking correspondences evoke some distinct questions. Why does the mathematical model behave in similar fashion to

    experimentally verified behavioral and biological reality? Could these correspondences be a mere coincidence, some kind of od d artifact?

    Or perhaps the striking patterns are due to some yet as undiscovered molecular/subatomic mechanism? Or could these odd correlations

    between mathematical and living processes be due to the process by which living systems digest information?

    We chose to explore the last theory. The first question we posed ourselves: What kind of information digestion process would a livingsystem require? What are the entry level requirements?

    The following essay addresses three questions. Why do living data streams best characterize the dynamic nature of living syst ems? Why do

    data streams require a new mathematics? And what requirements must this data stream mathematics fulfill if it is also to be t he

    mathematics of dynamic living systems?

    Living Systems require New Mathematics of Data Streams

    The highly respected Dr. Lotfi Zadeh is considered to be the father of the well-established fuzzy logic approach to engineering. Dr. Zadeh

    co-authored the first book on linear systems theory in 1963, which immediately became a standard text for every engineering scho ol. His

    prestige in the scientific community was sealed with this publication. Yet, he was already moving in a contrary direction. He was grappling

    with the difference between living systems and material systems. Most of his colleagues, in their attempt to accurately chara cterize the

    features of living systems, were pursuing a course that attempted to apply the mathematics of inanimate systems to what were becoming

    known as animate systems. The effort to capture the unique features of animate systems with conventional mathematics resulted in ever-greater levels of complexity and precision. He sensed that this effort by his colleagues to distinguish between 'animate' and 'inanimate'

    systems was conceptually misguided. He had begun to realize that living systems are qualitatively different from non-living systems; and

    this difference cannot be captured by the mathematics of inanimate sets no matter how complex. He recognized that a new

    mathematics was required to articulate this qualitative difference. He published a paper in 1962 entitled From Circuit Theory

    Life's requirements for an Information Digestion System29 November 2015

    08:31 AM

    The Mathematics of Living Systems Page 3

    http://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#interruptionhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepdephttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepdephttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepdephttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepnechttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepnechttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepbiohttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepbiohttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepbiohttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepnechttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepdephttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#sleepdephttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#interruptionhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamicshttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#infodynamics
  • 7/25/2019 The Mathematics of Living Systems

    4/49

    to System Theory that foreshadowed his new perspective.

    There are some who feel this gap [between 'animate' and 'inanimate' systems] reflects the fundamental inadequacy of the conventional

    mathematicsthe mathematics of precisely-defined points, functions, sets, probability measures, etc. for coping with the analysis of

    biological systems, and that to deal effectively with such systems, which are generally orders of magnitude more complex than man-made

    systems, we need a radically different kind of mathematics, the mathematics of fuzzy or cloudy quantities which are not descr ibable in

    terms of probability distributions. (Dr. Bart Kosko, Fuzzy Thinkingp.145 quoting from Dr. Zadehs paper.)

    Dr. Zadeh reminds us that biological systems are an order of magnitude more complex than inanimate systems. This is neither t o diminish

    the complexity, nor the conventional explanatory power of math or physics. But when we want to make meaningful statements abo ut these

    biological systems with their attendant complexity, what can be said with the traditional standard of precision is limited. F or instance,

    when we think about the creative act of writing, we realize that the invention and expression of complex ideas has a reality that goes far

    beyond the ability to electrically map the brain.

    We may be able to identify the electrical patterns of the brain with a high level of precision and still know very little abo ut the nature of the

    ideas. The nature of ideas would certainly include elements such as invention, expression, relation and evaluation. Any satisfactory

    explanation of these elements must go beyond mapping electrical patterns. We need to grasp very nuanced meanings that come fr om a

    nuanced understanding of context. Yet, this order of magnitude of complexity only addresses the individual writer. When we co nsider

    some of the other relevant social variables connected with writing editors, agents, publishers and the reading public, the

    magnitude of complexity of interaction of these living systems is staggering. To gain insight into how a particular author

    invents, expresses and interacts with others, we need to know about the particular subtleties that are at work in that partic ular

    case. This example is one illustration of Zadehs notion that biological systems have a level of complexity that far exceeds that

    of inanimate systems.

    Dr. Zadeh makes an important distinction between 'animate'and 'inanimate'systems. Fixed data sets typify 'inanimate'systems

    systems which do not change with time and whose members are precisely defined. The probability distributions of

    conventional mathematicswork extremely well when dealing with these fixed data sets. The mathematics of probability processes this

    data to produce familiar measures, such as mean and standard deviation, that accurately characterize the features of such a set. The analysis

    of these static and fixed data sets is a perfect way to discuss the general features of a fixed and static population an 'inanimate'system.

    In contrast, when we consider the dynamic nature of biological systems, the precisely defined data set, with its attendant conventional

    mathematics of probability, is inappropriate. The appropriate approach must address the fact that an organism, by definition, is in a

    continually changing state that is neither fixed, nor can it be precisely defined. Living systems move through time and space , constantly

    monitoring and adjusting in order to enhance the possibility of survival (as well as the achievement of any other higher orde r goals). It is

    this aspect of biological systems that scientists hope to capture through an analysis ofanimate systems. The new mathematics

    ofanimate systems must somehow articulate this qualitative difference between the static and dynamic features of existence.

    There is another way of looking at data that better reflects the dynamic nature of living systems. This new method requires l iving datasets which we choose to call data streams. The mathematics of living data streams must somehow address the information

    flow of an inherently growing data set. Only in this manner will the new mathematics be able to describe the complex ongoing

    relationship between organism and environment. Any animate system that hopes to simulate Life's dynamic nature must incorporate

    this new mathematics of living data streams.

    Data Stream Mathematics must address Lifes Immediacy

    This new mathematics that addresses the data streams of living systems must go beyond traditional methods of characterizing data.

    Conventional probability computes a single mean or standard deviation to characterize an entire set of data, where each set member is

    weighted equally. The new mathematics of living data streams should not weight all members of the data stream equally. The mo st

    relevant information regarding the well being of an organism is generally the most current information. Although past experie nce certainly

    has relevance, more recent changes in conditions are likely to have an immediate impact upon the well being of the organism. For instance,

    the average temperature for the day might be 65, but the more relevant information to an organism is that the current temper ature is 32. A

    mathematics that weights recent input more highly is required to make meaning out of living data streams.

    The nature of mathematical meaning as applied to this new interpretation of data streams is going to be qualitatively differe nt from the

    nature of mathematical meaning for a fixed data set. Traditional probability distributions are an excellent articulation of t he general

    meaning of fixed data sets. But this excellent articulation of overall averages inherently undervalues whatever significance may lie in the

    pattern ofimmediately preceding events. Therefore the power of the conventional mathematics breaks down when it is not addressing a

    fixed set with equivalent members. In fact, the only way that conventional mathematics can address a living data stream is by adding the

    new data to the existing fixed set; and then treating the new set as an enlarged, yet fixed set, where all members are equiva lent. This

    traditional approach trivializes the significance of particular moments in adynamic data stream.

    Probability theory inherently ignores the immediacy of events. Probability is a big picture specialist. It characterizes the nature of the entire

    set, and therefore undervalues the significance of the most recent environmental input. In contrast, living systems typically find it

    pragmatic to weight recent experience more heavily (see above), rather than taking all the members of the set equally into ac count when

    making computations. In short, the probability distributions of conventional mathematics are appropriate for dealing with fixed and

    permanent data sets (where every member is weighted equally). However, traditional data set mathematics is inappropriate for dealing with

    dynamic ongoing data streams (where the members are weighted in proportion to the proximity to the most recent data point). A s such, a

    brand new type of mathematics is needed one that specializes in living data streams. This data stream mathematics must

    somehow take lifes immediacy into account by weighting the most recent points more heavily.

    The Mathematics of Living Systems Page 4

  • 7/25/2019 The Mathematics of Living Systems

    5/49

    Data Stream Mathematics must include ongoing Predictive Descriptors

    A mathematics that addresses immediacy will inevitably produce new measures that focus our attention on a particular moment, or series

    of moments, in the data stream. Because this mathematics applies to data streams not data sets, these new measures must be qualitatively

    different from traditional measures. To address the importance of recent events, these measures must describe the nature of t he most

    current moments. These measures sacrifice big picture averages in order to more accurately describe the momentum of the momen t.

    Probability inherently sacrifices the uniqueness of a particular moment by incorporating the individual data into an overall average of the

    entire data set. Traditional measures, such as the Standard Deviation and the Mean Average, are therefore inadequate descriptors of a living

    data stream. Rather than measures that describe the entire population, Data Stream Mathematics requires descriptive measures that focus

    upon the immediacy of the moment.To be useful to the organism these descriptive measures must include a predictive component. Probability's descriptive measures (the mean

    average and the Standard Deviation) allow scientists to make well-defined predictions regarding general populations. For instance,

    Probability theory can predict the behavior of billions of subatomic particles with an amazingly high degree of precision. Similarly the data

    stream descriptors must allow us to make meaningful predictions about the next point in the stream. Making accurate descriptions about

    the most recent moments in the data stream enables the organism to make probable statements about the future. Probable statements

    provide timely information to the organism. Absent this information about the nature of a particular moment(s) in time, the organism

    would be flying blind as it confronted the environmental data stream. In essence, these probable statements serve two functions: 1) to

    predict environmental behavior with greater accuracy, and 2) to utilize this information in determining a more appropriate re sponse. Any

    data stream mathematics that hopes to simulate living systems must include descriptors of particular moments in time that als o allow us to

    make useful predictions about the next point in the stream.

    The predictive statements derived from these desired data stream measures are, however, likely to look fuzzier or cloudier than the

    predictive statements derived from traditional data set measures. The meaning is likely to be 'fuzzy or cloudy' in that the predictions will besuggestive rather than definitive. Practical considerations limit the predictive accuracy of the data stream's ongoing measures. The

    predictive statements of Probability mathematics are going to be definitive because they are applied to fixed data sets, which, by definition,

    never change. The predictive statements of data stream mathematics are likely to be fuzzier or suggestive because they are applied to a

    living data stream, which, by definition, possesses the capacity for constant change.

    The new data stream mathematics will not satisfy the traditional predictive rigor demanded by Probability. Yet, this new mathematics will

    complement traditional approaches by providing more powerful predictors about the immediate behavior of the data stream. The predictive

    power provided by these data stream measures may well be more significant than overall statements about a growing (yet fixed) data set. A

    mathematics that weights the immediacy of moment(s) can be a more useful predictor than traditional Probability mathematics (where all

    members of a fixed set are weighted equally). This new mathematical meaning may not fulfill the criteria for predictive precision

    demanded by the mathematics of probability; but what it lacks in predictive precision, it more than makes up for by focusing upon the

    relevance of more recent events.

    Are these data stream predictors relevant to living systems?Clearly there are times when an organism benefits from a heightened awareness of the immediate environment. Living systems are often

    required to make instant responses to ongoing environmental input in essence, data stream(s). The changing conditions inherent

    in the data streams of living systems often require an urgent response. The potential urgency of any reaction requires a

    flexibility of interpretation and response that is sensitive to the momentum of the moment, or series of moments. There is

    particular relevance to the organisms ability to predict the probable momentum of an ongoing series of experiences.

    Probabilitys preoccupation with the general features of a fixed data set fails to capture the momentum of recent events. Any

    Data Stream Mathematics that hopes to simulate this feature of living systems must somehow provide predictive descriptors

    that characterize the probable momentum of the moment. As we shall see, this characterization should also reveal the

    emerging pattern(s) in a series of moments in the life of an organism.

    This analysis of two complementary mathematical systems raises some significant questions. What level of precision is reasonable to

    expect from the predictive measures of data stream mathematics? Does the distinction between data sets and data streams suggest the

    possibility of two different standards of precision when analyzing data? Can we tolerate two different standards of precision whenanalyzing data? If the level of predictive accuracy for data stream mathematics is viewed as substandard by conventional mathematics, can

    it still be useful? Is accuracy that is suggestive, yet not definitive, a useful predictor? For some preliminary answers to t hese questions, read

    on.

    Precision & Relevance Incompatible

    Conventional mathematics may frown skeptically when confronted with the suggestion that not all significant patterns in Life can be

    characterized by the precise standards of Probability. However, this presumed imprecision of data stream mathematics turns ou t to be an

    asset, not a liability.

    In terms of the organisms predictive powers, precision and meaning are inversely proportional the more precision, the less meaning,

    and vice-versa. This notion may seem counter intuitive, yet in his study of system theory the aforementioned Dr. Zadeh turns

    this concept into a principle - the principle of incompatibility:

    {Zadeh] saw that as the system got more complex, precise statements had less meaning. He later called this the principle ofincompatibility: Precision up, relevance down. (Kosko, Fuzzy Thinking, p. 145, 1993)

    In 1972, Zadeh articulated the principle even more clearly. Kosko quotes Zadeh:

    "As the complexity of a system increases, our ability to make precise and significant statements about its behavior diminishe s until a

    The Mathematics of Living Systems Page 5

  • 7/25/2019 The Mathematics of Living Systems

    6/49

    threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics. A

    corollary principle may be stated succinctly as, "The closer one looks at a real-world problem, the fuzzier becomes the solution." (Fuzzy

    Thinking, p. 148, 1993)

    If it is true, as Dr Zadeh argues, that real world problems requirefuzzy solutions, data stream mathematics may provide a method to explore

    thisfuzziness.

    The concept behind Zadeh's Principle of Incompatibility helps explain why the traditional laws of Probability find Life's Imm ediacy

    perplexing. Probability is far more comfortable dealing with what is familiar his specialty. He feels most at ease with fixed,

    unchanging data sets where all members are functionally equivalent. Perhaps his desire for the comfortable, yet rigid,

    precision of conventional mathematics presents an insurmountable challenge to understanding the complexity of Life'simmediate meaning. For Life to have her spontaneous immediacy appreciated, she may have to search elsewhere for a

    mathematical partner.

    Perhaps understanding the immediacy of the moment requires a partner that relates better to a data stream. While this tradeof f sacrifices the

    comfortable predictability of the more traditional relationship, it offers her a freshness that comes from more accurately un derstanding the

    meaning of the moment(s) her most subtle nature. As we shall see, the suggestive predictors of Data Stream Mathematics with

    their relative imprecision are an ideal match for characterizing her meaning of the moment Life's Immediacy.

    Summary, Questions & Links

    The esteemed Dr. Zadeh makes the claim that conventional mathematics with its precisely defined probability measures is inadequate

    for coping with the analysis of biological systems. He further states that a new mathematics of fuzzy or cloudy quantities is required to

    deal effectively with such systems. We suggest that a productive step in this direction is the study of living data streams, rather than fixed

    data sets. Living data streams reflect lifes ongoing and immediate nature. A mathematics of living data streams may uniquely capturethese characteristic aspects of biological systems.

    This new approach, however, has strict requirements. Is it reasonable to expect that a mathematics of data streams will satis fy all of the

    criteria discussed above? Is it reasonable to expect this method to be: 1) current, 2) self-reflective of immediately preceding experience, 3)

    responsive to pattern, 4) sensitive to any change in context, and 5) pragmatically predictive? What metaphor, other than math ematics, is

    capable of systematically relating all these variables of living systems? What will this new mathematical metaphor look like? And if a

    mathematical metaphor can effectively relate these criteria, could this metaphor actually be a mechanism that living systems employ to

    process data streams? Is mathematics the information processing language of living systems?

    The Mathematics of Living Systems Page 6

  • 7/25/2019 The Mathematics of Living Systems

    7/49

    Fuzzy Set Mathematics doesnt address Lifes Data Streams2: Articles3. Sections4. Paragraphs

    The prior article developed the notion that living systems must extract meaning from ongoing data streams to survive. Because of the

    quantitative nature of data streams, we suggest that this meaning is mathematical in nature. This mathematical meaning has a few crucial

    features. The mathematics must address the immediacy of living systems as well as providing predictive descriptors for each moment. As of

    yet, traditional mathematical systems havent been able to meet this challenge.

    The esteemed Dr. Zadeh, the father of Fuzzy Logic, recognizes this deficiency and offers his own mathematical system as a solution. Lets

    see how successful this approach is. To set the context, our exploration begins by revisiting a previously-cited QUOTATION

    from Dr. Zadeh:

    (Due to) the fundamental inadequacy of the conventional mathematics the mathematics of precisely-defined points, functions, sets,

    probability measures, etc. for coping with the analysis of biological systems we need a radically different kind of mathematics, the

    mathematics of fuzzy or cloudy quantities which are not describable in terms of probability distributions. (Dr. Bart Kosko, Fuzzy Thinking

    p.145 quoting from Dr. Zadehs paper.)

    Inspired by this insight, Dr. Zadeh went on to formulate the concept of fuzzy sets. His followers turned this notion into the mathematics of

    fuzzy logic. Engineers have successfully applied the insights of fuzzy logic to significant real-life problems, such as how to stop bullettrains smoothly. Cognitive scientists have also employed the insights from fuzzy logic to simulate the neural networks of the brain. In

    essence, fuzzy logic successfully introduces the both-and approach to data sets, a complement to the either-or approach of conventional

    mathematics.

    Zadehs call for a radically different kind of mathematics implies the need for some new kind of arcane operations or esote ric measures to

    cope with biological systems perhaps a biological string theory, or a calculus of living systems, or the quantum mechanics of

    behavior, or even fuzzy logic. However, with the exception of neural networks, fuzzy logic has not proved to be the new

    mathematics that is necessary for coping with the analysis of biological systems .

    It may be that the quest to discover a radically different kind of mathematics to analyze biological systems should not be limited to the

    esoteric nature of high-level theoretical mathematics. We believe it would be useful to shift the mathematical focus to the inherent nature of

    the subject matter at hand the data stream. Rather than pursue ever-more complex abstractions, we believe that the intelligent

    application of existing mathematical tools can yield powerful, practical insights into the nature of data streams. This intelligent

    application of existing mathematical tools must focus on the immediate significance of moments in the data stream.We have argued previously that the dynamic nature of living systems is best characterized by data streams. We went on to detail the

    requirements of a mathematics of data streams that would address Lifes immediacy. Hunting for a new set of mathematical abstractions

    that would fulfill the necessary data stream requirements would be a difficult, if not hopeless, quest. Data Streams offer an extraordinary

    challenge because of the inherently changeable nature of an open living system. Living systems can be extremely sensitive to every

    interaction with an environment that includes not only the closed systems of inanimate matter, but also includes interactions with other open

    animate systems. The complex web of interactions between living systems and the myriad data streams of existence suggests the arcane

    approach of theoretical mathematics is highly impractical.

    Rather than a higher and more complex level of abstraction, we are looking for pragmatic and useful tools that can help us think about

    living data streams. Our search is for a practical mathematics. We appreciate the power of high-level abstractions. However, we believe

    there is a place for the use ofexisting mathematical tools. The key is to apply these tools with a sensitivity to the unique nature of data

    streams. The intelligent application of these tools can reveal insights into the nature of data streams, which are accessible to the informed

    reader. These practical insights can provide a pragmatic balance to the rarefied language of theoretical mathematics.The Quest for an ' Animate' System

    We begin with this question: What is the relationship between organism and data stream? In other words, what actually occurs when a

    biological system encounters a data stream? It is a compelling working hypothesis that organisms both ingest and digest the ongoing flow

    of information. If an organism is merely ingesting data, then the data stream has little, if any, relevance. Common sense tells us that

    processing data into some usable form must be a central purpose of an organism. The ability of organisms to monitor and adjust to the

    inherently changeable nature of data streams depends upon an effective digestive system. Organisms do more than merely ingest data

    they digest data. The question now becomes: How does this digestive system function?

    Data Streams best characterize the dynamic nature of living systems. Organisms are in a continual state of interaction with data streams.

    What could this process of interaction be, other than a perpetual state of ingesting and digesting the flow of information? The question is

    not: How do data streams behave? But rather, we ask: What method does an organism utilize to digest the flow of information? The quest is

    to find a method that simulates a system that digests data according to the criteria that we have established for a Data Stream Mathematics

    of Living Systems. This method will define an 'animate' system of information processing. The test for this system is simple. How well does

    it fulfill the prescribed requirements?

    The requirements for the Data Stream Mathematics of Living Systems are straightforward, yet daunting. To be a successful candidate for

    the position, the mathematics of the 'animate' system must effectively address the immediacy of dynamic living systems. This includes: 1)

    weighting the elements of the data stream in proportion to their immediacy a sort of sliding scale that weights the present moment

    Living Algorithm's Predictive Cloud29 November 2015

    08:32 AM

    The Mathematics of Living Systems Page 7

    http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2http://donlehmanjr.com/BD/Cell/CE02%20Cell%20Equation%20System.html#789746http://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/Cell/CE02%20Cell%20Equation%20System.html#789746http://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2http://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html
  • 7/25/2019 The Mathematics of Living Systems

    8/49

    more heavily; 2) providing descriptive measures that relate data points to each other in a manner that is sensitive to pattern

    recognition; and 3) providing suggestive predictors that serve a pragmatic anticipatory function. These are the requirements

    that a successful candidate must fulfill to be considered for the position. If the requirements are not fulfilled, the position will be

    left open.

    We would like to recommend a candidate for this long vacant, highly coveted and esteemed position. She's an excellent choice. She is a

    simple form of information processing. Her sole function is to digest data streams. Further, her method of information processing generates

    an 'animate' system, which fulfills the requirements of data stream mathematics. Her mathematics could be called the mathematics of the

    moment, in that she effectively addresses Life's Immediacy. This includes providing a suggestive interpretative mechanism that articulates

    pattern. The name of our candidate? You may have guessed it. Drum roll please . TheLiving Algorithm's Info System. The following

    discussion provides evidence that supports our claim that this animate system, the Living Algorithm System, fits this deman ding job

    criteria and should be considered for the position. If her qualifications interest you, read on.

    Living Algorithm fulfills Requirements

    The conventional mathematics of Probability, whose specialty is fixed data sets, is unable to capture the immediacy of living systems. This

    inability is due to a preoccupation with the average features of the entire set, rather than the unique features of particular moments. As such,

    we must reject this applicant for the position. The dynamic nature of living systems requires a mathematics that encompasses the

    immediacy of Life's data streams.

    To accomplish this feat this new data stream mathematics must: 1) weight more recent data points more heavily and ) provide ongoing

    predictive descriptors. It is easy to explain how the innate nature of the Living Algorithm System fulfills these two preliminary

    requirements. The Living Algorithms sole function is to digest Data Streams. Ongoing raw data enters this mathematical syste m of

    information processing. The Living Algorithm Family immediately: 1) weights the most recent data points in proportion to the current

    moment in the data stream; and 2) transforms this external input into ongoing predictive descriptors. Accordingly, each 'moment'in the datastream has its own predictive descriptors. (For the mathematics behind this verbal description check out The Living Algorithm.)

    The third criterion for the position requires a mathematics that provides suggestive predictors that serve a pragmatic anticipatory function.

    We believe that the Living Algorithms descriptive measures fulfill this difficult requirement. Justifying this claim is the quest of the

    remainder of this article.

    Living Algorithm System's Predictive Clouds

    The Living Algorithm's sole purpose is generating the rates of change (the derivatives) of any data stream. This method of information

    digestion entails turning precise data (instants) into an ongoing series of moments. These moments are characterized by their derivatives.

    These derivatives reveal the trajectories of each moment by describing the current moment in relation to the preceding moments. Each

    derivative has its own unique function. The Living Average (the 1st derivative) describes the relative position of each moment in the data

    stream in relation to prior moments. We choose the phrase Living Average to represent a proportional weighting of moments whose impact

    decreases over time. The Deviation (the 2nd derivative as a scalar, an undirected quantity) describes the relative range of each moment byarticulating the expected limits of the variation of pattern in the data stream. The Directional (the 2nd derivative as a vec tor, a directed

    quantity) describes the relative tendency of each moment by articulating the expected direction of the momentum of a pattern. As a trio,

    these descriptors characterize each individual moment in the data stream in relation to the preceding moments. In contrast, due to a

    preoccupation with the general features of fixed sets, Probability actually ignores the existence of these moments in the data stream. A way

    of visualizing this trio is shown in the following diagram.

    These three descriptors simultaneously provide a prediction that amounts to rough approximations about the next data point: 1) the expected

    position (the dot in the center), 2) the range of variation (the circle), and 3) the direction of momentum (the arrow). Accordingly, each of theLiving Algorithm's ongoing derivatives is a descriptor that contains a significant predictive feature. A simple combination of these

    predictive averages creates a composite predictive cloud. We choose the term cloudto represent the approximation of the expected features

    of the next data point, which in summary, includes position, a range of probable values and recent tendencies of direction.

    The Living Algorithm generates a trio of ongoing descriptors in response to the ongoing flow of information in the data stream. These

    descriptors create predictive clouds. These meaningful composite elements, these predictive clouds, may be the type of predictive tools that

    Dr. Zadeh suggested would be necessary for coping with the analysis of biological systems. Dr. Zadeh argued that these pred ictive tools

    would be, of necessity, 'fuzzy or cloudy quantities which are not describable in terms of probability distributions'. Dr. Zadeh pursues a

    solution that applies new mathematical abstractions to what he calls fuzzy sets. In contrast, we pursue an approach that applies existing

    mathematical tools to the notion of a living data stream.

    These predictive statements inherit their cloudy nature from the constant state of evolution inherent in a data stream. The Living Algorithm

    System digests data to provide a predictive cloud, whose shape shifts with each new entry. Each new data point represents change; and the

    constant possibility of change requires an ongoing approximation of pattern that is central to the responsiveness of living systems. As withLife, these predictive clouds are context sensitive, constantly evolving via the dynamic input from a living data stream. The urgency of

    response typically required of living systems demands context sensitivity. These predictive clouds reflect the immediate nature of living

    systems, as they move through time. As such, the ongoing and suggestive nature of the Living Algorithm's predictive cloud is ideal for

    describing the changeable and immediate nature of living systems.

    After accomplishing her last ordeal, the Living Algorithm has now fulfilled all of the previously stated job requirements. The Living

    The Mathematics of Living Systems Page 8

    http://donlehmanjr.com/BD/Glossary/bd%20dict-picts/Basic%20Living%20Algorithm.htmlhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Dynamics/D01%20Cell%20Equation.htmlhttp://donlehmanjr.com/BD/Dynamics/D01%20Cell%20Equation.htmlhttp://donlehmanjr.com/BD/Dynamics/D01%20Cell%20Equation.htmlhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Glossary/bd%20dict-picts/Basic%20Living%20Algorithm.html
  • 7/25/2019 The Mathematics of Living Systems

    9/49

    Algorithm is the ideal candidate for the position of representing the personal and dynamic nature of living systems. Her mastery of data

    stream mathematics renders her approach a powerful simulation of an animate system.

    Nature of Evidence: Living Algorithm as Life's Information Digestion System?

    relates the present moment to the immediately preceding moments,1.

    reveals patterns (or the lack thereof) that represent these related moments2.

    generates rough, yet practical, predictors about the immediate future.3.

    Let us summarize our findings and then examine some of the intriguing implications. One essential fact about living systems is that they are

    in a constant state of digesting data streams. The responsiveness of an organism to a data stream requires some form of information

    digestion that serves these notable functions:

    The Living Algorithms unique process of digesting data streams generates the Predictive Cloud. The Predictive Cloud satisfie s the three

    notable functions outlined above. Accordingly, the Living Algorithm System mirrors these essential qualities that living systems require and

    as such is a compelling model of biological information digestion.

    Could the Living Algorithm be more than a model? Could the Living Algorithm model actually be the method by which living systems

    process data streams? Does this method of information processing exhibit any noteworthy patterns? If noteworthy patterns do emerge, what

    rules are capable of generating these forms? Is it reasonable to assume that patterns must conform to some manner of rule-governed

    behavior?

    What form might these rules take? What language is being spoken? We sense that the ideal language must be mathematical. What other

    language is capable of fulfilling the unique requirements of this challenging job description? Is the mathematics of data streams a language

    of living systems?

    In order to answer these intriguing questions, we must start by discovering if there are any patterns worth noting. If we discover such

    sequences, we can then begin to search for the rules that govern them. We can then ask: what sort of language can express the rules that arerequired to generate these noteworthy patterns? And finally: are humans, specifically, or living systems, in general, subject to the grammar

    of this language?

    Let's take a stab at some preliminary answers. Does the Living Algorithm exhibit any noteworthy patterns? Both the Pulse of Attention and

    the Triple Pulse are innate patterns of the Living Algorithms method of information processing. These mathematical patterns obey some

    distinct rules, as discussed in Triple Pulse Results.Is there any evidence that living systems are subject in any way to these rules? In the

    prior article stream, we examined multiple examples of how the mathematical rules of the Triple Pulse sync up with multiple

    experimentally verified sleep-related phenomena. Humans, at least, seem to be subject to the rules of the Triple Pulse. Is all of this evidence

    coincidental? Are there other factors at work? Is there a confounding variable that is waiting to be discovered? Or do these correspondences

    indicate that we might employ the Living Algorithm to digest information?

    If the Living Algorithm is really one of the ways in which living systems digest data streams, the Living Algorithm must have evolutionary

    potentials as well. Else why would this computational ability be passed on from generation to generation? For a discussion of these issues,

    check out the next article in the stream The Living Algorithm's Evolutionary Potentials.

    The Mathematics of Living Systems Page 9

    http://donlehmanjr.com/BD/Triple/TP05%20Triple%20Pulse%20Results.htmlhttp://donlehmanjr.com/BD/Triple/TP05%20Triple%20Pulse%20Results.htmlhttp://donlehmanjr.com/BD/Cell/CE02a%20Evolution.htmlhttp://donlehmanjr.com/BD/Cell/CE02a%20Evolution.htmlhttp://donlehmanjr.com/BD/Cell/CE02a%20Evolution.htmlhttp://donlehmanjr.com/BD/Triple/TP05%20Triple%20Pulse%20Results.html
  • 7/25/2019 The Mathematics of Living Systems

    10/49

    Could the Living Algorithm possess Evolutionary Potentials?2: Articles3. Sections4. Paragraphs

    Living Algorithm?

    A special equation whose sole function is digesting data streams.

    The Living Algorithm's digestive process provides the rates of change (derivatives) of any data stream. These metrics/measures contain

    meaningful information that living systems could easily employ to fulfill potentials, such as survival.

    Living Algorithm System?

    A mathematical system based in the Living Algorithm's method of digesting data streams.

    In the priormonograph,the Triple Pulse of Attention,we saw that the mathematical behavior of the Living Algorithm System exhibited

    patterns of correspondence with many aspects of human behavior. Specifically, the Living Algorithm's Triple Pulseparalleled many sleep-

    related phenomena. This intriguing synergy between math andscientific 'fact'led us to ask the Why question. Why does the linkage exist? Is

    there a conceptual model that could help explain this math/data synergy?

    Life and theLiving Algorithm are compatible in many ways. As such, the Living Algorithm is the ideal type of equation to model livingsystems. Further, the Living Algorithm has many features that are useful to Life. Taking this line of reasoning a step further, we ask the

    question: could it be that the Living Algorithm doesn't just model Life, but that living systems actually employ the Living

    Algorithm's algorithm to digest sensory data streams? In other words, could the Living Algorithm be Life's computational tool?

    Is there any evidence that Life employs the Living Algorithm to digest data streams?

    The initial article in this monograph developed the notion that living systems require aData Stream Mathematics that provides ongoing up-

    to-date descriptions of a flow of environmental information. Life needs these descriptors to approximate the future. These approximations

    enable living systems to make the necessary adjustments to maximize the chances of fulfilling potentials, including survival. This ability to

    approximate the future applies to a wide range of behaviors everything from the regulation of hormonal excretions to the ability to

    capture prey or escape from predators.

    The prior article, The Living Algorithm System,argued that the Living Algorithm'sPredictive Cloudprovides viable estimates about future

    performance. As mentioned, Life requires a mathematical system that provides these future estimates. In this way, the Living Algorithm

    System fulfills this particular requirement for a mathematics of living systems. The existence of these talents provides preliminary support

    for the notion that the Living Algorithm could be the method by which living systems digest data streams.

    If the Living Algorithm is really one of the ways in which living systems digest data streams, could the Living Algorithm have evolutionary

    potentials as well? Why else would this computational ability be passed on from generation to generation? If it is indeed a computational tool

    of living systems, the Living Algorithm should also provide an essential mathematical backdrop that is crucial for the evo-emergence of

    many of Life's complex features.

    The Living Algorithm's Predictive Cloud is the collection of derivatives (rates of change) that surround each data point - each moment.This

    feature is of particular significance because the Predictive Cloud consists of predictive descriptors. In the prior article, we saw that these

    predictive descriptors could be very useful to Life on a moment-to-moment level. Could these predictions concerning environmental

    behavior confer an evolutionary advantage as well? Is it possible that knowledge of the Living Algorithms Predictive Cloud could further

    the chances of survival for the myriad biological forms? Does the Predictive Cloud provide an indication of the evolutionary potentials of the

    Living Algorithm System?

    We don't know. You must read on to find out.

    Living Algorithm's Predictive Descriptors as applied to the Environment

    Why may it be possible that the Living Algorithms predictive descriptors provide an evolutionary advantage?

    The Living Algorithms predictive power is based upon the ongoing contextual features of the data stream. The content-based approach (the

    raw data combined with memory) is certainly the simplest system. It requires no storage memory and has no computational requirements.However, because the ongoing instantaneous nature of the information is devoid of context, employing the most recent instant to make

    predictions about the future is like shooting in the dark. The likelihood of a hit is so small, as to approach the improbable.

    The Living Algorithms digestion process provides crucial knowledge about the features of any data stream. This digestion pro cess generates

    ongoing, up-to-date rates of change (derivatives) that provide context. These contextual features provide predictive powers that are far

    Living Algorithm's Evolutionary Potentials29 November 2015

    08:34 AM

    The Mathematics of Living Systems Page 10

    http://donlehmanjr.com/BD/Article%20Lists/bd%20cell.htmlhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2ahttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2ahttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#livingalgorithmhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#livingalgorithmhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#systemhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#systemhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Article%20Lists/bd%20triple%20pulse.htmlhttp://donlehmanjr.com/BD/Article%20Lists/bd%20triple%20pulse.htmlhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#triplepulsehttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#scifacthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#livingalgorithmhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#algorithmhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/Cell/CE02%20Living%20Algorithm%20System.htmlhttp://donlehmanjr.com/BD/Cell/CE02%20Living%20Algorithm%20System.htmlhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#cloudhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#evoemergencehttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#evoemergencehttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#cloudhttp://donlehmanjr.com/BD/Cell/CE02%20Living%20Algorithm%20System.htmlhttp://donlehmanjr.com/BD/Cell/CE01%20Data%20Stream%20Math.htmlhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#algorithmhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#livingalgorithmhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#scifacthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20n-t.html#triplepulsehttp://donlehmanjr.com/BD/Article%20Lists/bd%20triple%20pulse.htmlhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#monographhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20a-f.html#systemhttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#livingalgorithmhttp://donlehmanjr.com/BD/ToC/Cell%20ToC%204.html#2ahttp://donlehmanjr.com/BD/ToC/Cell%20ToC%203.html#2ahttp://donlehmanjr.com/BD/Article%20Lists/bd%20cell.html
  • 7/25/2019 The Mathematics of Living Systems

    11/49

    superior to that provided by sheer content alone (the raw data combined with memory).

    The most basic of these features is the trio of central measures referred to as the Predictive Cloud. The Cloud's predictive power has many

    uses. On the most basic level the Cloud provides information as to probable location, range and direction of the prey/predator. The Living

    Average indicates the most probable location for the next piece of data; the Deviation, the range of variation; and the Directional, the

    tendency and probable direction of the data stream. This trio of central measures provides incredible predictive power regarding the next data

    point.

    Could knowledge of the Predictive Cloud's metrics/measures regarding the ongoing flow of environmental data provide an essential

    evolutionary advantage? Could an organism, whether cell or human being, make more accurate predictions about the future with an

    understanding of these ongoing mathematical features of the myriad environmental data streams?

    Lets explore some examples of the predictive power of the trio of measures that constitute the Predictive Cloud. An ongoing knowledge ofthis trio of central measures would provide invaluable information to the prey in terms of probable range, direction, acceleration, and actual

    location of a moving predator. Vice versa these measures would provide invaluable information to the predator in terms of the probable

    location of an escaping prey. (The dot in the diagram indicates the probable location, the circle: the range, and the arrow: the direction of the

    next data point.)

    An organism could make a more efficient and effective response to environmental input with a knowledge of probable outcome. For

    example, the ability to better predict location, range and direction of motion would allow the predator/prey to capture/escape more

    frequently. It seems safe to say that the better the organism's predictions are, the greater the chance of survival. This would apply to any

    organism. In short, the knowledge of the ongoing contextual features of any data stream enables any organism to make conscious,

    subconscious or hard-wired choices that further the chances of survival.The knowledge of probable outcome supplied by the Predictive Cloud also enables the organism to conserve energy. Instead of wasting

    energy in the unguided attempt to procure food or sexual partners, the organism would only expend valuable energy when the Predictive

    Cloud indicates a greater chance of success. Of course, energy conservation is a key evolutionary talent.

    In addition to physical capabilities such as size and strength, it seems evident that the predator/prey evolutionary arms race would have to

    include the computational ability to make probabilistic predictions about the future. Further the refinement of this essentially mathematical

    skill has no end. While strength and size have limits imposed by physical requirements, neural development is virtually unlimited, as

    witnessed by these words. The continuous refinement of this computing advantage, whether through experience, evolution, or emergence,

    would enable the organism to maximize the impact of the response, while minimizing energy expenditure an essential evolutionaryability. Of course this refinement of computational abilities could apply to the Living Algorithm's multitude of potentials.

    Predictive Cloud: Computational Backdrop for Expectation-based Emotions?

    As mentioned, an ongoing knowledge of the Living Algorithms Predictive Cloud, the aforementioned trio of measures, could be employed

    as an invaluable predictive tool. On more complex levels, these same measures could easily supply the essential computational backdrop for

    the development of our emotions. Let us offer some cursory remarks in this regard. The Cloud provides information that could enable an

    organism to anticipate and prepare for the future. Anticipation morphs into expectation.

    As mentioned, an ongoing knowledge of the Living Algorithms Predictive Cloud, the aforementioned trio of measures, could be employed

    as an invaluable predictive tool. On more complex levels, these same measures could easily supply the essential computational backdrop forthe development of our emotions. Let us offer some cursory remarks in this regard. The Cloud provides information that could enable an

    organism to anticipate and prepare for the future. Anticipation morphs into expectation.

    The Mathematics of Living Systems Page 11

  • 7/25/2019 The Mathematics of Living Systems

    12/49

    In brief, the Clouds measures of data stream change are emotionally charged because they determine expectations concerning the future.

    The investment of emotion into information, whether memories or data, has an evolutionary purpose reinforce memory.

    This emotional content renders the information easier to remember. This is not mere speculation. Cognitive studies have shown that memory

    and emotions are linked. Information's meaning is invested with emotion because it is relevant to our existence. As such, a random set of

    numbers is difficult to remember.

    Its clearly evident that the Living Algorithms Predictive Cloud could provide an obvious evolutionary advantage to living systems. The

    accuracy of future estimates is increased via the application of a simple and replicable algorithm. The Cloud's estimates of future

    performance could be employed to predict environmental behavior. Better predictions increase the efficiency and effectiveness of our energy

    usage. This conservation of energy certainly provides an evolutionary advantage. Further, the Cloud's trio of predictors could easily generate

    the expectations that are the base of many emotions. Emotions have an evolutionary purpose, as they are associated with heightened

    retention and recall associated with memory.

    This discussion suggests that it is in Life's best evolutionary interests to have knowledge of the 3 ongoing and up-to-date measures that the

    Living Algorithm provides. But to have access to this predictive power, Life must employ the Living Algorithm to digest data streams.

    Living Algorithm provides Sense of Time that Life requires.

    Besides predictive power, the Living Algorithm's method of digesting information also supplies a sense of the passage of time. The Living

    Algorithm's repetitive/iterative process relates past data to the current data, with the present being weighted the most heavily. This relating

    process, which merges the past with the present, confers a sense of the passage of time. Similarly, a cartoon consists of a series of related

    images.

    The sense of time passing is important to living systems for multiple reasons. A primary reason is that the flow of digested sensory

    information only makes sense over time. Time duration is required to derive meaning from individual sensations. Isolated sensory input

    makes no sense by itself. For instance, an isolated sound without temporal context is neither music nor a word. Even a picture takes time to

    digest, no matter how brief. A sustained image over time is required to identify objects. The sense of smell, supposedly the first sense,

    requires a duration of some kind to differentiate the potentially random noise of a single scent from the organized meaning of a sustained

    fragrance.

    A sense of time is required to experience the meaning of a signal. If an organism existed in the state of sheer immediacy, it would

    automatically respond to environmental stimuli tit-for-tatjust as matter does. But to make any kind of sense out of a sensory

    message, the organism requires an elemental sense of the passage of time. The organism must be able to pay attention to thesensory translation for a sufficient length of time to determine if the message indicates food, foe, or sexual partner. Otherwise

    the raw sensory information is just random garble. It is evident that the ability to sense the passage of time is essential i f we are

    to experience the information behind sensory input.

    Further, when an organism must make choices based upon sensory input to maximize the chances of survival, a sense of time is required to

    even begin comparing alternatives. It seems that a sense of time is not just an evolutionary talent, but a requisite talent for all living systems.

    For an organism to both have a sense of time and make educated guesses about the future, it seems that living systems must have emerged

    with some sort of computational talent. This computational talent could be employed to digest the sensory data streams that are in turn

    derived from the continuous flow of environmental information. The Living Algorithm's method of digesting data streams provides both

    future estimates and a sense of time.

    If the ability to digest information and transform it into a meaningful form is indeed an essential characteristic feature of living systems,

    The Mathematics of Living Systems Page 12

  • 7/25/2019 The Mathematics of Living Systems

    13/49

    could the Living Algorithm and Life have emerged from the primordial slime together?

    How Living Algorithm mathematics supplies a sense of time.

    How does the Living Algorithm provide a sense of time?

    The Living Algorithm's digestion process generates a sense of time by merging and relating the present moment with past moments.

    How is this blending of past and present accomplished?

    The impact of each data byte decays over time. This process is illustrated in the graph at right. Each color swatch represents the impact of an

    individual data bit as it decays over time. The x-axis represents 180 repetitions of the Living Algorithm's mathematical process. Notice how

    each moment includes many colors. This indicates the impact of prior and current data bits upon the current moment.

    How is decay incorporated into the mathematical process?

    The Living Algorithm's Decay Factor supplies this function. Let's see how.The senses digest continuous environmental input to transform it into digital form. However this digital form has no meaning. Each byte of

    information is isolated from the rest. At this point in the digestion process, the Decay Factor = 1, which means that there is no decay. With

    no decay there is no relationship between the data points. With no relationship, there is no sense of time. Without a duration of time the

    sensory output - the translated environmental information makes no sense. In summary, when the Decay Factor is one (D=1),

    there is no time, hence no meaning.

    To provide a sense of time, hence meaning, the sensory data streams require another level of digestion. The senses transform environmental

    information into sensory data streams. The Living Algorithm's digestion process relates the isolated points in the sensory data stream to

    create a sense of time. This relating process automatically occurs when the Decay Factor is greater than 1 (D>1). The Living Algorithm's

    digestion process generates a sense of time passing, which simultaneously imparts the potential for meaning.

    To aid retention, let us summarize this important process. Our senses digest continuous environmental input transforming it into sensory

    data streams. The isolated instants of these sensory data streams don't have any inherent meaning, because they are not

    related to each other in any way, as there is no decay (D=1). The Living Algorithm digests sensory data streams. This digestion

    process relates the isolated instantsby introducing decay (D>1), which generates a sense of time. A sense of time is an essence ofmeaning.

    This analysis suggests that it is at least a plausible proposition that the Living Algorithm's digestion process could create the sense of time

    that is required for meaning. Because living systems must derive meaning from data streams, this lends further support for the notion that

    Life employs the Living Algorithm to digest data streams.

    As an aside, material systems do not derive meaning from data streams. As such, material systems only deal with information that is inert. A

    data stream's information is inert when the Living Algorithm's Decay Factor is one (D = 1). Conversely, a data stream's information is

    dynamic when the Decay Factor is greater than 1 (D>1). Living systems require dynamic information because it yields meaning. As such,this significant difference between material and living systems is inherent to the Living Algorithm's method of digesting data streams.

    The Mathematics of Living Systems Page 13

    http://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#momenthttp://donlehmanjr.com/BD/Glossary/bd%20dictionary%20g-m.html#moment
  • 7/25/2019 The Mathematics of Living Systems

    14/49

    Living Algorithm's Data Stream Acceleration as a Noise Filter

    The prior section illustrated how the Living Algorithm could very well provide the computational backdrop for our sense of time. A sense of

    time is required to derive meaning from sensory input and enable us to compare alternatives. Besides providing the sense of time that imparts

    meaning to our senses, the Living Algorithm also calculates a data stream's acceleration. In fact, two of the Predictive Cloud's measures are

    accelerations. Besides providing plausible estimates concerning future performance, knowledge of a data stream's acceleration could enable

    an organism to filter out random data streams, as meaningless noise. Put another way, data stream acceleration enables an organism to

    differentiate random noise from a meaningful signal.

    The ability to differentiate a random from an organized signal is due to a simple mathematical fact. The random data streams associated with

    background noise possess an innate and stable velocity, but no acceleration. Conversely, an organized data stream (a string of relatively

    stable values consistent with ordered environmental input) has a distinct acceleration.

    The graph at right exhibits this distinct difference between a random and an organized data stream. The big red curve represents the

    acceleration of an ordered data stream, the classic Pulse of Attention (120 ones). The erratic green color represents the acceleration of a

    random stream of zeros and ones. It is immediately apparent that the acceleration of the organized data stream overshadows (rising 3 times

    higher) the acceleration of a random data stream.

    Why is identifying random streams a significant talent? After the senses translate continuous environmental information into sensory data

    streams, it is essential to first pare out superfluous data streams from consideration. Differentiating random from organized signals is the

    initial step in the process. This filtering process prevents information overload. The ability to identify and ignore random data streams

    eliminates an abundance of environmental information from consideration. Minimizing the number of data streams under consideration

    maximizes the speed and efficiency of response, which of course conserves energy.

    The focus upon data stream acceleration as a way of filtering out random signals has other advantages as well. Paying attention to data

    stream acceleration enables frogs to conserve their energy by only shooting their tongues at bugs, rather than plants. Insects move erratically,

    and hence with more data stream acceleration, than plants. On more complex levels, focusing upon data stream acceleration allows complexlife forms to determine changes in their environment. Perceiving environmental changes, whether auditory, visual, olfactory or tactile, is

    essential for any organism that must detect an approaching predator, prey, or sexual encounter. Organisms with this sense must somehow

    have the ability to perform calculations that determine probable quantities that differentiate the random noise of the background environment

    from the significant accelerations of predator and prey. The simple Living Algorithm supplies the ability to perform these calculations

    relatively effortlessly.

    It seems that the Living Algorithm-derived random data stream filter could be employed to diminish the amount of incoming data, hence

    prevent information overload. This same filter could also be employed to identify environmental changes. Both of these computational

    talents provide an evolutionary advantage.

    Living Algorithm System has the streamlined operations that Evolution prefers.

    The Living Algorithm System provides predictive capabilities. Further, the Living Algorithm's method of merging/relating the past and the

    present generates a sense of the passage of time. Living systems require a sense of time to derive meaning from the sensory data streams.

    Finally the Living Algorithm computes data stream acceleration. Knowledge of data stream acceleration could enable an organism to

    The Mathematics of Living Systems Page 14

  • 7/25/2019 The Mathematics of Living Systems

    15/49

    differentiate random from meaningful signals and identify changes in the environment. Each of these talents provides an evolutionary

    advantage.

    Besides providing these advantages, the Living Algorithm satisfies evolution's simplicity requirement. Evolutionary processes select for

    efficiency and simplicity in order to streamline operations and avoid the corruption of complexity.

    1) There is only one formula or algorithm. The Living Algorithms pattern of commands can be employed on multiple levels to

    calculate an endless array of predictive measures. In other words, a simple replication of one algorithm provides an abundant source of

    predictive data streams.

    2) The necessary computations are quite simple only employing the operations of basic math.

    3) The Living Algorithm measures require very little storage space. Rather than storing exact memories of any kind, only the ongoing,

    digested measures are stored. The present is continually incorporated into the digested past, with the most recent environmental input

    having the greatest impact. Rather than a complete motion picture, only the most recent composite quantity is stored - discrete pieces of

    data networks versus a continuous data flow. Movies or music require far more storage capacity than numbers and letters, as anyone

    knows who attempts to download CDs and DVDs onto their personal computer. Even static pictures require far less storage s