stock market prediction using hidden markov models and investor sentiment
Post on 28-Jul-2015
208 Views
Preview:
TRANSCRIPT
Stock Prediction using Hidden Markov Models &
Investor Sentiment
Patrick Nicolaspatricknicolas.blogspot.com
Silicon Valley Machine Learning for Trading Strategies meetup, April 25, 2015
Introduction
Hidden Markov models (HMM) are have been used for years to decipher the internal workings of financial markets.
More recently, HMM have been applied to predict stocks/ETFs movement using crowd sourcing and social media.
2
Objectives
The goal is to evaluate the predictive capability of HMM using investor’s sentiment as input and answer 3 questions:
1. Can HMM predict price of securities in stationary states?
2. Can HMM detect regime shift or structural breaks?
3. What are the alternative solutions?
3
Stock Prediction using HMM in stationary states
Detection of regime changes using Buried Markov models
Alternative models
4
Problem I
Problem: Using AAII weekly sentiment survey to predict market trend (1 – 3 months)
Solution: Using hidden Markov model to predict SPX value using AAII survey as hidden states.
The sentiment of American Association of Individual Investors (AAII) is regarded as a contrarian indicator of the future direction in market indices.
5
AAII sentiment survey 6
www.aaii.com
1. Observations (bullishness) are independent, discrete
2. Observations depends on the internal state of the market
3. The historical data does show any significant break (~stationary process): Macro-economic distortion
4. The latent behavior of the stock market behavior can be quantized into a finite number of states
5. Asset/security pricing does not follow normal distribution but this restriction is relaxed for mixtures
Why HMM? 7
Overview of graphical model
AAII weekly sentiment (symbols) xt
HMM is a generative classifier to predict SPY z price, conditional to a latent, internal state of the stock market S using investor’ sentiment x as observation O.
8
Market internal state zt
…Si Sj Sk
𝑝 (𝑧𝑡+1=𝑆 𝑗|𝑧 𝑡=𝑆𝑖 )=𝑎𝑖𝑗
𝑝 (𝑥𝑡=𝑂𝑘|𝑧 𝑡=𝑆 𝑖 )=𝑏𝑘𝑗
Ok Om On
aij is the state transition probabilities matrix {Si}t-1 -> {Si}t
Bkj is the emission probabilities matrix {Sj}t -> {Ok}t
Trellis model representation
π0
πi
πn-1
a0,0
an-1,n-1
O0
bn-1,1Ot
bn-1,t
O1
9
x
S00
Si0
Sn − 10
S01
Si1
Sn − 11
S0t
Sit
Sn − 1t
The 3 key components are • z market behavior with n hidden states St = {increase, ..}• x observed sentiment with m symbols Ot = {bullish,…}• λ(πi,aij, bik) model.
Ot
bi,tz
Hidden
statesObservations
symbols
Gaussianmixture
Canonical forms
• Decoding: Given a set of observations and λ-model => most likely sequence of hidden states (Viterbi path)
• Training: Given a set of observations and set of known states => λ-model (EM - Baum-Welch)
• Evaluation: Given λ-model => Probability of observations (α/β passes)
10
Preliminary analysis
SPY (for hidden states)
03/15/2009 04/01/2014
150
100
50
Bullish/Bearish (observed)
11
Continuation Bullish/flag
pattern
Support
Resistance
Observed states
1. Single/multiple variable?
2. Quantization? x = { BULL, BEAR} or x = {bullishness intervals}
3. Smoothing?
𝐱=[x1 , x2]∨𝐱=f (x1 , x2)
12
(a ) 1𝑚 ∑
𝑖=0
𝑖=𝑚−1
𝑥𝑡
𝑥𝑡
𝑚 ∑𝑖=0
𝑖=𝑚− 1
𝑥𝑡
x {]0, 0.9], ]0.9, 1.5], ]1.5, +
( 𝑓 2 )𝑥𝑡4∑𝑖=0
𝑖=3
𝑥𝑡|3 ;
xt = bullish/bearish, x1t =bullish/neutral, x2t =bearish/neutral
, ] |3 |9
….
Issues
Selection
First you need to define the observation symbols (observed state) as number of variables, quantization…
(b)
Hidden states 13
1. How many states?2. How to initialize the λ-model (initial, transition and
emission probabilities)?3. Should the model be dynamically updated/re-
trained?
a) 4 states: Significant/moderate, decline/increaseb) Posterior, transition & emission probabilities
initialized by computing the average movement of SPY over 4 weeks on historical data?
c) No dynamic training (roll-over)
Questions
Answers
Training
Training set: 196 weeksValidation: 1-fold of 46 weeksImplementation in Scala/Apache Spark
48 96 144 192 2400.64
0.68
0.72
0.76
0.80 𝐹1 𝑆𝑐𝑜𝑟𝑒 – f2 model
Training/validation set size (weeks)
14
Prediction results
x1 = bullish/neutralx2 = bearish/neutral(f3) [x1t , x2t]
(f1) xt = bullish/bearish
15
Expected state SPY
Predictedstate
SPY
( 𝑓 2 )𝑥𝑡4∑𝑖=0
𝑖=3
𝑥𝑡
Expected state SPY
Predictedstate
SPY
Conclusion?
1. Text-book case worked. What about structural breaks?
2. Can the market response latency (set at 4 weeks), be optimized?
3. How does it compare to observed technical analysis data?
4. Can we combine investor sentiment with technical, fundamental analysis data?
16
Stock Prediction using HMM in stationary states
Detection of regime changes using Buried Markov models
Alternative models
17
Market technical indicators
Observations: ratio of the relative price change of the SPX within a trading session over relative volatility during the same session trading session.
18
The (in)ability of HMM to detect regime changes or structural breaks is illustrated by using a technical analysis signal as input observations and the same hidden states as in the previous test.
19
SPX
03/15/2009 04/01/2014
Regime 1
Regime 2
StructuralBreak (correction)
Structural breaks & regimes
Market technical indicators that relies on daily data are far more continuous than the AAII weekly sentiment survey.Can our HMM model detect the structural break and the two regimes it separates?
Validation range
Model and tests 20
The data is extracted from the Yahoo financials (.csv files)
• Training set: 870 trading sessions [1 – 420, 467 – 895]• Validation: 25 trading sessions around SPX correction
period sessions [421 – 466]• Hidden states (4): states (increase > 1%, increase < 1%,
decrease < 1%, decrease > 1%)
Limitation of HMM with Discrete states
The confusion matrix plots the predicted and actual values of SPX using the 4 hidden states illustrates the poor quality of prediction of HMM using discrete states.
21
Expected state SPY
Predictedstate
SPY
Problem II: structural breaks
Problem: How to deal with multiple trends/regimes and structural breaks?
Solution: Model observations sequence as a Gaussian mixture and defined dependence between observations as a Markov chain.
HMM is accurate in “stationary” process for which observations are independent, given a hidden state.
Can we leverage HMM’s inability to operate in shifting environment?
22
Auto-regressive Markov model
…Si Sj Sk
𝑝 (𝑧𝑡+1=𝑆 𝑗|𝑧 𝑡=𝑆𝑖 )=𝑎𝑖𝑗
𝑝 (𝑥𝑡=𝑂𝑘|𝑧 𝑡=𝑆 𝑖 )=𝑏𝑖𝑗
Ok Om On…
𝑝 (𝑥𝑡=𝑂𝑚|𝑥𝑡− 1=𝑂𝑘 ,𝑧 𝑡=𝑆 𝑗 )=𝒩(𝑥𝑡∨𝒘 𝒋 . 𝒙𝒕−𝟏+𝜇 𝑗 ,𝜎 𝑗)
t-1 t
Need to use a continuous model of observations: the observations set is defined as a Gaussian mixture.
23
AR-HMM & BMM
Sk
On On+d…
Auto-regressive model with 2 Markov models- HMM for hidden state- 1st order/Auto-regressive (AR-HMM) or higher
order/Buried Markov model (BMM) Markov chains for observations
24
We use cardinality (number of observations associated to the same hidden state) to evaluate the stationary states (regime) and sudden shifts (structural breaks).
Cardinality
Regime 1 Regime 2
CardinalitySk->l
Sm->n
25
Once the break is identified, then the regimes are identified by the distribution of probability across the different hidden states.
The cardinality (number of observations associated to the same hidden state) is quite high for stationary states. This is especially the case if the observations are noisy.
AR-HMM & BMM: Cardinality
26
• These enhancements to the HMM for continuous or pseudo-continuous observations have been proven to provide higher quality prediction in homogeneous observations (from the same source).
• AR-HMM are less accurate for observations from different sources (i.e. Weekly investors’ sentiment vs daily market technical indicators). In this scenario, regularization has to be added to the model.
• We may look at different type of techniques/classifiers as an alternative to complex regularization on AR-HMM
AR-HMM & BMM: limitations
Stock Prediction using HMM in stationary states
Detection of regime changes using Buried Markov models
Alternative models
27
Alternative to AR-HMM/BMM
The previous section describes the limitation of auto-regressive HMM and buried Markov model for observations from heterogeneous sources.
Here a summary on the few models that you may consider beside HMM• Conditional Random Fields• Riemann manifold learning• Continuous-time Kalman filter
28
Riemann manifolds
[ 𝑎00 ⋯ 𝑎0𝑛
⋮ ⋱ ⋮𝑎𝑛0 ⋯ 𝑎𝑛𝑛
][ 𝑏00 ⋯ 𝑏0𝑛
⋮ ⋱ ⋮𝑏𝑚0 ⋯ 𝑏𝑚𝑛
]Define the n+m dimension space of transition and emission probabilities, then find an embedded space (manifold) containing the most significant, constant hidden states.
You may define a regime has by the subset of the transition matrix that does not change overtime.
29
~constant probabilities
Transition probabilitiesover a regime
Emission probabilitiesover a regime
Riemann manifolds for regimes
Riemannian manifold
Tangent plane
The manifold consists of the transition probability tensor for the subset of constant states within a regime with a cumulative probability > 80%
30
𝐴⨂𝐵
Structural
break
Regime
Conditional random fields
Conditional Random fields (CRF) are discriminative models derived from the logistic regression
• CRF aims at computing the conditional probability p(x=P|z=S0, … Sn)
• CRF does not require the features be independent• CRF does not assume that the transition probabilities A to
be constant.
31
Continuous-time Kalman filter
Kalman filter is a recursive, adaptive, optimal estimator.
• Kalman allows transitory states (adaptive)• Kalman does not need a training set• Kalman supports continuous state values (continuous-
time Kalman ODE)• Kalman require specification of white noise for process
and measurement.
32
Scala in Machine Learning: $7 Sequential models – Hidden Markov Model P. Nicolas – Packt publishing 2014
References 33
Pattern Recognition and Machine Learning §13.2.1 Maximum Likelihood for the HMM C. Bishop –Springer 2006
American Association of Individual Investors (AAII) http://www.aaii.com
Stock Market Forecasting Using Hidden Markov Model: A new Approach R. Hassan, B. Nath - University of Melbourne 2008
Selective Prediction of Financial Trend with Hidden Markov Models R. El-Yaniv , D. Pidan – Technion Israel
Machine Learning: A probabilistic Perspective $17.6.4 Auto-regressive and buried HMMS – K. Murphy MIT press 2012
patricknicolas.blogspot.comwww.slideshare.net/pnicolas
github.com/prnicolas
www.packtpub.com
For further information …
top related