hatamikhah@chmail.ir Winter 2014 Presentation Outline Feature Selection Categorize and Describe Various Algorithms for Feature Selection A Short View

Download hatamikhah@chmail.ir Winter 2014 Presentation Outline Feature Selection Categorize and Describe Various Algorithms for Feature Selection A Short View

Post on 18-Dec-2015

213 views

Category:

Documents

1 download

Embed Size (px)

TRANSCRIPT

<ul><li> Slide 1 </li> <li> Slide 2 </li> <li> hatamikhah@chmail.ir Winter 2014 </li> <li> Slide 3 </li> <li> Presentation Outline Feature Selection Categorize and Describe Various Algorithms for Feature Selection A Short View on the Dimension Reduction My Paper </li> <li> Slide 4 </li> <li> Slide 5 </li> <li> Dimension (Feature or Variable) </li> <li> Slide 6 </li> <li> Dimension (Feature or Variable) Two feature of person: weight hight </li> <li> Slide 7 </li> <li> The curse of dimensionality Observe that the data become more and more sparse in higher dimensions (a) 12 samples that fall inside the unit- sized box (b) 7 samples in box(C) 2 samples in box Dimensionality reduction Effective solution to the problem of curse of dimensionality is: Dimensionality reduction </li> <li> Slide 8 </li> <li> Dimension Reduction General objectives of dimensionality reduction: I.Improve the quality of data for efficient data-intensive processing tasks II.Reduce the computational cost and avoid data over-fitting </li> <li> Slide 9 </li> <li> Dimension Reduction Dimensionality reduction approaches include : Feature Selection Feature Extraction </li> <li> Slide 10 </li> <li> Dimension Reduction Feature Extraction: Create new feature based on transformations or combinations of the original feature set. N: Number of original features M: Number of extracted features M </li> <li> Focus Feature selection Methods Compatibility with the least number of features Search tree --- &gt; BFS </li> <li> Slide 35 </li> <li> LVF Las Vegas Filter Feature selection Methods Searches for a minimal subset of features N: Number of feature (attribute) M: number of Samples (examples) Evaluation Criterion: inconsistency t max : predetermined number of iteration </li> <li> Slide 36 </li> <li> SFS (Sequential Forward Selection) SBS (Sequential Backward Selection) Feature selection Methods Nesting Effect plus-l-take-away-r SFFS (Sequential forward Floating Search) SBFS (Sequential Backward Floating Search) </li> <li> Slide 37 </li> <li> GA (Genetic Algorithm) Feature selection Methods Crossover Mutation SA (Simulated Annealing) RMHC-PF1 (Random Mutation Hill Climbing- Prototype and Feature selection) find sets of prototypes for nearest neighbor classification is a Monte Carlo method can be converted to a Las Vegas algorithm by running the many times. </li> <li> Slide 38 </li> <li> Slide 39 </li> <li> Three methods commonly used in feature selection : Filter model --- &gt; not consider interrelationship between the features Wrapper model --- &gt; High Complexity Embedded methods Feature redundancy Failure to select the appropriate number of features Defining the problem as a game Defining the problem as a game </li> <li> Slide 40 </li> <li> Problem as a One-Player Game Defining the problem as a Markov Decision Process Scan environment by Reinforcement Learning Methods Feature selection Method : to consider the interrelationship between the features Upper Confidence Graph Method </li> <li> Slide 41 </li> <li> The main algorithms : The main algorithms : Dynamic programming Monte Carlo Method Temporal Difference Learning </li> <li> Slide 42 </li> <li> The best policy possible in the situation f reward that have already achieved The whole set of features Subset of features each allowed action </li> <li> Slide 43 </li> <li> Average score collected by this feature The number of times that this feature is selected </li> <li> Slide 44 </li> <li> Benchmarks Information Gain CHI-squared statistic Feature Asseeement by Sliding Threshold(FAST) WEKA Software </li> <li> Slide 45 </li> <li> Slide 46 </li> <li> Slide 47 </li> <li> Any Question? May 201346 Thanks for your attention </li> </ul>

Recommended

View more >