2.linear regression and logistic regression

Click here to load reader

Post on 21-Jan-2018

705 views

Category:

Technology

18 download

Embed Size (px)

TRANSCRIPT

  1. 1. Linear Regression Logistic Regression
  2. 2. Outlook Part 1: Part 2: Part 3: Part 4: 2
  3. 3. ... 3
  4. 4. a = [0, 1] a.append(2) len(a) a[0:2] ( ) b = {sun: 0} b[mon] = 1 b[mon] Python-data type 4
  5. 5. if i == 10: print(10) elif i < 10: print(0) else: print(100) for a in lst: print(a) for k in dct: print(dct[k]) for k, v in dct.items(): print(k, v) Python-if, for 5
  6. 6. TensorFlow Graph Session . a = tf.constant(2) b = tf.constant(3) x = tf.add(a, b) . x tf.Session().run(x) 5 6
  7. 7. zeros(), ones() 0 . e = tf.zeros([2, 3]) tf.Session().run(e) array([[ 0., 0., 0.], [ 0., 0., 0.]], dtype=float32) 1 . f = tf.ones([2, 3], dtype=tf.int32) tf.Session().run(f) array([[1, 1, 1], [1, 1, 1]], dtype=int32) 7
  8. 8. tf.Variable() . . a = tf.Variable(tf.constant(2)) a init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) sess.run(a) 2 8
  9. 9. (matrix) 23 1 2 2 3 1 1 a = tf.Variable([[1, -2, 2], [3, -1, 1]]) sess = tf.Session() sess.run(tf.global_variables_initializer()) sess.run(a) [[1 -2 2], [3 -1 1]] 9 (row) (column)
  10. 10. 23 + 23 = [23] 1 2 2 3 1 1 + 1 3 2 2 4 1 = 0 1 4 5 3 2 :, (dotproduct) 23 32 = [22] 1 2 2 3 1 1 2 1 4 3 1 2 = 4 3 3 4 10 (row) (column)
  11. 11. tf.matmul() . a = tf.Variable([[1, -2, 2], [3, -1, 1]]) b = tf.Variable([[2, -1], [4, 3], [1, 2]]) dot = tf.matmul(a, b) sess = tf.Session() sess.run(tf.global_variables_initializer()) sess.run(dot) array([[-4, -3], [ 3, -4]], dtype=int32) 11 1 2 2 3 1 1 2 1 4 3 1 2 4 3 3 4
  12. 12. 12
  13. 13. . . Regression Analysis ex) 1 , , 13
  14. 14. 1 ; = + 14
  15. 15. Hyperplane TV Radio Sales = D + I + 15
  16. 16. n ; = D D + I I + + N N + O O = 1 ;D = D D + I I + + N N + O O ;Q = D QD + I QI + + N QN + O QO ; = DD DO QD QO D O , = V = Y 16
  17. 17. (Ordinary Least Squares) (Mean Squared Error) . 1 Z ; I, ; = Q ]D ^ = _ `D _ 17
  18. 18. (Gradient Descent) = 1 2 Z ; I, = 1 ( ;) Q ]D 18
  19. 19. () 19
  20. 20. 20
  21. 21. Neuron ; ; = + + 21
  22. 22. Neuron ; + = 1 ; ; = 1 ( ;) = 1 ; ; = 1 ( ;) = 1 2 Z ; I Q ]D 22
  23. 23. Neuron ; = + = + 1 ( ;) + = + = + 1 ( ;) ( ;) 23
  24. 24. w, b (local minima) . (learning rate) . = + 1 ( ;) = + 1 ( ;) 24
  25. 25. (Hyperparameter) . . , w, b . (learning rate) . , k-NN k . , . 25
  26. 26. 26
  27. 27. 0, 0.55 x 1000 0.1*x + 0.3 y , 0, 0.03 . 27
  28. 28. 28
  29. 29. W, b 0 y_hat MSE = 1 2 Z ; I Q ]D train loss y_hat W x b 29
  30. 30. 30
  31. 31. w = 0.099, b = 0.298 31
  32. 32. . MSE(mean square error) . . . . . . 32
  33. 33. 33
  34. 34. (Classification) . . Binary Classification( ), Multiclass Classification( ) ex) 34
  35. 35. ( ) True(1), False(0) . . 0~1 . 0.5 True, False . ; = + 35
  36. 36. (logistic) (sigmoid) -~+ 0~1 . ; = 1 1 + `(ghij) = 1 1 + `k = + 36
  37. 37. Neuron Sigmoid 0 ~ 1 ; = () + = + -~+ () = 1 1 + `k 37
  38. 38. (cross-entropy) . MSE . = 1 Z ; Q ]D = 1 Z[ ; + 1 log(1 ;)] Q ]D = 1 Z ; Q ]D = 1 Z ; Q ]D 38
  39. 39. Neuron Sigmoid + ; = () = 1 1 + `k = 1 ; ; = 1 ( ;) = 1 ; ; = 1 ( ;) = 1 Z[ ; + 1 log(1 ;)] Q ]D ; 39
  40. 40. Neuron Sigmoid + ; = + = + 1 ( ;) = + = + 1 ( ;) ( ;) 40
  41. 41. . 0~1 . 0.5 True False . MSE . . 41
  42. 42. 42
  43. 43. scikit-learn Bunch 43
  44. 44. (NumPy) . . scikit-learn, tensorflow . 44
  45. 45. cancer 30 ; = D D + I I + + qO qO + 45
  46. 46. cancer y X 569 float32 46
  47. 47. 0.2 0.6 0.1 0.2 0.5 0.4 0.1 0.3 = 1.5 5.9 0.7 + 0.1 = 1.6 6.0 0.8 30 : 569 569 + = ; [569, 30] x [30, 1] = [569, 1] + [1] = [569, 1] 1 (bias): 569 () 30 569 (logits) 47
  48. 48. () 48
  49. 49. prediction , 0.5 True, False[569, 1] 5000 92% 49
  50. 50. . . 1000, 1 569, 30 . w 1, b 1 . w 30, b 1 . . . 50
  51. 51. Materials Github : https://github.com/rickiepark/tfk-notebooks/tree/master/tensorflow_for_beginners Slideshare : https://www.slideshare.net/RickyPark3/ 51
  52. 52. . 52