{"id":2588,"date":"2023-03-23T01:22:54","date_gmt":"2023-03-23T01:22:54","guid":{"rendered":"https:\/\/cafe2sach.com\/?p=2588"},"modified":"2023-03-30T01:27:00","modified_gmt":"2023-03-30T00:27:00","slug":"xay-dung-thuat-toan-logistic-regression-su-dung-c","status":"publish","type":"post","link":"https:\/\/cafe2sach.com\/index.php\/2023\/03\/23\/xay-dung-thuat-toan-logistic-regression-su-dung-c\/","title":{"rendered":"X\u00e2y d\u1ef1ng thu\u1eadt to\u00e1n Logistic Regression s\u1eed d\u1ee5ng C#"},"content":{"rendered":"\n<p class=\"has-text-align-center\"><em>l\u00e0m sao \u0111\u1ec3 b\u1ea1n bi\u1ebft nh\u1eefng c\u00e2y h\u1ea3i \u0111\u01b0\u1eddng n\u00e0y c\u00f3 t\u1ea7m quan tr\u1ecdng nh\u01b0 nhau? <\/em><\/p>\n\n\n\n<p>                                                                    &#8211; Hercule Poirot &#8211; <\/p>\n\n\n\n<p><\/p>\n\n\n\n<div id=\"toc_container\" class=\"no_bullets\"><p class=\"toc_title\">N\u1ed9i dung ch&iacute;nh<\/p><ul class=\"toc_list\"><li><a href=\"#Thuat_toan_hoc_may_Logistic_Regression\"><span class=\"toc_number toc_depth_1\">1<\/span> Thu\u1eadt to\u00e1n h\u1ecdc m\u00e1y Logistic Regression<\/a><ul><li><a href=\"#Su_khac_nhau_cua_giai_thuat_Logistic_Regression_va_Naive_Bayes\"><span class=\"toc_number toc_depth_2\">1.1<\/span> S\u1ef1 kh\u00e1c nhau c\u1ee7a gi\u1ea3i thu\u1eadt Logistic Regression v\u00e0 Naive Bayes<\/a><\/li><li><a href=\"#Hoi_quy_tuyen_tinh_nhi_phan_va_hoi_quy_tuyen_tinh_da_thuc\"><span class=\"toc_number toc_depth_2\">1.2<\/span> H\u1ed3i quy tuy\u1ebfn t\u00ednh nh\u1ecb ph\u00e2n v\u00e0 h\u1ed3i quy tuy\u1ebfn t\u00ednh \u0111a th\u1ee9c<\/a><\/li><li><a href=\"#Ham_Sigmoid\"><span class=\"toc_number toc_depth_2\">1.3<\/span> H\u00e0m Sigmoid<\/a><\/li><li><a href=\"#Ham_Softmax\"><span class=\"toc_number toc_depth_2\">1.4<\/span> H\u00e0m Softmax<\/a><\/li><li><a href=\"#Ham_mat_mat_Loss_Function\"><span class=\"toc_number toc_depth_2\">1.5<\/span> H\u00e0m m\u1ea5t m\u00e1t (Loss Function)<\/a><\/li><li><a href=\"#Toi_uu_ham_mat_mat\"><span class=\"toc_number toc_depth_2\">1.6<\/span> T\u1ed1i \u01b0u h\u00e0m m\u1ea5t m\u00e1t<\/a><\/li><\/ul><\/li><li><a href=\"#Demo_Xay_dung_Logistic_Regression_su_dung_ngon_ngu_C\"><span class=\"toc_number toc_depth_1\">2<\/span> Demo X\u00e2y d\u1ef1ng Logistic Regression s\u1eed d\u1ee5ng ng\u00f4n ng\u1eef C #<\/a><ul><li><a href=\"#Tao_du_lieu_demo_chuong_trinh\"><span class=\"toc_number toc_depth_2\">2.1<\/span> T\u1ea1o d\u1eef  li\u1ec7u demo ch\u01b0\u01a1ng tr\u00ecnh <\/a><\/li><li><a href=\"#Ham_Softmax-2\"><span class=\"toc_number toc_depth_2\">2.2<\/span> H\u00e0m Softmax<\/a><\/li><li><a href=\"#Ham_Loss\"><span class=\"toc_number toc_depth_2\">2.3<\/span> H\u00e0m Loss<\/a><\/li><li><a href=\"#Gradient_Descent\"><span class=\"toc_number toc_depth_2\">2.4<\/span> Gradient Descent<\/a><\/li><\/ul><\/li><\/ul><\/div>\n<h1 class=\"wp-block-heading\"><span id=\"Thuat_toan_hoc_may_Logistic_Regression\">Thu\u1eadt to\u00e1n h\u1ecdc m\u00e1y Logistic Regression<\/span><\/h1>\n\n\n\n<p>H\u1ed3i quy tuy\u1ebfn t\u00edn l\u00e0 m\u1ed9t c\u00f4ng c\u1ee5 quan tr\u1ecdng trong khoa h\u1ecdc x\u00e3 h\u1ed9i v\u00e0 t\u1ef1 nhi\u00ean. Trong h\u1ecdc m\u00e1y, h\u1ed3i quy tuy\u1ebfn t\u00ednh l\u00e0 m\u1ed9t gi\u1ea3i thu\u1eadt th\u1ecdc m\u00e1y gi\u00e1m s\u00e1t cho b\u00e0i to\u00e1n ph\u00e2n l\u1edbp v\u00e0 n\u00f3 r\u1ea5t g\u1ea7n v\u1edbi m\u1ea1ng l\u01b0\u1edbi th\u1ea7n kinh (neural netowrks). Logistic regression c\u00f3 th\u1ec3 \u0111\u01b0\u1ee3c s\u1eed d\u1ee5ng \u0111\u1ec3 ph\u00e2n lo\u1ea1i 2 l\u1edbp ho\u1eb7c nhi\u1ec1u l\u1edbp. B\u00e0i to\u00e1n ph\u00e2n lo\u1ea1i 2 l\u1edbp c\u00f3 th\u1ec3 \u0111\u01a1n gi\u1ea3n h\u01a1n v\u00e0 \u0111\u01b0\u1ee3c x\u00e2y d\u1ef1ng v\u1edbi h\u00e0m sigmoid v\u00e0 v\u1edbi b\u00e0i to\u00e1n ph\u00e2n lo\u1ea1i nhi\u1ec1u l\u1edbp s\u1ebd ph\u1ee9c t\u1ea1p h\u01a1n s\u1eed d\u1ee5ng h\u00e0m softmax function s\u1ebd \u0111\u01b0\u1ee3c tr\u00ecnh b\u00e0y trong b\u00e0i vi\u1ebft n\u00e0y.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Su_khac_nhau_cua_giai_thuat_Logistic_Regression_va_Naive_Bayes\">S\u1ef1 kh\u00e1c nhau c\u1ee7a gi\u1ea3i thu\u1eadt Logistic Regression v\u00e0 Naive Bayes<\/span><\/h2>\n\n\n\n<p>Nh\u01b0 ch\u00fang ta \u0111\u00e3 bi\u1ebft, thu\u1eadt to\u00e1n ph\u00e2n l\u1edbp h\u1ecdc m\u00e1y c\u00f3 th\u1ec3 \u0111\u01b0\u1ee3c ph\u00e2n lo\u1ea1i th\u00e0nh 2 d\u1ea1ng gi\u1ea3i thu\u1eadt l\u00e0 gi\u1ea3i thu\u1eadt generativ v\u00e0 discriminative.<\/p>\n\n\n\n<p>\u0110i\u1ec1u kh\u00e1c nhau c\u01a1 b\u1ea3n c\u1ee7a Gi\u1ea3i thu\u1eadt Naive Bayes v\u00e0 Logistic Regression l\u00e0 gi\u1ea3i thu\u1eadt h\u1ed3i quy tuy\u1ebfn t\u00ednh (Logistic regression) l\u00e0 d\u1ea1ng Discriminative v\u00e0 Naive Bayes l\u00e0 d\u1ea1ng Generative<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Hoi_quy_tuyen_tinh_nhi_phan_va_hoi_quy_tuyen_tinh_da_thuc\">H\u1ed3i quy tuy\u1ebfn t\u00ednh nh\u1ecb ph\u00e2n v\u00e0 h\u1ed3i quy tuy\u1ebfn t\u00ednh \u0111a th\u1ee9c<\/span><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/cafe2sach.com\/wp-content\/uploads\/2023\/03\/image-1.png\" alt=\"\" class=\"wp-image-2595\" width=\"842\" height=\"966\" srcset=\"https:\/\/cafe2sach.com\/wp-content\/uploads\/2023\/03\/image-1.png 667w, https:\/\/cafe2sach.com\/wp-content\/uploads\/2023\/03\/image-1-600x688.png 600w, https:\/\/cafe2sach.com\/wp-content\/uploads\/2023\/03\/image-1-262x300.png 262w, https:\/\/cafe2sach.com\/wp-content\/uploads\/2023\/03\/image-1-366x420.png 366w\" sizes=\"(max-width: 842px) 100vw, 842px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Ham_Sigmoid\">H\u00e0m Sigmoid<\/span><\/h2>\n\n\n\n<p>Trong b\u00e0i to\u00e1n h\u1ecdc m\u00e1y ph\u00e2n l\u1edbp, h\u00e0m sigmoid \u0111\u01b0\u1ee3c s\u1eed d\u1ee5ng khi d\u1ef1 li\u1ec7u \u0111\u1ea7u v\u00e0o \u0111\u1ec3 ph\u00e2n lo\u1ea1i 2 l\u1edbp, h\u00e0m softmax \u0111\u01b0\u1ee3c s\u1eed d\u1ee5ng khi d\u1eef li\u1ec7u \u0111\u1ea7u v\u00e0o c\u1ea7n ph\u00e2n l\u1edbp l\u00e0 nhi\u1ec1u l\u1edbp. <\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Ham_Softmax\">H\u00e0m Softmax<\/span><\/h2>\n\n\n\n<p>D\u01b0\u1edbi \u0111\u00e2y l\u00e0 h\u00e0m softmax \u0111\u01b0\u1ee3c x\u00e2y d\u1ef1ng b\u1eb1ng ng\u00f4n ng\u1eef C# trong gi\u1ea3i thu\u1eadt logistic Regression<\/p>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Ham_mat_mat_Loss_Function\">H\u00e0m m\u1ea5t m\u00e1t (Loss Function)<\/span><\/h2>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Toi_uu_ham_mat_mat\">T\u1ed1i \u01b0u h\u00e0m m\u1ea5t m\u00e1t<\/span><\/h2>\n\n\n\n<p><\/p>\n\n\n\n<h1 class=\"wp-block-heading\"><span id=\"Demo_Xay_dung_Logistic_Regression_su_dung_ngon_ngu_C\">Demo X\u00e2y d\u1ef1ng Logistic Regression s\u1eed d\u1ee5ng ng\u00f4n ng\u1eef C #<\/span><\/h1>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Tao_du_lieu_demo_chuong_trinh\">T\u1ea1o d\u1eef  li\u1ec7u demo ch\u01b0\u01a1ng tr\u00ecnh <\/span><\/h2>\n\n\n\n<pre class=\"wp-block-code\"><code> \n            \/\/ dummy data\n            \/\/ Make dummy Data \n            double&#91;]&#91;] trainX = new double&#91;27]&#91;];\n\n            trainX&#91;0] = new double&#91;] { 0.1, 0.1 };  \/\/ lower left\n            trainX&#91;1] = new double&#91;] { 0.1, 0.2 };\n            trainX&#91;2] = new double&#91;] { 0.1, 0.3 };\n            trainX&#91;3] = new double&#91;] { 0.2, 0.1 };\n            trainX&#91;4] = new double&#91;] { 0.2, 0.2 };\n            trainX&#91;5] = new double&#91;] { 0.2, 0.3 };\n            trainX&#91;6] = new double&#91;] { 0.3, 0.1 };\n            trainX&#91;7] = new double&#91;] { 0.3, 0.2 };\n            trainX&#91;8] = new double&#91;] { 0.3, 0.3 };\n\n            trainX&#91;9] = new double&#91;] { 0.4, 0.1 };  \/\/ lower right\n            trainX&#91;10] = new double&#91;] { 0.4, 0.2 };\n            trainX&#91;11] = new double&#91;] { 0.4, 0.3 };\n            trainX&#91;12] = new double&#91;] { 0.5, 0.1 };\n            trainX&#91;13] = new double&#91;] { 0.5, 0.2 };\n            trainX&#91;14] = new double&#91;] { 0.5, 0.3 };\n            trainX&#91;15] = new double&#91;] { 0.6, 0.1 };\n            trainX&#91;16] = new double&#91;] { 0.6, 0.2 };\n            trainX&#91;17] = new double&#91;] { 0.6, 0.3 };\n\n            trainX&#91;18] = new double&#91;] { 0.1, 0.4 };  \/\/ upper left\n            trainX&#91;19] = new double&#91;] { 0.1, 0.5 };\n            trainX&#91;20] = new double&#91;] { 0.1, 0.6 };\n            trainX&#91;21] = new double&#91;] { 0.2, 0.4 };\n            trainX&#91;22] = new double&#91;] { 0.2, 0.5 };\n            trainX&#91;23] = new double&#91;] { 0.2, 0.6 };\n            trainX&#91;24] = new double&#91;] { 0.3, 0.4 };\n            trainX&#91;25] = new double&#91;] { 0.3, 0.5 };\n            trainX&#91;26] = new double&#91;] { 0.3, 0.6 };\n\n            \/\/\/ Label Data \n            int&#91;]&#91;] trainY = new int&#91;27]&#91;];\n            trainY&#91;0] = new int&#91;] { 1, 0, 0 };  \/\/ class 0\n            trainY&#91;1] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;2] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;3] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;4] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;5] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;6] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;7] = new int&#91;] { 1, 0, 0 };\n            trainY&#91;8] = new int&#91;] { 1, 0, 0 };\n\n            trainY&#91;9] = new int&#91;] { 0, 1, 0 };  \/\/ class 1\n            trainY&#91;10] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;11] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;12] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;13] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;14] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;15] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;16] = new int&#91;] { 0, 1, 0 };\n            trainY&#91;17] = new int&#91;] { 0, 1, 0 };\n\n            trainY&#91;18] = new int&#91;] { 0, 0, 1 };  \/\/ class 2\n            trainY&#91;19] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;20] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;21] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;22] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;23] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;24] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;25] = new int&#91;] { 0, 0, 1 };\n            trainY&#91;26] = new int&#91;] { 0, 0, 1 };\n  \/\/ Learning Rate, epoch \n\n            double lr = 0.01;\n            int maxEpoch = 1000;\n            Console.WriteLine(\"\\nStart online SGD train lr = \" +\n              lr.ToString(\"F3\") + \" maxEpoch = \" + maxEpoch);\n            double&#91;]&#91;] wts = Train(trainX, trainY, lr, maxEpoch);\n            \n            Console.WriteLine(\"Done\");\n\n            Console.WriteLine(\"\\nModel weights and biases:\");\n            ShowMatrix(wts);\n\n            Console.WriteLine(\"\\nPredicting class for &#91;0.45, 0.25] \");\n            double&#91;] x = new double&#91;] { 0.45, 0.25 };\n            double&#91;] oupts = ComputeOutput(x, wts, true);  \/\/ true: show pre-softmax\n            ShowVector(oupts);\n\n            Console.WriteLine(\"\\nEnd demo\");\n            Console.ReadLine();<\/code><\/pre>\n\n\n\n<h2 class=\"wp-block-heading\"><\/h2>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Ham_Softmax-2\">H\u00e0m Softmax<\/span><\/h2>\n\n\n\n<pre class=\"wp-block-code\"><code> static double&#91;] Softmax(double&#91;] vec)\n        {\n            \/\/ naive. consider max trick\n            double&#91;] result = new double&#91;vec.Length];\n            double sum = 0.0;\n            for (int i = 0; i &lt; result.Length; ++i)\n            {\n                result&#91;i] = Math.Exp(vec&#91;i]);\n                sum += result&#91;i];\n            }\n            for (int i = 0; i &lt; result.Length; ++i)\n                result&#91;i] \/= sum;\n            return result;\n        }\n<\/code><\/pre>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Ham_Loss\">H\u00e0m Loss<\/span><\/h2>\n\n\n\n<h2 class=\"wp-block-heading\"><span id=\"Gradient_Descent\">Gradient Descent<\/span><\/h2>\n","protected":false},"excerpt":{"rendered":"<p>l\u00e0m sao \u0111\u1ec3 b\u1ea1n bi\u1ebft nh\u1eefng c\u00e2y h\u1ea3i \u0111\u01b0\u1eddng n\u00e0y c\u00f3 t\u1ea7m quan tr\u1ecdng nh\u01b0 nhau? &#8211; Hercule Poirot &#8211; [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":2593,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[714,718,233],"tags":[716,715,717],"class_list":["post-2588","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-hoc-may-co-ban","category-ngon-ngu-c","category-tri-tue-nhan-tao","tag-giai-thuat-phan-lop","tag-hoc-may-co-ban","tag-hoc-ngon-ngu-c"],"_links":{"self":[{"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/posts\/2588","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/comments?post=2588"}],"version-history":[{"count":7,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/posts\/2588\/revisions"}],"predecessor-version":[{"id":4074,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/posts\/2588\/revisions\/4074"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/media\/2593"}],"wp:attachment":[{"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/media?parent=2588"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/categories?post=2588"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cafe2sach.com\/index.php\/wp-json\/wp\/v2\/tags?post=2588"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}