PA1-5extra.pdf: Accuracy =( TP +TN)/total. TPR= TP/(TP +FN). PA1-5data.html: Shown below EXTRA INFORMATION: You can use numpy. You can use numpy to conduct dot operation. numpy.dot(a,b) will be dot...

1 answer below »

Write a Python 3 program called pa1.py that implements a three-class linear classifier


using the following method:


Training (using the training data set):




  1. Compute the centroid of each class (e.g. A, B, and C).




  2. Construct a discriminant function between each pair of classes (e.g. A/B, B/C, and


    A/C), halfway between the two centroids and orthogonal to the line connecting the two centroids. This is the “basic linear classifier” that we have discussed.






PA1-5extra.pdf: Accuracy =( TP +TN)/total. TPR= TP/(TP +FN). PA1-5data.html: Shown below EXTRA INFORMATION: You can use numpy. You can use numpy to conduct dot operation. numpy.dot(a,b) will be dot two numpy vectors(one D arrays) The dimension will be larger than 3, but no larger than 10 in the testing samples. In pdf, it says tie should give priority to A and B, then C. This means wx= t. The linear classifier on the slide given below works for solving part of the problem. t should definitely be scalar. Look closely at the formulas on slide below. (p + n)^T * (p - n) yields a scalar product if (p-n) has dimension of (n x 1). After you've found your w and t, you want to use the equation w · x (dot product, with x being an entry in the dataset) to see if the result is > t or < t (or even = t, which you have to also handle). and then depending on what you chose to be your "p" class and your "n" class, you would be able to tell in which class the svm classified the data point. !!!!!!!!!!!!! evaluate.py expected output: !!!!!!!!!!!!!!!!!!!!1 [ { "tpr": 0.8666666666666667, "fpr": 0.06666666666666667, "error_rate": 0.08888888888888889, "accuracy": 0.9111111111111111, "precision": 0.8651059085841695 } ] 3 25 25 25 2.2333 3.1963 1.6958 3.2805 2.5182 2.9478 1.4613 4.1342 2.9146 2.629 5.5558 3.1148 1.8768 3.3124 1.9582 0.93935 2.5862 3.3608 3.2365 2.7132 1.9667 3.0854 1.3067 2.3781 2.0509 3.6136 2.0858 3.5622 3.3726 3.3905 5.3497 1.4974 1.9125 2.2413 2.378 1.6453 2.5971 5.5155 3.16 2.9388 3.0849 3.1374 3.7569 2.9966 3.4305 2.2983 0.7109 2.3562 2.0491 3.1111 3.27 0.50886 2.9523 2.7733 2.5193 2.8645 3.6623 3.3847 3.4637 5.2832 1.5397 1.8226 2.4095 0.49581 3.4521 3.6316 0.31583 2.8069 3.6496 3.2032 2.906 2.6966 3.4027 2.9172 3.0794 -0.49877 0.20008 -2.3288 -0.38407 0.11897 -0.86074 -0.69129 -1.3694 1.15 1.7348 -0.43501 -1.3909 0.63364 -1.2131 0.30637 -0.85871 0.33227 0.70622 -0.94597 1.2292 -0.7944 -0.23152 0.0040859 -0.082499 0.54806 0.67681 2.1142 -0.36045 -1.285 1.0833 0.47191 -0.2538 0.21496 -0.39515 0.22654 0.69927 0.36725 -0.03678 0.73714 -1.163 -1.1578 -0.37072 -0.37532 0.28983 -1.1944 0.66475 0.29401 -0.31675 1.7243 2.935 -0.34001 -0.37137 0.65109 -1.5469 -0.19588 -2.1125 -0.63156 1.1083 1.2625 0.53642 0.4063 0.38296 0.27026 0.13344 3.0761 -0.26232 -1.8361 -0.62632 -2.6363 -0.80394 -0.12193 0.60427 1.2986 2.304 1.9209 1.0589 -2.5554 -1.7026 -1.2475 0.89502 -1.0241 1.6492 -0.80632 -2.1692 2.3263 -0.72978 -2.4386 1.016 -2.5176 0.041779 2.4817 -0.080647 -3.2746 0.46406 -0.67174 -2.2436 0.30779 -2.0006 -0.8753 0.080953 -2.0968 -1.3778 1.4864 -0.4808 1.6565 3.1819 -1.8531 -0.5429 -0.14502 -2.253 -0.97537 1.5623 -0.60277 -1.1208 -0.49643 -0.96886 -1.1716 1.1993 -1.1067 -0.65425 0.42464 -2.1711 -2.1007 1.4741 -1.0423 -1.2888 0.75776 -1.321 -1.5626 2.002 -1.8513 -1.9542 2.8576 -0.51826 -2.0214 0.81403 -1.7382 -1.1036 0.51225 1.3931 -0.52944 1.412 -0.81099 0.27707 -0.82684 0.63752 -1.7372 0.54929 -1.5363 -0.76789 3 100 100 100 3.0106 2.5628 2.9734 4.9375 3.2307 3.4243 2.68 2.435 3.5858 3.3632 3.9185 3.4666 4.5048 3.1728 2.8777 1.1495 2.811 2.6615 2.3771 2.8965 3.121 3.6052 3.9533 2.8864 3.3828 2.2894 3.221 2.3785 3.087 2.6785 3.4398 2.72 3.1426 2.9807 2.7511 2.6058 3.8415 0.33417 2.5053 3.624 2.8453 3.5027 3.8965 3.8797 3.0044 2.179 3.9428 3.6117 2.6402 4.451 3.2602 2.5946 1.5836 4.0246 3.3332 4.6269 3.111 2.686 3.3334 0.88094 1.8333 2.8464 2.1865 4.874 2.229 3.5063 4.0153 1.8803 3.6022 1.7235 4.1728 1.8811 4.1107 2.6388 1.9783 3.5135 3.1462 3.3845 3.1661 2.4559 3.5155 3.2865 3.0939 3.0083 3.3005 3.9321 0.87182 5.3413 3.6041 2.3726 3.8127 3.38 3.8388 3.5485 4.3491 3.218 4.1711 3.4865 4.089 4.0407 4.5873 2.0787 3.4645 2.4938 4.3095 3.6989 4.0994 1.9655 4.6169 1.6068 3.4571 4.0866 4.2617 1.9156 3.6919 2.1172 1.3123 2.7697 4.2676 1.7963 2.9245 2.6119 3.0537 2.3665 2.0487 2.8465 3.1574 1.3829 3.277 1.9129 4.6949 3.548 5.5275 2.28 2.7813 2.1163 4.3254 2.1988 2.0741 2.8369 4.2751 3.0394 1.7915 2.4054 2.727 2.9299 2.8541 3.9129 3.6978 4.1973 3.7379 3.1225 5.126 4.0309 3.5973 2.7164 3.886 3.4068 3.4878 3.9242 4.3261 3.9289 1.1823 3.9259 5.7816 3.4344 3.6857 2.7002 1.8905 4.4247 4.2844 3.2729 2.109 2.3928 2.4106 3.5331 2.9096 4.7643 2.6991 2.3293 1.2697 3.3519 1.6014 2.6926 2.3184 3.4762 3.7026 2.8672 2.9603 2.9222 3.0051 1.788 1.8094 3.9075 3.2506 2.6607 3.1036 1.4959 4.1349 1.9472 3.2686 1.1533 0.78116 2.2719 3.5486 3.015 2.1318 3.6789 2.4783 3.648 1.6904 3.3147 4.8304 4.3303 2.4166 4.5834 1.5723 2.0027 3.9441 2.4446 3.1196 3.0444 4.4742 1.9369 4.7349 3.7003 2.0686 4.3032 3.3677 5.0152 1.6717 3.2468 2.5023 3.1021 3.2779 3.2174 3.0452 1.7684 3.339 3.187 5.2109 2.3793 2.1847 3.8692 2.2596 3.8991 3.4 0.94672 4.2008 2.5675 4.4059 2.2178 3.8867 3.8617 1.6088 1.5986 3.4093 2.8882 3.2718 2.2005 0.83473 3.7321 3.9068 4.1337 1.8219 4.0683 4.342 3.1268 4.5658 3.9303 2.6816 4.0058 3.8565 2.8955 2.7188 3.2448 3.7873 2.778 2.1487 2.566 4.0344 3.6527 1.6403 2.6723 4.915 4.0667 4.0779 0.91304 3.8874 3.6612 3.0778 3.4941 3.9508 4.0166 2.4117 2.2427 2.7427 2.0045 3.1438 3.8593 1.215 0.36767 -0.42177 -2.0059 -0.9681 1.6513 -0.031656 -1.3584 -0.4546 0.61358 0.93279 0.022097 0.63768 0.74234 0.031088 -2.2225 1.9133 0.18159 -1.0407 -0.46171 -0.21819 1.0562 -0.9566 -1.4085 -0.94691 -0.47227 0.53332 0.48687 -1.2343 0.48477 0.05146 0.2385 0.53417 -1.6396 1.1618 -0.55976 -1.5731 -0.60523 0.88909 -1.0496 -0.23026 0.56095 -1.6768 -1.1234 0.53111 0.76704 1.2978 0.90051 -0.84945 0.30333 0.37576 0.25813 -0.72356 -1.6372 0.97511 1.1292 0.44951 -1.2709 1.3913 2.6475 -0.98237 -0.72521 0.36322 0.0040278 0.58182 0.72594 -0.71109 -0.038144 -0.64845 1.2136 -1.7851 -0.67593 0.37035 -0.46397 -0.004055 1.1881 0.43271 -0.82944 -0.33405 -2.0872 -0.16543 -0.20202 2.8457 0.22038 0.53252 -0.32769 2.41 -0.27208 0.11281 1.3789 0.58263 1.8745 -0.28888 2.6548 0.17043 2.0732 1.4602 -0.48708 0.82398 -0.32779 -0.16362 0.0020356 1.8345 0.50718 0.66346 -1.0321 0.1158 -1.9051 -0.40959 -0.13967 1.0087 0.19743 0.68747 -0.33239 0.25997 -0.20563 1.1237 -0.2772 -0.52901 2.5102 0.15932 1.2778 0.78725 -0.64373 -1.1602 -3.0941 0.44976 2.5609 0.16294 -2.2375 0.4433 2.6942 -0.71018 0.24206 -0.50114 -0.86022 -1.6688 -1.1536 -1.1784 -1.163 -1.2866 1.8781 0.17089 -0.16768 1.2289 1.6828 -1.2246 1.6616 -1.8809 -0.25664 -0.60388 -0.2479 -0.38671 -1.1161 -1.3803 -0.3259 0.62663 -0.41697 0.83588 0.13301 -0.32903 2.185 0.27517 0.34094 -0.91593 -2.0839 1.1094 -1.0962 0.32008 0.55485 1.8402 -1.1772 0.37298 1.0781 -0.3479 -0.36018 -1.0289 0.2748 -0.10138 -0.80232 0.34583 -1.1422 -2.8304 2.0899 -0.074883 -1.3257 -0.29542 0.85796 -1.4717 -0.58037 0.62243 1.2097 1.5502 0.077016 0.50209 -0.64333 1.1646 -2.1726 -0.71626 2.2998 -0.59675 1.2411 -1.6311 -0.38766 -0.024518 1.2752 0.0014528 1.3971 1.2583 -1.291 0.025499 -2.0369 0.20475 1.4412 0.31391 -0.9107 -1.5073 -1.3608 -1.5724 -1.3622 0.34919 -1.3788 0.75436 0.13567 -0.28582 0.097988 0.81016 -1.2099 -0.29971 -0.93618 -0.38697 -1.448 -0.15201 0.41285 -0.12205 0.39849 1.125 -1.0525 0.79291 -0.067695 1.067 -0.48799 0.097754 -2.1653 1.3226 -0.92557 0.37243 -1.8969 0.091856 -0.26342 0.19849 -0.97256 0.55141 2.1741 1.6888 0.5934 -0.54028 -1.0899 -0.13956 -0.35299 0.015733 -0.15419 0.079403 0.16464 0.94346 -0.50154 0.59846 -1.5437 -0.65386 -0.17933 -0.43178 1.5333 0.28744 -1.2729 0.82528 0.1646 0.52954 -2.7191 2.2338 -0.72332 0.42213 0.48704 0.94204 -0.16585 0.087272 -0.7512 0.40112 -0.42308 1.4697 -1.3828 2.3023 0.36615 -0.065405 1.159 1.035 -0.41367 0.16684 -0.70897 -1.5211 1.978 1.0926 -0.17399 -1.2362 1.0883 -1.1303 0.042636 1.9306 1.105 -0.59937 -0.71463 -1 t="" (or="" even="t," which="" you="" have="" to="" also="" handle).="" and="" then="" depending="" on="" what="" you="" chose="" to="" be="" your="" "p"="" class="" and="" your="" "n"="" class,="" you="" would="" be="" able="" to="" tell="" in="" which="" class="" the="" svm="" classified="" the="" data="" point.="" !!!!!!!!!!!!!="" evaluate.py="" expected="" output:="" !!!!!!!!!!!!!!!!!!!!1="" [="" {="" "tpr":="" 0.8666666666666667,="" "fpr":="" 0.06666666666666667,="" "error_rate":="" 0.08888888888888889,="" "accuracy":="" 0.9111111111111111,="" "precision":="" 0.8651059085841695="" }="" ]="" 3="" 25="" 25="" 25="" 2.2333="" 3.1963="" 1.6958="" 3.2805="" 2.5182="" 2.9478="" 1.4613="" 4.1342="" 2.9146="" 2.629="" 5.5558="" 3.1148="" 1.8768="" 3.3124="" 1.9582="" 0.93935="" 2.5862="" 3.3608="" 3.2365="" 2.7132="" 1.9667="" 3.0854="" 1.3067="" 2.3781="" 2.0509="" 3.6136="" 2.0858="" 3.5622="" 3.3726="" 3.3905="" 5.3497="" 1.4974="" 1.9125="" 2.2413="" 2.378="" 1.6453="" 2.5971="" 5.5155="" 3.16="" 2.9388="" 3.0849="" 3.1374="" 3.7569="" 2.9966="" 3.4305="" 2.2983="" 0.7109="" 2.3562="" 2.0491="" 3.1111="" 3.27="" 0.50886="" 2.9523="" 2.7733="" 2.5193="" 2.8645="" 3.6623="" 3.3847="" 3.4637="" 5.2832="" 1.5397="" 1.8226="" 2.4095="" 0.49581="" 3.4521="" 3.6316="" 0.31583="" 2.8069="" 3.6496="" 3.2032="" 2.906="" 2.6966="" 3.4027="" 2.9172="" 3.0794="" -0.49877="" 0.20008="" -2.3288="" -0.38407="" 0.11897="" -0.86074="" -0.69129="" -1.3694="" 1.15="" 1.7348="" -0.43501="" -1.3909="" 0.63364="" -1.2131="" 0.30637="" -0.85871="" 0.33227="" 0.70622="" -0.94597="" 1.2292="" -0.7944="" -0.23152="" 0.0040859="" -0.082499="" 0.54806="" 0.67681="" 2.1142="" -0.36045="" -1.285="" 1.0833="" 0.47191="" -0.2538="" 0.21496="" -0.39515="" 0.22654="" 0.69927="" 0.36725="" -0.03678="" 0.73714="" -1.163="" -1.1578="" -0.37072="" -0.37532="" 0.28983="" -1.1944="" 0.66475="" 0.29401="" -0.31675="" 1.7243="" 2.935="" -0.34001="" -0.37137="" 0.65109="" -1.5469="" -0.19588="" -2.1125="" -0.63156="" 1.1083="" 1.2625="" 0.53642="" 0.4063="" 0.38296="" 0.27026="" 0.13344="" 3.0761="" -0.26232="" -1.8361="" -0.62632="" -2.6363="" -0.80394="" -0.12193="" 0.60427="" 1.2986="" 2.304="" 1.9209="" 1.0589="" -2.5554="" -1.7026="" -1.2475="" 0.89502="" -1.0241="" 1.6492="" -0.80632="" -2.1692="" 2.3263="" -0.72978="" -2.4386="" 1.016="" -2.5176="" 0.041779="" 2.4817="" -0.080647="" -3.2746="" 0.46406="" -0.67174="" -2.2436="" 0.30779="" -2.0006="" -0.8753="" 0.080953="" -2.0968="" -1.3778="" 1.4864="" -0.4808="" 1.6565="" 3.1819="" -1.8531="" -0.5429="" -0.14502="" -2.253="" -0.97537="" 1.5623="" -0.60277="" -1.1208="" -0.49643="" -0.96886="" -1.1716="" 1.1993="" -1.1067="" -0.65425="" 0.42464="" -2.1711="" -2.1007="" 1.4741="" -1.0423="" -1.2888="" 0.75776="" -1.321="" -1.5626="" 2.002="" -1.8513="" -1.9542="" 2.8576="" -0.51826="" -2.0214="" 0.81403="" -1.7382="" -1.1036="" 0.51225="" 1.3931="" -0.52944="" 1.412="" -0.81099="" 0.27707="" -0.82684="" 0.63752="" -1.7372="" 0.54929="" -1.5363="" -0.76789="" 3="" 100="" 100="" 100="" 3.0106="" 2.5628="" 2.9734="" 4.9375="" 3.2307="" 3.4243="" 2.68="" 2.435="" 3.5858="" 3.3632="" 3.9185="" 3.4666="" 4.5048="" 3.1728="" 2.8777="" 1.1495="" 2.811="" 2.6615="" 2.3771="" 2.8965="" 3.121="" 3.6052="" 3.9533="" 2.8864="" 3.3828="" 2.2894="" 3.221="" 2.3785="" 3.087="" 2.6785="" 3.4398="" 2.72="" 3.1426="" 2.9807="" 2.7511="" 2.6058="" 3.8415="" 0.33417="" 2.5053="" 3.624="" 2.8453="" 3.5027="" 3.8965="" 3.8797="" 3.0044="" 2.179="" 3.9428="" 3.6117="" 2.6402="" 4.451="" 3.2602="" 2.5946="" 1.5836="" 4.0246="" 3.3332="" 4.6269="" 3.111="" 2.686="" 3.3334="" 0.88094="" 1.8333="" 2.8464="" 2.1865="" 4.874="" 2.229="" 3.5063="" 4.0153="" 1.8803="" 3.6022="" 1.7235="" 4.1728="" 1.8811="" 4.1107="" 2.6388="" 1.9783="" 3.5135="" 3.1462="" 3.3845="" 3.1661="" 2.4559="" 3.5155="" 3.2865="" 3.0939="" 3.0083="" 3.3005="" 3.9321="" 0.87182="" 5.3413="" 3.6041="" 2.3726="" 3.8127="" 3.38="" 3.8388="" 3.5485="" 4.3491="" 3.218="" 4.1711="" 3.4865="" 4.089="" 4.0407="" 4.5873="" 2.0787="" 3.4645="" 2.4938="" 4.3095="" 3.6989="" 4.0994="" 1.9655="" 4.6169="" 1.6068="" 3.4571="" 4.0866="" 4.2617="" 1.9156="" 3.6919="" 2.1172="" 1.3123="" 2.7697="" 4.2676="" 1.7963="" 2.9245="" 2.6119="" 3.0537="" 2.3665="" 2.0487="" 2.8465="" 3.1574="" 1.3829="" 3.277="" 1.9129="" 4.6949="" 3.548="" 5.5275="" 2.28="" 2.7813="" 2.1163="" 4.3254="" 2.1988="" 2.0741="" 2.8369="" 4.2751="" 3.0394="" 1.7915="" 2.4054="" 2.727="" 2.9299="" 2.8541="" 3.9129="" 3.6978="" 4.1973="" 3.7379="" 3.1225="" 5.126="" 4.0309="" 3.5973="" 2.7164="" 3.886="" 3.4068="" 3.4878="" 3.9242="" 4.3261="" 3.9289="" 1.1823="" 3.9259="" 5.7816="" 3.4344="" 3.6857="" 2.7002="" 1.8905="" 4.4247="" 4.2844="" 3.2729="" 2.109="" 2.3928="" 2.4106="" 3.5331="" 2.9096="" 4.7643="" 2.6991="" 2.3293="" 1.2697="" 3.3519="" 1.6014="" 2.6926="" 2.3184="" 3.4762="" 3.7026="" 2.8672="" 2.9603="" 2.9222="" 3.0051="" 1.788="" 1.8094="" 3.9075="" 3.2506="" 2.6607="" 3.1036="" 1.4959="" 4.1349="" 1.9472="" 3.2686="" 1.1533="" 0.78116="" 2.2719="" 3.5486="" 3.015="" 2.1318="" 3.6789="" 2.4783="" 3.648="" 1.6904="" 3.3147="" 4.8304="" 4.3303="" 2.4166="" 4.5834="" 1.5723="" 2.0027="" 3.9441="" 2.4446="" 3.1196="" 3.0444="" 4.4742="" 1.9369="" 4.7349="" 3.7003="" 2.0686="" 4.3032="" 3.3677="" 5.0152="" 1.6717="" 3.2468="" 2.5023="" 3.1021="" 3.2779="" 3.2174="" 3.0452="" 1.7684="" 3.339="" 3.187="" 5.2109="" 2.3793="" 2.1847="" 3.8692="" 2.2596="" 3.8991="" 3.4="" 0.94672="" 4.2008="" 2.5675="" 4.4059="" 2.2178="" 3.8867="" 3.8617="" 1.6088="" 1.5986="" 3.4093="" 2.8882="" 3.2718="" 2.2005="" 0.83473="" 3.7321="" 3.9068="" 4.1337="" 1.8219="" 4.0683="" 4.342="" 3.1268="" 4.5658="" 3.9303="" 2.6816="" 4.0058="" 3.8565="" 2.8955="" 2.7188="" 3.2448="" 3.7873="" 2.778="" 2.1487="" 2.566="" 4.0344="" 3.6527="" 1.6403="" 2.6723="" 4.915="" 4.0667="" 4.0779="" 0.91304="" 3.8874="" 3.6612="" 3.0778="" 3.4941="" 3.9508="" 4.0166="" 2.4117="" 2.2427="" 2.7427="" 2.0045="" 3.1438="" 3.8593="" 1.215="" 0.36767="" -0.42177="" -2.0059="" -0.9681="" 1.6513="" -0.031656="" -1.3584="" -0.4546="" 0.61358="" 0.93279="" 0.022097="" 0.63768="" 0.74234="" 0.031088="" -2.2225="" 1.9133="" 0.18159="" -1.0407="" -0.46171="" -0.21819="" 1.0562="" -0.9566="" -1.4085="" -0.94691="" -0.47227="" 0.53332="" 0.48687="" -1.2343="" 0.48477="" 0.05146="" 0.2385="" 0.53417="" -1.6396="" 1.1618="" -0.55976="" -1.5731="" -0.60523="" 0.88909="" -1.0496="" -0.23026="" 0.56095="" -1.6768="" -1.1234="" 0.53111="" 0.76704="" 1.2978="" 0.90051="" -0.84945="" 0.30333="" 0.37576="" 0.25813="" -0.72356="" -1.6372="" 0.97511="" 1.1292="" 0.44951="" -1.2709="" 1.3913="" 2.6475="" -0.98237="" -0.72521="" 0.36322="" 0.0040278="" 0.58182="" 0.72594="" -0.71109="" -0.038144="" -0.64845="" 1.2136="" -1.7851="" -0.67593="" 0.37035="" -0.46397="" -0.004055="" 1.1881="" 0.43271="" -0.82944="" -0.33405="" -2.0872="" -0.16543="" -0.20202="" 2.8457="" 0.22038="" 0.53252="" -0.32769="" 2.41="" -0.27208="" 0.11281="" 1.3789="" 0.58263="" 1.8745="" -0.28888="" 2.6548="" 0.17043="" 2.0732="" 1.4602="" -0.48708="" 0.82398="" -0.32779="" -0.16362="" 0.0020356="" 1.8345="" 0.50718="" 0.66346="" -1.0321="" 0.1158="" -1.9051="" -0.40959="" -0.13967="" 1.0087="" 0.19743="" 0.68747="" -0.33239="" 0.25997="" -0.20563="" 1.1237="" -0.2772="" -0.52901="" 2.5102="" 0.15932="" 1.2778="" 0.78725="" -0.64373="" -1.1602="" -3.0941="" 0.44976="" 2.5609="" 0.16294="" -2.2375="" 0.4433="" 2.6942="" -0.71018="" 0.24206="" -0.50114="" -0.86022="" -1.6688="" -1.1536="" -1.1784="" -1.163="" -1.2866="" 1.8781="" 0.17089="" -0.16768="" 1.2289="" 1.6828="" -1.2246="" 1.6616="" -1.8809="" -0.25664="" -0.60388="" -0.2479="" -0.38671="" -1.1161="" -1.3803="" -0.3259="" 0.62663="" -0.41697="" 0.83588="" 0.13301="" -0.32903="" 2.185="" 0.27517="" 0.34094="" -0.91593="" -2.0839="" 1.1094="" -1.0962="" 0.32008="" 0.55485="" 1.8402="" -1.1772="" 0.37298="" 1.0781="" -0.3479="" -0.36018="" -1.0289="" 0.2748="" -0.10138="" -0.80232="" 0.34583="" -1.1422="" -2.8304="" 2.0899="" -0.074883="" -1.3257="" -0.29542="" 0.85796="" -1.4717="" -0.58037="" 0.62243="" 1.2097="" 1.5502="" 0.077016="" 0.50209="" -0.64333="" 1.1646="" -2.1726="" -0.71626="" 2.2998="" -0.59675="" 1.2411="" -1.6311="" -0.38766="" -0.024518="" 1.2752="" 0.0014528="" 1.3971="" 1.2583="" -1.291="" 0.025499="" -2.0369="" 0.20475="" 1.4412="" 0.31391="" -0.9107="" -1.5073="" -1.3608="" -1.5724="" -1.3622="" 0.34919="" -1.3788="" 0.75436="" 0.13567="" -0.28582="" 0.097988="" 0.81016="" -1.2099="" -0.29971="" -0.93618="" -0.38697="" -1.448="" -0.15201="" 0.41285="" -0.12205="" 0.39849="" 1.125="" -1.0525="" 0.79291="" -0.067695="" 1.067="" -0.48799="" 0.097754="" -2.1653="" 1.3226="" -0.92557="" 0.37243="" -1.8969="" 0.091856="" -0.26342="" 0.19849="" -0.97256="" 0.55141="" 2.1741="" 1.6888="" 0.5934="" -0.54028="" -1.0899="" -0.13956="" -0.35299="" 0.015733="" -0.15419="" 0.079403="" 0.16464="" 0.94346="" -0.50154="" 0.59846="" -1.5437="" -0.65386="" -0.17933="" -0.43178="" 1.5333="" 0.28744="" -1.2729="" 0.82528="" 0.1646="" 0.52954="" -2.7191="" 2.2338="" -0.72332="" 0.42213="" 0.48704="" 0.94204="" -0.16585="" 0.087272="" -0.7512="" 0.40112="" -0.42308="" 1.4697="" -1.3828="" 2.3023="" 0.36615="" -0.065405="" 1.159="" 1.035="" -0.41367="" 0.16684="" -0.70897="" -1.5211="" 1.978="" 1.0926="" -0.17399="" -1.2362="" 1.0883="" -1.1303="" 0.042636="" 1.9306="" 1.105="" -0.59937="" -0.71463="">
Answered Same DayApr 21, 2021

Answer To: PA1-5extra.pdf: Accuracy =( TP +TN)/total. TPR= TP/(TP +FN). PA1-5data.html: Shown below EXTRA...

Pushpendra answered on Apr 22 2021
143 Votes
#!/usr/bin/env python
# coding: utf-8
# In[89]:
pwd
# In[4]:
import pandas as pd
import numpy as np
# In[6]:
df1=pd.read_excel("tra
ining_LDA.xlsx")
df1.head()
# In[8]:
df2=pd.read_excel("testing_LDA.xlsx")
df2.head()
# # Centroid
# In[9]:
from sklearn.cluster import KMeans
from sklearn import metrics
import numpy as np
print("*****Centroid for A******")
x1=df1.iloc[:,0:1].values
K =1
kmeans_model = KMeans(n_clusters=K).fit(x1)
print(kmeans_model.cluster_centers_)
centers = np.array(kmeans_model.cluster_centers_)
# In[10]:
print("*****Centroid for B******")
x2=df1.iloc[:,1:2].values
K =1
kmeans_model = KMeans(n_clusters=K).fit(x2)
print(kmeans_model.cluster_centers_)
centers = np.array(kmeans_model.cluster_centers_)
# In[11]:
print("*****Centroid for C******")
x3=df1.iloc[:,2:3].values
K =1
kmeans_model = KMeans(n_clusters=K).fit(x3)
print(kmeans_model.cluster_centers_)
centers = np.array(kmeans_model.cluster_centers_)
# # Analysis
# In[5]:
X_train=df1.drop(('Y'),axis=1)
X_train
# In[6]:
Y_train=df1.pop('Y')
Y_train
# In[9]:
X_test=df2.drop(('Y'),axis=1)
X_test
# In[10]:
Y_test=df2.pop('Y')
Y_test
# In[11]:
#Import svm model
from sklearn import svm
#Create a svm Classifier
clf = svm.SVC(kernel='linear') # Linear Kernel
#Train the model using the training sets
clf.fit(X_train, Y_train)
#Predict the response for...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here