A critical review on machine learning algorithms and their applications in pure sciences
Author Affiliations
- 1Seedling Modern High School, Jaipur, Rajasthan, India
- 2Seedling Modern Public School, Udaipur, Rajasthan, India
Res. J. Recent Sci., Volume 8, Issue (1), Pages 14-29, January,2 (2019)
Abstract
Today, it is difficult to think about solving any set of problems without the use of Artificial Intelligence. This has grown tremendously across several fields starting from the Management to the Life Sciences. The use of AI has made life simpler and better. Today, its use in the process of high - throughput screening has provided us with several types of advantages such as saving resources, expenditures and many more. The method of Machine learning has led to minimizing the errors involved with the co-relation of different kinds of attributes. Most importantly, it has transformed the Edisonian approach of hit and trial method into a way with full of logic and simulations. Today, using different simulation we can predict several required properties and the after effects of many materials, which led us to save a lot of resources. Here in this review article, we have explicitly presented the machine learning types, different algorithms and along with their uses in several different fields.
References
- Stefik M.J. (1985)., Machine learning: An artificial intelligence approach.
- 39.50, 25(2), 236-238. doi:10.1016/0004-3702(85)90005-0
- Angra S. and Ahuja S. (2017)., Machine learning and its applications: a review., In Big Data Analytics and Computational Intelligence (ICBDAC), 2017 International Conference on 57-60. IEEE. doi:10.1109/ICBDACI.2017.8070809
- Robert C. (2014)., Machine Learning, a Probabilistic Perspective., Chance, 27, 62-63.
- Mitchell T.M. (1997)., Machine Learning., McGraw-Hill. doi:10.1145/242224.242229
- Alpaydın E. (2014)., Introduction to machine learning. Methods in Molecular Biology., doi:10.1007/978-1-62703-748-8-7
- Ng A. (2012)., Supervised learning., Mach. Learn. doi:10.1111/j.1466-8238.2009.00506.x
- Kotsiantis S.B. (2007)., Supervised Machine Learning: A Review of Classification Techniques., Informatica, 31, 249-268.
- Nasrabadi N.M. (2007)., Pattern recognition and machine learning., Journal of electronic imaging, 16(4). doi:10.1117/1.2819119
- Kohavi R. and Provost F. (1998)., Glossary of Terms., Mach. Learn. doi:10.1023/A:1017181826899
- Caruana R. and Niculescu-Mizil A. (2006)., An empirical comparison of supervised learning algorithms., Proceedings of the 23rd international conference on Machine learning, 161-168. ACM. doi:10.1145/1143844.1143865
- Caruana R., Karampatziakis N. and Yessenalina A. (2008)., An empirical evaluation of supervised learning in high dimensions., in Proceedings of the 25th international conference on Machine learning - ICML '08. doi:10.1145/1390156.1390169
- Jordan M.I. and Rumelhart D.E. (1992)., Forward models: Supervised learning with a distal teacher., Cognitive science, 16(3), 307-354. doi:10.1016/0364-0213(92)90036-T
- Hastie T., Tibshirani R. and Friedman J. (2009)., The Elements of Statistical Learning., Bayesian Forecasting and Dynamic Models. doi:10.1007/b94608
- Zhang C., Patras P. and Haddadi H. (2018)., Deep Learning in Mobile and Wireless Networking: A Survey., IEEE Commun. Surv. TUTORIALS.
- Math Works (2016)., Applying Supervised Learning., Mach. Learn. with MATLAB. doi:10.1111/j.2041-210X.2010.00056.x
- Dougherty J., Kohavi R. and Sahami M. (1995)., Supervised and unsupervised discretization of continuous features., In Machine Learning Proceedings, 194-202. doi:10.1016/B978-1-55860-377-6.50032-3
- Rasmussen C.E. and Williams C.K. (2006)., Gaussian process for machine learning., MIT press. doi:10.1093/bioinformatics/btq657
- Aiken L.S. and West S.G. (1994)., Multiple regression: Testing and interpreting interactions., Journal-Operational Research Society, 45, 119. doi:10.2307/2583960
- Liu X. (2003)., Supervised Classification and Unsupervised Classification., Cfa.Harvard.Edu.
- Chelly S.M. and Denis C. (2001)., Applying Unsupervised Learning., Med. Sci. Sports Exerc. doi:10.1111/j.2041-210X.2010.00056.x
- Love B.C. (2002)., Comparing supervised and unsupervised category learning., Psychonomic bulletin & review, 9(4), 829-835. doi:10.3758/BF03196342
- Jain A.K., Murty M.N. and Flynn P.J. (1999)., Data clustering: a review., ACM computing surveys (CSUR), 31(3), 264-323. doi:10.1145/331499.331504
- Adil S.H. and Qamar S. (2009)., Implementation of association rule mining using CUDA., In Emerging Technologies, ICET 2009. International Conference, IEEE, 332-336. doi:10.1109/ICET.2009.5353149
- Barto A.G. and Dietterich T.G. (2004)., Reinforcement learning and its relationship to supervised learning., Handbook of learning and approximate dynamic programming. John Wiley and Sons, Inc. doi:10.1002/9780470544785.ch2
- Das S., Dey A., Pal A. and Roy N. (2015)., Applications of artificial intelligence in machine learning: review and prospect., International Journal of Computer Applications, 115(9), 31-41.
- Mnih V., Kavukcuoglu K., Silver D., Rusu A.A., Veness J., Bellemare M.G. and Petersen S. (2015)., Human-level control through deep reinforcement learning., Nature, 518(7540), 529. doi:10.1038/nature14236
- Giryes R. and Elad M. (2011)., Reinforcement Learning: A Survey., In Eur. Signal Process. Conf, 1475-1479. doi:10.1613/jair.301
- Sutton R.S. and Barto A.G. (2013)., Reinforcement learning : an introduction., Neural Networks IEEE Transactions. doi:10.1109/TNN.1998.712192
- Tran H.A. (2011)., Statistical Models., IEEE Transactions on Pattern Analysis and Machine Intelligence, 4.
- Abadi M., Barham P., Chen J., Chen Z., Davis A., Dean J. and Kudlur M. (2016)., TensorFlow: A System for Large-Scale Machine Learning TensorFlow: A system for large-scale machine learning., in 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI '16). doi:10.1038/nn.3331
- Langley P. (1994)., Institute for the S. of L. and E. Selection of Relevant Features in Machine Learning., Proc. AAAI Fall Symp. Relev. doi:10.1.1.43.4648
- Spring M.L. (2015)., Machine Learning in Action., Engineering. doi:10.1007/978-0-387-77242-4
- Pedregosa F., Varoquaux G., Gramfort A., Michel V., Thirion B., Grisel O. and Duchesnay É. (2012)., Scikit-learn: Machine learning in Python., Machine Learning, 12, 2825-2830. doi:10.1007/s13398-014-0173-7.2
- Muller K.R., Mika S., Ratsch G., Tsuda K. and Scholkopf B. (2001)., An introduction to kernel-based learning algorithms., IEEE transactions on neural networks, 12(2), 181-201. doi:10.1109/72.914517
- Bibuli M., Caccia M. and Lapierre L. (2007)., Path-following algorithms and experiments for an autonomous surface vehicle., IFAC Proceedings, 40(17), 81-86. doi:10.3182/20070919-3-HR-3904.00015
- Freund Y. and Schapire R.E. (1996)., Experiments with a new boosting algorithm., In Icml, 96, 148-156. doi:10.1.1.133.1040
- Kotsiantis S.B., Zaharakis I.D. and Pintelas P.E. (2006)., Machine learning: a review of classification and combining techniques., Artificial Intelligence Review, 26(3), 159-190. doi:10.1007/s10462-007-9052-3
- Weisberg S. (2005)., Applied Linear Regression., New York. doi:10.1080/00049538408255324
- Article R. (2010)., Linear Regression Analysis., Deutsches Ärzteblatt International, 107, 776-782.
- Almeida A.M.D., Castel-Branco M.M. and Falcao A.C. (2002)., Linear regression for calibration lines revisited: weighting schemes for bioanalytical methods., Journal of Chromatography B, 774(2), 215-222. doi:10.1016/S1570-0232(02)00244-1
- Kutner M.H., Nachtsheim C.J., Neter J. and Li W. (1996)., Applied Linear Statistical Models., Journal Of The Royal Statistical Society Series A General. doi:10.2307/2984653
- Zou K.H., Tuncali K. and Silverman S.G. (2003)., Correlation and simple linear regression., Radiology, 227(3), 617-628. doi:10.1148/radiol.2273011499
- Meng X., Bradley J., Yavuz B., Sparks E., Venkataraman S., Liu D. and Xin D. (2016)., Mllib: Machine learning in apache spark., The Journal of Machine Learning Research, 17(1), 1-7.
- Kleinbaum D.G. and Klein M. (2010)., Logistic regression: a self-learning text., Springer Science & Business Media. doi:10.1007/978-1-4419-1742-3
- Lemeshow S. and Hosmer D.W. (1982)., A review of goodness of fit statistics for use in the development of logistic regression models., Am. J. Epidemiol., 115(1), 92-106. doi:10.1093/oxfordjournals.aje.a113284
- Gortmaker S.L., Hosmer D.W. and Lemeshow S. (1994)., Applied Logistic Regression., Contemp. Sociol. doi:10.2307/2074954
- Tripepi G., Jager K.J., Dekker F.W. and Zoccali C. (2008)., Linear and logistic regression analysis., Kidney international, 73(7), 806-810. doi:10.1038/sj.ki.5002787
- Landwehr N., Hall M. and Frank E. (2005)., Logistic model trees., Machine learning, 59(1-2), 161-205. doi:10.1007/s10994-005-0466-3
- Menard S. (2002)., Applied logistic regression analysis., Sage, 106. doi:10.4135/9781412983433
- Peduzzi P., Concato J., Kemper E., Holford T.R. and Feinstein A.R. (1996)., A simulation study of the number of events per variable in logistic regression analysis., Journal of clinical epidemiology, 49(12), 1373-1379. doi:10.1016/S0895-4356(96)00236-3
- LaValley M.P. (2008)., Logistic regression., Circulation, 117(18), 2395-2399. doi:10.1161/CIRCULATIONAHA.106.682658
- Pampel F.C. (2000)., Logistic Regression: A Primer., Sage Quant. Appl. Soc. Sci., 132. doi:10.1177/007542429602400208
- Peng C.Y.J., Lee K.L. and Ingersoll G.M. (2002)., An introduction to logistic regression analysis and reporting., The journal of educational research, 96(1), 3-14. doi:10.1080/00220670209598786
- Chernick M.R. and Friis R.H. (2003)., Introductory biostatistics for the health sciences: modern applications including bootstrap., John Wiley & Sons.
- Murphy K.P. (2006)., Naive Bayes classifiers Generative classifiers., Bernoulli. doi:10.1007/978-3-540-74958-5_35
- Rish I. (2001)., An empirical study of the naive Bayes classifier., In IJCAI 2001 workshop on empirical methods in artificial intelligence, New York, 3(22), 41-46. IBM.doi:10.1039/b104835j
- Kohavi R. (1996)., Scaling Up the Accuracy of Naive-Bayes Classifiers: A Decision-Tree Hybrid., Proc. Second Int. Conf. Knowl. Discov. Data Min. doi:citeulike-article-id:3157868
- Taneja S., Gupta C., Goyal K. and Gureja D. (2014)., An Enhanced K-Nearest Neighbor Algorithm Using Information Gain and Clustering., 2014 Fourth Int. Conf. Adv. Comput. Commun. Technol., 325-329. doi:10.1109/ACCT.2014.22
- Keller J.M., Gray M.R. and Givens J.A. (1985)., A fuzzy k-nearest neighbor algorithm., IEEE transactions on systems, man, and cybernetics, 15(4), 580-585. doi:10.1109/TSMC.1985.6313426
- Dudani S.A. (1976)., The distance-weighted k-nearest-neighbor rule., IEEE Transactions on Systems, Man, and Cybernetics, 4, 325-327. doi:10.1109/TSMC.1976.5408784
- Yu X., Pu K.Q. and Koudas N. (2005)., Monitoring k-nearest neighbor queries over moving objects., in Proceedings - International Conference on Data Engineering, 631-642. doi:10.1109/ICDE.2005.92
- Sharma R., Kumar S. and Maheshwari R. (2015)., Comparative Analysis of Classification Techniques in Data Mining Using Different Datasets., Int. J. Comput. Sci. Mob. Comput., 4(12), 125-134.
- Guo G., Wang H., Bell D., Bi Y. and Greer K. (2003)., KNN model-based approach in classification., In OTM Confederated International Conferences\" On the Move to Meaningful Internet Systems\". Springer, Berlin, Heidelberg, 986-996. doi:10.1007/b94348
- Lafferty J., McCallum A. and Pereira F.C.N. (2001)., Conditional random fields: Probabilistic models for segmenting and labeling sequence data., ICML '01 Proc. Eighteenth Int. Conf. Mach. Learn. doi:10.1038/nprot.2006.61
- Palmer D.S., O, Random forest models to predict aqueous solubility., Journal of chemical information and modeling, 47(1), 150-158. doi:10.1021/ci060164k
- LaValle S.M. and Kuffner J.J. (2000)., Rapidly-exploring random trees: Progress and prospects., 4th Work. Algorithmic Comput. Robot. New Dir. doi:10.1017/CBO9781107415324.004
- Svetnik V., Liaw A., Tong C., Culberson J.C., Sheridan R. P. and Feuston B.P. (2003)., Random forest: a classification and regression tool for compound classification and QSAR modeling., Journal of chemical information and computer sciences, 43(6), 1947-1958. doi:10.1021/ci034160g
- Breiman L. (2001)., Random forests., Machine learning, 45(1), 5-32.
- Ho T.K. (1995)., Random decision forests., In Document analysis and recognition, proceedings of the third international conference, IEEE, 1, 278-282.
- Breiman L. (1999)., Random Forests., Mach. Learn. doi:10.1023/A:1010933404324
- Pal M. (2005)., Random forest classifier for remote sensing classification., International Journal of Remote Sensing, 26(1), 217-222. doi:10.1080/01431160412331269698
- Horrace W.C. and Oaxaca R.L. (2006)., Results on the bias and inconsistency of ordinary least squares for the linear probability model., Economics Letters, 90(3), 321-327. doi:10.1016/j.econlet.2005.08.024
- de Souza S.V. and Junqueira R.G. (2005)., A procedure to assess linearity by ordinary least squares method., Analytica Chimica Acta, 552(1-2), 25-35. doi:10.1016/j.aca.2005.07.043
- Studenmund A.H. (2005)., Ordinary Least Squares., Using Econometrics. A Practical Guide.
- Leng L., Zhang T., Kleinman L. and Zhu W. (2007)., Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science., Journal of Physics: Conference Series, 78(1), 1-5. 012084. IOP Publishing. doi:10.1088/1742-6596/78/1/012084
- Kilmer J.T. and Rodriguez R.L. (2017)., Ordinary least squares regression is indicated for studies of allometry., Journal of evolutionary biology, 30(1), 4-12. doi:10.1111/jeb.12986
- Dhillon P.S., Foster D.P., Kakade S.M. and Ungar L.H. (2013)., A risk comparison of ordinary least squares vs ridge regression., The Journal of Machine Learning Research, 14(1), 1505-1511.
- McCool M., Robison A. and Reinders J. (2012)., Structured parallel programming: patterns for efficient computation., Elsevier. doi:10.1016/B978-0-12-415993-8.00013-X
- Bron C. (1972)., Merge sort algorithm [M1]., Communications of the ACM, 15(5), 357-358. doi:10.1145/355602.361317
- Cormen T.H., Leiserson C.E., Rivest R.L. and Stein C. (2001)., Introduction to algorithms second edition., Computer. doi:10.2307/2583667
- Cole R. (1986)., Parallel Merge Sort., Annual Symposium on Foundations of Computer Science (Proceedings). doi:10.1109/SFCS.1986.41
- Byvatov E., Fechner U., Sadowski J. and Schneider G. (2003)., Comparison of support vector machine and artificial neural network systems for drug/nondrug classification., Journal of chemical information and computer sciences, 43(6), 1882-1889. doi:10.1021/ci0341161
- Tao X., Renmu H., Peng W. and Dongjie X. (2004)., Input dimension reduction for load forecasting based on support vector machines., Proceedings of the 2004 IEEE International Conference on Electric Utility Deregulation, Restructuring and Power Technologies (DRPT2004), 2, 510-514.
- Yeh C.Y., Huang C.W. and Lee S.J. (2011)., A multiple-kernel support vector regression approach for stock market price forecasting., Expert Systems with Applications, 38(3), 2177-2186. doi:10.1016/j.eswa.2010.08.004
- Brereton R.G. and Lloyd G.R. (2010)., Support vector machines for classification and regression., Analyst, 135(2), 230-267. doi:10.1039/B918972F
- Berwick R. (2003)., An Idiot's guide to Support vector machines (SVMs)., Retrieved on October, 21, 2011.
- Ekambaram R., Fefilatyev S., Shreve M., Kramer K., Hall L.O., Goldgof D.B. and Kasturi R. (2016)., Active cleaning of label noise., Pattern Recognition, 51, 463-480. doi:10.1016/j.patcog.2015.09.020
- Rai P. (2011)., Hyperplane based Classification : Perceptron and ( Intro to ) Support Vector Machines., Piyush Rai Hyperplane Separates a D -dimensional space into two half-spaces.
- Suciu A.I. (2014)., Mathématiques., In Annales de la Faculté des Sciences de Toulouse, 23(2), 417-439. doi:10.5802/afst.1412
- Platt J.C. (1998)., Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines., Advances in kernel methods. doi:10.1.1.43.4376
- Rasmussen C. (2006)., Gaussian processes for machine learning., Int. J. Neural Syst. doi:10.1142/S0129065704001899
- Rodriguez S., Tang X., Lien J.M. and Amato N.M. (2006)., An obstacle-based rapidly-exploring random tree., In Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006, 895-900. IEEE. doi:10.1109/ROBOT.2006.1641823
- Zhao Y. and Zhang Y. (2008)., Comparison of decision tree methods for finding active objects., Advances in Space Research, 41(12), 1955-1959. doi:10.1016/j.asr.2007.07.020
- Yu X., Hyyppä J., Vastaranta M., Holopainen M. and Viitala R. (2011)., Predicting individual tree attributes from airborne laser point clouds based on the random forests technique., ISPRS Journal of Photogrammetry and remote sensing, 66(1), 28-37. doi:10.1016/j.isprsjprs.2010.08.003
- Hermens L.A. and Shlimmer J.C. (1994)., A machine-learning apprentice for the completion of repetitive forms., IEEE Expert, 9(1), 28-33. doi:10.1109/64.295135
- Kung J.Y. and Shu M.H. (2015)., Some scheduling problems on a single machine with general job effects of position-dependent learning and start-time-dependent deterioration., Asia-Pacific Journal of Operational Research, 32(02). doi:10.1142/S0217595915500025
- Langley P. and Simon H.A. (1995)., Applications of machine learning and rule induction., Communications of the ACM, 38(11), 54-64.
- Durso F.T. and Manning C.A. (2008)., Air traffic control., Reviews of human factors and ergonomics, 4(1), 195-244. doi:10.1518/155723408X342853
- Djokic J., Lorenz B. and Fricke H. (2010)., Air traffic control complexity as workload driver., Transportation research part C: emerging technologies, 18(6), 930-936. doi:10.1016/j.trc.2010.03.005
- Hanh T.T.B. and Van Hung D. (2007)., Verification of an air-traffic control system with probabilistic real-time model-checking., UNU-IIST report.
- Saba T., Sulong G. and Rehman A. (2011)., Retracted Article: Document image analysis: issues, comparison of methods and remaining problems., Artificial Intelligence Review, 35(2), 101-118. doi:10.1007/s10462-010-9186-6
- Murman E.M., Walton M. and Rebentisch E. (2000)., Challenges in the better, faster, cheaper era of aeronautical design, engineering and manufacturing., The Aeronautical Journal, 104(1040), 481-489. doi:10.1017/S0001924000091983
- Morrow D. (1997)., Improving consultations between health-care professionals and older clients: Implications for pharmacists., The International Journal of Aging and Human Development, 44(1), 47-72. doi:10.2190/GQX9-F4UJ-5RQ2-N1YD
- Parcell G. and Collison C. ( ). Learning to Fly Summary. 1-9., undefined, undefined
- Sengupta N., Sahidullah M. and Saha G. (2016)., Lung sound classification using cepstral-based statistical features., Computers in biology and medicine, 75, 118-129. doi:10.1016/j.compbiomed.2016.05.013
- Ream N. (1977)., Discrete-time signal processing., Electronics and Power, 23(2), 157. doi:10.1049/ep.1977.0078
- Al-Jumaily A.H.J., Sali A., Mandeep J.S. and Ismail A. (2015)., Propagation measurement on earth-sky signal effects for high speed train satellite channel in tropical region at Ku-band., Int. J. Antennas Propag. doi:10.1155/2015/270949
- Smith S.W. (1999)., The Scientist and Engineer's Guide to Digital Signal Processing., IEEE Signal Processing Magazine. doi:10.1109/79.826412
- Le Guernic P., Benveniste A., Bournai P. and Gautier T. (1986)., Signal--A data flow-oriented language for signal processing., IEEE transactions on acoustics, speech, and signal processing, 34(2), 362-374. doi:10.1109/TASSP.1986.1164809
- Zhang Q. and Benveniste A. (1992)., Wavelet networks., IEEE transactions on Neural Networks, 3(6), 889-898. doi:10.1109/72.165591
- Movaghar A., Jamzad M. and Asadi H. (2014)., Artificial Intelligence and Signal Processing: International Symposium., AISP 2013 Tehran, Iran, December 25-26, 2013 Revised Selected Papers. Commun. Comput. Inf. Sci. (2014). doi:10.1007/978-3-319-10849-0
- Hagan M.T., Demuth H.B. and Beale M.H. (1996)., Neural Network Design., Bost. Massachusetts PWS. doi:10.1007/1-84628-303-5
- Thrun S. (2010)., Toward robotic cars., Communications of the ACM, 53(4), 99-106. doi:10.1145/1721654.1721679
- Lassa T. (2013)., The Beginning of the End of Driving., Mot. Trend.
- Gehrig S.K. and Stein F.J. (1999)., Cartography and dead reckoning using stereo vision for an\\nautonomous car., Proc. 1999 Int. Conf. Image Process. (Cat. 99CH36348), 4, 30-34. doi:10.1109/ICIP.1999.819461
- Luettel T., Himmelsbach M. and Wuensche H.J. (2012)., Autonomous Ground Vehicles-Concepts and a Path to the Future., Proceedings of the IEEE, 100(Centennial-Issue), 1831-1839. doi:10.1109/JPROC.2012.2189803
- Hamid U.Z.A., Pushkin K., Zamzuri H., Gueraiche D. and Rahman M.A.A. (2016)., Current Collision Mitigation Technologies for Advanced Driver Assistance Systems - A Survey., Perintis eJournal, 6(2). doi:10.1105/tpc.15.01050
- Chen C., Seff A., Kornhauser A. and Xiao J. (2015)., DeepDriving: Learning affordance for direct perception in autonomous driving., Proceedings of the IEEE International Conference on Computer Vision, 2722-2730. doi:10.1109/ICCV.2015.312
- LeCun Y.A., Bengio Y. and Hinton G.E. (2015)., Deep learning., Nature, 521, 436-444. doi:10.1038/nature14539.
- Bird S., Klein E. and Loper E. (2009)., Natural Language Processing with Python., analyzing text with the natural language toolkit. doi:10.1097/00004770-200204000-00018
- Collobert R., Weston J., Bottou L., Karlen M., Kavukcuoglu K. and Kuksa P. (2011)., Natural language processing (almost) from scratch., Journal of Machine Learning Research, 12(Aug), 2493-2537. doi:10.1. 1.231.4614.
- Soon W.M., Ng H.T. and Lim D.C.Y. (2001)., A machine learning approach to coreference resolution of noun phrases., Computational linguistics, 27(4), 521-544. doi:10.1162/08912010175334 2653.
- Farghaly A. and Shaalan K. (2009)., Arabic natural language processing : Challenges and solutions., ACM Trans. Asian Lang. Inf. Process., 8(4), 14. doi:10.1145/1644879. 1644881.http.
- Westerdahl S. (2003). Organising and managing work. Organisational, managerial and strategic behaviour in theory and practice. Scand. J. Manag., 3(19), 392-394. doi:10.1016/S0956-5221(03)00030-7., undefined, undefined
- Rokach L. and Maimon O. (2010)., Decision Trees., in Data Mining and Knowledge Discovery Handbook, 154-192. doi:10.1007/ 0-387-25465-X_9.
- Kuncheva L.I. (2004)., Classifier ensembles for changing environments., Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics). doi:10.1007/978-3-540-25966-4_1.
- Usmani M., Adil S.H., Raza K. and Ali S.S.A. (2016)., Stock market prediction using machine learning techniques., In Computer and Information Sciences (ICCOINS), 2016 3rd International Conference, IEEE, 322-327. doi:10.1109/ICCOINS.2016.7783235
- Coupelon O. (2007)., Neural network modeling for stock movement prediction A state of the art., Network.
- Yoo P.D., Kim M.H. and Jan T. (2005)., Machine Learning Techniques and Use of Event Information for Stock Market Prediction: A Survey and Evaluation., Int. Conf. Comput. Intell. Model. Control Autom. Int. Conf. Intell. Agents, Web Technol. Internet Commer, 2, 835-841. doi:10.1109/CIMCA.2005. 1631572.
- Glahn H.R. and Lowry D.A. (1972)., The Use of Model Output Statistics (MOS) in Objective Weather Forecasting., J. Appl. Meteorol, 11(8), 1203-1211. doi:10.1175/1520-0450(1972)011<1203: TUOMOS>2.0.CO;2.
- Holtslag A.A.M., De Bruijn E.I.F. and Pan H.L. (1990)., A High Resolution Air Mass Transformation Model for Short-Range Weather Forecasting., Mon. Weather Rev., 118(8), 1561-1575. doi:10. 1175/1520-0493(1990)118<1561:AHRAMT>2.0.CO;2
- Nash J.E. and Sutcliffe J.V. (1970)., River flow forecasting through conceptual models part I - A discussion of principles. J. Hydrol., 10(3), 282-290. doi:10.1016/0022-1694(70)90255-6., undefined
- Gneiting T. and Raftery A.E. (2005)., Weather forecasting with ensemble methods., Science, 310(5746), 248-249. doi:10.1126/ science .1115255.
- Hapgood M. (2012)., Astrophysics: Prepare for the coming space weather storm., Nature, 484(7394), 311-313. doi:10.1038/484311a.
- Lee K., Cha Y. and Park J. (1992)., Short-term load forecasting using an artificial neural network., Power Syst. IEEE Trans., 7(1), 124-132. doi:10.1109/59.141695.
- Jolliffe I.T. (1982)., A Note on the Use of Principal Components in Regression., J.R. Stat. Soc. Ser. C (Applied Stat., 31(3), 300-303. doi:10.2307/2348005
- Holmstrom M., Liu D. and Vo C. (2016)., Machine Learning Applied to Weather Forecasting., Meteorol. Appl, 1-5.
- Salman A.G., Kanigoro B. and Heryadi Y. (2015)., Weather forecasting using deep learning techniques., in 2015 International Conference on Advanced Computer Science and Information Systems (ICACSIS), 281-285. doi:10.1109/ ICACSIS.2015.7415154.
- Li J., Ward J.K., Tong J., Collins L. and Platt G. (2016)., Machine learning for solar irradiance forecasting of photovoltaic system., Renew. Energy, 90, 542-553. doi:10.1016/ j.renene. 2015.12.069
- Ji H.F., Li X.J. and Zhang H.Y. (2009)., Natural products and drug discovery: Can thousands of years of ancient medical knowledge lead us to new and powerful drug combinations in the fight against cancer and dementia?., EMBO Rep., 10(3), 194-200. doi:10.1038/embor.2009.12.
- Pan S.Y., Zhou S.F., Gao S.H., Yu Z.L., Zhang S.F., Tang M.K. and Ko K.M. (2013)., New perspectives on how to discover drugs from herbal medicines: CAM'S outstanding contribution to modern therapeutics., Evidence-based Complement. Altern. Med. doi:10.1155/2013/627375
- Mitchell J.B.O. (2014)., Machine learning methods in chemoinformatics., Wiley Interdiscip. Rev. Comput. Mol. Sci., 4(5), 468-481. doi:10.1002/wcms.1183.
- Patwardhan B. (2005)., Ethnopharmacology and drug discovery., Journal of Ethnopharmacology, 100(1-2), 50-52. doi:10.1016/ j.jep.2005.06.006.
- Lee J.A., Uhlik M.T., Moxham C.M., Tomandl D. and Sall D.J. (2012)., Modern phenotypic drug discovery is a viable, neoclassic pharma strategy., Journal of Medicinal Chemistry, 55(10), 4527-4538. doi:10.1021/jm201649s
- Murphy R.F. (2011)., An active role for machine learning in drug development., Nature Chemical Biology, 7, 327-330. doi:10.1038/ nchembio.576.
- Bernick J.P. (2015)., The Role of Machine Learning in Drug Design and Delivery., J. Dev. Drugs, 4(3), 1-2.
- Hopkins A.L. (2009)., Drug discovery: Predicting promiscuity., Nature, 462, 167-168.
- Wale N. (2011)., Machine learning in drug discovery and development., Drug Dev. Res., 72, 112-119.
- Bowen A. and Casadevall A. (2015)., Increasing disparities between resource inputs and outcomes, as measured by certain health deliverables, in biomedical research., Proc. Natl. Acad. Sci., 112(36), 11335-11340. doi:10.1073/pnas.1504955112.
- Williams C.K.I. (2003)., Learning With Kernels: Support Vector Machines, Regularization, Optimization, and Beyond., J. Am. Stat. Assoc., 98, 489-489. doi:10.1198/jasa.2003.s269
- Lavecchia A. (2015)., Machine-learning approaches in drug discovery: Methods and applications., Drug Discovery Today, 20(3), 318-331. doi:10.1016/j.drudis.2014.10.012.
- American Cancer Society (2016)., Cancer Facts and Figures 2016., Cancer Facts Fig. 2016. doi:10.1097/ 01.NNR.0000289503.22414.79
- World Health Organisation (2018)., Cancer., http://www.who.int/en/ news-room/fact-sheets/detail/ cancer.
- Alberts B. (2008)., Molecular Biology of the Cell., Amino Acids doi:10.1002/1521-3773(20010316)40: 6<9823::AID-ANIE9823>3.3.CO;2-C.
- Schneider K. (2001)., Cell Biology and Cancer., Couns. about cancer Strateg. Genet. Couns., 1-17.
- Aktipis C.A. and Nesse R.M. (2013)., Evolutionary foundations for cancer biology., Evolutionary Applications, 6(1), 144-159. doi:10.1111/eva.12034
- Harris R.J.C. (1971)., Cancer and the environment., Int. J. Environ. Stud. doi:10.1080/00207237008709397
- Hanahan D. and Weinberg R.A. (2011)., Hallmarks of cancer: The next generation., Cell, 144(5), 646-674. doi:10.1016/ j.cell.2011.02.013
- Bughin J. (2017)., Artificial Intelligence - The Next Digital Frontier?Artificial Intelligence., doi:10.1016/S1353-4858(17)30039-9
- Kourou K., Exarchos T.P., Exarchos K.P., Karamouzis M.V. and Fotiadis D.I. (2015)., Machine learning applications in cancer prognosis and prediction., Computational and Structural Biotechnology Journal, 13, 8-17. doi:10.1016/j.csbj.2014.11.005
- Cruz J.A. and Wishart D.S. (2006)., Applications of machine learning in cancer prediction and prognosis., Cancer Informatics, 2. doi:10.1177/117693510600200030
- Sirintrapun S. Joseph, Ahmet Zehir, Aijazuddin Syed, JianJiong Gao, Nikolaus Schultz and Donavan T. Cheng (2015)., Translational Bioinformatics and Clinical Research (Biomedical) Informatics., Surgical pathology clinics 8(2), 269-288. doi:10.1016/j.path.2015.02.015
- Ataka K. and Heberle J. (2007)., Biochemical applications of surface-enhanced infrared absorption spectroscopy., Anal. Bioanal. Chem., 388(1), 47-54. doi:10.1007/s00216-006-1071-4
- Hume D. (2000)., A Treatise of Human Nature., A Treatise Hum. Nat. doi:10.2307/2216614
- Binda C., Mattevi A. and Edmondson D.E. (2002)., Structure-function relationships in flavoenzyme-dependent amine oxidations: A comparison of polyamine oxidase and monoamine oxidase., Journal of Biological Chemistry. doi:10.1074/jbc.R200005200
- Doi E. (1993)., Gels and gelling of globular proteins., Trends in Food Science and Technology, 4(1), 1-5. doi:10.1016/S0924-2244(05) 80003-2.
- Fraley S.I., Feng Y., Krishnamurthy R., Kim D.H., Celedon A., Longmore G.D. and Wirtz D. (2010)., A distinctive role for focal adhesion proteins in three-dimensional cell motility., Nature cell biology, 12(6), 598-604. doi:10.1038/ncb2062
- Sippl M.J. (1993)., Recognition of errors in three???dimensional structures of proteins., Proteins Struct. Funct. Bioinforma, 17(4), 355-362. doi:10.1002/prot.340170404.
- Bowie J.U., Lüthy R. and Eisenberg D. (1991)., A method to identify protein sequences that fold into a known three-dimensional stucture., Science, 253(5016), 164-170. doi:10.1126/science. 1853201.
- Lazaridis T. and Karplus M. (2000)., Effective energy functions for protein structure prediction., Current opinion in structural biology, 10(2), 139-145. doi:10.1016/S0959-440X(00)00063-4.
- Rost B. and Sander C. (1993)., Prediction of protein secondary structure at better than 70% accuracy., Journal of molecular biology, 232(2), 584-599. doi:10.1006/jmbi.1993.1413.
- Zhang Y. (2008)., Progress and challenges in protein structure prediction., Current Opinion in Structural Biology, 18(3), 342-348. doi:10.1016/j.sbi.2008.02.004.
- Ginsberg M.L. (1988). Multivalued logics: a uniform approach to reasoning in artificial intelligence. Comput. Intell., 4(3), 265-316. doi:10.1111/j.1467-8640.1988.tb00280.x, undefined, undefined
- Porollo A. and Meller J. (2007)., Prediction‐based fingerprints of protein-protein interactions., Proteins: Structure, Function, and Bioinformatics, 66(3), 630-645. doi:10.1002/prot.21248.
- Pirovano W. and Heringa J. (2010)., Protein secondary structure prediction., Methods in molecular biology (Clifton, N.J.), 327-348. doi:10.1007/978-1-60327-241-4_19.
- Koehl P. and Levitt M. (1999)., A brighter future for protein structure prediction., Nature Structural Biology, 6, 108-111. doi:10.1038/5794.