Abstract
Measuring the critical temperature of a superconductor experimentally is a very arduous process as critical temperatures of superconductors mostly lie on the extreme lower end on the Kelvin scale. Estimating them with the help of high-end instruments and lab experiments incurs high costs. In this paper, we employ and compare the performance of several regression algorithms which predict the critical temperature of superconductors based on the physical and chemical properties of the materials that constitute them, which are derived from the chemical formula of the superconductor. We compare state-of-the-art algorithms based on their accuracy of prediction using metrics like MAE, MSE, RMSE and R2 score. Random forest and XGBoost provide best results in this task of predicting superconducting critical temperature.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Tinkham, M.: Introduction to Superconductivity. Courier Corporation (2004)
Parinov, I.A.: High-temperature superconductors. Overview. In: Microstructure and Properties of High-Temperature Superconductors, pp. 73–124. Springer, Berlin, Heidelberg (2012)
Sotelo, G.G., De Oliveira, R.A., Costa, F.S., Dias, D.H., De Andrade, R., Stephan, R.M.: A full scale superconducting magnetic levitation (MagLev) vehicle operational line. IEEE Trans. Appl. Supercond. 25(3), 1–5 (2014)
Robin, J.K., Poorvika, E.: Levitated Launch Vehicle for Launching Satellites
Fuchino, S., Furuse, M., Agatsuma, K., Kamioka, Y., Iitsuka, T., Nakamura, S., Ueda, H., Kajikawa, K.: Development of superconducting high gradient magnetic separation system for medical protein separation. IEEE Trans. Appl. Supercond. 24(3), 1–4 (2013)
Berggren, K.K.: Quantum computing with superconductors. Proc. IEEE 92(10), 1630–1638 (2004)
Istrate, N.: Determining the critical temperature of YBa2Cu3O7 superconductor by using a four-point probe and type-T thermocouple (2014)
Liquid Nitrogen Handling. https://ehs.research.uiowa.edu/liquid-nitrogen-handling
Ray, S.: A quick review of machine learning algorithms. In: 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon), pp. 35–39. IEEE (Feb 14, 2019)
Botchkarev: Performance metrics (Error Measures) in machine learning regression, forecasting and prognostics: Properties and typology. IJIKM 14, 45–79 (2019)
Owolabi, T.O., Akande, K.O., Olatunji, S.: Prediction of superconducting transition temperatures for Fe-based superconductors using support vector machine. Adv. Phys. Theor. Appl. 35, 12–26 (2014)
Owolabi, T.O., Akande, K.O., Olatunji, S.O.: Computational intelligence approach for estimating superconducting transition temperature of disordered MgB2 superconductors using room temperature resistivity. Appl. Comput. Intell. Soft Comput. 2016, 1–7 (2016)
Owolabi, T, Akande, K, Olatunji, S.: Application of computational intelligence technique for estimating superconducting transition temperature of YBCO superconductors. Appl. Soft Comput. 43(C), 143–149 (2016)
Konno, T., Kurokawa, H., Nabeshima, F., Sakishita, Y., Ogawa, R., Hosako, I., Maeda, A.: Deep Learning Model for Finding New Superconductors. arXiv preprint arXiv:1812.01995 (Dec 3, 2018)
Stanev, V., Oses, C., Gilad Kusne, A., Rodriguez, E., Paglione, J., Curtarolo, S., Takeuchi, I.: Machine learning modeling of superconducting critical temperature. npj Comput. Mater. 4(1) (2018)
Meredig, B., Antono, E., Church, C., Hutchinson, M., Ling, J., Paradiso, S., Blaiszik, B., Foster, I., Gibbons, B., Hattrick-Simpers, J., Mehta, A., Ward, L.: Can machine learning identify the next high-temperature superconductor? Examining extrapolation performance for materials discovery. Mol. Syst. Des. Eng. 3, 819–825 (2018)
Hamidieh, K.: A data-driven statistical model for predicting the critical temperature of a superconductor. Comput. Mater. Sci. 1(154), 346–354 (2018)
Puntanen, S.: Methods of multivariate analysis, by Rencher, A.C., Christensen, W.F. Int. Statist. Rev. 81(2), 328–9 (Aug, 2013)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Roy. Stat. Soc.: Ser. B (Methodol.) 58(1), 267–288 (1996)
Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970)
Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)
Altman, N.S.: An introduction to kernel and nearest-neighbor nonparametric regression. Am. Stat. 46(3), 175–185 (1992)
Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC press (1984)
Louppe, G.: Understanding Random Forests: From Theory to Practice. arXiv preprint arXiv:1407.7502 (July 28, 2014)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)
Friedman, J.H.: Stochastic gradient boosting. Comput. Stat. Data Anal. 38(4), 367–378 (2002)
Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (Aug 13, 2016)
Bengio, Y.: Learning deep architectures for AI. Found. Trends Mach. Learn. 2(1), 1–127 (2009)
Bengio, Y., Delalleau, O.: On the expressive power of deep architectures. In: International Conference on Algorithmic Learning Theory, pp. 18–36. Springer, Berlin, Heidelberg (Oct 5, 2011)
Wang, H., Raj, B.: On the Origin of Deep Learning. arXiv preprint arXiv:1702.07800 (Feb 24, 2017)
LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)
Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S.: Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv preprint arXiv:1811.03378 (Nov 8, 2018)
Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (Dec 22, 2014)
National Institute of Materials Science, Materials Information Station, SuperCon, http://supercon.nims.go.jp/indexen.html (2011)
Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemom. Intell. Lab. Syst. 2(1–3), 37–52 (1987)
Cameron, A.C., Windmeijer, F.A.: An R-squared measure of goodness of fit for some common nonlinear regression models. J. Econometrics 77(2), 329–342 (1997)
Iman, R.L., Davenport, J.M.: Approximations of the critical region of the fbietkan statistic. Commun. Stat. Theor. Methods 9(6), 571–595 (1980)
Friedman, M.: A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 11(1), 86–92 (1940)
García, S., Fernández, A., Luengo, J., Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 180(10), 2044–2064 (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Kulkarni, A.K., Puranik, V., Kulkarni, R. (2022). Empirical Study of Predicting Critical Temperature of a Superconductor Using Regression Techniques. In: Gandhi, T.K., Konar, D., Sen, B., Sharma, K. (eds) Advanced Computational Paradigms and Hybrid Intelligent Computing . Advances in Intelligent Systems and Computing, vol 1373. Springer, Singapore. https://doi.org/10.1007/978-981-16-4369-9_39
Download citation
DOI: https://doi.org/10.1007/978-981-16-4369-9_39
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-4368-2
Online ISBN: 978-981-16-4369-9
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)