Skip to main content

Empirical Study of Predicting Critical Temperature of a Superconductor Using Regression Techniques

  • Conference paper
  • First Online:
Advanced Computational Paradigms and Hybrid Intelligent Computing

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1373))

  • 750 Accesses

Abstract

Measuring the critical temperature of a superconductor experimentally is a very arduous process as critical temperatures of superconductors mostly lie on the extreme lower end on the Kelvin scale. Estimating them with the help of high-end instruments and lab experiments incurs high costs. In this paper, we employ and compare the performance of several regression algorithms which predict the critical temperature of superconductors based on the physical and chemical properties of the materials that constitute them, which are derived from the chemical formula of the superconductor. We compare state-of-the-art algorithms based on their accuracy of prediction using metrics like MAE, MSE, RMSE and R2 score. Random forest and XGBoost provide best results in this task of predicting superconducting critical temperature.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Tinkham, M.: Introduction to Superconductivity. Courier Corporation (2004)

    Google Scholar 

  2. Parinov, I.A.: High-temperature superconductors. Overview. In: Microstructure and Properties of High-Temperature Superconductors, pp. 73–124. Springer, Berlin, Heidelberg (2012)

    Google Scholar 

  3. Sotelo, G.G., De Oliveira, R.A., Costa, F.S., Dias, D.H., De Andrade, R., Stephan, R.M.: A full scale superconducting magnetic levitation (MagLev) vehicle operational line. IEEE Trans. Appl. Supercond. 25(3), 1–5 (2014)

    Article  Google Scholar 

  4. Robin, J.K., Poorvika, E.: Levitated Launch Vehicle for Launching Satellites

    Google Scholar 

  5. Fuchino, S., Furuse, M., Agatsuma, K., Kamioka, Y., Iitsuka, T., Nakamura, S., Ueda, H., Kajikawa, K.: Development of superconducting high gradient magnetic separation system for medical protein separation. IEEE Trans. Appl. Supercond. 24(3), 1–4 (2013)

    Article  Google Scholar 

  6. Berggren, K.K.: Quantum computing with superconductors. Proc. IEEE 92(10), 1630–1638 (2004)

    Article  Google Scholar 

  7. Istrate, N.: Determining the critical temperature of YBa2Cu3O7 superconductor by using a four-point probe and type-T thermocouple (2014)

    Google Scholar 

  8. Liquid Nitrogen Handling. https://ehs.research.uiowa.edu/liquid-nitrogen-handling

  9. Ray, S.: A quick review of machine learning algorithms. In: 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon), pp. 35–39. IEEE (Feb 14, 2019)

    Google Scholar 

  10. Botchkarev: Performance metrics (Error Measures) in machine learning regression, forecasting and prognostics: Properties and typology. IJIKM 14, 45–79 (2019)

    Google Scholar 

  11. Owolabi, T.O., Akande, K.O., Olatunji, S.: Prediction of superconducting transition temperatures for Fe-based superconductors using support vector machine. Adv. Phys. Theor. Appl. 35, 12–26 (2014)

    Google Scholar 

  12. Owolabi, T.O., Akande, K.O., Olatunji, S.O.: Computational intelligence approach for estimating superconducting transition temperature of disordered MgB2 superconductors using room temperature resistivity. Appl. Comput. Intell. Soft Comput. 2016, 1–7 (2016)

    Google Scholar 

  13. Owolabi, T, Akande, K, Olatunji, S.: Application of computational intelligence technique for estimating superconducting transition temperature of YBCO superconductors. Appl. Soft Comput. 43(C), 143–149 (2016)

    Google Scholar 

  14. Konno, T., Kurokawa, H., Nabeshima, F., Sakishita, Y., Ogawa, R., Hosako, I., Maeda, A.: Deep Learning Model for Finding New Superconductors. arXiv preprint arXiv:1812.01995 (Dec 3, 2018)

  15. Stanev, V., Oses, C., Gilad Kusne, A., Rodriguez, E., Paglione, J., Curtarolo, S., Takeuchi, I.: Machine learning modeling of superconducting critical temperature. npj Comput. Mater. 4(1) (2018)

    Google Scholar 

  16. Meredig, B., Antono, E., Church, C., Hutchinson, M., Ling, J., Paradiso, S., Blaiszik, B., Foster, I., Gibbons, B., Hattrick-Simpers, J., Mehta, A., Ward, L.: Can machine learning identify the next high-temperature superconductor? Examining extrapolation performance for materials discovery. Mol. Syst. Des. Eng. 3, 819–825 (2018)

    Article  Google Scholar 

  17. Hamidieh, K.: A data-driven statistical model for predicting the critical temperature of a superconductor. Comput. Mater. Sci. 1(154), 346–354 (2018)

    Article  Google Scholar 

  18. Puntanen, S.: Methods of multivariate analysis, by Rencher, A.C., Christensen, W.F. Int. Statist. Rev. 81(2), 328–9 (Aug, 2013)

    Google Scholar 

  19. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Roy. Stat. Soc.: Ser. B (Methodol.) 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  20. Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970)

    Article  Google Scholar 

  21. Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)

    Article  MathSciNet  Google Scholar 

  22. Altman, N.S.: An introduction to kernel and nearest-neighbor nonparametric regression. Am. Stat. 46(3), 175–185 (1992)

    MathSciNet  Google Scholar 

  23. Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC press (1984)

    Google Scholar 

  24. Louppe, G.: Understanding Random Forests: From Theory to Practice. arXiv preprint arXiv:1407.7502 (July 28, 2014)

  25. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MathSciNet  Google Scholar 

  26. Friedman, J.H.: Stochastic gradient boosting. Comput. Stat. Data Anal. 38(4), 367–378 (2002)

    Article  MathSciNet  Google Scholar 

  27. Chen, T., Guestrin, C.: Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (Aug 13, 2016)

    Google Scholar 

  28. Bengio, Y.: Learning deep architectures for AI. Found. Trends Mach. Learn. 2(1), 1–127 (2009)

    Article  MathSciNet  Google Scholar 

  29. Bengio, Y., Delalleau, O.: On the expressive power of deep architectures. In: International Conference on Algorithmic Learning Theory, pp. 18–36. Springer, Berlin, Heidelberg (Oct 5, 2011)

    Google Scholar 

  30. Wang, H., Raj, B.: On the Origin of Deep Learning. arXiv preprint arXiv:1702.07800 (Feb 24, 2017)

  31. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)

    Article  Google Scholar 

  32. Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S.: Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv preprint arXiv:1811.03378 (Nov 8, 2018)

  33. Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (Dec 22, 2014)

  34. National Institute of Materials Science, Materials Information Station, SuperCon, http://supercon.nims.go.jp/indexen.html (2011)

  35. Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemom. Intell. Lab. Syst. 2(1–3), 37–52 (1987)

    Article  Google Scholar 

  36. Cameron, A.C., Windmeijer, F.A.: An R-squared measure of goodness of fit for some common nonlinear regression models. J. Econometrics 77(2), 329–342 (1997)

    Article  MathSciNet  Google Scholar 

  37. Iman, R.L., Davenport, J.M.: Approximations of the critical region of the fbietkan statistic. Commun. Stat. Theor. Methods 9(6), 571–595 (1980)

    Article  Google Scholar 

  38. Friedman, M.: A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 11(1), 86–92 (1940)

    Article  MathSciNet  Google Scholar 

  39. García, S., Fernández, A., Luengo, J., Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 180(10), 2044–2064 (2010)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anish K. Kulkarni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kulkarni, A.K., Puranik, V., Kulkarni, R. (2022). Empirical Study of Predicting Critical Temperature of a Superconductor Using Regression Techniques. In: Gandhi, T.K., Konar, D., Sen, B., Sharma, K. (eds) Advanced Computational Paradigms and Hybrid Intelligent Computing . Advances in Intelligent Systems and Computing, vol 1373. Springer, Singapore. https://doi.org/10.1007/978-981-16-4369-9_39

Download citation

Publish with us

Policies and ethics