Abstract
In this chapter, we will review the current state of play in the area of overlap between learning analytics (LA), specifically data mining and exploratory analytics, and the field of measurement science. We will review the logic of measurement science, as instantiated through the BEAR Assessment System (BAS), and illustrate it in the context of a LA example. An example is presented showing how complex digital assessments can be designed through BAS with attention to measurement science, while LA approaches can help to score some of the complex digital artifacts embedded in the design. With that background, we suggest ways that the two approaches can be seen to support and complement one another, leading to a larger perspective. This chapter concludes with a discussion of the implications of this emerging intersection and a survey of possible next steps.
Similar content being viewed by others
References
American Educational Research Association, American Psychological Association, National Council for Measurement in Education (AERA, APA, NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
Baker, R. S., & Siemens, G. (2014). Educational data mining and learning analytics. In K. Sawyer (Ed.), The Cambridge handbook of the learning sciences. Cambridge, UK: Cambridge University Press, (pp. 253–274).
Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (Vol. 1). Dordrecht, The Netherands/New York, NY: Springer.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. doi:10.1080/0969595980050102.
Brady, A., Conlan, O., Wade, V., & Dagger, D. (2006). Supporting users in creating pedagogically sound personalised learning objects. Paper presented at the Adaptive Hypermedia and Adaptive Web-based Systems, Dublin, Ireland.
Chedrawy, Z., & Abidi, S. S. R. (2006). An adaptive personalized recommendation strategy featuring context sensitive content adaptation. Paper presented at the Adaptive Hypermedia and Adaptive Web-Based Systems, 4th International Conference, AH 2006, Dublin, Ireland.
Chi, E. H., Pirolli, P., Suh, B., Kittur, A., Pendleton, B., & Mytkowicz, T. (2008). Augmented social cognition. Palo Alto, CA: Palo Alto Research Center.
Council, N. R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.
Dagger, D., Wade, V., & Conlan, O. (2005). Personalisation for all: Making adaptive course composition easy. Educational Technology & Society, 8(3), 9–25.
Dringus, L. P. (2012). Learning analytics considered harmful. Journal of Asynchronous Learning Networks, 16(3), 87–100.
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 301–317. doi:10.1504/IJTEL.2012.051816.
Gasevic, G., Dawson, C., Ferguson, S. B., Duval, E., Verbert, K., & Baker, R. S. J. D. (2011). Open learning analytics: An integrated & modularized platform (Concept paper). Society for Learning Analytics Research. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf
Griffin, P., McGaw, B., & Care, E. (Eds.). (2012). Assessment and teaching of 21st century skills. Dordrecht, The Netherlands/New York, NY: Springer.
Kennedy, C. A., & Draney, K. (2006). Interpreting and using multidimensional performance data to improve learning. In X. Liu (Ed.), Applications of Rasch measurement to science education. Chicago, IL: JAM Press.
McFarlane, A. (2003). Assessment for the digital age. Assessment in Education: Principles, Policy & Practice, 10, 261–266.
Mislevy, R. J. (2016). [Discussion of learning analytics].
Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to Evidence-Centered Design. CRESST Technical Paper Series. Los Angeles, CA: CRESST.
Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Educational Technology & Society, 17(4), 49–64.
Partnership for 21st Century Skills & American Association of Colleges of Teacher Education. (2010). 21st Century Knowledge and Skills in Educator Preparation. http://www.p21.org/storage/documents/aacte_p21_whitepaper2010.pdf
Pirolli, P. (2007). Cognitive models of human-information interaction. In F. T. Durso (Ed.), Handbook of applied cognition (pp. 443–470). New York, NY: Wiley.
Pirolli, P. (2009, April 3–9). An elementary social information foraging model. Paper presented at the CHI 2009, ACM Conference on Human Factors in Computing Systems, Boston, MA.
Pirolli, P., Preece, J., & Shneiderman, B. (2010). Cyberinfrastructure for social action on national priorities. IEEE Computer, 43(11), 20–21.
Pirolli, P., & Wilson, M. (1998). A theory of the measurement of knowledge content, access, and learning. Psychological Review, 105(1), 58–82.
Resnick, L. B., & Resnick, D. P. (1992). Assessing the thinking curriculum: New tools for educational reform. In B. R. Gifford & M. C. O'Connor (Eds.), Changing assessments: Alternative views of aptitude, achievement and instruction (pp. 37–76). Boston, MA: Kluwer.
Russell, S., & Norvig, P. (2009). Artificial intelligence, a modern approach (3rd ed.). Upper Saddle River, NJ: Prentice Hall.
Scalise, K. (2012). Using technology to assess hard-to-measure constructs in the CCSS and to expand accessibility. Invitational Research Symposium on Technology Enhanced Assessments. http://www.k12center.org/events/research_meetings/tea.html
Scalise, K. (2016). Student collaboration and school educational technology: Technology integration practices in the classroom. Journal on School Educational Technology, 11(4), 39–49.
Scalise, K., & Gifford, B. R. (2006). Computer-based assessment in E-Learning: A framework for constructing “Intermediate Constraint” questions and tasks for technology platforms. Journal of Teaching, Learning and Assessment, 4(6), 7.
Scalise, K., Bernbaum, D. J., Timms, M. J., Veeragoudar Harrell, S., Burmester, K., Kennedy, C. A., & Wilson, M. (2007). Adaptive technology for e-Learning: Principles and case studies of an emerging field. Journal of the American Society for Information Science and Technology, 58(14), 001–015.
Schrum, L., & Levin, B. B. (2014). Evidence-based strategies for leading 21st century schools. Thousand Oaks, CA: Corwin.
Sclater, N. (2014). JISC: Code of practice for learning analytics: A literature review of the ethical and legal issues. http://repository.jisc.ac.uk/5661/1/Learning_Analytics_A-_Literature_Review.pdf
Stanton, J. M. (2012). An introduction to data science. Retrieved from http://surface.syr.edu/istpub/165/
Stevens, S. S. (1946). On the theory of scales of measurement. Science, 103, 221–263.
Timms, M. (2016). Towards a model of how learners process feedback: A deeper look at learning. Australian Journal of Education. doi:10.1177/0004944116652912.
Timms, M., DeVelle, S., & Schwanter, U. (2015). Towards a model of how learners process feedback. Paper presented at the Artificial Intelligence in Education Conference 2015, Switzerland.
van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012). Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative. https://qa.itap.purdue.edu/learning/docs/research/ELI3026.pdf
Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Lawrence Erlbaum Assoc.
Wilson, M., Bejar, I., Scalise, K., Templin, J., Wiliam, D., & Torres-Irribarra, D. (2012). Perspectives on methodological issues. In P. Griffin, B. McGaw B., & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 67–142). Dordrecht: Springer.
Wilson, M., Scalise, K., & Gochyyev, P. (in press). ICT Literacy – Learning in digital networks. In R. W. Lissitz & H. Jiao (Eds.), Technology enhanced innovative assessment: Development, modeling, and scoring from an interdisciplinary perspective. Charlotte, NC: Information Age Publisher.
Wilson, M., Scalise, K., & Gochyyev, P. (2016). Assessment of learning in digital interactive social networks: A learning analytics approach. Online Learning Journal, 20(2). ISSN 2472–5730. http://olj.onlinelearningconsortium.org/index.php/olj/article/view/799/205
Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning progression. Journal for Research in Science Teaching, 46(6), 716–730.
Wilson, M., & Scalise, K. (2014). Assessment of learning in digital networks. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Vol. 2. Methods & approaches. Dordrecht: Springer.
Wilson, M., & Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 13(2), 181–208.
Wilson, M., Scalise, K., & Gochyyev, P. (2015). Rethinking ICT literacy: From computer skills to social network settings. Thinking Skills & Creativity, 18, 65–80.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this entry
Cite this entry
Wilson, M., Scalise, K. (2016). Learning Analytics: Negotiating the Intersection of Measurement Technology and Information Technology. In: Spector, M., Lockee, B., Childress, M. (eds) Learning, Design, and Technology. Springer, Cham. https://doi.org/10.1007/978-3-319-17727-4_44-1
Download citation
DOI: https://doi.org/10.1007/978-3-319-17727-4_44-1
Received:
Accepted:
Published:
Publisher Name: Springer, Cham
Online ISBN: 978-3-319-17727-4
eBook Packages: Springer Reference EducationReference Module Humanities and Social SciencesReference Module Education