Keywords

1 Introduction

Statistics has achieved a position of status in the Pre-K-12 curriculum in the United States and around the world; (Australian Curriculum Assessment and Reporting Authority 2010; National Governors Association Center for Best Practices & Council of Chief State School Officers 2010; Conference Board of the Mathematical Sciences 2010; Franklin et al. 2007). Secondary mathematics teachers are now responsible for teaching statistics ; yet remain ill prepared for the job (Batanero et al. 2011; Conference Board of the Mathematical Sciences 2010; Franklin et al. 2015; Madden 2008; Shaughnessy 2007). In contrast to the largely theoretical statistical courses teachers tend to take in mathematics departments, recent recommendations suggest the need for authentic data-intensive exploration and modeling experiences in addition to theory-based coursework (Franklin et al. 2015). Teachers should develop facility with the statistical process (Wild and Pfannkuch 1999); techniques and tools for simulation , computation, and representation; and a generally elevated understanding of the statistical landscape appropriate to meet 21st century curricular demands. Courses to prepare teachers for these new demands are still rare and largely unexamined (Franklin et al. 2015).

Related to pedagogical content knowledge (PCK) (Shulman 1986) and technological pedagogical content knowledge (TPCK) (Mishra and Koehler 2006), technological pedagogical statistical knowledge (TPSK) (Lee and Hollebrands 2011) addresses the importance of teachers understanding students’ learning and thinking about statistical ideas ; conceptions of how technology tools and representations support statistical thinking ; instructional strategies for developing statistics lessons with technology; critical stance towards evaluation; and use of curricula materials for teaching statistical ideas with technology (Groth 2007). TPSK informs a doer to designer approach (Kadijevich and Madden 2015) to teacher learning where teachers first engage in statistical investigations as learners (doers) and later design, implement and study the implementation of statistical lessons as enacted (designers). Constructionism (Papert 1991) is echoed in the doer to designer framework with its emphasis on engaging teachers as statistical learners en route to supporting them to design, implement, and reflect on statistical learning opportunities with their own students.

With these perspectives as guides, a blended format course (part face-to-face, part virtual) was developed to support and explore teachers’ evolving TPSK . This study begins to address the dearth of research exploring teachers’ TPSK development in relation to the enacted curriculum in the classroom (Kadijevich and Madden 2015; Lee and Nickell 2014).

2 Description of the Blended Learning Environment

A three-credit experimental graduate course offered in a US university was designed by the author to facilitate middle and high school teacher learning of statistics and modeling in the secondary curriculum. The course intended to impact teachers’ practices and their students’ opportunity to engage with statistical ideas . Design commitments included : active learning , technology rich investigations, community of practice orientation (Wenger 1998), exploration of curriculum materials, and attention to autonomy (Ryan and Deci 2000). The course consisted of five face-to-face (F2F) four-hour sessions plus five virtual modules between F2F meetings (Fig. 12.1). This course structure facilitated teachers’ schedules with intense statistical and technological learning experiences during their summer break and thoughtful implementation of statistical units of instruction with secondary students when the teachers returned to school in September.

Fig. 12.1
figure 1

Course format, schedule, and content trajectory

Course content included model-based sampling investigations, experimental design investigations to motivate randomization testing, and other simulation-based statistical tools for supporting statistical argumentation . Face-to-face sessions were largely focused on statistical investigations intended to support the statistical process and conceptual development , use of technology for exploring data, small and whole group processing of readings and experiences, and general community building (Fig. 12.2).

Fig. 12.2
figure 2

Process-related design commitments

In addition, participants were invited to read approximately 30 articles, conduct a statistical curriculum analysis, and engage in an action research project in which they designed, implemented, and reflectively analyzed student learning in a statistical unit of study where technology was utilized. Participants electronically submitted written assignments and discussion posts for each of the 10 distinct chunks of the course. Appendix A (http://bit.ly/2OAkugq) provides an example of instructions for participants for one of the virtual modules . Appendix B (http://bit.ly/2OAkugq) provides a description of the curriculum analysis project and associated scoring rubric. Appendix C (http://bit.ly/2OAkugq) contains the instructions for the curriculum implementation project. Course grades were determined by 40% preparation and participation ; 30% curriculum analysis project, 30% action research project (curriculum implementation).

Aspects of the design of the course for participants included: (1) developing facility with Fathom (Finzer 2005), TinkerPlots (Konold and Miller 2005), and CPMP-Tools (Keller 2006) software while conducting statistical investigations , much of this during virtual modules; (2) analyzing secondary-level curriculum materials to support statistical development as well as pedagogical sensibility; (3) designing, implementing, and studying a technologically-relevant statistical unit in their own classroom; and (4) choosing articles to read and statistical content and curricula to investigate from a pool of recommendations. The author provided a library of curriculum materials and literature for this study.

3 Methods

Ten secondary teachers (eight mathematics, two science) participated in the study. Four of these participants were Teaching Fellows in a National Science Foundation (NSF)-funded Noyce Master Teaching Fellow/Teaching Fellow project, while six were volunteers from schools not associated with the Noyce project. All names are pseudonyms. Each participant completed an initial background and motivation survey as well as post course survey (see https://goo.gl/forms/gaBXAPkFOzTBW1XP2). All course assignments, discussions, emails, and associated artifacts were collected for analysis. Survey data were analyzed using descriptive statistics and standard quantitative methods. Document analysis techniques were used for qualitative data with open and axial coding. With a focus on teachers’ development of TPSK , initial codes included: statistical knowledge (SK) , technological knowledge (TK) , pedagogical knowledge (PK) , STK, SPK, TPK, TPSK, tool use, impact of curriculum, impact of activity, impact of reading, impact of discussion, challenges, and miscellaneous. Each data source (e.g., discussion post or written assignment) was analyzed and summarized. Coding categories were further explored for themes across the data. Data were analyzed vertically by type and horizontally by person. A chronological case study analysis for each participant was conducted to capture the evolution of each participant’s learning over the period of the course to answer the research question : To what extent and in what ways did the blended format statistics and modeling course experiences impact participants’ TPSK?

A portion of the analysis is reported in this study. Results will coordinate teachers’ self-reported data with data analyzed by the researcher. Changes in teachers’ perceptions of statistical and technological facility are summarized; descriptions showcasing the breadth of curricular investigations and implementation projects are presented; and two specific learning trajectories are provided to illustrate the development of TPSK for project participants.

4 Results

4.1 Analysis of Participants’ Self-reported Pre- and Post-intervention Data

An analysis of participants’ comfort level (1-low, 5-high) with statistical big ideas pre- and post-intervention suggests limited prior statistical knowledge for most and significant improvement in a number of areas (Table 12.1). Significant gains in the areas of descriptive statistics , experimental design, sampling distributions , overall, and facility with TinkerPlots and Fathom coincide with the goals of the course (see Table 12.2). Understanding of statistical graphs showed improvement but was also the area most highly rated during the initial survey , and gains scores were not significant. Correlation and regression were addressed only briefly at the end of the course; however, some participants elected to explore curriculum units where these were a focus. This decision to focus elsewhere was predicated on the fact that many secondary teachers tend to have some familiarity with regression and correlation through their work teaching algebra. Several participants selected instructional units addressing correlation and regression during one of the modules where they could choose from a variety of statistical units to explore. The relatively high standard deviation associated with correlation and regression may be the result of representing a bifurcation of experiences where some participants benefitted from independent work, while others did not. Statistical inference was the area seeing the least change, a result likely due to the more informal approach to inference that participants may not have associated with more formal statistical inference.

Table 12.1 Participants’ self-reported comfort level with statistical big ideas and tools (1-low, 5-high)
Table 12.2 Ratings for the extent to which course objectives were met (1-low, 5-high), N = 10

Participants rated their personal engagement in the course (e.g., course readings, statistical tasks and investigations, discussion posts, curriculum units, TinkerPlots , Fathom , CPMP-Tools) . Aggregate ratings (1-low, 4-high) ranged from 2.86 to 3.71 (M-3.33, SD-0.31) and were strongly, positively associated with perceived learning gains (Fig. 12.3).

Fig. 12.3
figure 3

Participants’ self-reported statistical learning (scale 1-5) versus self-reported course engagement (scale 1-4)

Participants rated the extent to which course objectives were met. Mean ratings (1-low to 5-high) were 4.30 or above with five of seven objectives receiving a median rating of 5 (Table 12.2), suggesting participants believed course objectives were met.

4.2 Curriculum as Lever to Promote TPSK

Curriculum played a major role in the course. Curriculum frameworks such as GAISE and CCSSM were introduced to participants. Innovative curriculum texts developed with funding from the NSF such as Core-Plus Mathematics Project (CPMP) (Hirsch et al. 2015), Interactive Mathematics Program (IMP) (Fendel et al. 2012) and Connected Mathematics Project (CMP) (Lappan et al. 2009) were utilized to develop statistical ideas as well as to introduce participants to innovative instructional materials. These materials allowed for modeling classroom instruction in a manner that privileged investigation, discovery, and argumentation . Learning that instructional materials like those used during the course existed helped to encourage participants to critically examine them. The curriculum analysis project allowed participants to look carefully at ways in which different curriculum materials have potential to engage students in statistical activity as well as to address state and national standards. Contrasts with more familiar materials became obvious. As Table 12.3 illustrates, all mathematics participants and one science participant elected to analyze some combination of curriculum materials that included NSF-funded materials. The other science participant selected a broad range of resources for Advanced Placement (AP) Biology (College Board 2015) to examine and critique. Completed projects were posted on Moodle for sharing and brief presentations were made during a F2F session.

Table 12.3 Descriptions of participants’ curriculum analysis projects

By the end of the course and as will be illustrated in Sects. 12.4.3 and 12.4.4, participants developed and demonstrated extensive familiarity with the GAISE Framework and several high quality instructional resources for supporting student statistical learning. They increased their facility with the use of dynamic statistical tools (e.g., TinkerPlots , Fathom , and CPMP-Tools) as they engaged in statistical activity as learners. The requirement to complete curriculum implementation action research projects at the end of the course signaled the expectation that lessons learned would be explicitly tied to classroom practice. Using action research methods, participants designed, implemented, and analyzed student learning from a statistical unit of study. Table 12.4 contains brief descriptions of participants’ focus for their project and the technological tool(s) they elected to implement with students.

Table 12.4 Participants’ statistical curriculum implementation projects and associated technological tool

A wide range of statistical content was addressed and explored through the curriculum implementation projects; however, it was essential that participants could select appropriate content for their particular teaching context. Participants briefly presented their projects on the final day of the course. Their unique and improved statistical, technological, and pedagogical knowledge was evidenced through these individual projects and will be further described throughout the next two sections.

4.3 Tracing Learning Trajectories: Examining Two Cases for TPSK

Tracing participants’ learning journeys over the course illuminated a complicated but compelling storyline for each participant. Every participant attempted and completed all aspects of the course; however, the extent to which each aspect was completed varied considerably. Only a tiny fraction may be presented here, so I illustrate trajectories of two distinct patterns of engagement.

4.3.1 The Case of Claire

Claire is a third year high school mathematics teacher who described her past largely theoretical statistical learning experiences in great detail and characterized them as procedurally dominated:

I took a 1.5 credit Prob Stats course on: Sample spaces , events, axioms for probabilities ; conditional probabilities and Bayes’ theorem ; random variables and their distributions, discrete and continuous; expected values, means and variances; covariance and correlation … Also, I’ve taken a 2 credit Intermediate Probability course on: Continuous random variables , distribution functions, joint density functions, … Chebyshev’ theorem … Most of the class time was spent taking notes in a “fill in the blank” format and then once in a while we had statistical investigations . The professor did not take time to know her students individually and I felt that I didn’t learn much in her class because of this.

She indicated a desire to “learn methods for teaching statistics in a meaningful and engaging manner.” Her pre-course statistics comfort level was 2.33.

Following the June F2F sessions and readings, her reflection, a portion of which is below, indicated her growing understanding of the use of graphing calculators , TinkerPlots , and the simulation process model for generating empirical sampling distributions :

In Using Graphing Calculator Simulations in Teaching Statistics , Koehler gives a pretty detailed description of how to use the graphing calculators , and I realize that the graphing calculators are much more powerful than even I knew. However, I found that this tool is much more syntactically confusing and I would anticipate that students would have a lot of trouble understanding what was truly happening in situations being modeled. In contrast, Lane-Getaz describes that Tinkerplots really allows students to see the three layers of statistical modeling with a great figure on page 280 of the yearbook. I think I finally have this whole process clear in my mind! Finally, Lane cautions teachers that simulations can sometimes still produce passive learners, so they must be presented with a query-first method of teaching. I really want to remember this idea and try to pose a question of study to my students at the beginning of units and lessons of study.

In July, she assessed her own understanding after reading the Guidelines for Assessment and Instruction in Statistics Education (Franklin et al. 2007) using 1, 2, and 3 for levels A, B, and C:

I think I am probably around level 2.5, if we’re allowing halves. I’ve heard of some level 3 concepts, but do not have a firm grasp on, for example, the data analysis done on pp 67–70. The coolest new thing that I learned about was the Quadrant Count Ratio. I didn’t know there were more than one “correlation coefficient” although in retrospect it makes sense that there isn’t just one. I like that I now could explain how to find this one, whereas I still have no idea how Pearson’s correlation coefficient is calculated.

During her curriculum analysis project she compared a unit from the Interactive Mathematics Project (IMP) to a unit developing similar content (standard deviation) from her school’s newly adopted Carnegie Learning Program. She concluded IMP provided more cognitively demanding tasks for students, but both texts performed equally when compared to GAISE recommendations.

In September following a series of readings and tasks supporting understanding the randomization test for comparing experimental treatment and control groups, she writes about her own growth with TinkerPlots and Fathom and compares to CPMP-Tools :

I think that TP and Fathom allow for a deeper understanding than CPMP tools because you are building more of the functionality yourself. You have to work directly with the resampling process, so you understand exactly what is happening and how the means are being calculated. I understand better now how to use formulas in Fathom, and am gaining ability with Fathom. I haven’t used it much before, but this is the second assignment I’ve completed with it. I’m improving at using the sampler in TP .

In October, she attributes improved understanding of binomial distributions to her reading selection.

‘Is Central Park Warming?’ This article describes an activity that students can do to find out the probability that the warm temperatures in Central Park happened randomly. They then compare this to the exact mathematical probability calculated from the binomial distribution. This provided some insight to me about what the binomial distribution actually is!

For her curriculum implementation project, Claire partnered with a classmate to design and implement a statistical unit in her peer’s class. Together, they developed and reflected on the unit, its implementation, and impact on student learning. She was unable to implement a statistics unit with her own classes due to curricular limitation within the window for the course, so this partner project allowed her to still design and study the implementation for the purpose of the course.

In the final survey , she remarked,

This course has exposed me to literature in the field of statistics education which I can bring to other educators in my school. I understand the flow of statistical learning that should happen in middle and high schools . I think the most important pedagogical idea that I have taken away is that it is more important for students to construct and use their own measures in statistics before learning about and applying conventional measures . I very much feel like I have more resources for the future.

Her comfort level with statistical ideas jumped to 3.83 at the end of the course and her perceived facility with TinkerPlots and Fathom increased from 3 to 5 and 1 to 4, respectively. Claire is a case of a teacher from a highly regarded undergraduate institution with a bachelor’s degree in mathematics, master’s degree in education, and prior to her taking the course described herein, a very fragile understanding of statistics with few constructive ways in which to teach statistics. Throughout the course, she engaged thoroughly in tasks, investigations and all assignments and her written record indicates strong growth as a learner and teacher of statistics; that is, her TPSK improved dramatically. She communicates growing sensibilities about statistics as a discipline, teaching statistics in a learner centered, technologically oriented manner and alignment with professional guidelines for teaching.

4.3.2 The Case of Alexandra

Alexandra is a veteran high school science teacher who wrote, “I had a statistics course in college…many years ago. I have been teaching chi square and standard deviation to AP Biology students as part of the newest version of the course and feel I need more background.” Her overall pre-course statistical comfort level was 1.67. Following the initial F2F sessions and Module 1, Alexandra wrote,

Learning takes time, and good instruction loaded with experiences for students to develop their own understanding takes LOTS of time … I was impressed (overwhelmed?) by the topics listed in the Common Core for the Statistics & Probability strand. To me, even the Grades 6/7/8 expectations seemed very challenging. I thought the detailed descriptions and examples for Levels A/B/C as detailed in the GAISE Report were very helpful. I especially liked how in some cases the same activity or exercise was used at multiple levels, to distinguish the differences in understanding expected.

Following Module 2, she continued to express a sense of excitement, challenge and pedagogical insight related to her activity:

After using TinkerPlots myself, I don’t need the experts to convince me of how helpful this software tool could be in my classroom. However, extensive time in a computer lab is difficult to schedule in my school, and finding extensive time for any new activity is a challenge! I will explore using TinkerPlots to some degree, but what I found most interesting and potentially useful in this set of readings was the exercise described by delMas and Liu in “Exploring students’ conceptions of the standard deviation.” I can see how I could use the pairs of graphs on page 62 (they call them test items) to help my students understand standard deviation. In the study, students were asked to decide if, for each pair, the second graph would have a higher or lower standard deviation than the first. By predicting, calculating/confirming, and discussing these pairs of graphs 1 or 2 at a time, I believe my students could develop a better understanding of standard deviation.

For her curriculum analysis, Alexandra chose to explore two units from Core-Plus Mathematics Program (CPMP) (Hirsch et al. 2015), one focusing on standard deviation and the other on the χ2 test. Due to the mathematical demands of the χ2 unit, she sought out and discovered additional AP Biology (College Board 2015) resources to support her learning that she shared with the other science teacher in the course. The curriculum analysis project allowed her to build her own capacity to understand and teach two important statistical ideas to her students.

Following Module 4, she demonstrated her grasp of randomization testing and TinkerPlots facility:

I really had to follow the videos closely to do the randomizations initially, and even then I needed additional assistance (Thanks person1 and person2!). But I just corrected a quiz for my AP class…2 versions, means of 13.0 and 13.5. I was able to run a randomization test using TinkerPlots to confirm that the difference in the quiz means has p value of 0.63, so I think I can tell the students that one quiz was not easier than the other! While this (Module 4) was time-consuming, between the exercises from the unit, the videos, the software practice, and the readings, I feel very confident about my understanding of and my potential use of/teaching of these concepts/tests.

Alexandra’s curriculum implementation was exemplary. She presented thoughtful plans to build ideas of standard deviation with her students, used the CPMP unit from her curriculum analysis project and utilized CPMP-Tools with her students. She videotaped her classroom, collected student artifacts, and reflected on the experience with a colleague. Her reported insights showed her vulnerability as well as her strengths as a teacher and champion for students. Alexandra’s project illuminated her growth in statistical knowledge , technological statistical knowledge, her student’s growth in statistical knowledge, and ultimately markedly improved TPSK . She indicated a disposition toward continuing to grow and learn in this arena.

On the post-course survey , she wrote:

I learned a lot about statistical concepts and tools. I learned a lot about how students learn statistics . This will have a direct impact on my classroom and my students, as I am better prepared to help them understand measures of central tendency , variation, standard deviation, p values , and chi square. I benefited from the exposure to technological tools , but could use a lot more practice to feel truly comfortable using them. I learned about issues, challenges, and successes that other teachers have in teaching statistical content to students. I learned a lot from being a student and working in groups with others in completing some of the exercises. I feel even more strongly that students need to understand the concepts behind the statistical tools (what do they mean?). I have a much better sense of how the tools can be applied to our own data sets . I found the exercises that we completed in class in groups to be excellent learning activities in terms of concepts but also as models of teaching strategies. I enjoyed working through the CPMP Lessons; I really like their approach in introducing concepts gradually and before the equations and/or technological aids. They include pertinent examples and plenty of practice problems. I enjoyed working with TinkerPlots and Fathom and am convinced of their power in illustrating many statistical concepts (randomization, value of large data sets) . Great experience. I would not have signed on if it had been offered online only. The face-to-face sessions were particularly beneficial, and I believe the online modules worked better given that we knew who the other students were when posting comments, questions, etc.

Her post-course statistical comfort level was 2.5. This rating seems to confirm her awareness of the complexity of statistical learning, but perhaps underestimates her actual learning. It may well illustrate this teacher’s acknowledgement of learning while also recognizing the need to learn more. She seemed to recognize a state of personal disequilibrium while at the same time developing agency in the statistical teaching and learning realm. Alexandra’s reaction to the blended format of the class suggests a real preference for face-to-face interaction to build community and it foregrounds potential reasons why hybrid statistics courses with face-to-face and virtual components may support better learning for students than online only experiences (Meyer and Lovett 2014).

4.4 Summary Perspectives Across Participants

Each of the other eight participants’ individual storylines vary, yet they each demonstrated improved TPSK . As Table 12.5 illustrates, eight of 10 participants assessed their statistical knowledge to have increased. One student’s rating from pre- to post did not change; however, from the perspective of the instructor, this student demonstrated increased statistical knowledge. During his curriculum implementation project, he designed a set of lessons to introduce his students to randomization testing for comparing results from two groups in an experimental context. Randomization testing as a means for comparing experimental and treatment groups was unfamiliar to all participants prior to the course, thus this represents significant growth in statistical knowledge . The participant with a negative gain score represents a student who demonstrated remarkable engagement with all aspects of the course and a growing facility with statistical ideas and tools; however, the student may not have felt completely competent yet. As the participant mentioned in her curriculum implementation project,

Table 12.5 Summary of participants’ self-reported statistical understanding pre- and post-intervention gain scores as the self-reported average gain in facility with TinkerPlots and Fathom , and instructor assessment of TPSK demonstrated through curriculum implementation projects

It [the course] benefitted me by giving me an awareness of statistical learning and concepts at the high school level. Some of the concepts we learned about I don’t think I realized were of the statistical realm. It just made me realize that there is so much I don’t understand and I feel like a novice. The course just really gave me an awareness that statistics is different than math and I need to approach it differently with my students (Michelle).

The TSK scores in Table 12.5 represent self-reported gain scores with Fathom and TinkerPlots . As the data show, every participant increased their TSK with at least one technology. The two participants whose gains were zero or negative had rated their facility highly on the initial survey and likely discovered there was much more to learn than they had realized. As indicated in Table 12.1, gain scores for both Fathom and TinkerPlots were significantly greater than 0.

Finally, each participant’s completed curriculum implementation project provided evidence of TPSK growth. Three levels of TPSK were evident through the projects. At the lowest level (✓), projects fell into one of three categories: (1) largely algebraic reasoning rather than statistical reasoning but utilized technology productively; (2) relied on previously familiar technology and content but incorporated more student-centered activity; and (3) relied heavily on a partner to do the technological or statistical heavy lifting. At the (+) and (++) levels, participants’ projects showcased greater evidence of stretching in the direction of engaging learners with less familiar content using tools and materials that were initially unfamiliar. Projects rated (++) were exceptional and represented thoughtful and thoroughly documented and analyzed products. Two of the three projects in this category were from science teachers.

Document analyses further supported the following claims: (1) science teachers in this environment appeared unusually receptive to learning statistics and adapting their learning to their practice; (2) teachers with the highest self-reported statistical comfort level tend to be those with significant statistics teaching experience and least receptive to new ideas; (3) modeling using resampling ideas such as randomization testing in technologically-conducive environments is accessible and beneficial; (4) analyzing curriculum materials using GAISE (Franklin et al. 2007), National Council of Teachers of Mathematics (NCTM 2000, 2009) , Next Generation Science Standards (National Research Council 2013), and Advanced Placement Biology (College Board 2015) guidelines is worthwhile for teachers; and (5) pushing for teachers to design, implement, and reflect on students’ statistical learning is formidable yet impactful.

5 Discussion

Creating experiences with potential to directly impact participants’ capacity to design, implement, and reflect on statistical units in their classrooms is a complicated matter. Finding ways to support and nurture, while maintaining high expectations in a virtual environment is daunting. It requires individualization and personal touch that is feasible when N = 10. Sequencing topics, amassing appropriate curricular units and readings for nourishment and exploration, building and sustaining productive F2F and virtual communities of practice with teachers representing urban, rural, suburban, middle and high school mathematics and science contexts is a complex endeavor and requires a well-stocked arsenal of resources.

Teachers experienced shared activities during F2F sessions that challenged them to make sense of statistical concepts , with and without technology, as well as provide pedagogical modeling to consider. These sessions developed a sense of community and fostered relationships that promoted productive virtual collaboration . Because the virtual modules and curriculum projects allowed students to “choose their own adventure,” they could target concepts and resources most relevant to their work or interests. This autonomy appeared welcome and novel for teachers.

Ten teachers completed eight statistical curriculum implementation projects requiring them to reflect on their students’ learning. Six students worked independently and four students partnered up. Every project incorporated dynamic statistical technology, some multiple tools. Each project demonstrated student learning through collected artifacts including classroom video and student work samples. Given the written documentation of the plans, descriptions of the implementation, and reflections on the unit with at least one peer, it is clear that all of these teachers extended their TPSK . Their enactments were informed by literature and course experiences. They often referred to Core-Plus Mathematics (Hirsch et al. 2015) units and the GAISE (Franklin et al. 2007) document for guidance and courageously went live with real students with new and challenging content while utilizing and helping their students use new tools. Evidence in the form of self-assessments , instructor assessment , and participants’ written artifacts suggests that the doer to designer inspired blended course design with summer/fall timeline has been impactful for teachers’ personal learning of statistics and modeling relevant for the secondary curriculum, thus improving their TPSK. Furthermore, there is mounting evidence that teachers’ thinking about statistical instruction has evolved toward a more sense-making , activity-based, technology-oriented perspective, suggesting the approach is promising.