Reflective Waves: Evaluating SEISMIC with The Center for Education Design, Evaluation, and Research

By Ashley Atkinson

Edited by Nita Tarchinski

Since 2019, SEISMIC has partnered with the Center for Education Design, Evaluation, and Research (CEDER), a center within the University of Michigan Marsal School of Education that provides expertise on topics relating to teaching, learning, leadership, and policy. To advance equity and excellence in education, they offer services for educational design, evaluation, and research through the form of collaborations with those they work with. CEDER has provided SEISMIC with external evaluation throughout the duration of our project, giving the collaboration critical feedback from participants that has informed change within SEISMIC.

CEDER Evaluation Coordinator

Vicki Bigelow

Recently, CEDER completed a summative evaluation of SEISMIC (which will soon be available on the SEISMIC website). To celebrate this and learn more about CEDER’s work for SEISMIC as a whole, I reached out to Vicki Bigelow, an Evaluation Coordinator within CEDER and lead evaluator for SEISMIC. As an evaluation coordinator, Bigelow provides support to U-M administration, units, and individuals looking to evaluate their education programs and/or research. She plans and conducts evaluations, writes evaluation plans for grant proposals, and shares her evaluations across various platforms. Additionally, Bigelow also hosts workshops on creating logic models and designing programs. When I met with Bigelow, she was able to tell me more about CEDER’s work with SEISMIC and how the collaboration has adapted to the feedback they’ve received.

During the first year of their partnership, CEDER worked with SEISMIC to collect and analyze data that detailed how SEISMIC was functioning in areas such as recruitment, activities, and effectiveness. This type of evaluation is known as process evaluation, where the focus is on assessing whether activities or processes are achieving the program’s goals and mission. The evaluation team looked at how money was being spent, the impact of events such as the Speaker Exchange Program and the first Summer Meeting, and structures within SEISMIC like the Collaboration Council and Working Groups. “Qualitative research has three components: interviews, observations, and document analysis. I’d say we’ve done all of those to some degree,” Bigelow says. CEDER then identified links between their findings and appropriate short-term program outcomes. This led SEISMIC to develop themes for Working Groups to focus on.

Because SEISMIC has adapted in response to CEDER’s feedback, the scope of SEISMIC’s evaluation has evolved over time. This is known as a developmental evaluation approach. When done correctly, developmental evaluation can be quite an involved process, Bigelow explains. However, this method allows for continuous adaptation and real-time feedback.

Below is a graphic by CEDER summarizing the focus areas of SEISMIC’s evaluation over time:

When reflecting on SEISMIC’s responses to feedback they’ve received, Bigelow can recall her work surrounding the 2021 Summer Meeting clearly. During the event, which was virtual, participants began giving feedback that made it clear they didn’t feel involved. This resulted in SEISMIC changing the agenda of the meeting to include a discussion about how SEISMIC was helping and/or hurting efforts to reduce collective harm and ways in which setbacks could be addressed. Participants gave feedback anonymously, and CEDER analyzed their responses. Common themes included calls to clarify and organize around a common set of goals and objectives as well as placing diversity, equity, inclusion, and justice (DEI-J) at the center of SEISMIC’s efforts.

In response to this discussion and data analysis, the SEISMIC Collaboration Council released a letter addressing the feedback members had given and detailing the steps the Collaboration Council would be taking to improve the collaboration as a whole. This included the creation of a SEISMIC Task Force dedicated to reexamining SEISMIC’s structures, procedures, and practices. “Sometimes data analysis is a plan… Other times, it’s in real-time,” Bigelow says. At the 2022 Summer Meeting, members reported that they had noticed positive changes related to DEI-J, but that efforts needed to continue.

Another evaluation effort Bigelow remembers working on is the Weeks of SEISMIC, which provided local members with opportunities to become more involved with SEISMIC work and share their own work with their community. The Weeks of SEISMIC began in 2022 due to both calls for more local campus activity and institutions beginning to open back up after the start of COVID-19. “It’s something that was a big shift as a result of what they were hearing from [our] analysis,” Bigelow says. From both surveys and interviews, CEDER determined that these events were valuable to participants as they provided relevant sessions, in-person work time, and opportunities to socialize.

Bigelow attributes much of the success of CEDER and SEISMIC’s partnership to Project Manager Nita Tarchinski. Having someone to visit each of the SEISMIC institutions and build “that level of trust at the site level” was immensely helpful with qualitative data collection. Additionally, Bigelow credits Tarchinski with being great at seeing the bigger picture and avoiding pitfalls such as scope creep, where more and more focuses are added to an evaluation plan.

Overall, CEDER has analyzed 45 interviews, 679 survey responses, 12 focus groups, and 261 sticky notes. This has led to the creation of 15 reports, 9 institution posters, 7 non-traditional reports (such as snapshots, thematic analyses, and other deliverables designed to provide rapid feedback or retain participants’ voices), 6 slide decks, and 1 packet. The feedback generated from these analyses drove strategic changes within SEISMIC, allowing the collaboration to more effectively promote equity and inclusion in STEM education.

As SEISMIC 1.0 draws to a close and CEDER’s work with the collaboration ends, Bigelow says she will miss working with the project. Still, she’s proud of the work that the partnership accomplished: “This project has been especially good at using their feedback and data to make decisions about what needs to be revised or what they should continue.” The feedback and recommendations CEDER has provided will remain integral as we think about what a future multi-institutional SEISMIC 2.0 collaboration should look like.

 

Ashley Atkinson

Ashley Atkinson is a Program Assistant for SEISMIC Central, ensuring that SEISMIC initiatives have the help they need to run smoothly. Her primary responsibilities include maintaining the SEISMIC website, managing the Newsletter, and supporting projects. As an alumnus of Michigan State University, Ashley is passionate about equity and inclusion in STEM alongside science communication. She is currently pursuing an MA in Science Writing and Johns Hopkins University.