Giter Site home page Giter Site logo

centerforassessment / ncme_2022_training_session Goto Github PK

View Code? Open in Web Editor NEW
0.0 4.0 1.0 22.96 MB

Data Validation and Analysis in the Era of COVID-19

Home Page: https://centerforassessment.github.io/NCME_2022_Training_Session/

License: Other

academic-impact covid-19 learning-loss rstats sgp sgp-analyses

ncme_2022_training_session's Introduction

NCME 2022 Training Session

Data Visualization and Analysis in the Era of COVID-19

In this two, half-day NCME training session, participants will be introduced to suite of R based analyses that can be used to address numerous education assessment data analysis and validation issues that arose due the the COVID-19 pandemic. One of the consequences of the disrupted education due to the pandemic has been cancellation, interruption and modification of the educational assessment of students. For example, in spring 2020, just after the pandemic began in the United States, all state summative testing was cancelled after the United States Department of Education issued assessment waivers to states. Similarly, as student education took place remotely, interim assessment providers have altered their products to allow students to take tests while at home. These and other alterations to standard testing protocol present unique challenges to psyshometricians and data analysts who validate and use these data.

Several practical issues emerged due to the pandemic that will be focal topics of instruction during this training session:

Academic Impact Analysis

A common use of student assessment data is to try and determine the academic impact associated with the pandemic. Two complementary ways of investigating academic impact are to look at change in academic attainment (i.e. status) and change in academic growth. Due to the disruption of state assessment in spring 2020, many states were tasked with analyzing their state testing data across a span of two years. Analysis of status and growth across a two-year time span was common in many states. Using approaches developed in our work with states, we show how skip-year status and growth comparisons can be conducted in order to investigate the academic impact of the pandemic on students.

Missing data and changing enrollment

Due to the pandemic, numerous states experienced significant declines in student participation in state assessments. Additionally, due to high student mobility during the pandemic, several states experienced high rates of change in terms of student enrollment. Questions immediately emerged about what impact missing data and changing enrollment would have on comparability to 2019 results. Comparisons between 2021 and 2019 are essential as part of any investigation into academic impact. If missing data or changing enrollment are substantial, then those comparisons are threatened.

Non-standardized testing situations

The pandemic forced states and assessment vendors to relax rigid test administration standards in order to collect achievement data. Student receiving instruction from home, for example, took interim assessments from home. And in some states hardest hit by the pandemic, students were administered the state summative assessment at home. Non-standardized testing conditions bring into question comparability and the validity of the overall scores.

Schedule

Overview & Background: April 10, 1:00 to 2:00 pm @dbetebenner

The COVID-19 pandemic caused numerous disruptions and alterations to education in the United States. In the first hour we provide attendees with an overview of the training session followed by an introduction to the analytic approaches that will be investigated during the training session. The COVID-19 pandemic caused numerous disruptions to student education and alterations to student testing. We will familiarize attendees with some of the major ones (e.g., cancelled testing).

Overview & Background Presentation

Software Preparations: April 10, 2:00 to 2:30 @dbetebenner & @adamvi

Training session participants will need to have R installed with several R packages in order to follow along with analyses being conducted as part of the training session.

Software Preparations Presentation

Break: April 10, 2:30 to 2:45

Academic Impact (Part 1): April 10, 2:45 to 3:45 pm @dbetebenner

Investigating pandemic related academic impact on student learning. During the third and fourth hours, participants will be introduced to several ways to investigate the academic impact students encountered due to the pandemic. Including: Skip year baseline referenced growth analyses. Using a toy data set that mimic 2020 test cancellations, students will learn to calculate academic impact and use those results to investigate impact by demographic subgroups.

Academic Impact Part 1 Presentation

Descriptive Examination of Missing Data Patterns: April 10, 3:45 to 4:45 pm @ndadey

Descriptive examination of missing data patterns.

Missing Data (Part 1) Presentation

Summary and next steps: April 10, 4:45 to 5:00 pm @dbetebenner

Wrap-up question and answer of day 1 and an overview of what will be discussed during day 2.

Missing Data (Part 2): April 11, 1 to 3:00 pm @adamvi

Multiple Imputation with Missing Data: A substantial issue associated with assessment data from 2021 was whether aggregate results (e.g., school level results) could be compared to previous year due to missing data (non-tested students) and changing enrollment. As part of our work with states we developed numerous multiple imputation procedures to help understand missing data as well as propensity score matching procedures to accommodate changing enrollment. Students will learn about these procedures and use example data to see how missing data can interfere with inferences one makes from assessment data.

Multiple Imputation with Missing Data Presentation

Break: April 11, 3:00 to 3:15

Academic Impact (Part 2): April 11, 3:15 to 4:45 pm @dbetebenner

Investigating pandemic related academic impact on student learning. During the third and fourth hours of day 2, participants will conduct status and growth based academic impact analyses. methods of looking at impact based upon propensity score matching Andrew Ho’s Fair Trend method as used to look at academic impact. Using a toy data set that mimic 2020 test cancellations, students will learn to calculate academic impact and use those results to investigate impact by demographic subgroups.

Academic Impact Part 2 Presentation

Wrap-up/Q&A: April 11, 4:45 to 5:00 pm

Wrap-up question and answer for the training session.

ncme_2022_training_session's People

Contributors

adamvi avatar dbetebenner avatar

Watchers

 avatar  avatar  avatar  avatar

Forkers

erge324

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.