Innovation Abstracts Banner

Volume XXXVIII, No. 22 | October 7, 2016

Creating a Culture of Evidence With Course-Level Dashboards

At many community colleges, faculty and administrators have a long list of unmet data needs. In order to submit a data request to a typically small, overworked Institutional Research (IR) department, faculty and administrators often need to know specifically what they are looking for—and then wait until IR has time to get back to them, which could be weeks or even months.

Yet we are increasingly asked to help improve student outcomes, especially regarding retention, completion, and closing achievement gaps, with little or stale data from which to work. And while IR departments are often able to provide data at the institutional level for retention and completion, current data at the departmental and classroom level are often unavailable. How do we increase student success at the departmental and course level without knowing how students are currently doing at those levels?

At Pierce College, we found ourselves facing this data deficit. We knew we wanted faculty and administrators to have access to timely, accessible data on their desktops. We determined we needed two things to start: a data dash-boarding tool that would be simple and powerful to use, and the human capacity to create it. Fortunately, we had access to Tableau software’s dash-boarding tools through the Washington State Board of Community and Technical Colleges, and we had just hired a third person in our IR department with the skill set needed to create the dashboards we needed. The resulting dashboards allow users to dig into the data, and sort them using a wide array of variables. They leveraged the collective knowledge of faculty and administrators to create a more data-transparent and data-driven decision-making culture. In essence, the Tableau dashboards have been a game changer.

In 2010, we began earnest work to increase student retention and completion, while also closing achievement gaps. In 2012, we decided we weren’t moving quickly enough and joined the Association for Talent Development (ATD). As we began working with ATD, we started to share disaggregated institutional data at summits and department meetings on a regular basis. It quickly became apparent that even though we were sharing details about our students’ success at an unprecedented level of detail, it was still at a scale difficult for individuals to act upon. We could see our overall retention rates and the differences in student populations, but we could not see how students were doing in specific fields, such as chemistry or psychology.

An additional insight occurred when we began a complete redesign of our precollege math sequence. Like most colleges, the majority of our students started in precollege math, and the majority of those students never completed a college-level math class. As we began placing students in higher level math classes with the necessary support strategies, shorten the precollege sequence, and create STEM and non-STEM pathways, we determined that we needed to hold a series of focus groups to better understand the student experience.

We asked students who were successful in our existing precollege sequence to tell us what mattered most in their success. The response was surprising and a bit troubling: students felt that the biggest factor in whether they would be successful was which instructor taught the course. With certain instructors, they told us, you would almost certainly be successful, with others you would not.

Based on this finding, we disaggregated our precollege math data based on instructor. What we found confirmed what the students told us. For some instructors, the number of students successfully completing the course was consistently around 30 percent, while for others it was consistently 95 percent. For students, being successful in precollege math was a roll of the dice. As we began to dig deeper into the data, it became clear that this range in success rates by instructor was ubiquitous across the college. Furthermore, not only were faculty not aware of the range in student success by instructor, neither were administrators. The faculty realized that they needed a way to see their course-level data term to term, as well as the data of their colleagues, and they asked if that were possible.

In responding to this request, the question was not whether we should allow everyone at the college to see course-level data, but how we could do it in a manner that would create a sense of curiosity and a desire to address the issue of student success, especially on the part of faculty. How could we avoid creating a punitive culture of blame and having a handful of resistant faculty thwart our efforts? We believed that, faced with the facts of how student success varied across departments, faculty would take on this issue and the responsibility of norming how they were grading and interpreting course outcomes. Our focus became how best to get this level of data into the hands of those individuals who ultimately would need to do the work of norming outcomes and grading.

To roll out the dashboards to faculty, staff, and administrators, we decided to borrow a page from E.M. Rogers’s 1962 diffusion of innovation theory. The theory describes the adoption of innovation as being very dependent on human capital; there must be wide adoption in order for the innovation to be sustained. Rogers divided those adopting innovations into four groups: early adopters, early majority, late majority, and laggards. Our idea was to not mandate that faculty, staff, and administrators use the new dashboards, but to first approach the most innovative individuals to be early adopters, or beta testers. As the earliest adopters shared their impressions about how useful and insightful the dashboards were, more and more faculty, staff, and administrators began asking for access to the dashboards. In this voluntary manner, we quickly reached a tipping point at which a majority of faculty and administrators were using the dashboards. Today, users have live access from any device, and nearly all full-time and an increasing number of part-time faculty use the dashboards. All administrators and many staff do so as well.

Today, we continue to develop new dashboards; there are more than 250 dashboard users, with faculty representing 67 percent of all users. The dashboards have been a game changer here at Pierce College. Decisions are made based on evidence, whereas in the past they might have been made based on anecdotal information. We no longer wait so long for answers to our data-related questions that we forget what we asked in the first place! Very importantly, our IR staff has been relieved of their duties as data “wait staff” and can engage in more meaningful and in-depth institutional research.

Most important, we are seeing real changes in student outcomes. With norming work completed or underway in nearly every academic department, student retention, course completion, and degree completion are rising. Faculty now know how students are doing in their classes and those of their peers and that, we are finding, is the most important precondition for improvement.

Thomas Broxson, District Dean, Natural Sciences and Mathematics

Join Tom in NISOD’s October 12 webinar, “Empowering Faculty With Course-Level Data to Drive Institutional Change,” as he continues the discussion about providing faculty and administrators with data that promote student success. Click here to view the webinar.

For further information, contact the author at Pierce College, 1601 39th Avenue SE, Puyallup, Washington 98374. Email: tbroxson@pierce.ctc.edu

CCSSE Benchmarks: CCSSE Standardized Benchmark Scores (2011 and 2014) for Academic Challenge, Active and Collaborative Learning, Student Effort, Student/Faculty Interactions, and Support for Learners. View graph.

Selection criteria includes campus, enrollment status, race/ethnicity, gender, first-generation, developmental, credits completed, and age.

Capacity and Fill Rate: Class capacity and fill rates by campus and division. View graph.

Selection criteria include program/discipline, time of day, and instructor status.

Course Enrollment and Grade Distribution: Enrollment and grade ranges, annual and quarterly averages, and decimal and letter grade (e.g., incomplete, withdraw) distributions. View graph.

Selection criteria include year, quarter, modality, department, course number, placement test, campus/location, instructor status, and instructor name.

Custom output filters include age, gender, race/ethnicity, enrollment status, family status, first generation, Running Start status, veteran benefits status, placement test status.

FTE and Enrollment Report: FTE targets, comparison, and transactions with headcount and demographics. Comparison and transactions by academic year and quarter. View graph.

FTE comparison selection criteria includes academic year, quarter, funding type (state, contract, self-funded), FTE type (reportable, non-reportable), and site/location.

Custom output filters include administration unit, department, age, gender, race/ethnicity, enrollment status, family status, first generation, Running Start status, veteran benefits status, Pell grant recipient, international, work first, and worker retraining.

Headcount and Demographics: Unduplicated headcounts for students with breakouts for total headcount, race/ethnicity, kind of student (e.g., workforce, academic, basic skills), highest enrolled programs, age, first generation, gender, family status, enrollment status, and veteran benefits status, and a mapping/location function. View graph.

Selection criteria include year, quarter, modality, department, course number, placement test, census race, campus/location, instructor status, and instructor name.

Quarterly Waitlist: Current quarter waitlist by Campus and Division. View graph.

Selection criteria include waitlist status, campus, division, program/discipline, course number, and course start time. Running Start status and enrollment status are also available.

Student Achievement Initiative (SAI): SAI point (2012- 2015) funding -% share of system, point comparison, and trends with benchmark colleges and Washington state CTCs. View graph.

Point comparison and trend selection criteria include student achievement point measures, year, and comparison colleges.

Student Degree and Certificate Completion: Academic awards based on five years (2010-present) of ATD (new, degree-seeking students) fall cohort data. Also, includes distribution of all degrees and certificate completions by program broken out by age, race/ethnicity, family status, gender, first generation, enrolment status, Running Start, and Pell status. View graph.

Cohort selection criteria include cohort year,completion type (no degree/certificate, associate’s degree, certificate, high school completion, workforce, and general studies).

Degree by program selection criteria include academic year, cohort year, award type, education program code, and degree title.

Student Retention: Five years (2010-present) of ATD (new, degree-seeking students) fall cohort data, broken out by annual and quarterly retention rates, age, race/ethnicity, family status, gender, first generation, Running Start, and Pell status. View graph.

Cohort selection criteria include cohort year.

Demographic selection criteria include cohort year, selected demographic, and academic quarter.

Subsequent Course Completion: For questions about the balance between success rates and maintaining course rigor. View graph.

Initial course selection criteria: year, quarter, course, instructor, grade (decimal) quartile, time of day, and modality.

Subsequent course selection criteria :year, quarter, course, campus, time of day, and modality.

Custom output filters include age, gender, race/ethnicity, enrollment status, family status, Pell recipient, first generation, Running Start status, veteran benefits status, and placement test status

Successful Course Completion: Number and percentage of students successfully completing their classes (2.0 or higher). View graph.

Selection criteria include year and quarter, modality, time of day, department, course number, item number, placement test status, campus/location, instructor status, and instructor name.

Download PDF