Tag Archives: completion by design

Dispatch from the Lone Star State

I just got off the road from a couple of meetings in Texas — one of several states doing some truly innovative things in the college completion movement.  Here’s a summary of what I learned:

Texas Higher Education Coordinating Board.  My colleague Elizabeth Gonzalez and I spent an hour meeting with folks from the Coordinating Board.  We were joined by two of our partners working with us on the Completion by Design project – Melissa Henderson (Educate Texas) and Amy Welch (Lone Star College).  We covered a lot of ground and I found it to be really valuable.  Specific areas of interest included:

  • Diagnostic assessment.  THECB just announced an RFP for a statewide diagnostic assessment that will be more closely aligned with high school curricular standards (known as STAR).  This follows a trend we’ve seen in other states (VA, FL, NC), but – given the size of Texas – it represents the biggest opportunity in terms of impact on large numbers of students.  Their plan is to select an instrument this summer, pilot this fall, and move to statewide implementation in fall of 2013.  THECB staff are very knowledgeable on this topic and demonstrated incredible thoughtfulness in their RFP design.  In addition to being diagnostic and aligned with high school learning standards, they have built in plans to recalibrate the assessment on a regular basis because they know that this should be a continuous improvement endeavor rather than a “one time fix.”  Their north star is improving alignment around definitions of “college readiness,” which should help smooth out the transition from high school to college.  This seems to be the right goal and – if implemented well – could create real opportunity for large scale change.
  • Performance funding.  The Coordinating Board is currently considering various models of performance funding (including the WA “momentum funding”).  Though they are unsure what this might look like in TX, it was clear that there is an appetite to rethink how institutions are funded.  This is being driven in part by budgetary realities and in part by a desire – on the part of the Coordinating Board and some college level leaders – to better recognize colleges who do a good job of helping students persist and complete.  They are facing a couple of key design questions at the moment (should the model use base funding or incentive funding?  How can the model avoid unintended consequences like “creaming” and grade inflation?), which they will continue to trouble shoot.

 

  • Vertical alignment teams.  I was amazed to learn that THECB and TEA have been collaborating around improving instructional and programmatic coherence across K-20 through faculty led “vertical alignment teams.”  These teams (which are connected in some ways to the Pathways project) are working to make sure that expectations for student success are clearly spelled out at each step of the way.  This has been a tricky enough dance at the K12 level and I tip my hat to them for bringing this conversation to the higher education sector.  At a local level, our Communities Learning in Partnership (CLIP) and Partners for Postsecodnary Success (PPS) district and community college partners have begun similar pilot efforts through faculty led professional learning communities. Almost all sites are addressing curriculum alignment for math and English.  Furthermore, as our work in Completion by Design has demonstrated, many students do not know what courses they need to take to get the credential they want.  Projects like this one are a great starting point for answering some essential questions like (a) what courses are required to complete which programs and (b) what competencies are inside these courses?

Elizabeth and Melissa before our meeting with the Coordinating Board

UT Austin – Center for Community College Student Engagement (CCCSE).  I also had the good fortune of spending a day with Kay McClenney and her team at CCCSE as a part of a Technical Advisory Board.  This team is sitting on unreal amount of data collected through their four survey instruments (two of which are targeted at students, one at faculty and one at college leaders).  The task at hand for the advisory board was to help their team think of the best applications and topics for future inquiry.  The conversation was productive, but not without its challenges.  The CCCSE team is on the heels of unearthing what I think are three important pieces of knowledge:

  • A better taxonomy of institutional practice.  With their new institutional survey instrument, CCCSE has been able to decode the component parts of some common community college interventions (e.g. learning communities, student success courses, etc.).  Practitioners have known for a long time that there is tremendous variation across colleges in the composition of these interventions, but they have never known how these practices vary.  Thanks to CCCSE’s work, we’re close to uncovering a more precise taxonomy, which includes things like study skills, learning style assessments, and tutoring.   This can help cut across the “fads” and nicknames of community college practices to get to the heart of the matter.  Colleges can call these things what they will – CCCSE is interested in learning about the component parts and services that students are receiving.  Their team is still struggling with how best to empirically validate the specific components that they are asking colleges to identify, which will be a critical step.  To quote one participant: “the hardest part of any factor analysis is naming the factors.”
  • The relationship between institutional practice and student engagement.  The institutional survey will give CCCSE additional explanatory power when it comes to understand what colleges can do to better engage their students.  By matching survey institutional survey responses with student responses about their relative levels of engagement, they will be able to draw a stronger line between the specific institutional practices that are most closely related to student engagement (and, importantly, those that are not).

 

  • The relationship between student engagement and student outcomes.  Once these two pieces of the puzzle have been solved (no easy task!), CCCSE will be able to take on what their team sees as the biggest opportunity of all: uncovering the relationship between student engagement and actual student outcomes (e.g. course completion rates, term to term persistence, etc).  I was surprised to learn that – beyond logical assumptions –  we don’t really have any empirical evidence of a relationship between engagement and outcomes.  It’s not clear yet when this will happen or at which colleges, but their goal is clear: they want to match student survey data to transcript level student unit record data to better connect the dots between engagement and outcomes.  This has the potential to be huge value added to the research community and feed into a tool to provide feedback to college leaders.
Tagged , , , , , ,