Learn more about JFF
Learn more about JFF

Findings & Recommendations


Implementing Outcomes-Based Quality Evaluation for Postsecondary Providers in Indiana

 

Findings & Recommendations

The core findings and recommendations from our research and analysis are included below, organized by section of the EQOS framework. In some cases it made sense to combine sections of the Framework (e.g., Learning and Completion, Placement and Earnings) as the issues identified and corresponding recommendations are consistent or linked across those sections. 

Our findings are organized by the following:

  1. Learning & Completion
  2. Return on Investment (ROI)
  3. Survey Data: Student/Alumni Satisfaction
  4. Survey Data: Job Placement, Earnings, and Employer Satisfaction
  5. Additional Issues and Recommendations
Themes of the findings within and across sections:

We have also identified three common underlying themes for the majority of the issues we have identified, and have cross referenced the findings and recommendations by these themes within each component of the EQOS framework. The three consistent themes of our findings are:

  1. Terminology and Definitions
    Use of common terminology and language is critically important to ensuring “apples to apples” comparisons across providers. Even with the common data definitions developed and provided in this pilot, interpretations varied by provider when applying to their own data systems, and even by staff within a provider. 

  2. Organizational Capacity
    Organizational capacity differences shape the ability to do various aspects of this work and the types of challenges encountered; we will need to be able to design for these capacity differences on the front end to ensure the consistent data gathering needed for high-quality analysis and output. 

  3. Culture/Mindset
    Providers vary significantly in how they see themselves and their roles in the student lifecycle from recruitment through program completion and into employment.  Some providers take an active role through this entire continuum and others have a much narrower mindset of their role being to/through certification but no further into employment or other downstream engagements with alumni or employers.
 

1. Learning and Completion

 
Terminology and Definitions
  1. Assessment. Providers had varying operating definitions of “assessment." For programs with a single targeted certification exam, this exam was treated as the “assessment” for the program, where other providers had a more expansive definition to include any summative or formative tests or assessments of learning progress during the course or program. 
  2. Cohort vs. Class. Educational design uses cohort as a specific structure that emphasizes peer-learning, shared experience, and intensive instruction to improve student outcomes. However, many of the providers we spoke with used it synonymously with the term “class” without the components that would traditionally define a cohort, including classes with rolling admissions.
ORGANIZATIONAL CAPACITY
  1. Assessment Literacy. The capacity to design, deliver, and interpret quality assessments varies by provider, resulting in disparities in the ability to measure learning outcomes and impeding the ability to compare these outcomes across programs and providers.
Culture and Mindset
  1. Open enrollment & completion. A concern was raised that providers with open enrollment or similarly inclusive access might be negatively impacted when comparing completion and other outcomes measures with providers using more restrictive entrance requirements. 
  2. Assessment Mindset. The issue above of varying definitions for “assessment” has its roots in part in a deeper mindset difference. Providers focused singularly on certification have built their organizations and cultures on optimizing for that one exam per program; it permeates most every aspect of the organization. Programs structured as traditional post secondary courses or as part of a larger degree track were more likely to have a culture of ongoing assessment and iterative learning.

2. Return on Investment

 
Terminology and Definitions
  • The program cost items included in this calculation varied by provider (including some who used list price) which will impact the quality of the analysis and comparisons. 
ORGANIZATIONAL CAPACITY
  • Smaller organizations in general can pull this information together from a single source, whereas larger organizations need to operationalize and streamline gathering this data across business units and data sources internally (e.g., Financial Aid, Career Services, etc.), giving smaller organizations an advantage.
Culture and Mindset
  • Related to the Terminology issue for ROI above, in some organizations the default mindset is to equate program cost with “price charged” rather than cost of delivery. This was relatively straightforward to address through reference to the definitions for Program Cost, but it is important to recognize that measuring program cost is not the default mindset for all organizations, or for all staff within an organization.

3. Survey Data: Student/Alumni Satisfaction

 
Terminology and Definitions
  • Providers appreciated having the EQOS template questions to build their surveys from, and some used the questions verbatim. Other providers, however, preferred personalizing and adapting the questions to their organizations and alumni, which could negatively impact comparability across providers.  
  • For accredited programs, there are required questions that need to be included, and the providers want to send a single survey that blends the EQOS questions with program-specific questions related to accreditation.
ORGANIZATIONAL CAPACITY
  • Ability to gather this data. For smaller organizations, gathering alumni (and employer) data is both new and a huge lift; it is not something they currently have internal capacity for or experience in. 
  • Quality of the data gathered. As with any self-reported data, there is strong potential for inconsistent and/or poor quality in the responses received, and little capacity to address those quality issues within most providers today.
  • The amount of time passing between program completion and survey is a huge factor in gathering this information, with the ability to contact alumni decreasing significantly after six months. Many/most providers are not set up to track and stay in contact with alumni beyond a very short window after completion. According to several providers, this is made more challenging by the number of students and alumni in transitory housing situations and who lack any consistent online presence/access.
  • Follow-up outreach (emails, texts, calls) to improve response rates is time consuming, best practices aren’t clear, and many organizations do not currently have staff allocated for this.
  • In addition to the challenges of low response rates, many/most of the surveys returned are incomplete. So while average survey response rates may be 20%-30%, response rates on any particular question might be significantly lower.
CULTURE AND MINDSET
  • In many organizations, the mindset is that “student satisfaction” is something captured during a brief survey on the last day of the program related specifically to their experience within the program, as opposed to their satisfaction with the downstream employment outcomes that the program supported/enabled.  Extending the definition of satisfaction to downstream longitudinal outcomes will therefore be a culture shift for many organizations.

4. Survey Data: Job Placement, Earnings, and Employer Satisfaction

 
Terminology and Definitions
  1. Employers do not define their industry sectors in straight alignment with programs or certifications. For example, large manufacturers may include HVAC and many other certified roles not explicitly tied to “manufacturing”; tying their feedback to a specific program area can be problematic.
ORGANIZATIONAL CAPACITY
  1. Related to the culture/mindset finding below, providers who have not previously seen their role as including employer outreach are relying on students and alumni to identify their employers. As a result, with ~20% - 30% alumni survey response rates, the employer information gathered through these surveys is a very limited percentage of the total employment picture. 
    1. Furthermore, in many of those survey responses the employer details sections were left blank (one provider had zero students answer the employer-related questions). Providers in this situation will therefore get a fraction of a fraction on key employment and earnings information through their own survey efforts.
  2. For some providers (e.g., Ivy Tech), many program completers go on to finish 4-year degrees at another college/university, and from there into employment. Ivy loses connection to the students during this transition and is therefore cut off from the downstream employment information.
Culture and Mindset
  1. Some providers do not see their role extending to engagement with alumni and/or employers; their focus extends from recruiting through completion/certification, but not beyond that point. 
  2. Notable Exceptions: providers with a strong focus on externships and/or built-in employer partners had this information readily available.

5. Additional Issues Not Specific to a Framework Section or Theme

 

DATA PRIVACY AND USAGE
  1. Several providers mentioned concerns about data privacy and how student and employer information was being tracked and used. 
  2. Students frequently did not respond to the questions about employers, placement, or wage data. It’s not clear why this was the case, but it may be indicative of concerns about data privacy and how/where this information is to be used. 
  3. One provider specifically mentioned having a large percentage of recent immigrants in their programs, and that there may be trust issues within that population related to sharing employer and other information.
EQUITY
  1.  Although no provider mentioned this and it is not immediately clear from the current data, there is a potential issue related to varying response rates by student subgroups that risks underrepresenting certain of those subgroups in the findings and corresponding recommendations/solutions, potentially skewing this work in ways that have equity implications.