Findings & Recommendations
Implementing Outcomes-Based Quality Evaluation for Postsecondary Providers in Indiana
Findings & Recommendations
The core findings and recommendations from our research and analysis are included below, organized by section of the EQOS framework. In some cases it made sense to combine sections of the Framework (e.g., Learning and Completion, Placement and Earnings) as the issues identified and corresponding recommendations are consistent or linked across those sections.
Our findings are organized by the following:
-
Learning & Completion
-
Return on Investment (ROI)
-
Survey Data: Student/Alumni Satisfaction
-
Survey Data: Job Placement, Earnings, and Employer Satisfaction
-
Additional Issues and Recommendations
Themes of the findings within and across sections:
We have also identified three common underlying themes for the majority of the issues we have identified, and have cross referenced the findings and recommendations by these themes within each component of the EQOS framework. The three consistent themes of our findings are:
- Terminology and Definitions
Use of common terminology and language is critically important to ensuring “apples to apples” comparisons across providers. Even with the common data definitions developed and provided in this pilot, interpretations varied by provider when applying to their own data systems, and even by staff within a provider. - Organizational Capacity
Organizational capacity differences shape the ability to do various aspects of this work and the types of challenges encountered; we will need to be able to design for these capacity differences on the front end to ensure the consistent data gathering needed for high-quality analysis and output. - Culture/Mindset
Providers vary significantly in how they see themselves and their roles in the student lifecycle from recruitment through program completion and into employment. Some providers take an active role through this entire continuum and others have a much narrower mindset of their role being to/through certification but no further into employment or other downstream engagements with alumni or employers.
1. Learning and Completion
Terminology and Definitions
- Assessment. Providers had varying operating definitions of “assessment." For programs with a single targeted certification exam, this exam was treated as the “assessment” for the program, where other providers had a more expansive definition to include any summative or formative tests or assessments of learning progress during the course or program.
- Cohort vs. Class. Educational design uses cohort as a specific structure that emphasizes peer-learning, shared experience, and intensive instruction to improve student outcomes. However, many of the providers we spoke with used it synonymously with the term “class” without the components that would traditionally define a cohort, including classes with rolling admissions.
ORGANIZATIONAL CAPACITY
- Assessment Literacy. The capacity to design, deliver, and interpret quality assessments varies by provider, resulting in disparities in the ability to measure learning outcomes and impeding the ability to compare these outcomes across programs and providers.
Culture and Mindset
- Open enrollment & completion. A concern was raised that providers with open enrollment or similarly inclusive access might be negatively impacted when comparing completion and other outcomes measures with providers using more restrictive entrance requirements.
- Assessment Mindset. The issue above of varying definitions for “assessment” has its roots in part in a deeper mindset difference. Providers focused singularly on certification have built their organizations and cultures on optimizing for that one exam per program; it permeates most every aspect of the organization. Programs structured as traditional post secondary courses or as part of a larger degree track were more likely to have a culture of ongoing assessment and iterative learning.
-
Recommendations
- Implement an “accreditation-like” system for assessments, or build on pre-existing industry-driven program accreditation systems, that can be used to establish a clear baseline for measuring learning for the purposes of ETPL. This would clearly define what is meant by “assessment” in this context and establish the threshold for quality implementation so that providers clearly understand the goal and requirements for measuring learning outcomes as part of ETPL.
- Provide professional learning and capacity development to support providers in meeting these assessment definitions and expectations.
- Within that system, rather than measuring learning across the diversity of programs with a single assessment model, consider distinguishing between acceptable approaches for measuring learning outcomes based on program or course type. One example would be having differing approaches for tracking assessment in non-degree certification programs vs. degree-track programs or courses. (This could be particularly relevant if the plan is to extend outcomes measurements across all degree and non-degree programs eventually).
- Provide specific recommendations and guidelines for how to structure “open enrollment” to include pre-screening that helps ensure:
- A target program is aligned with student interests
- Students have the required baseline skills/knowledge for likely completion and success in the program, and/or...
- Prerequisite learning needs are identified and students are connected to programs/resources for building those prerequisite skills.
2. Return on Investment
Terminology and Definitions
- The program cost items included in this calculation varied by provider (including some who used list price) which will impact the quality of the analysis and comparisons.
ORGANIZATIONAL CAPACITY
- Smaller organizations in general can pull this information together from a single source, whereas larger organizations need to operationalize and streamline gathering this data across business units and data sources internally (e.g., Financial Aid, Career Services, etc.), giving smaller organizations an advantage.
Culture and Mindset
- Related to the Terminology issue for ROI above, in some organizations the default mindset is to equate program cost with “price charged” rather than cost of delivery. This was relatively straightforward to address through reference to the definitions for Program Cost, but it is important to recognize that measuring program cost is not the default mindset for all organizations, or for all staff within an organization.
-
Recommendations
- Provide an ROI calculation template (or interactive form) that clearly defines the elements to be included in this calculation and facilitates standardized data entry and output, and provide related technical assistance for implementing this tool.
- Develop a Societal Impact calculation that relates the amount of public money taken in by a program to the societal benefit generated in employment, earnings growth, tax contribution, and other measurable societal benefits.
- Build time into the roll out plan to allow organizations (particularly larger, more complex providers) to define and implement the necessary process and data flows as well as the systemization across departments for gathering this information.
3. Survey Data: Student/Alumni Satisfaction
Terminology and Definitions
- Providers appreciated having the EQOS template questions to build their surveys from, and some used the questions verbatim. Other providers, however, preferred personalizing and adapting the questions to their organizations and alumni, which could negatively impact comparability across providers.
- For accredited programs, there are required questions that need to be included, and the providers want to send a single survey that blends the EQOS questions with program-specific questions related to accreditation.
ORGANIZATIONAL CAPACITY
- Ability to gather this data. For smaller organizations, gathering alumni (and employer) data is both new and a huge lift; it is not something they currently have internal capacity for or experience in.
- Quality of the data gathered. As with any self-reported data, there is strong potential for inconsistent and/or poor quality in the responses received, and little capacity to address those quality issues within most providers today.
- The amount of time passing between program completion and survey is a huge factor in gathering this information, with the ability to contact alumni decreasing significantly after six months. Many/most providers are not set up to track and stay in contact with alumni beyond a very short window after completion. According to several providers, this is made more challenging by the number of students and alumni in transitory housing situations and who lack any consistent online presence/access.
- Follow-up outreach (emails, texts, calls) to improve response rates is time consuming, best practices aren’t clear, and many organizations do not currently have staff allocated for this.
- In addition to the challenges of low response rates, many/most of the surveys returned are incomplete. So while average survey response rates may be 20%-30%, response rates on any particular question might be significantly lower.
CULTURE AND MINDSET
- In many organizations, the mindset is that “student satisfaction” is something captured during a brief survey on the last day of the program related specifically to their experience within the program, as opposed to their satisfaction with the downstream employment outcomes that the program supported/enabled. Extending the definition of satisfaction to downstream longitudinal outcomes will therefore be a culture shift for many organizations.
-
Recommendations
Our recommendations for gathering student satisfaction information via surveys are included here; there are related challenges with gathering employment and wage data via surveys, discussed in Section 4 below.
- Standardize core survey questions statewide for essential information to be tracked across all providers and incentivize/mandate use of those questions.
- Enable providers to add questions of their own to support customization, accreditation, and provider-specific feedback opportunities.
- Encourage/require providers to include satisfaction surveys at:
- The end of each program, where response rates were significantly higher (75%-85%, vs. 20%-30%).
- The 30-day, 90-day, and 6-month marks.
- Develop a recommended “playbook” for satisfaction surveys that include best practices for maximizing response rates (for example, to use sampling or other methods that research shows to be most effective).
- Develop and facilitate an ongoing method for providers to share “what works” and other lessons learned, and to iterate and improve the above playbook over time.
4. Survey Data: Job Placement, Earnings, and Employer Satisfaction
Terminology and Definitions
- Employers do not define their industry sectors in straight alignment with programs or certifications. For example, large manufacturers may include HVAC and many other certified roles not explicitly tied to “manufacturing”; tying their feedback to a specific program area can be problematic.
ORGANIZATIONAL CAPACITY
- Related to the culture/mindset finding below, providers who have not previously seen their role as including employer outreach are relying on students and alumni to identify their employers. As a result, with ~20% - 30% alumni survey response rates, the employer information gathered through these surveys is a very limited percentage of the total employment picture.
- Furthermore, in many of those survey responses the employer details sections were left blank (one provider had zero students answer the employer-related questions). Providers in this situation will therefore get a fraction of a fraction on key employment and earnings information through their own survey efforts.
- For some providers (e.g., Ivy Tech), many program completers go on to finish 4-year degrees at another college/university, and from there into employment. Ivy loses connection to the students during this transition and is therefore cut off from the downstream employment information.
Culture and Mindset
- Some providers do not see their role extending to engagement with alumni and/or employers; their focus extends from recruiting through completion/certification, but not beyond that point.
- Notable Exceptions: providers with a strong focus on externships and/or built-in employer partners had this information readily available.
-
Recommendations
- Do not rely on student responses as the source of contact information for subsequent employer surveys. Instead, develop or access objective, non-student-sourced placement and salary data.
- Build or connect data warehouses that can track student progress through high school, postsecondary programs (and across multiple post-secondary providers), and workforce, rather than trying to piece together data from disparate systems that can’t follow an individual through these critical transitions.
- As part of this, develop/refine the state’s capacity to track and provide longitudinal employment and wage data that can be indexed back to providers and specific programs.
- Access private sector institutions or sources for outcomes data through business or industry partners (e.g., the National Association of Manufacturers work with the National Student Clearinghouse).
- Access the Post-secondary Employment Outcomes (PSEO) data or similar federal sources.
- Engage employers and industry associations in any subsequent planning or working groups for this project -- get them at the table to help identify and solve issues related to gathering information and program feedback from employers.
- Use survey data to supplement and enhance these objective findings, but do not rely on survey data as the primary source for placement and earnings.
- Build the capacity of organizations to do student and employer outreach effectively, including the use of non-email-based platforms/approaches, the development of alumni and employer services, and helping students understand the importance of providing follow-up information to the provider.
- Encourage and incentivize providers to develop internships, externships, and/or apprenticeships, where the linkages from provider to employer are more explicit.
5. Additional Issues Not Specific to a Framework Section or Theme
DATA PRIVACY AND USAGE
- Several providers mentioned concerns about data privacy and how student and employer information was being tracked and used.
- Students frequently did not respond to the questions about employers, placement, or wage data. It’s not clear why this was the case, but it may be indicative of concerns about data privacy and how/where this information is to be used.
- One provider specifically mentioned having a large percentage of recent immigrants in their programs, and that there may be trust issues within that population related to sharing employer and other information.
EQUITY
- Although no provider mentioned this and it is not immediately clear from the current data, there is a potential issue related to varying response rates by student subgroups that risks underrepresenting certain of those subgroups in the findings and corresponding recommendations/solutions, potentially skewing this work in ways that have equity implications.
-
Recommendations
- Conduct focus groups with students/alumni regarding their responses to the surveys to better understand obstacles they perceive in a) responding in the first place, and b) answering or skipping specific sections or questions. Is there more to it than simply resisting taking the time to complete the surveys? Identify the true sources of friction in generating survey responses so any solutions are aligned with those underlying causes.
- Clearly establish data privacy and use policies and procedures (if not already in place for this work) and be exceedingly transparent and forthcoming on these policies.
- Compare the demographics of the students who responded to the survey requests against the student/alumni populations as a whole to see if there are trends or correlations indicating variances in response rates by subgroups that should be considered and solved for as the program expands.