Enrollment Management and Graduate Employment Outcomes
Although our institution does consider the impact of admissions on career services, it seems as though there is still quite a gap. Many times, it does seems that admissions goals are more of a priority (to the institution). This results in enrolling students that drop out quickly, or become 'problem students' for student services, and eventually, career services.
It is very important that we enroll students who actually want a career, and are motivated to try to attain that goal. However, this is not always the case.
Based on speaking to colleagues in career services at other institutions, this seems to be an ongoing dilemma.
I do voice my concerns to the admissions department, but I feel that the priority of that department is so stringently aligned with metric goals, this overshadows those other aspects.
I recommend this gap be closed by stricter admissions standards for enrollment, and less emphasis on the enrollment metrics as strict goals.
Hello Paulette,
Whereas "over enrollment" may not be an issue at your campus, are there any practices in place at your institution during the admissions process that enable students to assess their career interests at the point of considering enrollment to align their interests with programs offered by the institution? In my experience, this seems to be a rare practice and I am curious if your institution does this, has considered it, or will consider. Thanks for sharing!
Robert Starks Jr.
The goal of both admissions and Career Services is to see the student progress through the program, graduate and become successful in the field. We are not able to 'oversell' our seats as we have a small campus and have limited space so that is not an issue. In our industry, the number of people who leave the field per year is about equal to the number who enter the field annually.
Does your institution currently consider how enrollment management plans impact employment rates, or is there often a discrepancy between the goals of admissions and the goals of career services? How would you recommend this gap be closed, and/or what strategies has your organization successfully implemented?
Thankfully, our University Administration and Admissions teams do not over enroll students as a matter of course. It is known that a policy of over enrollment would saturate the local employment market, increasing unemployment for new graduates and outpacing the capacity of our clinical and externships sites, thus frustrating local employers as well.
There are discrepancy's at times; however, if I see an issue I direct it with our Admissions Director.
Hi Robert!
Well, it was measured, however, that data was usually collected within a few weeks before the quarterly report was due. This report describes everything that happens within the academics department within a quarter. The advisors, while writing this report, would call graduates to collect the information. In my region, I know that this type of data collection made for a limited response. Now, we are given an entire quarter to collect the information and help students (more time, of course, if we are helping place a student).
Myhisha,
If the institution makes decisions based on both employment projections from research and actual employment outcomes, do you know how decisions are made on eliminating programs or creating new ones if there hasn't been any tracking of data historically? I'm confused because I don't see how an institution can make data-driven decisions without data. As I understood from your previous statements, your institution has just recently formed a Career Services department in which you are the only one serving in the Career Services function. I imagine that very little data has historically been collected if this is the case. How have such decisions been made without the data you will now be collecting?
Thanks.
Robert Starks Jr.
The institution that I work for is a regionally accredited, for-profit institution. Career placement and employment rates are considered on every level before a program is introduced, and then again considered when taking those programs away. The whole point of offering the courses we do is to help students become employable, and therefore it is always a factor. The only gap that we have had is having a department in place to primarily track and place those newly skilled workers.
Kathy,
I definitely knew what you meant and wanted to comment on it so that others don't misinterpret as I have had similar conversations in which the other party assumed "elitism" was being suggested when this is far from the truth. Perhaps you can start with suggesting that other leaders within the institution take this course and perhaps it will help get them on the same page to re-think "career services" :) It is challenging but continue being the champion of the cause and do the best you can do to advocate for perpetual improvement. This is part of serving the students/graduates.
You may enjoy this article and it might give you something to share with others as a "primer" to your future conversations to gain buy-in:
http://www.careercollegelounge.com/pg/blog/rstarks/read/49623/the-evolution-of-career-services-transforming-the-way-career-colleges-deliver-career-services.
Your observation is spot on. I agree it is very challenging to assess which prospective students truly want to pursue a career and which may be making a knee-jerk decision. In the end, however, this is why admissions processes should be about helping students make the most informed decision they can make and self-knowledge is part of that process. Therefor, I am a strong believe in self-assessments in the admissions process and believe they can help provide the prospective student with insight as well as the institution. This is mutually beneficial. It also leads to a sustainable approach to enrollment management while increasing the likelihood of improved graduate employment outcomes.
Robert Starks Jr.
Thank you for your comment about semantics. I of course did not mean to "select" people based on any criteria that suggests discrimination but rather, as you say, "identify" candidates that are better suited to the objectives of the program.
In this day and age it seems more and more that the reasons for enrolling in our program are not necessarily because of a genuine interest in the subject but that "I just need a job" or "it sounded like a good way to make money." Then the student is either pleasantly surprised how much they enjoy it or they quit because it was more work than they want to handle. Either way, it is hard to predict what the final outcome will be.
Our company does collect a lot of data but I do not know who or how it is being evaluated. I have recently brought up the need to perhaps discuss our current policies and consider ways to make the process more effective by reviewing any possible patterns in the the data we have. We'll see!
Kathy,
When you say you'd love it if your school could be "more selective" in the admissions process, this statement can be misconstrued. The goal isn't to be
"more selective" per se but rather, identify opportunity to have more appropriate evidence-based admissions criteria that correlate to student success (student success defined as becoming employed upon graduation vs. simply graduating). I know this is likely what you mean but I wanted to re-state this because although it may seem as if it is semantics, I think it makes a big difference when we communicate it this way because it more accurately reflects the best practice discussed in the course. To be "more selective" isn't really the goal as if to say, "only admit the more elite students." The real goal is to analyze existing data to discover opportunities to implement evidence-based interventions that can improve graduate employment outcomes. This is simply an objective process of data analysis and important to any institutional effectiveness plan.
If an institution has strong data that clearly demonstrates a need for more appropriate admissions criteria, it should be considered. In your particular institution, do you have such data? Has there been an analysis of internal data from which to make any suggestions for either new admissions criteria or assessments during the admissions process to target individuals who may demonstrate a higher likelihood of risk and thus, may need targeted, intense intervention?
You raise good points that certain academic evaluations may prove beneficial and if the suggestion is to implement assessments to improve service to students vs. new admissions criteria that may prevent certain students from enrolling, I'm wondering if you would get buy-in for this enhancement suggestion that could improve graduate employment outcomes.
I think this is a good example of where buy-in from institutional leaders who have the authority to make such decisions is absolutely necessary to implement institution-wide best practices. This is also an example of the role of internal consultant that Career professionals play using their expertise to make suggestions for institutional improvements. If career professionals can arm themselves with the proper data, suggestions to improve institutional effectiveness would be more persuasive as they would be derived from data vs. mere opinion. I wish you luck in your perpetual goal of maximizing graduate outcomes.
Robert Starks Jr.
I would echo Carolyn's comments almost verbatim. Our challenge is also that the company places high emphasis on quantity and not quality (meaning students who are most likely to complete the program and pursue a career in the related study). More and more seem to enroll only to find out that it is not what they thought or imagined, or it was too hard or they just decided to do something else.
Personally, I would love it if we could be more selective in our admissions process to ensure that the student is well informed about what the expectations are, the commitment needed and the self-awareness to know if this is a good fit for their personal goals. Even some minimum academic evaluation would be helpful to know if we are setting the students up for success or for unnecessary stress do to the level of performance needed (basic reading, writing and comprehension skills).
I have voiced concern in the past but have been informed that we are not allowed to deselect anyone who meets the financial and minimal educational requirements.
Carolyn,
Is it possible for you to suggest or influence any decisions to include this as part of the Institutional Effectiveness Plan? Are you part of any institutional effectiveness team? This may be a vehicle to have a cross-functional team discuss these issues, initiate research and data collection and then draw conclusions as a group as to what interventions could be implemented to improve institutional outcomes. Has this approach already been pursued?
Robert Starks Jr.
Robert,
Thank you for your comments. You have articulated my own feelings very well. The example you illustrated in your response is exactly the type of reaction I would love to see Admissions adopt. Without such data, it is very hard to demonstrate statistically the negative impact to our institution of enrollment on the goals and outcomes of other departments.
Many times, it seems that the goals of my department, Academics/Retention, and the Admissions department are almost in direct conflict, and this is a great example. As a Career Management Director, I would Love to see the Admissions department incorporate measurable criteria so that we could statistically analyze the student population we are enrolling and their outcomes.
I do communicate regularly with the Admissions department to ensure they are aware (my feelings on) the impact of enrollment toward not only graduate placement, but also retention. Unfortunately, without empirical evidence to back up such feelings, and without being part of that department, it is difficult to make additional progress.
Carolyn
Carolyn,
I hear you. I thought perhaps there was data that would support ideas of more appropriate admissions criteria. For example, a program that has an abnormally high drop rate with a correlation discovered that particular classes relating to advanced mathematics are where the majority of drops occur within the program. This may lead one to believe that perhaps a math assessment should be implemented prior to enrollment as it has been determined that an appropriate admissions criteria would be a specific(measurable) mastery of math prior to admission in the program. This is just an example and though you might have such data and thus, recommendations. The information necessary to have these conversations among departments who all monitor different data sets usually happened at Institutional Effectiveness meetings in my own personal past experience. Any information such as the data used in this example would prevent recommendations from being speculative but rather factual. If anyone else has real-world examples of making recommendation for new admissions criteria based on collected data, I would love to hear about it as it would add value to this discussion.
Thanks for your generous sharing and participation Carolyn! It is very helpful.
Robert Starks Jr.
Robert,
Because I am not directly a part of the admissions department, I feel it would be difficult to comment much further on the issues you addressed without being entirely speculative.
I can tell you that this information has been communicated to upper level management within our organization. I do feel that improved, direct feedback loops, would improve progress.
Carolyn,
Do you have any thoughts on what types of new admissions standards might be appropriate relative to the data you have that suggests new admissions standards seem to be needed? Additionally, do you have any thoughts on what data you would use to better determine these new standards? Beyond discussions with Admissions, is this a discussion that occurs at the top leadership level such as with the President, or if part of a larger corporation, with top executives? Perhaps you can evaluate some of the other potential issues that hinder progress in this area such as organizational structure (reporting structure) and/or a need for improved, direct feedback loops. Can you comment on these issues?
Robert Starks Jr.