Auditing Your Communication Flow – Enrollment Management 101
Estimated read time – 6 minutes
Want to keep updated about the blog? Sign up for the newsletter!
One of the most expensive items to hit admissions offices is their communication flow. The mailings, emails, texting services, and telecounselors can take up over 50% of the total budget for the year, and a lot rests on having an effective communication flow. It’s surprising then that so few offices audit their flows and use a critical eye to evaluate whether or not they are working as intended, or are even necessary. Let’s dive into the methods and tests you can use to audit and optimize your communication flow.
Auditing your existing comm flow
If you didn’t run any tests last year, there is still a lot that can be learned from your results. It will depend upon what data you have in your CRM and how accurate that data is. Start with creating a flow document showing the communications and timing that each audience received them. Your audiences might include:
- Potential enrollments
- High school students
- Transfer students
- International students
- Part-time students
- Non-degree-seeking students
- Potential Influencers
- Parents
- Counselors
- CBOs
- Alumni
Start by laying out the potential funnel stages for each audience and what they receive in each stage in relation to when they enter that stage. It’s ok if you don’t have something at each stage, that’s an immediate opportunity to improve. With your influencers, think about where they fit in. Do you have messaging to encourage awareness? What about supporting them as they encourage students to apply? Was there anything to support yield efforts? If you have a highly segmented approach, be sure to record the differences in those as well. Here is a small sample of what this could look like:

It might be helpful to turn to a third party or project manager on campus who is skilled in teasing out things that you might overlook when you’re dealing with them day-to-day. If you don’t already know, be sure to document what messages are going out from other offices as well; such as athletics, alumni and advancement, student life, or fine and performing arts. You need to think about the experience from a recipient perspective.
Performance and outcomes
The next step, after you have documented where you are currently, is to evaluate your comm flow’s quantitative and qualitative performance. On the quantitative front, you should be able to evaluate the open, click-to-open, and conversion rates of every email as well as the reach, engagement rate, and conversion rate of every digital campaign. An overview of these KPIs is included as part of this digital housekeeping guide. If you planned ahead, you will also be able to measure the direct impact of every link you shared thanks to UTM. You will be able to analyze the source, medium, and campaign of the users on your site who are making key conversions to assign value to those tactics. This will never show the full impact because of post impression conversions, but it will provide a base level of impact for comparison and determining whether to try this tactic again.
The quantitative impact of a billboard or postcard couldn’t be much fuzzier than it is, but lift can be used to measure the increase of a conversion or intent behavior from an area that may be attributed to a campaign. For example, if you placed billboards around Pittsburgh you could measure the increase of traffic to the site from Pittsburgh users as well as the change compared to prior years in inquiries, visits, and applications from Pittsburgh. This lift would be the impact of your awareness campaign. Setting up a randomized A/B test with mail can show the impact based on the funnel behavior differences between those who did and did not receive a piece. Another way to test mail is to set up four audiences: one who receives no communication, another who receives your full comm flow, the third only receiving digital communications, and the fourth receiving only print communications. This is a very thorough test that can be done year over year to determine the effectiveness of various aspects of your comm flow.
The qualitative measure of your tactics will involve user testing and focus groups of students, parents, and community members to determine how they are being perceived and if they are changing sentiments. It’s very important to have diverse voices and individuals who feel comfortable speaking up if there are concerns. Not including first-generation, multicultural, LGBTQIA, or parent perspectives may lead to holes in your content or not providing a welcoming picture of your institution.
All of this hinges on the assumption that you know the goal of each piece as well. If you are sending an email, what should the recipient get from it? What is their next step, their call to action (CTA)? Don’t send something if you don’t understand its purpose. For example, if you send an email with the intention of having students sign up for a visit and that email leads to applications instead, even though it failed at your original goal, it spoke to the audience in a different way. Consider the copy, assets, and landing page and how they may have worked better for this other conversion, and how you can replicate that success when the goal is to generate applications.
Planning: Keep what works and stop the rest
We are so good at trying new things, adding pieces, providing new experiences and information — and yet so bad at ending things. Part of this surely comes from a fear of losing something that might be critical to enrolling students. If you haven’t measured the value of every event, marketing piece, or tactic it can be hard to make the case to contract your offerings. After you have done the testing and analysis proposed above you will have a better understanding of whether or not the time and financial resources put into each effort are worth it.
When stopping something, be sure to understand the full impact. How many enrollments uniquely came from that source or tactic? Is that potential revenue loss covered by the savings? Programs that have been in place for a number of years and involve other offices should include discussions about why they are being dropped so that there aren’t hurt feelings or distrust. For tactics through partnerships, make sure that your partners are providing regular updates and check the results with your own data to ensure accuracy.
