It's a bit of the classic 'chicken and egg' situation: you need better data in order to justify more resources, but you need more resources in order to collect better data.
Every business knows the value of data and the ability to draw meaningful conclusions from it. Just ask Google!
Over the years, Australian universities have gotten pretty good at collecting strong data for recruitment purposes - AUIDF benchmarking conducted by Alan Olsen of SPRE Pty Ltd has provided world-class, industry- and institution- level trends. Other professionals, like Keri Ramirez and Dimity Huckel of Study Move Consultants have done incredible analyses of both data and trends for many institutions, including groundbreaking work in the social media space (on a side note, we got Keri in to do a team training for us on social media that was brilliant - highly recommended).
But what about the use of analytics and capturing of data in outbound mobility programs?
As many practitioners around the country would attest, just finding the time to manage all the student applications and Government funding programs can be challenging enough, let alone branching out into sophisticated data collection and analysis. It's a bit of the classic 'chicken and egg' situation: you need better data in order to justify more resources, but you need more resources in order to collect better data.
The principle: measure and track everything you possibly can and use that data to streamline your systems and processes...
The introduction in recent years of more powerful mobility database solutions, such as Studio Abroad and Move On, has helped some institutions to access better reports and information about their participants. Such systems (commercial or proprietary) are crucial to the better management of programs and participants. However, as mobility teams know, the implementation of big databases is a complex, time consuming and ongoing process and certainly no panacea to the challenge of collecting better intelligence.
So where do student mobility teams start looking to answer the data collection/analysis question?
The starting point should be the following principle: measure and track everything you possibly can and use that data to streamline your systems and processes.
In my time authoring the outbound mobility best practice guide I was fortunate enough to work with many universities across the country and learn about their mobility programs. During this project the University of New South Wales' Global Education/Mobility program really stood out as best practice .
...know your customer and how they like to engage, then provide them appropriate resources in order for them to do so efficiently...
UNSW have implemented a sophisticated array of data collection processes and have embraced a philosophy of collecting data for program improvement purposes. Not only do they track web statistics but collect detailed information about student enquiries and responses, including the channels through which those enquiries come. This includes in-person consultations, emails sent/received, phone calls made/received, student attendance at sessions and events (by scanning uni card barcodes) and much more. This data is consolidated to give an overview of trends and what is and isn't working as intended.
As a result, UNSW have managed to streamline their processes and systems, and improve their information in intelligent, trackable ways. This includes setting up an online 'module' in their learning management system, which once again provides trackable data to identify which sources of information are most effective. Their approach is truly a business-oriented one - know your customer and how they like to engage, then provide them appropriate resources in order for them to do so efficiently.
...it is critical to set aside dedicated time to implement good data collection processes and to review data once collected...
So where to start? Here are some ideas - like everything, start with a few basic things and work your way up:
Survey your applicants before and after they return from their programs. Ask them about why they participated, where they found out about it and what they thought about your services to them.
Survey students enquiring about programs - make this a standard process using Surveymonkey or a similar tool.
Capture as much information about enquiries and applicants as possible (use CRM, Google analytics, surveys, usage data from your learning management system etc).
Work out your enquiry to application conversion rate.
Work out your application to participant conversion rate for different program types (e.g. exchanges, summer/winter schools, study tours etc).
Monitor your conversion rates and web stats year on year to get a broad perspective of program health.
Track web stats before and after major marketing efforts to determine which efforts are most effective.
Do an analysis of your systems and processes - how manual are they for participants? What might be automated to make things easier for them and you?
Identify cheap/free online tools (like email and web browser plug ins) that can provide you more information about your workload. Xobni and Tout App are just a couple of these.
Use an online tool like Gliffy or LucidChart to flow chart your systems and processes.
The list could go on and on. But most of all, it is critical to set aside dedicated time to implement good data collection processes and to review data once collected.
So back to our chicken and egg: resources before data or data before resources?
The reality of tightening budgets is that good data will be increasingly important in underpinning important decisions in student mobility and distraction-free time must be taken to try to get it right. Setting aside this time may create some pain in the short term, but has significant upside benefits down the track: better systems, better processes and a clearer way to make the case for more resources when they are required.
Rob MalickiDirector, AIM Overseas
"There was so much to do in 3 weeks and honestly I would love to experience it all over again. There was never a dull moment. Also, splitting the theory into the first 2 weeks and practical in the third week worked really well."