Main menu

The inside line on CaRMS’ post-activity surveys

By: Ryan Kelly, Manager of Client Services | November 4, 2014

Let’s be honest, nobody likes receiving a survey that doesn’t relate to something they’re interested in or are involved in personally or professionally. Personally, when Rogers sends me surveys to see if I am enjoying HGTV, I rarely respond and when Mazda asks my thoughts on the 2015 CX-5 when I’m currently driving a 2004 Tribute, my temperature starts to rise. However, every so often a survey crosses my path that I am more than happy to complete. Those surveys have a common theme – they are relevant and they impact me directly. CaRMS’ latest feedback collecting initiative – the post-activity survey (PAS) – does just that.

A post-activity survey is distributed to our primary stakeholders after a major activity in the match year passes. For example, when the application period ends, we send a survey to participating applicants to collect feedback on their online experience during the process.

This is our second year distributing this type of survey. The first foray into the PAS was to determine if there was an appetite to complete these surveys and if the results would be valid/constructive. The answer to both was yes and applied across our stakeholder groups. Our plan is to distribute PAS’ for the next three years, then re-evaluate their utility for the organization.

The first survey this year was sent to programs and focused on their experiences during the program description updates process. The survey had almost triple the response rate from last year and provided valuable feedback for the organization, including guidance for our product development team. Top level results are below and include feedback from all 17 Canadian faculties of medicine.

graph-1 graph-2

The majority of respondents found the program description process to be fairly easy or easy, which is extremely positive. We have worked hard over the last two years to consistently improve the system for our end-users and will continue to do so moving forward. Feedback from respondents who found the process more difficult is of equal or greater value, as it points us to areas that we can focus on for improvement. Based on the feedback received, we are looking to continue to address basic navigation and usability – two of the biggest problem-areas for our users.

I think it is important to point out that these surveys are just a portion of the information we use to help plan our product development and improvement initiatives. CaRMS offers match services to many stakeholders and receives thousands of pieces of feedback annually.

With this in mind, CaRMS has recently added two new staff members to the mix who will act as our product development team. Part of their job is to wade through the feedback received, consult with our stakeholders and improve our services to offer the greatest value to our overall user community.

Our primary goal is to offer a product that satisfies the needs of all our users: programs, whether they have four applicants or over 1000; referees who write one letter or 15; PGME offices that actively update program descriptions or leave the obligation to their individual programs; and applicants who apply to just one discipline or many. It is no easy task, but it is our job to provide an online application service that allows you to do your job as effectively and efficiently as possible.

You can view the full results of our first post-activity surveys here.