Announcing CRMUG UK Superstar Awards 2016

CRMUG UK Superstar Award Logo small

CRMUG is an organisation that is fundamentally run by users, for users. We wanted to recognise the efforts of some of our many volunteers who provide such great content at our meetings, sharing their experiences and expertise for the benefit of other members. And it seems to be “awards season” at the moment, so what better time to announce our inaugural CRMUG UK Superstar Awards!

We have been asking our audiences to provide feedback at every meeting, and for the last couple of years we have been collecting feedback data for each individual session too. Based on these scores we have drawn up a shortlist of finalists in various categories, and now we need YOUR help to vote for the overall winners. The 2016 awards will be based on feedback “since records began”, which basically boils down to presentations from 2014 and 2015.

Please vote for your favourite session of the last couple of years from the finalists in the different categories, as well as for the best overall speaker at our survey page.

Vote NOW!

If you want to know how we chose the finalists then scroll down for the boring bit, but if you just want to know who they are before you vote, read on…

Category Finalists

The finalists in each category are listed below, with profile links to help you remember who these speakers are if you are at all unsure. All listings are alphabetical by surname (the survey responses for voting are presented in a random order). The first five categories are based on the feedback score for overall satisfaction for each session.

Best Educational Session

Best Partner Session

  • Rowland Dexter (QGate) – Better Data > Better CRM
  • Martin Doyle (DQ Global) – Mantra for High Quality CRM Data
  • Tim Fowler (Cincom) – Enhancing Sales and Marketing by Using a Sales Configurator

Best Customer Showcase

  • Jamie Barker (Town and Country Housing) – To Infinity and Beyond!
  • Martin O’Neill (Skills Development Scotland) – Supporting a Changing Business at SDS
  • Chris Roberts (Amicus Horizon) – Becoming the Pied Piper

Best Session for Developers

Best Round Table Discussion

Best Speaker

We also have one group of finalists drawn from all session types, based on the feedback scores for how well the speaker or session chairperson performed. The finalist are listed below, along with the session that qualified them for this category, although you may have seen some of the speakers involved in more than one session over the last few years. This award is intended to be given to the person who has performed best overall:

Vote NOW!

Finalist Qualification Criteria

We have lots of different track sessions that address the needs of different groups of members, so we wanted to make sure these were fairly represented. No short documentary or animation is ever likely to win the Oscar for Best Picture, so the Academy have special categories for those films so they can be compared fairly with similar movies. We have used different categories to make sure things are being compared on a like for like basis, and to try and counter any bias that would be caused by sheer weight of numbers. If we did not do this, then even the most exceptional developer-oriented session would be likely to attract fewer votes than a session with a much more general “middle of the road” appeal.

We also wanted to make sure that our volunteers had a fair chance of being recognised for their contributions, while acknowledging that our Premium Partner members also provide very valuable information for audiences at every meeting.

So we have used 5 categories for sessions, as follows:

  • Best Educational Session (not including partner-led sessions, and not “pure developer” sessions or Customer Showcases, all of which have their own category)
  • Best Partner Session (self explanatory)
  • Best Customer Showcase
  • Best Developer Session (includes round table, interactive, or presentation-led sessions)
  • Best Round Table (including MS Conduit sessions, but not dev content, covered above)

The finalists for the above categories were selected and filtered on the following basis:

  • In each category, all sessions were ranked by the “overall” feedback score (not an average of all scores, partly because we sometimes asked different questions)
  • Anyone appearing in the list more than once was excluded for all but the first entry
  • The top 3 sessions were then selected, or the top 4 for Best Educational simply because of the number of entries in that category and how closely the scores were grouped.

We also have a completely open category for “Best Speaker” to include volunteers from all areas of our membership, including customers, partners and MVPs (but not Microsoft). This was worked out in a similar fashion to the main categories above:

  • All sessions were ranked by the “speaker” feedback score only
  • Anyone appearing in the list more than once was excluded for all but the first entry
  • The top 5 sessions were then selected to give the best chance of including speakers from a wide range of categories.