DHFA is big on feedback - after every event we ask for feedback, and every 3 months we have a club meeting to solicit things we can improve. MCHO is no different, and we wanted to share the results of the feedback survey we put out for 2019. We also believe in transparency, which is why we’re sharing with you.
I do want to clarify that we only had 37 people who filled out the feedback form - a healthy percentage, but could skew the results a tad.
What people liked the LEAST about MCHO 2019
In order of weight:
Judging (27%)
Awards Dinner (16.2%)
Ruleset (16.2%)
Open Sparring on Sundays (13.5%)
Directors (13.5%)
Ring Size and number of rings (10.8%)
Technology (10.8%)
Finals format (all bronzes matches first, gold matches second) (10.8%)
Interesting how Judging was the #1 thing, but director was down the list at #5. We had a 1+1 format, and the director was the final arbiter on calls. To be fair, a large majority of comments in this section were along the lines of “Great event, but I had to pick things,” so take it with a grain of salt.
Let’s take a look at some of the feedback about these:
The judging/directing wasnt bad but probably the weakest part of the tournament. Calls on quality and elbow shots were inconsistent. Impressed that the judges DID NOT seem biased towards particular fencers.
But calls were inconsistent between rings, and deviated pretty significantly from the provided rules. More experience, training and rest time for judges/directors.
I loved the judging format, but I feel by the end of the tournament the quality starting going down. I definitely am interested in reviewing the video to see if I was crazy on what I felt or thought happened in several instances. I believe more rest and rotation of judges would alleviate this.
Takeaways
‘Quality’ is one of the most inconsistent metrics we use to judge strikes, and while this is a common complaint at tournaments, we’ve got some work to do in order to make that consistent. We’ll be publishing explainer videos this year that will go over the more nuanced parts of the ruleset, such as quality, so that way everyone has a better understanding of the intent and rubric for scoring.
Deviation from the rules was only brought up by this one person, and they didn’t provide any examples, but it still is an issue of consistency that we’ll be fixing.
Rest for directors and judges is a important topic, however. Our goal was to run 2019 with as few outside volunteers as possible. however, with the sheer number of matches, this meant that judges and directors were EXHAUSTED by the end. For 2020, we will be opening up judge and director registration - the goal will still be to keep outside volunteers lower, but to make sure we can have as many fresh eyes as possible during pools and eliminations.
What people liked the MOST about MCHO 2019
In order of weight:
Fencing Talent (64.9%)
Venue (51.4%)
Technology (45.9%)
Ruleset (32.4%)
Video recording of all fights (29.7%)
Judging (21.9%)
Skills Course (21.9%)
Some comments to hightlight:
The judging in this tournament had very few mistakes.
I enjoyed the skills course, there was a ton of talented fighters, the venue was very nice and the shirts looked great. I was impressed how closely the schedule was followed.
I like the two judge format. I also really like the open discussion and explanation from the judges. There were problems which I will detail below. I was really surprised at the talent that came out to participate. It made for a very challenging weekend. I thought the tech made it very easy to follow the matches. I had friends watch who knew nothing about fencing, and the scoring display really helped them enjoy it more. For schedule, everything ran smoothly, more so than any other tournament I've been to.
The skills course took a lot of people by surprise, and we had a number of people ask to sign up after they saw the format. For those of you who don’t know, we created an obstacle course that tested your technique and skill with a longsword.
We’re really happy with the venue as well, which is why we’re going back there for 2020! Clean, state of the art, showers…it’s got everything.
It makes us happy to see that the fencing talent was something people appreciated, and it led to some RIVETING fights. In fact, for 2020, we have a plan for bringing in even more top-tier fencers; follow our facebook page for future announcements.
Let’s take a look at judging quality in more detail:
Judging & Directors in the Pools
Accuracy: 5 out of 7 (13 responses)
Consistency: 6 out of 7 (15 responses)
Clarity and Explanation: 6 out of 7 (15 responses)
Not bad! Accuracy is the least positive, but still overall something we’re pleased with. As mentioned before, there’s still a lot of work to do.
Directors and judges were instructed to have open, deliberate dialogue in front of the fencers, so that way the fencers could understand the reasoning behind the call itself. While there were some instances of side-discussions, it was kept to a minimum.
Judges & Directors in Eliminations
Accuracy: 5 out of 7 (10 responses)
Consistency: 6 out of 7 (13 response)
Clarity and Explanation: 6 out of 7 (16 responses)
While the overall consensus was consistence, a couple of people did feel that quality of judging went down in eliminations, and that’s reflected here. It’s important to us to figure out how to make it more consistent between the two rounds for all of our events.
Tournament Ring Feedback
We went with the Slovakian style of ring format - a 4mx8m(15ftx30ft) rectangle with starting lines about 3m(12ft) apart. This allowed us to keep time between exchanges low, and move through our eliminations quickly. We’ll be using it again next year.
There were enough people who felt they were a bit too narrow, and next year we’re considering a layout change to give the rings even more space if we can pull it off.
Skills Course Feedback
This is interesting feedback - the Zwerch station (you had to cut horizontally at two targets that were close together, strike the intended target without hitting the other target, and then cut around for the second target) was the simplest - what went wrong?
In the feedback, we saw a number of people state the the targets were inconsistently re-attached to their posts - some were firmly planted, others lightly attached. This meant that some people couldn’t complete the cut, even though power levels were not what we were trying to score against.
Our attack, parry, riposte station was the favorite, and is going to see some cool expansions next year (you had to provoke, then parry a counter attack, and then complete a final riposte to another target).
Slalom Cutting is going to need some improvements - it needs to match the intended skills better (fluidity of movement and efficiency with the blade).
The biggest thing we need to change about the skills course next year is that we weighted accuracy too heavily over speed. Some people literally could walk through the course to get the highest score. We want there to be tension between the skill, and the need to move quickly.
We’ll be sharing some other choice quotes in the future as we talk about improvements we plan on making to the entire event. Everyone who gave us feedback will be receiving a $30 discount code to next year - feedback is the only way we can improve, and we have every intention on making MCHO a world class event.