All-Star Idea For How To Score Better

Welcome to our Cheerleading Community

Members see FEWER ads... join today!

So randomly including errors in scoring is an improvement? Should the "human element" of scoring include not being allowed to use calculators to tabulate results? That is just a piece of technology that allows more accuracy. Video is the same thing to me. I want it to be right more than I want to keep the status quo.

No I totally agree with you on using modern technology. Still dont understand how scoring mistakes are made by adding incorrectly in this day and age. But in other sports what is one of the worst things you see? You watch a play in football lets say it is clearly a mistake, the referee goes over and watches video, comes back and stays with the same call and it takes time to review call. I am all for going back and changing results before awards if there is video evidence that a judge missed something that will have a direct impact on scoring, and placement. I think a lot of the problem is the Competition companies not admitting they made mistake and fixing it. Or they know they made a mistake and still wont fix it. That I feel needs to be changed. I am not sure what you are asking by randoming including errors in scoring? I also think if you had two sets of judges scoring the same thing, like difficulty, and if there is a huge discrempency in the their scores, going back and reviewing as well by video or by deductions would help as well. That way you have 2 sets of eyes looking at the same thing, where one might see something the other is not catching, then averaging the scores out. I agree the system needs to be better like you.
 
I think there is difficulty in selecting the judging. You are faced with the dilemma of coming up with people experienced enough with modern cheerleading to know the relative difficulty of modern skills, but NOT associated with or have a history with any particular gym. That is a tough combo to find. I definitely think that no judge should have an association with any gym they are judging, but you can't realistically expect every judge to have no all-star experience at all.

I agree with you, I think every judge should have all star experience. How about having a standard scoresheet and all judges go to school and learn how to score that scoresheet. Each competition company having seperate rules is why you have these issues. Not many other sports have more than one set of rules when it comes to scoring or rules.
 
No I totally agree with you on using modern technology. Still dont understand how scoring mistakes are made by adding incorrectly in this day and age. But in other sports what is one of the worst things you see? You watch a play in football lets say it is clearly a mistake, the referee goes over and watches video, comes back and stays with the same call and it takes time to review call. I am all for going back and changing results before awards if there is video evidence that a judge missed something that will have a direct impact on scoring, and placement. I think a lot of the problem is the Competition companies not admitting they made mistake and fixing it. Or they know they made a mistake and still wont fix it. That I feel needs to be changed. I am not sure what you are asking by randoming including errors in scoring? I also think if you had two sets of judges scoring the same thing, like difficulty, and if there is a huge discrempency in the their scores, going back and reviewing as well by video or by deductions would help as well. That way you have 2 sets of eyes looking at the same thing, where one might see something the other is not catching, then averaging the scores out. I agree the system needs to be better like you.

I didn't make myself clear. When you say you want the "human element" in judging and don't want video used unless there is a dispute, I disagree. I say let the judges (particularly those tasked with determining difficulty) watch all the video they want to get their score in the first place. There currently isn't any way for judges to possibly see every element, so there will always be random mistakes made. A judge may simply not be looking at the side of the floor where there is a tumbling bust. They may happen to be watching the one girl in a group of 10 doubles that only fulls and assume that they were all fulls. My question was why does having these random, fixable mistakes in our scoring improve it?

I do agree that some concession has to be made to the timeliness and expense of the process. You can't simply have 10 camera angles and let the judges watch each routine 10 times to get it right. However, I think the accuracy could be greatly improved with some judicious use of technology.
 
  • Thread starter
  • Moderator
  • #64
I believe a coach/choreographer has to accept that one camera will be facing the front (in HD). The difficulty score will come from making sure they display the skills easy enough for a difficulty judge to correctly count the skills. If you build a pyramid and have a standing double go behind the pyramid thats just poor planning. With just decent youtube quality video I can go count a routines skills fairly quickly (look at the people doing the breakdowns on here). Then you include the scored live execution and performance parts (the ones that cannot change and are subjective) and you end up with a final result.

I think the problem is for so long weve been judging skills the same way. Two completely separate parts (the objective and subjective) had to be done at the same time the same way. It is kinda like having a stunt score but no separate execution or difficulty. But now it has become common knowledge that skills have an execution factor and a difficulty factor. So why not let the judges just judge when it comes to execution, and then let someone who can pick it apart later give the difficulty score?
 
I didn't make myself clear. When you say you want the "human element" in judging and don't want video used unless there is a dispute, I disagree. I say let the judges (particularly those tasked with determining difficulty) watch all the video they want to get their score in the first place. There currently isn't any way for judges to possibly see every element, so there will always be random mistakes made. A judge may simply not be looking at the side of the floor where there is a tumbling bust. They may happen to be watching the one girl in a group of 10 doubles that only fulls and assume that they were all fulls. My question was why does having these random, fixable mistakes in our scoring improve it?

I do agree that some concession has to be made to the timeliness and expense of the process. You can't simply have 10 camera angles and let the judges watch each routine 10 times to get it right. However, I think the accuracy could be greatly improved with some judicious use of technology.

Couldn't this be accomplished by having multiple judges scoring the same thing, then discussing and coming to conclusion of what they feel score should be. If one catches more than other judge, go back and see where discrepancy is at and score accordingly. This would seem less costly and time consuming. I think everyone on here brings up great points both positive and negative. Great debate question by Kingston.
 
I believe a coach/choreographer has to accept that one camera will be facing the front (in HD). The difficulty score will come from making sure they display the skills easy enough for a difficulty judge to correctly count the skills. If you build a pyramid and have a standing double go behind the pyramid thats just poor planning. With just decent youtube quality video I can go count a routines skills fairly quickly (look at the people doing the breakdowns on here). Then you include the scored live execution and performance parts (the ones that cannot change and are subjective) and you end up with a final result.


I think the problem is for so long weve been judging skills the same way. Two completely separate parts (the objective and subjective) had to be done at the same time the same way. It is kinda like having a stunt score but no separate execution or difficulty. But now it has become common knowledge that skills have an execution factor and a difficulty factor. So why not let the judges just judge when it comes to execution, and then let someone who can pick it apart later give the difficulty score?

I do like this and think it brings up great points.
 
BlueCat said:
I think there is difficulty in selecting the judging. You are faced with the dilemma of coming up with people experienced enough with modern cheerleading to know the relative difficulty of modern skills, but NOT associated with or have a history with any particular gym. That is a tough combo to find. I definitely think that no judge should have an association with any gym they are judging, but you can't realistically expect every judge to have no all-star experience at all.

God I hope they have all star experience. I want coaches like you and kingston and imrichhowboutu judging my teams. I know we'll be scored correctly if the people judging have been in the trenches and know their stuff.
 
I didn't make myself clear. When you say you want the "human element" in judging and don't want video used unless there is a dispute, I disagree. I say let the judges (particularly those tasked with determining difficulty) watch all the video they want to get their score in the first place. There currently isn't any way for judges to possibly see every element, so there will always be random mistakes made. A judge may simply not be looking at the side of the floor where there is a tumbling bust. They may happen to be watching the one girl in a group of 10 doubles that only fulls and assume that they were all fulls. My question was why does having these random, fixable mistakes in our scoring improve it?

I do agree that some concession has to be made to the timeliness and expense of the process. You can't simply have 10 camera angles and let the judges watch each routine 10 times to get it right. However, I think the accuracy could be greatly improved with some judicious use of technology.

Only issue I see is the timing issue. This is long process!!
 
  • Thread starter
  • Moderator
  • #69
God I hope they have all star experience. I want coaches like you and kingston and imrichhowboutu judging my teams. I know we'll be scored correctly if the people judging have been in the trenches and know their stuff.

But that is kinda a catch 22. If you get people judging who have no close relation to a gym then you feel people will be more objective. But let's say I judged your routine against a Rays team. I have practiced and trained a lot of the skills these teams are trying with my athletes, but would you believe I would be completely objective? (I would be, but I am sure many feel I wouldnt be).
 
As a parent, I am probably not qualified to post in this thread, as judging appears to be a foreign language. However, after watching level 5 cheer progresses over the last 5 years, I don’t understand how you can judge without video review (difficulty level, amount of skills and speed of the routine). After years of discussion with coaches, owners and parents….I think some of these suggestions are long overdue. A long with video assisted judging, I believe All Star Cheer judges should be using a computer (not pen and paper). All judges’ scores should be time stamped and made public. This would eliminate the perception that EP’s and judges “review” scores before awards. Also, would make judges accountable by eliminating the perception that judges play favorites to programs (did choreography for this team, used to coach that team, had dinner with that owner, etc…). These may not be issues, but when dealing with subjective scoring sports (cheer, gymnastics, diving, etc,) people look for any excuse when they lose. By eliminating as many of these perceptions as possible you make the sport more legitimate.
 
  • Thread starter
  • Moderator
  • #71
Only issue I see is the timing issue. This is long process!!

Let's do a speed test. Find me a routine and I will see how fast I can come up with the skills and elements. Preferably a division Rays is not in
 
Let's do a speed test. Find me a routine and I will see how fast I can come up with the skills and elements. Preferably a division Rays is not in
Rays don't have a large coed. Here is CA Cheetahs - in HD.
 
Let's do a speed test. Find me a routine and I will see how fast I can come up with the skills and elements. Preferably a division Rays is not in

You are not the issue though. Are all judges going to be as fast as you? Like earlier poster said I have faith in you. I think if it can be shown that it can be done in a timely manner it is great idea. Maybe put a time limit on it, like other sports do?
 
Let's do a speed test. Find me a routine and I will see how fast I can come up with the skills and elements. Preferably a division Rays is not in
Rays don't have a large coed. Here is CA Cheetahs - in HD.

I want to play too. Are we doing a list of all elements in the routine, or would there still be a separate judge for building and for tumbling? Also, will dance be on the difficult judge, or would that stay with the execution judge?

Once you give me specifics, i'll join in. and then post my list for accuracy. I'd say i would try to limit myself to 5 minutes.
 
Using that as an example, is there anyone who can accurately tell how many of the baskets were kick kick doubles (highest range on NCA scoresheet) vs. kick doubles (lower range) in one viewing (happens at around 1:50)? The potential difference in difficulty score in whether there were majority or not would be about .8 towards your final score - a bigger difference than 1st and 2nd in most divisions.

That is assuming that ALL you are watching for is kick kick doubles, you know ahead of time what is coming up, and you are NOT writing down the jump score for the previous section and watching intently. I know I couldn't do it in one viewing even though it is my team and I know which ones to look for.

Do you still think that video replay is a bad idea?
 
Back