To me there are multiple levels of "public" when it comes to scoring:
1. No information released at all
2. Coaches see the information about their own team
3. Coaches see the information about all teams in their division
4. Coaches see the information about all teams at their teams' level
5. Coaches see the information about all teams at the event
6. Public sees information about all teams at the event.
There are also levels of information typically given:
1. Rank
2. Final Score
3. Final scores with deductions
4. Summary information
5. Detailed information
"Summary" information would be averages of combination of scores. ("tumbling judge average")
"Detailed" information would be averages of every score on a scoresheet ("standing tumbling execution average")
There are shades of grey to this whole issue. I don't think anyone thinks that coaches shouldn't be able to find out the final scores of the teams in their division. There are probably very few people that want scanned copies of every scoresheet for every team made available to the public. Most fall somewhere in between.
I could be wrong, but my assumption is that many newer and smaller gyms fear their scores being released. While I really do understand this on the surface, I don't think they fully understand how lopsided the current system is. They may walk away with 5-10 scoresheets from an event. A CA (or CEA, Cali, etc) coach will walk away from an event able to study not only their own teams' scores - but perhaps AN ADDITIONAL 200 scoresheets from other teams within their program. This gives the larger programs a MONSTER information advantage that can help them develop strategies and shape routines going into the next competition. Releasing more information would negate that advantage. Everyone would have the exact same amount of information.
The numbers on those scoresheets are slightly helpful for coaching purposes, but what REALLY matters is context. You got an 8.2 on jumps. Was that good? How does that compare to teams within your division? Which teams if any got higher scores than you on jumps? From watching the videos, can you determine why? The number by itself is barely helpful, but seeing the pattern of numbers over multiple teams is much more powerful data.
My ideal event reporting would list every team's average for EVERY number that went in to their score. (team a: standing tumbling difficulty average x.xx, standing tumbling execution average .xx, deductions, safety, etc.) Summary information is better than nothing, but much less helpful. I would prefer it was pure public, but I could live with restricting this to coaches from the event or even coaches within a division.