Login required to started new threads

Login required to post replies

Re: ERO's test of VeloVetta [milesthedog]
milesthedog wrote:
If you were to update the testing protocol, what would you go with?

I’ll start. Feel free to offer edits and additions:

With small sample sizes, and keeping this reasonable for a hobbyist of a reviewer and not peer review level:

- protocol checklist provided to reader/viewer that is used before each run.
- each rider needs to try all shoes in the test.
- each rider needs to rotate through each shoe 4x.
- report test-retest reliability of cda for each rider across their 4 trials of an individual shoe.
- L-R difference in power treated as a variable to also examine using test retest reliability.
- If not reliable, examine protocol issues, confounding variables (e.g. shoe touching pedal powermeter sensor, disc wheel, w/kg out of range), remove possible spurious result and have rider(s) retest the specific shoe.
- If reliable, average scores for each shoe across all testers and run basic ANOVA between shoe cda, controlling for w/kg.


I think there is a need to document things. This need is much greater when you are going to publish the results that could affect the sales of a product, especially that of a start up.

I recently tested some wheels from a large manufacturer. They were slower. I showed the data to the decision maker, explained to him why I believed the numbers were real even though it was limited testing. It allowed him to make decisions for him and only him based on those results. But I would never publish those results here without a full description of how they were tested and a discussion section on results. I told him "my confidence these wheels are slower is 9". "My confidence it's 3w rather than 5w is 5". That was good enough for that specific decision process. Often in aero testing this is good enough. Product reviews require more.

I don't think a simple YMMV statement is sufficient.

So if you are going to do it on stuff like "does long hair impact my CDA", sure put it on youtube with nothing else. If product review, do something more serious.

So for me, it's documentation of the results. How was the test conducted, how many runs, was it ABABA or something else. Road conditions, wind conditions, metrics on power, speed, wind, yaw. In that video they show a rider with a disc. Yellow flags went up. Outdoor test with a disc on the back is something I avoid. Variable wind during the day will skew results. I would have seen that on a equipment list. An equipment list would provide other details I would look at. Pictures of the specific test would be big. The pedal rubbing was possibly caught through pictures.

BTW, L/R balance is not something I would think people would look at normally. I look at it because it caused me a lot of grief. So much grief we actually monitor other metrics for this kind of thing. We capture and score many metrics in the field, only so we know to proceed or not to the next test. And we require them for documentation anyways

Make the Garmins available. Make the raw data available. At least to the parties on the receiving end of your review.

I would run this documentation with a few trusted sources before publishing.

Too formal for a youtube video ? Maybe. Maybe not all aero comparisons are youtube material

PS : at one point i thought it would be cool to have a thread, kind of like the Power VS mph thread, where people could share their aero tests, providing some very basic info on how their test was done. I had it all ready to go, but put it on hold.
Last edited by: marcag: Sep 5, 23 3:18

Edit Log:

  • Post edited by marcag (Dawson Saddle) on Sep 5, 23 3:14
  • Post edited by marcag (Dawson Saddle) on Sep 5, 23 3:18