I just updated the Schwinn Loop review with some 30-day impressions. Bottom line, it's still "fine," but the radius in which I'm interested in using it is shrinking now that I've been out on the CrossRip a few times, too.
In the process of adding a few notes for the 30-day update (which has moved into the yellow, "low-but-not-none" state of credibility on the scale), I realized I had a passage on bike reviews in general I'd meant to move over here once I had a blog set up. I do, now, so here it is:
When I went shopping for what would become my Crossrip, I had an easy time of it. I knew which brands and models I was interested in thanks to seeing what coworkers were riding, and I learned which categories were interesting based on those brands and models. It was easy to find reviews for just about all of them either through retail sites or biking publications. I felt like I knew what I was buying by the time I took a test ride.
Trying to learn more about the Schwinn Loop was a chore. There were plenty of retail reviews, and the Amazon product Q&A feature was pretty helpful, but there weren’t many reviews written by biking publications. I had to do a lot of reading between the lines, and I ended up coming away from a lot of reviews the way I did when I was researching standing desks: None of the credible reviewers put a lot of time on one, and the retail site reviewers were the usual mix of bad writing, unreasonable expectations, and flights of wild pique over things like the tires not being inflated out of the box.
Even worse than the dearth of reviews was the sheer volume of SEO’d folding bike review sites that exist solely to rake in affiliate dollars from Amazon. I came across “reviews” that were copied and pasted from different bikes (they failed to correctly search-and-replace the model), “reviews” that just involved repeating the specs back with weird Markovisms like, “With 20-inch tires you’ll ride openly," and YouTube channels with a clip that depicted someone folding and unfolding the bike for what was obviously the first time, except with a soundtrack and titles that suggested they thought they were teaching you the process.
In the end, the very most useful resource ended up being Amazon's Q&A section on the product page. People ask specific questions, dozens of them, and product owners answer them. In some cases, the answers are ridiculous and at odds with every other answer to the same question. You just learn to filter. What you're left with is a sort of collaborative product FAQ, supplemented by a close reading of the least favorable and middling reviews, along with a skim of the most favorable ones where you're hoping the reviewer came back six months later, or noted something – anything – wrong or off about the product to indicate the review has any critical value.
The one flaw in the Amazon Q&A system that I can tell is that Amazon prompts people via email to answer questions. I've received a few of these prompts, and I know how conversational they seem. For a certain personality, the invitation to weigh in is simply irresistible. If you've ever wondered why on Earth Amazon Q&As are littered with random people saying "I don't know" in apparent reckless disregard for anyone else's time and energy, well … that's it. Amazon asks and a certain kind of person answers.
It's still a great feature. The questions are specific and the answers are directed to the question. It's a valuable tool that beats sifting through the written reviews if you're largely on the path to buying but can't tell about a certain feature or want to learn if anything about the product is confusing or hard to master.