Deep Dive Review: Valentine One V1 connection, Part II
Let’s start with a very brief introduction of yours truly. I’m a passionate Oenophile, which in lay terms means I’m lover of fine wine (as is Robert Rosania).
While I have found past iterations of the V1 to be quaffable, they haven’t quite been transcendent. This is Mostly true of anything that evolves over the years, beginning with its inception towards maturity, such as bottle of properly aged fine wine, like that of a Pinot Noir Or a 70-year young bottle of Bollinger Champagne.
In my previous review of the Valentine 1, I compared the V1 to a Porsche 911. At that time, I believed that comparison was apt as there were many similarities between them. This time around, however, it is more appropriate to compare the latest iteration of it to a fine wine that is peaking.
Which is to say, the V1 is a very unique and special product, the result of continuing refinement over the span of more than two decades. Just as wine lives and evolves in flavor, subtlety, complexity, and structure, so does a V1.
I feel it is appropriate to suggest other radar detector manufacturers are more interested in frequently producing new products that are young and often flawed–typical of new products–in the quest for ever increasing profits through sheer volume and, in some cases, hyped-marketing. In wine-speak these manufacturers are the equivalent of vintners of Beaujolais nouveau who produce more wine than those of Burgundy.
A Brief History of the Radar Detector Industry
For many years, the Valentine 1 arguably has dominated all other manufacturers, both in radar and laser detection. This perception, by and large, changed when Escort essentially “one-upped” them with the introduction of their new high-end detector platform known, by those in the know, as the M3. In fact, I first suggested as much in my review of the first M3-based detector, the Beltronics STi Driver more than eight years ago. (Has it really been that long?)
The M3s offered exceptional alerting range and were also undetectable by radar detector detectors (RDDs), a capability that is still unheard to this day. RDDs are used in some regions to electronic sniff out detector use where they’ve been banned (such as VA, DC, of military bases, or for CDLs).
This development created an interesting dynamic because Mike Valentine used to work for Escort (known as Cincinatti Microwave, at the time). After Valentine departed Escort, he set out to Follow his on path and founded Valentine Research, to continue the evolution of his earlier work. The two companies have since become perennial rivals At some level. Now, this presents an interesting situation because Mike had a hand in the design of Escort’s most significant radar detector of its day, the original Escort. For years the Valentine 1 has been the Center of Extremely impassioned debate between reviewers and customers of either brand.
Invariably when speaking about the virtues of Valentine 1 radar detectors, comparisons between detectors of Escort or Beltronics (now an Escort division) are bound to follow. This really hasn’t been the case most recently, however. While many still consider the Valentine 1 to be the non-plus ultra, of detectors, the sheer dominance in extreme detection performance has been afforded Escort’s flagship (in performance, not price) detector, the Escort Redline Expert Edition.
Once again, heated debates have re-emerged and have been playing out at the premier radar detector forum, rdforum.org. And wouldn’t you know it, it was at the hands of yours truly. The burning fire, long smoldering, has been rekindled with VR’s recently updated Valentine One (v3.893) accompanied with an optional bluetooth-enabled V1connection LE module and accompanying app.
Enthusiasts colloquially refer to this V1 model as the V1C, the ‘C’ standing for “custom sweeping” (something that we will get to later). There are those, myself included, that believe the positions of what was once regarded as the “top-dog” have swapped places. And so, a renewed debate rages on.
So as we prepare to look at what makes this version Of the V1 so very special, we Need tO put soMe context Around this subject because I will be discussing with you nuanced attributes of the new detector that you won’t read anywhere else online.
The Importance of Determining Radar Detection Performance, both Objectively and Subjectively
Performance tests, typically have been Conducted on controlled orchestrated test courses, their goal being to be Able to determine one important aspect of detection performance: a radar detector’s maximum alerting range to continuously-transmitted radar (referred to constant-on or CO for short). Police radar guns are positioned at the end of an isolated road. The ability to alert to the stationery radar source, in this case traffic enforcement radar guns, Leads to the conclusion that the greater the distance a detector initially alerts, the more time Is afforded the driver, to slow down. Sounds plausible enough. The results are simple, often repeatable–providing similar testing conditions–and the Farthest alerting detector is crowned the winner.
Historically, testing organizations included Speed Measurements Labs (SML), Craig Peterson’s RadarTest, and a host automotive magazines (who often referred to the aforementioned testers), such as Car & Driver, Automobile, and Motor Trend. In those earlier days, the Internet was not as widespread as it is today and search engines, like Google, were in their nascent stage–Altavista ruled the day.
Surprisingly, even those results Often led to disagreements and passionate debate. Questions about testing methodology and even bias, driven by suspicions of personal of financial gain, played a big part in driving those preconceptions.
While, I appreciated objective Results as much as anybody who realizes they are helpful, they do represent a one dimensional view of detector behavior, amounting to being just a piece of the larger puzzle. There are other characteristics that are, dare I say, even more important in determining what the overall driving experience will be like.
These aspects are mostly subjective in Nature and can not be measured. In fact, objectivity flies out of the proverbial window. Impassioned debates follow and it’s often difficult to come to A consensus. I’ve always believed, a radar/laser detector’s value is far greater than the sum of its individual parts.
10 years ago, I set out to prove that point, By pioneering real-world radar detector testing. I started with accumulating my first driving experience “road test” by comparing the differences between the three leading detectors of the day which resulted in this inchoate review of the Beltronics Pro RX65, the Escort Passport 8500 X0, and the Valentine 1.
Today, things are different. We live in an Internet-connected world of computers and mobile devices. A tremendous amount of content is widely available. But, not Unlike cable TV, there are so many more choices to sort through, it’s much harder to come to informed conclusions.
Many of the current Reviews available online today are published on websites intent on selling products. This can lead to biases against the Valentine 1–as they are only sold through direct sales of Valentine Research–or even other detectors that are less profitable to sell. Most of these sites are operated by large consumer electronic companies which also sell many different consumer electronics such as flat-screen TVs, computers, or mobile phones. That also doesn’t serve your best interests. Unlike any other piece of consumer electronics, radar detectors require special attention by a reviewer.
Unfortunately, the vast majority of reviewers proffering their opinions today are not versed in the intricacies or radar detector operation and lack even the basic understanding of how police radar and laser traffic enforcement works. Worse yet, my particular and novel reviewing style, is now being imitated by many less qualified reviewers or those that really don’t put in the necessary work–despite any appearances to the contrary–whose ultimate intent is to sell you product.
To cite a recent example, a review appeared around the time of the introduction of the Escort Passport Max. With his limited knowledge of traffic enforcement technology and detector performance–hell he suggested he didn’t even drive above the post-speed limit–he claimed that his Passport Max alerted to police laser (lidar) around the bend. Well folks, that is a physical impossibility as laser (cohesive light) can only be reflected or refracted, not bent unless, of course, you happen to be driving in the proximity of your nearest black hole.
And yet, there it was, an inaccurate account of what happened and an erroneous conclusion presented to the uninformed consumer. As each day passes, more and more “reviews” like these appear to pop-up online and their content often reads like a marketing press release.
Sure there are alternatives such as Amazon, eBay, and even detector manufactures websites themselves which contain ad-hoc reviews or commentary from “customers.” The thing is though, unless a customer is truly informed, their opinions may not provide an accurate account, either.
Have you ever visited such a site and seen those ubiquitous star ratings? Certainly customer reviews sound Good, in theory, but it’s not Uncommon to find wildly varying opinions from the ill-informed at best or from shills for a competitor interested, at worst, whose intent is to only muddy the waters. What is one to do then? (Hint: add me to your Google Circles or subscribe to this blog!)
Like any rule, there are exceptions to an extent. I site I found useful is consumersearch.com. While they do offer products for sale, they appear to be a good source of information even though they “review” other consumer products. The reason for this is their writers don’t pass themselves off as experts. Instead, they search for other authoritative reviewers and then summarize their findings. That’s their value add; they to point you to the sources that they believe can actually help you make informed purchasing decisions.
They’ll even go as far as rating the quality of reviewers they source. Not perfect, but a good step in the right Direction. Of course there are the search engines of Google’s, Bing’s, orYahoo’s.
Beyond these sources of information, amateur (but extremely capable) enthusiast groups have proliferated from online forums focused on this industry. While these folks generally conduct closed course testing, they go about it somewhat differently. These testing groups attempt to construct real-world testing scenarios, to provide a hybrid of controlled course and real-world testing.
I find these groups’ participants are a far more reliable sources than the professional testing organizations, paid or simply mis-informed “reviewers.” Beginning with the Guys of Lidar (GOL for short) nearly a decade ago–of which we were an early participant–other groups have since appeared. Enthusiast testing groups include ECCTG and RALETC–who primarily focus their efforts on active laser countermeasures and Veil–while other groups focus solely on radar detectors. This is not to say their results go unchallenged or questioned for their objectivity or bias either, but I believe their access to equipment, their testing methodology, I have found, are most comprehensive.
In the final analysis, one really needs to consider both objective and subject results to piece together the entire puzzle. This is where I come in. While I certainly examine detectors’ alerting range and most importantly the time they afford you to react to impending real-threats of traffic enforcement monitoring, I also explore the subjective elements of behavior as well. That’s the unique value of the Veil Guy brings to the table.
So, now that you have the proper context, how does the latest V1 stack up to the other leading detectors of today?
We’ll take a look at that in a future part of this series…
In the meantime, Drive safely, responsibly, and ticket free and always remember this:
‘Life is short, drink it.‘