Welcome to Part 3 of the results of a massive 6.5 Creedmoor ammo field test, where I tested multiple boxes of every kind of 6.5 Creedmoor ammo that is marketed as “match” or “target” grade. That included 19 different brands and types of 6.5 Creedmoor ammo! This article is going to cover all the muzzle velocity data that I collected over almost 1,000 rounds fired.
Consistent muzzle velocity is key for long-range shooting, otherwise, bullets that leave the muzzle faster than normal could miss high, or bullets that leave the muzzle slower could miss low. While the goal is for each shot to leave the muzzle at precisely the same velocity, no ammo is perfect. So it is very helpful for us as long-range shooters to understand the variation we can expect from our ammo shot-to-shot.
Many long-range shooters believe the most useful measuring stick for ammo quality is how consistent the muzzle velocity is. Often that’s how I can tell how experienced someone is in shooting long-range: Are they more concerned with getting maximum velocity or finding the most consistent velocity? I’m not saying faster bullets aren’t advantageous, but for precision long-range work that is secondary to consistent velocity.
How much does it take to miss?
Let’s start by putting muzzle velocity variation in context for long-range shooting. A change in velocity of just 30 fps would change your bullet drop by about 8.5 inches at 1,000 yards. A 50 fps variation in MV would put you off-center by 14 inches! That is based on the average ballistics for all of this 6.5 Creedmoor match ammo that I test (avg. MV, avg. bullet weight, avg. BC), and assumes everything else was dead on.
The diagram below shows a shot simulation and hit probability on a 20” target at 1,000 yards for a few different SD’s to give you context for how that might play out in the real world:
A 20″ target is a pretty generous size, but you can use the visual above to estimate what would happen with a smaller target. We should also remember that the simulated shots and hit probability above assumes that our firing solution is absolutely perfect and we also broke the shot exactly where we should have.
To learn more about how much consistent muzzle velocity matters in terms of hit probability at long-range, read How Much Does SD Matter?
Quantifying Muzzle Velocity Variation: ES & SD
When shooters talk about muzzle velocity variation they refer to either Extreme Spread (ES) or Standard Deviation (SD).
- Extreme Spread (ES): The difference between the slowest and fastest velocities recorded.
- Standard Deviation (SD): A measure of how spread out a set of numbers are. A low SD indicates all our velocities are closer to the average, while a high SD indicates the velocities are spread out over a wider range.
If you’re math-averse, please stick with me! This is important, and understanding the basics can seriously help as a shooter. This topic in particular is exactly why I dedicated time to write the “Statistics for Shooters” 3-part series. One of those articles dives into this specific topic and has lots of visuals and explains it in a way that you don’t have to be a math-nerd to understand. If you aren’t familiar with ES or SD, I’d highly recommend you go read “Muzzle Velocity Stats – Statistics for Shooters” so you’ll be able to get the full value from this article.
Here are some key points from that article related to using ES or SD to quantify muzzle velocity:
- SD is a more reliable and effective stat when it comes to quantifying muzzle velocity variations. ES is easier to measure but is a weaker statistical indicator in general because it is entirely based on the two most extreme events.
- It’s probably a bad idea to be completely dismissive of either ES or SD. Both provide some form of insight. An over-reliance on any descriptive statistic can lead to misleading conclusions.
- ES continues to grow as you fire more shots, but the average MV and SD will both begin to converge on the true value as your sample size gets larger.
- While it’s easy to get close to the average muzzle velocity with 10 shots or less, it’s exceedingly more difficult to measure variation and SD with precision. There is a tendency for SD to be understated in small samples. To have much confidence that our SD is accurate, we need a larger sample size than many would think – likely 20-30 shots or more. The more the better!
- It is very difficult to determine minor differences in velocity variation between two loads without a very large sample size (e.g. 40+ rounds). Often we make decisions based on truly insufficient data because the measured performance difference between two loads is simply a result of the natural variation we can expect in small sample sizes.
With all that in mind, I plan to provide both ES and SD for all the data I collected – but I’ll primarily use SD when making comparisons because it is the stronger statistical measure.
What is a good SD?
If you’re newer to the concept of SD, you might be wondering, “What is a ‘good’ SD when it comes to muzzle velocity?”
In my experience, most standard factory ammo that is NOT labeled as “match” or “target” has an SD in the 15-22 fps range. It’s relatively easy for a reloader to produce ammo with an SD of 15 fps, but we typically have to be meticulous and use good equipment and components to wrestle that down into single digits (i.e. under 10 fps). In fact, I’d bet good money that while most reloaders believe they’re producing quality ammo, if they shot a couple of 10-shot strings over an accurate chronograph (e.g. LabRadar, MagnetoSpeed) they’d see an average SD in the 10-15 fps range. I’m not saying all reloaders, but I am saying most. Often reloaders either don’t test their ammo over a chronograph or when they do they may only fire 3-5 rounds over it. Since there is a tendency for SD to be understated in small sample sizes, if they fired 15 more shots they’d likely see their SD fall in that 10-15 fps range – and often on the higher end of that.
The table below is from Modern Advancements in Long Range Shooting Volume 2 and it provides what Bryan Litz describes as a “summary of what kind of SD’s are required to achieve certain long-range shooting goals in general terms”:
In general, most long-range shooters have a goal to reload ammo with SD’s “in the single digits” (i.e. under 10 fps). However, I’d say in most PRS or NRL-type matches, getting below 10 really isn’t necessary for typical target sizes and distances out to 1,000 yards. I’ve used match-grade factory ammo with SD’s around 12 fps range to place in the top 3 in local PRS matches multiple times. While a match might have one stage with 1 MOA targets out to 1,000 yards – that is rare and doesn’t represent the typical target sizes. It’s far more common to have 1.5-2.0 MOA targets in precision rifle matches. Now, if you’re competing at the highest levels where one miss could be the difference between landing in the top 5 or 15th – everything starts to matter more! Those guys want single digits, but us “normal guys” honestly have weaker links in our chain than our SD being a smidge over 10. I’d say an SD below 14 fps is an appropriate goal for the majority of long-range shooters. That would likely keep your ammo from being one of your biggest limiting factors.
Note: Competitive shooters engaging targets beyond 2,000 yards in Extreme Long Range (ELR), where first-round hits are critical, want to be closer to 5 fps. The further the distance, the more critical consistent muzzle velocity becomes, so the lower the better!
For more context, read How Much Does SD Matter?
How I Gathered Muzzle Velocity Data
Sample Size & Where The Ammo Came From
The stats article explains this in more detail, but just for context if you only fired 5 shots from any of these boxes of ammo and measured one to have an SD of 8 fps and another to have an SD of 15, we’d think we found something meaningful, right? Actually, there is less than a 75% chance that the difference is real, because of the natural variation you can expect in a sample that small. That means there is more than a 1 out of 4 chance that both could have come from the exact same case of ammo and one measured 8 fps and the other 15. It’s simply not a large enough sample size to have any certainty that the results were accurate and repeatable.
In fact, if you are trying to differentiate minor differences like 2-3 fps in SD between loads, you need a sample size of at least 40 rounds! So, that’s why I fired two boxes of each type of ammo – a sample size of 40. At first, I was just thinking of doing one box of each ammo, but if you know stats, you know that could potentially be misleading. I believed that so much that I personally dropped almost $1,000 more to buy a 2nd box of each type of ammo. Now we can have real confidence in the results!
While I could have reached out to companies like Hornady, Berger, PRIME, Federal, and others, and they would have happily sent me discounted or even free ammo for this test – I decided to buy it all for full retail price from popular online distributors. I didn’t even tell any of the manufacturers that I was even doing this test. I paid full retail price for all the ammo to ensure none of it was cherry-picked or loaded “special” for this test. It was all just random ammo off a shelf somewhere. In fact, I ordered one box of each in December 2019 from popular online distributors, and then I waited 6 months and ordered another box of each from a completely different list of distributors. PRIME and Copper Creek only sell direct to customers, so I had to order those from the manufacturer directly. I made sure no manufacturers knew I was conducting this test, and waiting 6 months between orders should ensure the ammo tested is representative of what you can expect.
The Data-Gathering Process
I realize this research could potentially have a significant impact on sales for some of these ammo manufacturers, so I tried to go above and beyond to ensure the data I collected was trustworthy. I’m certainly not “out to get” any manufacturers or to promote any of them either. None of them sponsor the website or have ever given me anything for free or even discounted. There aren’t hidden relationships or agendas here. I’m a 100% independent shooter who is simply in search of the truth to help my readers. So, as always, I tried to identify any significant factors that could skew the results, and then created testing methods that tried to eliminate those or at least minimize them to an acceptable level.
A big part of that for these muzzle velocity results was using 3 LabRadar Doppler Radars to record the shots. If you aren’t familiar with a LabRadar, you should come out from the rock you’ve been living under! 😉 It’s a device that tracks a bullet downrange using Doppler Radar and then analyzes that data to calculate an extremely accurate muzzle velocity. I looked at the underlying log data on the LabRadar for this test, and it looked like each device would typically record 55-85 data points for every shot. So with 3 of them running, that means I recorded around 200 velocity measurements per shot – or 8,000 data points for each type of ammo!
The chart below is a real example of the underlying log data for a single shot during this research. There are 84 black dots, which each represent a data point where the velocity was recorded. You can see the distances each was recorded along the horizontal axis, which included measurements out to 107 yards. I also added a trendline for that data, and the blue dot is what the device calculated the velocity at the muzzle must have been to result in those downrange measurements.
The company that makes the LabRadar is named Infinition, and they’ve been manufacturing high-end instrumentation radars for more than a decade. Infinition’s high-end radars are used daily by professionals at serious research centers, ballistic labs, and proving grounds around the world. So they are experts among experts in this field, and this consumer-grade product includes a lot of the technology and lessons learned from their high-end radars, and brings that technology into the hands of the shooters and hunters. The LabRadar is a big leap ahead of most traditional light-based chronographs in terms of accuracy and reliability. Infinition says the LabRadar has an accuracy of 0.1%.
I went with 3 LabRadar’s for a couple of reasons. My first thought was to mitigate the risk of a bad measurement or device malfunction potentially skewing the results. While I can’t remember experiencing a moment when the device gave a “bad reading,” I felt like it was a good idea to alleviate that concern. I did that by taking the median of the 3 measurements, which means if one of the 3 devices had a reading that was lower or higher than the other 2 devices it had no impact on the results. Honestly, most of the time the measurements were within 1-2 fps on all 3 devices, so this was probably overkill – but at least we don’t have to worry about the recording device skewing the results.
Also, while it’s rare, a single LabRadar can occasionally miss a shot. Having 3 devices ensured at least 2 of them caught every shot – and I can confirm that I got all 780 shots for record measured by at least two LabRadar’s. I was curious how often 1 of the 3 devices didn’t catch a shot, so I did the analysis and all 3 devices captured the velocity 95% of the time.
I was careful in how I set up the devices. I reread through the manual to make sure I hadn’t misunderstood or forgot something. You can explicitly set the transmission frequency of the radar so you can use multiple radars without them interfering with each other. The manufacturer recommends at least 2 channels of separation between devices, I used 4+ channels of separation. I also ensured the devices were all similar distances from the muzzle and aligned them relative to the muzzle as the manufacturer suggests. I also configured the projectile offset on each device to correlate to the distance to the side of the radar, as recommended to optimize precision.
When it came to the rifle and firing process, I used two different rifles. I do personally own a couple of high-end 6.5 Creedmoor rifles, but I know most of my readers aren’t using an $8,000 custom rifle setup. Also, it’s possible that a particular kind of ammo might perform better out of one rifle than another, because of differences in the chamber, barrel, and other mechanical nuances. Since more people use factory rifles than custom rifles, I decided to buy a stock Ruger Precision Rifle (RPR) to use in this test. Again, I bet if I’d have reached out to Ruger they would have gladly loaned me a rifle for this test, but I decided to simply buy a brand new one from GunBroker.com, just like my readers would, to try to ensure it was representative and not a rifle that potentially had been cherry-picked off the line. So that was another out-of-pocket expense for this test, and I hope all of this shows how serious I am about objective testing. I am whole-heartedly in search of the real, unbiased truth to help fellow shooters!
The custom Surgeon rifle featured a 22-inch barrel, and I decided to shoot the test without the suppressor attached (i.e. bare muzzle). I didn’t want to risk it somehow affecting the results if it came loose or somehow the heat caused mirage issues that skewed the group sizes or maybe it was too quiet to trigger the LabRadar on all the shots. The Ruger Precision Rifle came with their stock 24-inch barrel, and I didn’t change a thing about the whole rifle. Both rifle barrels had a 1:8 twist rate.
Another thing I thought about that could potentially skew the results was the barrel condition. So I did these things to mitigate some potential issues related to that:
- First, I “broke in” the new barrel on the RPR by firing 150 rounds down it. In my experience, the velocity has always stabilized by that point, and it had with the Ruger Precision Rifle, as well. The custom 6.5 Creedmoor had just over 2,600 rounds on it when the testing started (yes, I document every round). It was outfitted with a custom Bartlein barrel with a StraightJacket and was chambered by Surgeon Rifles. If you aren’t familiar with the StraightJacket, you should read my massive barrel test research published in Modern Advancements for Long Range Shooting, Vol 2. (If you are reading about this test, I guarantee you will LOVE that book. I find myself referencing it more than any other book – and I’ve read virtually all of them.)
- I ran a cleaning regimen during all testing where I would start with a clean barrel and fire 4 shots that weren’t for record to foul the barrel and ensure all the cleaning solvent was out. Then I would fire no more than 30 shots for record before I cleaned the barrel and repeated the process. That would always be 10 rounds from 3 types of different ammo. The cleaning process and method were consistent throughout testing. I randomized the order that I shot the ammo and changed it between the first and second boxes.
- The barrels were always allowed to cool to ambient temperature between each 10-shot string for record.
So I recorded a 10-shot string from each rifle and each box of ammo. The total sample size for each brand and type of ammo was 40 rounds, which was from two different lots of ammo purchased 6 months apart from different distributors. Then I was OCD and paranoid about every aspect that might skew the results, and I tried to control for those aspects in any way I could think of. Now let’s see the results!
I will be providing the detailed data, and not just the watered-down summary info or an A/B/C style rating like many magazines tend to do. That is primarily for transparency, but also because I know I’ve attracted a lot of detailed and critical readers – because that’s what I am! I know my subscribers include ballisticians, Olympic and world champion shooters, statisticians, and many professional researchers within the small arms community. You guys are smart people (honestly, it’s a bit intimidating knowing those people are reading), so my approach is always to present the facts and let you guys draw your own conclusions. This test is no different. Don’t worry – you’ll get exhaustive detail in the next post!
Having said that, I know some guys just want to see the head-to-head comparison, so before we dive into all the details for each type of ammo tested, we’ll cut to the chase and start by looking at the overall summary of performance when it comes to muzzle velocity.
As we already established, SD is a more reliable and effective stat than ES when it comes to quantifying muzzle velocity variations, so that is what I’m going to focus on. (But don’t worry, I’ll provide the measured ES’s too in the detailed data.) The chart below shows what the average SD was for each of the four 10-shot strings that I fired with each type of ammo. That is based on a total sample size of 40 rounds for each type of ammo. I also calculated the stats with a different approach where I normalized the velocity by rifle and lot of ammo, and then calculated the SD over all 40 shots in one sampling – and the results of both methods were virtually identical (average difference was 0.1 fps). I thought the average SD of the four 10-shot groups is a bit more straightforward and easy to understand, so I ranked them based on that method below.
The Best 5
The Federal Premium Gold Medal 140 MatchKing ammo had the lowest SD at just 8.6 fps! Considering that was a sample of 40 rounds in two different lots of ammo from two different rifles – that is an absolutely stunning performance. Federal, I tip my cap to you for that performance over multiple boxes of ammo. Wow!
Sig was not far behind in 2nd with their 6.5 Creedmoor Elite Performance Match 140 OTM ammo also posting an SD in the single digits – at just 9.1 fps! Considering that is mass-produced factory ammo, that is exceptional. I would suspect both are more consistent than the ammo that the majority of reloaders produce, even after investing a ton more time per round. I’m not saying super-OCD handloaders with good equipment can’t top it – but that is absolutely match-worthy ammo!
Behind those top 2 were the Berger triplets taking 3rd, 4th, and 5th with SD’s from 10.6 to 11.4 fps. I tested 3 different types of Berger Match ammo, each loaded with a different bullet: the 120 Scenar-L, 130 OTM Tactical (Hybrid), and the 140 Hybrid. Those 3 types of Berger match ammo landed neck-and-neck with just 0.8 fps of variance in their overall SD’s after 40 rounds – even over two lots of ammo. How is that for consistency?!
Remember what expert ballistician and long-range expert Bryan Litz said about ammo with an SD around 10 fps: “Exceptional factory ammunition … Acceptable for many long range shooting applications. 10 fps SD will give you approximately 1 MOA of vertical dispersion at 1000 yards.” I agree with Bryan. The factory ammo in this top 5 is exceptional!
The Middle 10
In the middle of the pack is a long list of other types of ammo with SD’s between 12.3 and 15.2 fps. In fact, 10 of the 19 types of ammo landed in that band that was just 3 fps wide. Before this research, I had shot literally thousands of rounds of match-grade factory ammo, and I would have guessed that 12-15 fps SD’s were pretty typical based on my experience – and it looks like that is true for the majority of the ammo tested. However, there were a few clear outliers both above AND below that range where the bulk fell.
Here are a couple of interesting notes from this middle group:
- All of the Hornady 6.5 Creedmoor Match ammo landed in this group with SD’s ranging from 12.7 to 15.0 fps. That is just a 2.3 fps spread among all 3 types of Hornady ammo tested (120, 140, and 147 gr. ELD-M’s).
- Both types of Copper Creek custom-loaded ammo also landed in this group, although they were both a little better at 12.3 and 13.6 fps.
Remember what Bryan Litz said about ammo with an SD around 15 fps: “Good factory ammo or poor hand-loads. Useable for long range shooting but not ideal.” So these should all be considered good ammo when it comes to consistent muzzle velocity. If some of the ammo from this middle pack prints tight groups and/or has a higher BC bullet and/or leaves the muzzle a little faster – it’s plausible that they could have a higher hit probability at long range than even those in the top 5 with the lower SD’s. We’ll just have to wait to see how it all shakes out when we put it all together.
The Bottom 4
After firing 760 rounds for record, there were 4 types of ammo with SD’s of 17 fps or more:
- Winchester Match 140 MatchKing: Average SD = 17.0 fps
- Nosler Match 140 Custom Competition: Average SD = 17.9 fps
- Nosler Match 140 RDF: Average SD = 19.9 fps
- Remington Match 140 OTM: Average SD = 21.2 fps
There is no way around it: Those SD’s are too big to be considered “match-grade.” The average Extreme Spread (ES) of those over 10-shot strings was 56-68 fps, which definitely makes it hard to hit anything at distance. For context, 60 fps of muzzle velocity difference equates to 17” of vertical drop at 1,000 yards for the average 6.5 Creedmoor ammo tested. Here is what Litz said about ammo with an SD of 20 fps or more: “Excessive muzzle velocity variation. Not suitable for long range shooting.” I know this isn’t going to make those companies happy – but I agree. Consistent muzzle velocity is critical to long-range shooting, and it looks like these brands simply failed the test.
Predicting The Future
Okay, before we move on – I did want to say one thing. Some might think this academic, but I think it’s important. The chart below shows the range of SD’s that statistics would say there is a 90% chance that the true SD of the population would fall into. What do I mean by “the true SD of the population”? Well, we know exactly what the SD was for the 40 shots I fired of each type of ammo – but we will never fire those same shots. In fact, I’ll never fire those same shots again even from the same rifles! Since those were in the past we can quantify the variance with absolute precision – even out to the 5th decimal place if we wanted. But, if we ran this test from start to finish again with new boxes of ammo – would we get the same SD’s out to the 5th decimal place? No.
“Just because we can measure or calculate something to the 2nd decimal place doesn’t mean we have that level of accuracy or insight into the future! We can only speak in terms of absolute, precise values about shots fired in the past. When we’re trying to predict the future, we can only speak in terms of ranges and probabilities.” – from Statistics For Shooters Part 1 – Predicting The Future
When I refer to “the true SD of the population,” I’m talking about predicting what the overall SD would be if we fired 100,000 rounds of each type of ammo. See now we are talking about the future, not the 40 rounds that I shot in the past. This is precisely where statistics can help us get valuable insight!
With that in mind, the chart shows a blue dot which is the measured SD for each 40-shot sample, but then there is a range on both sides of that point representing where the SD of the population might fall. I chose a 90% confidence level for these calculations, which basically means if we bought 10 cases of ammo we could reasonably expect the ammo in 9 of those cases to have an SD inside that range and 1 might fall outside of that range. There is a 5% chance it would fall above the top arrow and a 5% chance it’d fall below the bottom arrow, leaving us 90% right in the middle.
You can see that there is considerable overlap between a lot of these types of ammo. Look at all of those where the blue dot is between 13 and 15 fps, which is the Copper Creek 140 to around the PRIME 130 ammo. The majority of those ranges overlap, which means you can’t have much confidence that there would be a real difference between those over the long haul. I realize that I’m kind of undermining my own test results by saying that, but I’m more interested in helping you guys have real insight than propping up my research as “definitive.”
With this in mind, I think of this as “classes of ammo.” Those first two types of ammo (Federal 140 and Sig) seem to largely be in a class of their own, at least based on our 760 round sample size. Then 3 types of Berger ammo is somewhere between those and that big middle group that included 10 types of ammo. Then you have those on the tail end that start creeping up pretty high. Now there is still considerable overlap between say the Hornady Match 140 ELD-M ammo and the PRIME Match ammo, so maybe those wouldn’t turn out to be so different over a larger sample size – or maybe the true SD of the population for the PRIME ammo would end up at the bottom of its range and the Hornady would be at the top of its range – in which case they’d reverse order. So those with significant overlap could be a toss-up, depending on the sample you happened to get. But, with all that said – the Sig ammo is almost certainly going to always beat the Nosler ammo (assuming neither of them changes their components or processes).
If you are wondering how to get more confidence and/or tighter ranges, I want those too! The problem is you have to shoot A LOT more ammo. As an example the Hornady 147 gr. ELD-M ammo had a measured SD of 13.7 fps a 90% confidence range of 11.6-16.9 fps based on our 40-shot sample. If we would have fired 100 rounds with that same SD that would just shrink the range to 12.3-15.5 fps. If we’d have shot 200 rounds with the same SD we could say with 90% confidence the true SD of the population would be between 12.7 to 14.9 fps. So even if we fired a whole case of every type of ammo and burned out multiple barrels doing the research – we would still have a fairly significant range with overlap between some of these.
Now, let’s go the other direction: What if we’d only fired 10 rounds and got the same 13.7 SD? We would only be able to say the true SD of the population is somewhere between 10.0-22.5 fps! That is a massive range! See why I decided to shoot 40 rounds of each type of ammo? (Also, see why you should fire more than one 10-shot string to see what your SD really is?) I was trying to shrink the ranges as much as I could – without having to refinance my house to pay for the ammo! It’s always a balance of confidence and further investment and time in testing, but I hope this gives you guys context for how to interpret the results. 😉
Average Muzzle Velocity
Now let’s do one more head-to-head comparison, and that is simply the average muzzle velocity. These were all 6.5 Creedmoor match-grade ammo but the bullet weights ranged from 120 to 147 grains – so the muzzle velocities had a pretty wide range. In fact, there was just over 350 fps difference in the average MV – even when shot from the same rifles!
Here are the overall average muzzle velocities that I recorded over the 40 rounds, and they’re grouped by bullet weight:
Here are a few interesting things that stuck out to me about the chart above:
- The biggest outlier is the Berger Match 120 gr. Lapua Scenar-L ammo, which was 112 fps faster than the Hornady Match 120 gr. ELD-M ammo.
- The Berger 140 Hybrid ammo has the same MV as the PRIME 130 ammo – even though the bullet weighs 8% more.
- The Black Hills 147 gr. ELD-M ammo had an average MV that was 77 fps faster than the Hornady ammo loaded with the same bullet.
- The Sig ammo that had crazy consistent muzzle velocity (9.1 fps SD) was one of the slowest compared to all the rest of the ammo using a 140 gr. bullet. In fact, the Berger 140 ammo was 124 fps faster! I wonder if that increase in speed more than makes up for the slight difference in SD in terms of hit probability at long range? We’ll definitely learn the answer to that in a subsequent post!
As I mentioned at earlier in this article, veteran long range shooters are more concerns with the consistency of their muzzle velocity than maximizing their muzzle velocity. If they can find something more consistent that is 50 fps slower, that is what the majority would decide to go with. There was a time that “flat-shooting cartridges” were all the rage, but with the ballistic engines we have now you can use a slower, more consistent velocity and it just means you dial a couple clicks more – but you have a higher hit probability because you have less vertical stringing. I say all that so we don’t put too much weight on the overall muzzle velocity here, although if you can get it fast and consistent that would be having your cake and eating it too.
For more context on this, check out How Much Does Muzzle Velocity Matter?
Finally, here is the overall summary showing the highlights for each type of ammo in tabular format. It includes the average 10-Shot SD and average muzzle velocity that I shared above, but it also includes the average 10-shot ES and the difference in average velocity between the two boxes.
Any Correlation to Consistency of Physical Measurements?
In the last article, I published data about physical measurements of the loaded rounds to see how consistent they were in terms of overall length and weight, and I also measured how concentric they were (i.e. how much bullet runout the loaded rounds had). I wanted to measure all of that stuff because as reloaders we can often obsess about those kinds of things, and we tend to believe those things are highly correlated to performance – meaning the more consistent the ammo measures the better it will perform. I thought it’d be interesting to see if some type of ammo was good or bad in one of those aspects, did that clearly translate to performance in the field when it came to consistent muzzle velocity or group sizes? So let’s take a quick look at that.
In the last article, we saw that both the Remington and Nosler 140 RDF rounds that I took apart had the most variance in powder charge weight – and those two also had the most inconsistent muzzle velocity of the 19 types of ammo tested. But, remember I didn’t buy an extra box of all 19 types of ammo to deconstruct and measure the individual components. I did weigh loaded rounds of all 19 and charted that variance, but I only deconstructed 7 types of ammo.
But, while we might want to celebrate a clear correlation there, it isn’t quite that clear. The measured standard deviation in powder weight for the Berger Match 140 Hybrid ammo was very similar to the Remington (0.17 and 0.20 grains, respectively), but the Berger Match 140 Hybrid ammo ranked in the top 5 in terms of muzzle velocity consistency. So it seems like if there was a strong correlation between powder weight variance and muzzle velocity consistency those would have more similar performance – but they didn’t.
Another thing that muddies the waters is the Copper Creek 144 LR Hybrid ammo had the most consistent powder weight measurements. Its SD was just 0.09 grains, compared to the 0.17 grains of the Berger 140 Hybrid – so you’d think it should have had super-consistent velocity. The Copper Creek ammo did finish in the top half, but outside of the top 5 and the Berger 140 had a slightly lower SD even though the SD in powder charge weight was almost twice as much.
What if we look at both the brass weight variance and the powder weight variance – would that complete the picture? The brass weight variance of the Copper Creek ammo was the highest (loaded with Hornady brass), so maybe those things offset each other. In fact, when you look at the variance in total weight of the loaded rounds, we saw that the Sig match ammo and the Berger 130 Hybrid ammo were most consistent – and sure enough, those were some of the top performers here for muzzle velocity consistency! But the 3rd most consistent by measured weight was PRIME and it finished slightly below average in terms of muzzle velocity consistency. The Nosler 140 RDF ammo actually ranked 6th in terms of weight consistency of loaded rounds, but it was one of the very worst in terms of muzzle velocity consistency with an SD of 19.9 fps.
So at least from my perspective, there isn’t a strong correlation between those physical measurements and how consistent the muzzle velocity was. Now, if I’d have taken apart all of the rounds and had a better breakdown between the powder weight and brass variance, maybe a combination of those would have some correlation, but at least based on the data I collected, it seems like there is too much noise to make any claims about correlation. That is very interesting to me, because don’t we all obsess about those things when we’re reloading? Maybe the relationship is just more complex than I’m able to untangle. If anyone else has any insight into this, please leave a comment and enlighten us all!
Alright! That’s it for the summary and head-to-head velocity comparisons. The very next post will share the exact details I collected for each type of ammo, along with some interesting nuances I discovered with many of them.
If you’d like to be the first to know when the next article is published, sign up to receive email notifications about new posts.
6.5 Creedmoor Match Ammo Field Test Series
Here is the outline of all the articles in this series covering my 6.5 Creedmoor Match-Grade Ammo Field Test:
- Part 1: Intro & Reader Poll – Cast Your Vote For Which Will Perform Best
- Past 2: Round-To-Round Consistency For Physical Measurements
- Part 3: Live-Fire Muzzle Velocity & Consistency Summary (this article)
- Part 4: Live-Fire Muzzle Velocity Details By Ammo Type
- Part 5: Live-Fire Group Sizes & Precision
- Part 6: Overall Performance & Long-Range Hit Probability
Also, if you want to get the most out of this series, I’d HIGHLY recommend that you read what I published right before this research, which was the “Statistics for Shooters” series. I actually wrote that 3-part series so my readers would better understand this ammo research that I’m presenting, and get more value from it. Here are those 3 articles:
- How To Predict The Future: Fundamentals of statistics for shooters
- Quantifying Muzzle Velocity Consistency: Gaining insight to minimize our shot-to-shot variation in velocity
- Quantifying Group Dispersion: Making better decisions when it comes to precision and how small our groups are