As long-range shooters, we tend to obsess over every little detail. After all, we’re trying to hit relatively small targets that are so far you may not even be able to see with the naked eye. While you might can get away with minor mistakes and still ring steel at short and medium ranges, as you extend the range those small mistakes or tiny inconsistencies are magnified. So, most things are important … but to differing degrees. This series of posts is taking a data-driven approach by using Applied Ballistic’s Weapon Employment Zone (WEZ) analysis tool to gain insight into how different field variables in real-world shooting affect the probability of hitting long-range targets.
I’ve played around with the WEZ tool a lot, and it was very enlightening! It challenged a lot of my long-held assumptions about how important different aspects were. As Bryan Litz said in his Accuracy & Precision for Long-Range Shooting book, “Looking at each variable separately teaches us how to assess the uncertainties of any shot and determine how critical each variable is to hitting the target.”
Previous posts looked at what impact we could expect from tightening our groups, lowering our muzzle velocity SD, picking the ideal cartridge, or increasing muzzle velocity. In this post we’ll look at another element that plays into our ability to get rounds on target:
How Much Does Accurate Ranging Matter?
A few readers have asked me to include a post on how much accurate ranging affects long-range hit probability. As Bryan Litz explains, “Accurately determining the range to a target is fundamentally important for successful long-range shooting. Due to the bullet’s arcing trajectory, if the range is not known accurately, the shooter will over or under-shoot the intended point of impact. … Laser rangefinders are the most accurate (practical) way of measuring a range, typically capable of +/-1 yard. But there are problems with laser rangefinders related to beam divergence and target reflectivity.” I wrote a post that summarizes how rangefinders work and highlights some of the issues Bryan is referring to. This illustration from Vectronix is a great visualization with many of those factors:
Bryan goes on to explain, “Sometimes the intended target isn’t reflective enough to register in its environment, so the shooter ends up ranging something that’s more reflective in the target’s surroundings. Sometimes it could be a rock 5 yards in front of the target, or sometimes it could be a tree line 100 yards behind the target. Most tools and methods available for ranging targets become less accurate as the range increases. This is unfortunate since the most distant shots are the ones for which you need the most accurate range information.”
Some rangefinders are capable of that +/- 1 yard accuracy, and others don’t even claim that high of accuracy. I did an exhaustive test on the actual ranging capabilities of 8 popular rangefinders in the field. Not all rangefinders are created equal! You can check out the ranging performance results from that test at Field Test Ranging Performance Results.
Now, let’s dive into the WEZ analysis on ranging uncertainty. Here are the results:
Note: The chart axis increases by +/- 1 yard up to 5 yards, then it transitions to +/- 2.5 yards. I wanted to show the effects of fine tuning the range, as well as grossly over/under estimating.
Hit percentage drops off quick! On that 10” target at 700 yards, if you’re off by just 10 yards your odds of hitting the target drop by almost 10%. If you’re off by 20 yards, you’ve got a 50/50 shot of hitting it.
These are the shot simulations for a few of those scenarios, to give you a better understanding of what’s happening.
“Training to improve the accuracy of one’s ranging abilities can result in direct and dramatic improvements in hit percentage,” Bryan explains in his book. You can say that again! If you accidentally ranged a tree 10 yards in front of the target, or the berm 20 yards behind it … that significantly changes your odds of the bullet finding its intended target.
You can see in the shot simulations, we start to miss because of vertical dispersion. That is how poor ranging plays into your misses. If you’re missing because of vertical dispersion, it’s likely due to ammo velocity variance or a bad range (as long as you know your dope is correct, and you didn’t pull the shot). Wind already gives us a lot of horizontal dispersion, so when you add vertical dispersion to that … the picture gets ugly quickly.
One point to keep in mind, is that all of this analysis assumes you have centered groups. That means they represent the best case scenario for hit percentage, since your odds only decrease if groups come off center. If you’re scope isn’t zeroed, or your rifle is canted slightly to one side, or your scope’s clicks aren’t calibrated correctly, or you pull the shot slightly … then your hit probability can decrease dramatically. But these simulations assume we have all that stuff squared away.
Uncertainty from Milling Targets for Range Estimation
After seeing those results, and realizing how important an accurate range was … I started wondering how much error was involved when milling targets for range estimation. For those that are newer, milling a target means that you use a reticle like a ruler to measure the target, then you do a calculation to estimate the range to the target based on that (more on mildot range estimation). Here is an example of what that looks like through a scope. Some advanced reticles have finer subtensions, but the smallest on this reticle is 0.2 mils.
The primary potential for error seems to come from how precisely you’re able to measure the target size in mils through the scope. I wasn’t sure how good guys could get at that, but Marine Scout Sniper John McQuay of 8541 Tactical explains: “To be accurate you need to train your eye to measure to the nearest tenth or a mil or 0.1 mil. Most experienced snipers can measure down to five hundredths of a mil or 0.05. All it takes is practice.” Bryan Litz tells us in his Accuracy & Precision book “The [Horus] TreMor 2 reticle is capable of MILing targets with 0.02 MIL resolution, which enables far more accurate range determination and directly improves hit percentage.” That goes to show how much a specialized reticle can help when measuring targets.
So, we know with practice, a shooter can mil a target within 0.05 mils. We’ll work within that. We’ll start with +/- 0.005 mils of error, which is 10 times better than our scout sniper tells us is realistic. Our 10” target at 700 yards would actually measure very close to 0.395 mils, so +/- 0.005 mils of error might mean we mistake that for 0.39 mils or 0.40 mils through our reticle. That’s a minuscule error, and more than plausible.
Then on the other end of our spectrum, we look at what would happen to your range estimation if you had a measurement error of +/- 0.025 mils, which is closer to the 0.05 mils of precision our scout sniper tells us the most experienced snipers are able to do. That means a target might actually 0.375, but we mistook it for either 0.35 or 0.40 through our scope. That is a completely plausible margin of error, even for experienced shooters.
To help what kind of margin of error we’re talking about, look at the photo below. How many mils tall would you say the IPSC silhouette target is?
What did you get? 0.8 mils? Maybe you said a little more than that … 8.1 mils, or even 8.5 mils. The distance to the target is known to be exactly 1,000 yards (measured with a $24,000 Vectronix Vector 23 Rangefinder), and it is a standard IPSC target with a height of 29.5 inches. So we can calculate the exact size to be 0.819 mils. How’d you do?
Let’s assume we’re experienced enough and have good enough scope clarity and sharp eyes that we wouldn’t call that example 9 mils. But would it be too far-fetched to think we could mistake it for 0.81 or 0.825 mils … or maybe even just 0.80 or 0.85 mils? 0.80 or 0.85 mils represent the worst case scenarios we’ll be analyzing, roughly +/- 0.025 from 0.819 mils. The best case scenario of +/- 0.005 mils would mean we guessed around 0.815 to 0.825 … which would be amazing. Remember that is 10x better than our marine scout sniper thought was plausible by the most experienced snipers.
Keep in mind that on that example I tried to help by zooming into the photo, boosting the contrast, sharpening the image in Photoshop, and even gave the reticle a slight glow around the edges to make it easier to measure. Of course, you probably don’t have those luxuries in the field.
So let’s look at what kind of impact being off a tiny amount when we are milling a target can impact our range estimation. This is based on our target at 1000 yards, but the amount of error at 700 yards is virtually identical.
You can see on the chart, in the best case scenario our range would be off by +/- 9 yards. That would result in around a 7-10% decrease in hit percentage in our 700 and 1000 yard examples. An error of just 0.01 mil results in 18 yards of ranging error, which would drop that hit percentage by 20-25%. Then in the middle of our range, +/- 0.015 mils of error would result in our range being off by 27 yards. So by the middle the chart, we’ve already fallen below a 50% probability of hitting both the 10” circle at 700 yards and the 20” circle at 1000 yards. If you keep going up to top of the chart with an error of +/- 45 yards, our hit percentage has dropped close to 25%.
This aligns with a similar analysis from Bryan’s book. Litz said “Consider a novice shooter with a basic MIL dot reticle, and little practice or experience. Given these tools and skills, a shooter may be able to estimate range to within +/- 50 yards at 1000 yards.” Of course, the impact of this on hit percentage is strongly correlated on the target distance. Bryan tells us “The farther the targets are away, typically the less resolution they can be ranged with.”
My takeaway from this … if you’re looking for first round hits, a laser rangefinder seems to be a pretty handy tool! The ability to estimate range using a mildot reticle can be a great back-up plan, because electronics can and do fail … but after seeing how much an accurate range plays into hit percentage on long-range targets, a laser rangefinder will always be my Plan A. And this reminds me to train on ranging, and not just shooting. And, I shouldn’t rush the ranging process, but instead should get a steady rest and possibly try to hit the target a couple times to double-check the range. In hunting or competitions with unknown distance targets, it’s for me to get in a hurry and not take the care I should in ranging the target. I’ve learned to not rush my shot, but rushing my range acquisition could have as much of impact on whether the bullet finds its target. Now that’s new insight!
Other Posts In This Series
This post was one of a series of posts that takes a data-driven look at what impact different elements have on getting hits at long-range. Here are some others posts in this series:
- How Much Does Group Size Matter?
- How Much Does SD Matter?
- How Much Does Cartridge Matter?
- How Much Does Muzzle Velocity Matter?
- How Much Does Accurate Ranging Matter?
- How Much Does Wind Reading Matter?
- Overall Summary
If you want to dig more into this subject or explore some of these elements for your specific rifle, ammo, and ballistics, I’d encourage you to buy the Applied Ballistics Analytics Package to run these kinds of analysis yourself. You could also pick up Bryan’s Accuracy and Precision for Long-Range Shooting book, which has a ton of great info on these topics and other aspects of shooting.
in the miss% line graph you have, what causes that switch at the every end. Since is falling faster at farther targets, being off by 5 yards should affect the outcome more than a closer target. Looking at the extremes, the difference between a 700 and 750 yard dope is closer than a 1000 and a 1050 yard dope.
but at the end of that chart, at some point the 1000 yard shot becomes easier, which seems very counter intuitive to the above. is that the point where the target size is a greater benefit than the closer distance? even though the close distance decreased the variable we are measuring?
So I went ahead and did some rough math to see if I could make sense of this. I wanted to isolate the distance measurement as the major factor here. My methodology was the calculate the minimum mils you need to be off by to miss, then see what difference in distance this would need to be. I used a 338 lapua Sierra MatchKing BTHP 300 grain bullet in this calculator: https://www.federalpremium.com/ballistics-calculator
20” gong at 1000 yds
Target is 0.556 mils high, so a miss would need to be .278 mils off.
Drop at 1000yds is about 9.1 mils so a missed shot would be 9.38 or 8.82
If you doped for those then you would be aiming at about 1022 or 977
So you can be +22 or -23 yards off
10” gong at 700 yds
Target is .397 mils high, so a miss would need to be .199 mils off.
Drop at 700yds is about 5.3 mils so a missed shot would be 5.5 or 5.1
If you doped for those then you would be aiming at about 718 or 683
So you can be +18 or -19 yards off
Looks like the math supports that the extra size of the 1000yd target is more beneficial than the lower drop of the 700 yd target.
Wow, thanks for the effort you put in there Adith. I agree that it seems a little confusing, and I wasn’t sure what was causing it. As always, your input is very insightful.
You should do one on how much does charge weight accuracy matter +-.2 etc. grains at 1000 yards
Precision Rifle Series (PRS)
Don’t you think that might end up being essentially the same as the “How much does SD matter?” post? Consistent powder charge seems like it might be the biggest factor affecting muzzle velocity SD, so it is probably directly correlated. I guess neck tension and concentricity might also play into that.
I went back and looked at the load development I did on my 6XC, and it looks like increasing charge weight by 0.2 grains changed the muzzle velocity by 15 fps on average. So if you’re saying that you’d throw within 0.2 grains of your target charge 95% of the time, based on a normal distribution that would be a standard deviation of +/- 0.1 grains. Then if 0.2 grains equates to 15 fps, a +/- 0.1gr SD should correlate to a muzzle velocity SD of 7.5 fps. At least I think that should be right! We might be overthinking this at this point, but it’s a fun exercise! The equivalent SD of 7.5 fps is actually surprisingly low. With all the other uncertainties going on, that would likely have less than a 1% impact on those long-range hits. You can go check it out on the “How much does SD matter?” post to see what it comes down to exactly and how that varies as you get better or worse.
I know I have a few PhD-level Statisticians reading this … someone please correct me if I’m wrong.
Great report as usual Cal. I agree 100% and you could add other factors like how much the target is angled in relation to the shooter. In mountain terrain you’re always looking up or down at objects. This has a huge effect. Keep up the good work.
I very much agree with your new insight remark re. ranging precision. In LRH we don’t get a target card that says it is a 10″ gong…we get a deer, elk, antelope that VARIES in size and his image varies in the mirage.
Milling a tactical target i.e. torso or standing height or for that matter vehicle tire height has more variance than it appears on the surface.
Yes sir! I do have the height to the shoulder of a mule deer, whitetail, and coyote memorized. That’s a helpful unit of measure, since that is normally what is in my scope. But even that can vary by age, so there is still a judgement call and a lot of room for error. I bought rangefinding binoculars, which means I virtually always have them on me when I’m hunting. I never just grab my rifle. I usually have binos around my neck, rifle slung, and shooting sticks in my hand like a walking stick. Every other piece of equipment is optional. Oh, and a couple extra rounds in my pocket. I didn’t have those one time … that wasn’t cool.
But, I can’t argue with them still teaching that at sniper school. Equipment can fail. I just hope I’m never in a position to have to use it.
“8.0 or 8.5 mils represent the worst case scenarios we’ll be analyzing, +/- 0.025 mils.”
Am I misunderstanding the error calculation? Wouldn’t this be +/- 0.25 mils?
Really enjoying this series. I’m pretty new to long range and have been holding off on buying a laser rangefinder thinking that I could use some practice mil ranging first, but this makes me wonder if I should just get that laser.
Sorry … mispoke in the comments there. Move that decimal over one place. It’s 0.80 to 0.85 mils represents +/- 0.025 mils. Obviously we wouldn’t be off by 0.25 mils … that’s enormous.
Just to be clear, all the numbers shown and conclusions drawn are correct … I just made a typo there. It’s now corrected in the post. Thanks for catching that!