A Data-Driven Approach To Precision Rifles, Optics & Gear
Home / Reviews & Field Tests / Field Tests & Studies / Public Review of Scope Field Tests

Public Review of Scope Field Tests

Update: The field test is now complete, and all the results are published. Click here to view the overall results and ratings for each scope!

Around April 14th, I’ll begin a massive field test focused on high-end tactical rifle scopes. A lot of companies have entered this space in the past 2-3 years, and I’ve assembled 18 of the most popular models in the $1500 and up price range. Here is the line-up:

Why didn’t you include …

  • Premier – Sad news, you can no longer buy Premier scopes. PremierReticles.com says “The assets of Premier Reticles Ltd. were purchased by Tangent Theta Incorporated in early 2013. Tangent Theta’s products will replace a majority of the Premier line with a new series of professional rifle scopes.” Andrew Webber, President of Tangent Theta, told me they have no plans to continue making any of the Premier scopes (but they are working on a new line that has my attention).
  • Tangent Theta – I hoped to include the Tangent Theta 5-25×56 scope, but unfortunately they are still in pre-production. They’ve been working on a new line of scopes for 2 years, and are committed to ensuring it is perfect before releasing it. This scope includes fresh designs, and because they bought the intellectual property from Premier and brought on some of Premier’s brightest engineers … I’m very interested to get my hands on one of these. They hope to have some available for evaluation in about 3 months, and I plan to run it through these same exact tests at that point, so it can be compared to this original set of scopes.
  • Vortex Razor HD Gen II 4.5-27×56 Similar story here … Vortex is still making some last minute touches on this design, and are hoping to start production on it this summer. They just weren’t comfortable sending a model for testing yet, and I can respect that. I hope to get my hands on one as soon as possible, and run it through these same exact tests so it can be compared to this original set of scopes.
  • SWFA Super Sniper I know a lot of people who swear by these scopes, and I’m sure they’re great. But most models are below the $1500 price limit. In fact, there is only one model that is exactly $1500. I plan to do another round of field tests in the fall that specifically focused on scopes in the $500 to $1500 price range, and I hope to include a couple SWFA Super Sniper scopes then. I do plan to run the same exact tests on that second round of scopes, so you should be able to directly compare results between this original set and that later set.
  • Huskemaw Optics – Similar story to the SWFA Super Sniper, in that virtually all of their scopes are under $1500, and I’ll likely include one in the $500 to $1500 scope tests. There is a Huskemaw Tactical Blue Diamond 5-30×56 scope that is well above $1500, but I’ve never heard of anyone using it and neither had anyone on the forums I asked. It simply isn’t as popular as the others included here.
  • Swarovski – No tactical reticles (i.e. has evenly spaced hash marks on the vertical and horizontal axis)
  • Counter Sniper – You’re kidding, right? 😉
  • My favorite scope – It either isn’t available with a tactical reticle (i.e. has evenly spaced hash marks on the vertical and horizontal axis), isn’t in the $1500+ price range, doesn’t have at least 6x zoom on the low end and at least 18x on the high end, isn’t one of the most popular models in this market segment, or is represented well by one of the 18 scopes already in the test.

I Need Your Input

Although I’ve spent an absurd amount of time thinking through the tests I plan to run on these scopes, I don’t claim to be an optics expert. I’m an OCD engineer and a passionate shooter, and while I may know more about optics than the average Joe … there are some of you that know way more than me. So I’ve typed up all the tests I plan to run and all the stuff I plan to do to mitigate biased results, and I’m asking for your feedback on what could be improved BEFORE I START TESTING. If you know of a tweak to one of the tests that would make it more reliable or meaningful, please tell me. If you know of a different approach that would be more repeatable or less error prone, please tell me. The way I see it, this is a massive undertaking and I’m going to spend a ridiculous amount of time in the field testing all these anyway. So if anyone has a way to get more value out of that time … I’m all ears.

Guidelines for Feedback:

  • Can’t require buying more than $100 in equipment (unless you’re also sending me a check). I’m doing this all on my dollar, and I’ve already spent hundreds on this test … so I’m on a budget.
  • Can’t require a ridiculous amount of additional time to setup or conduct
  • Can’t require an elaborate test environment
  • Can’t require more than 5 people to conduct


About Cal

Cal Zant is the shooter/author behind PrecisionRifleBlog.com. Cal is a life-long learner, and loves to help others get into this sport he's so passionate about. His engineering background, unique data-driven approach, and ability to present technical and complex information in a unbiased and straight-forward fashion has quickly caught the attention of the industry. For more info on Cal, check out PrecisionRifleBlog.com/About.

Check Also

6.5 Creedmoor factory Ammo comparison

6.5 Creedmoor Ammo Test Part 5: Live-Fire Group Sizes & Precision

Welcome to Part 5 of the results from my massive 6.5 Creedmoor ammo field test! ...


  1. This may be too basic but I take weight into serious consideration. It is the reason I passed on the US Optics and went with S&B. I don’t know if this is important to other shooters.

    • Absolutely. I plan to do a lot of direct measurements of the physical attributes of the scope as well, and combine in a format that makes side-by-side comparisons of specs easy. Weight is one of the big ones, but it is actually a long list as well. I should’ve posted all that stuff up here as well. I’ll try to add that to the end of the PDF shortly. Thanks for the tip, and let me know if you see anything else.

  2. Eric J. Koechling

    i have no experience with these high end scopes as of yet. If they have the B.D.C. Is there any way you can compare to the weight of the bullet compared to the ballistic chart labeled for how it lines up with the B.D.C.at different ranges??.
    Reason for asking: i just bought a Nikon 223 along with a freind. I haven’t been able to get to a longer range than 100 yds., but my friend has. He said it didn’t work worth a damn

    • None of the scopes I’m testing have a BDC reticle. BDC reticles are typically a rough estimation for trajectory (i.e. don’t exactly match bullet path for the specific load & barrel you’re using), which is why I don’t use them and likely why your friend is frustrated. Custom ballistic turrets are a different story, and I do use those for hunting occasionally. I prefer to stick with more standard milling reticles with evenly spaced marks (either mil or MOA). You can still use those like a BDC reticle, but you just have to know your dope card.

  3. Hi, What you are planning will be a fantastic selection of tests of both the price ranges. It will be much admired and commented on and, you are right, some will make carping criticisms.

    My thoughts:

    1. if you are using new scopes, they will need a ‘run in’. Any mechanical gadget needs a period of use to settle down from the factory assembly. Can I suggest that you move all of the adjustments to their full range a set number of times. I usually do this about 75 times. That number is a bit random because I don’t know how much better things will be if it was 100 or 200 times but I have certainly noticed a change in a scope from new to that ‘settled in’ phase. There was quite a good article published in a New Zealand hunting magazine last year about physically wearing in a scope before use. That should cost you nothing in materials but something in time.
    2. I would be very interesting to see the change from new to ‘settled in’, to quantify it for the various scopes. The best made scopes should require the least amount of manipulations to settle in (like good shoes).
    3. I think the mechanical wear in process should precede the ‘box test’.
    4. The box test is more of a test of the rifle/shooter/bag set up than the scope. I see them in magazines and really wonder what they prove, if anything. The process, which is supposed to measure tracking acuracy, can be done much more economically and with fewer variables by using the grid on a collimator – and a collimator is pretty cheap (China-made sell in NZ for about $80, so should be much cheaper in USA).
    5. Parallax test should be done after ‘run in’ because it will change.
    6. Zoom POA change test should be done after ‘run in’ because it will change.
    7. Both parallax test and zoom POA change can be measured using the scope collimator. These will change through the range of the zoom and parallax adjustment. I check for these with a new scope and make up a chart for the scope so that I don’t have those annoying misses, but it is very irritating anyway, so how close the scopes run to no change in the cross-hairs would be a very interesting point to make.
    8. When the scopes are being considered by disinterested test subjects, the scopes should be adjusted for the eyesight of each test subject.
    9. It is my experience that the numbers on the parallax ring/knob have only a passing connection to the actual distances at which parallax is neutral. How far out the numbers are on a scope is an interesting feature that I’d like to see in your report.

    Cheers, John.

    • Great tips! Thanks John. I just bought a “Professional Collimator” like you suggested, and it will be here on Monday! It had oustanding reviews on Amazon and on Brownells as well (although Brownells was out of stock). I totally get what you’re saying. That is one reason why I plan to run the box test two ways, and only one of them involves a rifle. I will do a “no-fire” box test with the scope in a vise, against gridded paper (likely with an assistant downrange with a 2 way radio marking the exact spot where the crosshairs are pointed for every spot). I thought if I could do it without adding a rifle, ammo and shooter … it might produce more accurate results, because there are just less variables involved (margin of error drastically reduced ). The only reason I wanted to also do a “live-fire” box test, was to see if each scope could retain that same performance under magnum recoil. I bet none of these scopes have a problem with that, but I don’t want to assume that. I will experiment with the collimator now as well, and hopefully that will make this even more accurate.

      I’ll also do the “run in” as you suggested as well. 16 of the 18 scopes are brand new out of the box, but 2 of them have been on a rifle before and used at least to the extent you described. Doing the “run in” will ensure they are all on level playing field.

      Thanks again for the tips!

    • I forgot one thing … if you allow the users to adjust the parallax on the scopes, that means you can’t do a blind test. Really I see two options:

      1. Blind Tests: Preset the image focus to reduce parallax and get the image as sharp as possible for the target distance they’ll be viewing. Also preset the reticle/eyepiece focus so the reticle is as sharp as possible (according to ILya Koshkin this “should never be used to focus the image, only the reticle”). I’d then cover the scopes (as best as possible) so testers couldn’t see the brand or model, but that also removes their ability to manipulate the side focus.
      2. Allow Individual Adjustment: Present the adjustments like described above, but leave the controls uncovered and encourage testers to adjust the scopes to their individual eyesight (as best they can). If you get completely uninterested parties who have never heard of any of the brands … this could work. But it may be hard to find 5-10 of people who are willing to spend an hour driving out and looking through scopes they are in no way interested in.

      I’ve been on both sides of this fence, but today I am leaning a little more toward option 1 and the completely blind tests. My reasoning is I think there is little reason to adjust the side focus on a per-person basis if the target distance is always 100 yards.

      What do you guys think?

    • Oh, absolutely. It amazes me how often a manufacturer will fail to disclose that, or simply give bad information. I’ll even go beyond that and note EXACTLY what each click is. For example, it may be an average of 1.05 MOA or 0.955 MOA (what SMOA equates to). In fact, some scopes (mostly lower end ones) may have clicks with different effective adjustment values as you get the extremes of the scope’s erector. So from 0-30 MOA it may average 0.98 MOA per click, but then from 30-40 MOA it may average 1.13 MOA. I have tests that should identify all that kind of stuff, and I’ll definitely provide it in the results.

      I’m not sure these high-end scopes will be off my much (if any), but I’m still going to run them through the tests. You shouldn’t assume anything, especially this part, because it’s the bridge that links your dope card to your bullet flight. I think the 2nd round of tests I plan on conducting in the fall on scopes in the $500-1500 price range will yield some interesting results.

      Thanks for the comment! Let me know if you think of anything else.

    • I haven’t, but I’ll see if I can get some of those. Thanks for the tip! Those will be really helpful. I started making something similar in PhotoShop, but it may be easier to just pick up some of those.

  4. Also, for the test to measure the force necessary to make adjustments, you may be able to more quickly measure torque by using something like a trigger pull gage instead of weights.


    • Great idea. I already have a trigger pull gauge, so I will try it out. I’m not sure which approach would be more precise, but I’ll experiment with both. I really appreciate all the tips! If you see anything else, please let me know.

  5. It would be interesting to know the range at which each optic can see bullet holes in different kinds of paper. It would be nice to test “shoot and see” targets, white card stock, and those crappy brownish grayish targets that you can barely see bullet holes up close.

    • Interesting idea. I certainly like the practicality of it. It might be difficult because bullet hole size could vary from .17″ to .50″ (or at least .22″ to .34″). I guess I could pick 6.5mm as the average, since a lot of people shoot that round anyway. Overlapping shots could also make that vary. It seems like you could infer that with some measure of precision based on the scores from the optical tests. I may try to calculate that, then actually go out in the field with a couple scopes and validate whether the prediction was accurate (i.e. If the optical score of the S&B 5-25 indicated I should be able to differentiate 6.5mm bullet holes at 450 yards, can I really do that?)

      Very interesting thought for sure. I like how it might be able to translate an abstract score into a concrete thing that shooters could wrap their head around. It seems like it would add value.

  6. Suggestion #72: Evaluate adjustments, optical performance, etc. with temperature change. Use an electric blanket and a freezer to bring them to different temperatures and note changes in performance.

  7. I must say that this is going to be some interesting reading when you are done testing.
    I assume that you are going to do a low light comparison. As you do that, could you evaluate the reticle illumination as well? A lot of people who buy these scopes use them for long-range hunting. A well-known problem with hunters is that the illumination is too bright, and that too much of the reticle is illuminated. That makes it impossible to see past the reticle in low light hunting situations.
    I have never understood the need for a fully illuminated reticle and daytime visible illumination. If the optic is good enough, one doesn’t need it.

    That leads me to a question out of my own curiosity. Why didn’t you want to have a Swarovski Z6i 2end gen for optical compression. There are most likely no other riflescope manufacturer that make glass as good as Swarovski, not even Zeiss. A lot of other riflescope manufacturers claim their zoom range and field of view, but in real life they tunnel at low power and FOV just aren’t that good. Some have a narrow eye box or a small tiny picture with a big black ring of scope around it.
    I know Swarovski don’t make tactical scopes, but they are the Benchmark in optical performance.

    Anyway…looking forward to see your results. I admire your effort and all the work you put inn to it. This is something that a lot of people will be talking about and referring to for a long time.

    Thank you for the good work.

    • Thanks for the comments. I’m looking forward to the results too! I also believe this will become a landmark test, and definitely be something people talk about.

      Swarovski wasn’t included because they don’t offer a tactical reticle. I personally talked to them in-depth about this at SHOT this year, and they have no plans to offer any type of tactical reticle in the foreseeable future. That is a deal-breaker for serious long-range shooters, because it really limits how you can use the scope.

      While they do make outstanding, top drawer glass … You are either unfamiliar with some of these other brands or badly underestimating them. Not only are some of these in the same class as Swarovski, I’d wager a few may be better. You can include some pretty outstanding glass for a $7000 price tag. But even some of the lower prices models could go head-to-head with Swaro. Don’t get me wrong, Swaro glass is amazing … but these guys are too.

      I’m definitely measuring the exact FOV, so we’ll have some hard data and can compare these to what the manufacturer claims.

      I plan to test resolution and contrast, which are the two major factors that play into low light performance (and optical performance in general). Resolution matters more in day time, while contrast gets more important in low light. While it would be ideal for both to be the best possible, these are competing design characteristics. I don’t have plans at this point to do any testing specifically in low light, but the tests I am running could easily be applied to that scenario by simply weighting them differently.

      Thanks again for the comments.

  8. I would like to see the units listed into a few brackets.

    50mm or less objective vs Larger as one category.
    30mm tube vs Larger as one category.
    Price category as sub 3k vs over 3k

    • I can definitely see your point on two of those, but what’s the benefit of splitting up scopes based on tube size? Objective size I can see, and price too … But can you explain why 30mm tube size is important to you?

  9. Shahrouz Hagizadegan

    Do you still plan to test the Tangent Theta?

    • I probably won’t test any scopes for a few months. I may consider testing another batch of scopes in 2015, and if I do it will definitely include the Tangent Theta.


  10. First of all, and I’m sure you’ve heard this a million times by now, but I would like to sincerely thank you for conducting this evaluation.

    The results of your efforts are by far the best, most in-depth and technically exhaustive reviews of any one single scope, let alone almost twenty that I personally have ever seen. I have followed this blog after finding it last summer and have learned more from the high-end scope testing than I could have ever hoped for by putting my hands on them at my LGS or reading about them in the countless sites/magazines that would undoubtedly not deliver an as quantitative and unbiased review as you have here. Well done!

    I’ve also followed your “What the Pros Use” series to endless joy and interest to start ‘e-collecting’ components that I’ll never be able to afford for my precision build; so thank you for that work as well. Please, for the love of all things shooting, never stop publishing this blog.

    Secondly, (and coincidentally the reason I post this comment here as opposed to other threads) I wanted to check to see if you had an update or even just an idea of when you might conduct this same evaluation for scopes in the $500 – $1000 range per your comments above? I know you’re probably still recovering from the physical, emotional, and not least important monetary toll that the high-end scope test put on you so it truly pains me to even ask. Unfortunately, my appetite has been whetted by pouring over the excellent data that you’ve produced in your previous endeavour and now much like Short Circuit’s Johnny 5, I need more input.

    All kidding aside, thanks for what you do; and whether it’s a week or a year before you’re ready to test some more scopes I’ll be waiting patiently and gratefully to see the results.

    • Thanks for the kind words, Perry. Glad you find the website helpful. I certainly put a lot of effort into it, and the encouragement goes a long way.

      I still don’t have plans to do any additional scope tests. I have had a couple guys contact me that sounded like they were going to try to replicate the same benchmark against other scopes. I hope that happens. It would be cool if that started to catch on and we as customers could compare results for lots of scopes. I’m not closed off to doing it, but just can’t commit to doing it at this point because of other stuff going on.

      Thanks again for the encouragement,

  11. Another thanks for the work you are doing. I am learning a lot. will be very interested to see the final report!!