I thought this should have its own thread..
The question is:
---If I focus my binoculars at infinity
---How close can I then make a target before it gets noticeably blurry?
from a C program using the basic lens equation...
For various binocular powers / apertures:
nearfoc 6 30 \ Nearpoint: 14.60 meters 47.89 feet
nearfoc 7 35 \ Nearpoint: 19.80 meters 64.94 feet
nearfoc 7 50 \ Nearpoint: 28.30 meters 92.82 feet
nearfoc 10 50 \ Nearpoint: 40.20 meters 131.86 feet
*Most binoculars have a prime focal length of ~4 times the objective diameter.
*The eyepice focal length is Prime-focal / power
*Perfect focus at infinity is where the FL of the objective and eyepiece meet
* A standard of the prime focal plane being 1/20th of FLep from a perfect meeting (Fo + Fep)
triggers the out-of-focus standard. (for sharper standards I would just tighten that down).
* This is checking out pretty well with a 'Yardage Pro' laser rangefinder for distances..
So, to confirm observations, the 'near-point' does indeed move out as power increases.
Why does the 7x50 have a farther near-point than the 7x35? Because the F4 rule makes the 7x50 longer..
...the longer the binocular Fl, the more error as you get closer.
But why did I choose to set infinity and then look for the near out of focus line?
Why not pick the special camera mean distance to start? (te hyperfocal distance)
1) It doesn't shift the near point much
2) It's far easier to set your binocs for "far away' (practically infinity)
than for the hyperfocal distance...you would need a rangefinder on the side.
So it's practicality that makes me set my 6x30s to infinity and then troll the yard.
I found the secret of why they work so much closer than 7x50s: the 7x50s simply have an extra-long objective FL.
Now my 10-yr itch is scratched.
Edited by MartinPond, 19 May 2021 - 07:31 AM.