My main problem with autofocus is when it identifies noisy pixels as if they were tiny stars - and that throws off the HFD calculation. I’m not sure how nebulosity rejection works exactly - but if instead we could enter the minimum HFD of a real star, in arc-seconds, then those small stars would be rejected from the star measurement statistics - and it would only use “real” stars that are above the minimum size.
Neb. rejection does work - but it requires tuning, possibly to each object. But I know I’ll never have a real star under about 1.4" - so I don’t want them in the average. Then I don’t think the neb. rejection would be as critical.
Frank
5 Likes
That’s a neat idea. I agree!
Just an update that this was killing me the other night. I was trying to calculate a filter offset for a narrowband filter - so I needed a field with a few brightish stars in it. I had neb. rejection at max and it was finding two good stars in the field with HFD around 3 pixels - but it kept grabbing a single point with HFD 0.6 or something - and it completely threw off the graph. There was no way I could stop it from including small and bogus stars - which would cause the focus curve to dive down all of a sudden - even though the real stars would have provided a good curve.
If you have enough stars in the field then having a few small ones won’t matter too much - but even then they are undesirable.
Frank
1 Like
Same here… I have always problems with very low values of HFD (0.4, etc) and the curve seems the Himalaya instead a V.
I made darks, but have the same issue.
Thx
I think this is an excellent feature request!
I do have a question though: Would we need to be able to define the minimum HFD based on each binning option? I get remarkably repeatable autofocus results, regardless of my camera binning, so I typically bin the narrowband autofocus frames while keeping the LRGB settings the same as my imaging scale.
Because the HFD is in pixels, I believe I’d need to define settings for each binning option.