Autofocus trigger based on FWHM (or HFR) of previous image

Although I use temperature as a trigger for autofocus I often find that temperature does not accurately track focus changes - probably because the motor and remote temperature probes do not accurately reflect the thermal-induced changes in the OTA. This is particularly noticeable as abient temperature is falling quickly early in the session. I find that I’m noticing a visible deterioration in the image sharpness and having to manually pause the sequence in order to run an autofocus then re-start. This could be avoided if I could manually set an acceptable change in FWHM or HFM calculated from the previous image that would then trigger an autofocus run.

ChrisH

Hi,

I have 2 systems, one with a resolution of 4.3 arc sec per pixel and the other with 0.68

The HFM of the 0.68 system is limited by seeing, in other words on a good night the HFM value will go down to about 0.8 and on a bad night it’ll go as low as 1.4, so triggering refocus based on star profile value will not work with this system. OTOH on my low resolution rig (4.3) the HFR curve is fairly inmune to seeing.

Cheers,

Jose

Well on my system (either 1.14 or 1.46 ercsec/pxl) I see a clear star bloating starting to occur which correlates with a measurable shift in FWHM even though temperature (set to refocus @ 1degC change) has yet to trigger a refocus run. Then again, I get a sharp V-curve with a well-defined minimum using my refractor, this may not happen if using an SCT (or other obstructed OTA). Setting temperature sensitivity lower to, say, 0.5degC is not the answer as this would result in excessive refocus runs once the system had thermally stabilised - in the early hours temperature tracks focus reasonably well. Note that I need to setup my scope outside each session and I allow at least an hour for the scope to ‘cool’, but early in the session temperature is usually changing fairly quickly (1-2deg/hr).
ChrisH

Couldn’t the change in star size be used rather than some absolute limit?

Chris

Chris,
That’s what I’m saying - measure the FWHM/HFR of the frame just captured, if it has changed (basically worsened by a user-settable amount - which will vary according to the optics) and it will trigger a refocus run. Ambient temperature is just a surrogate and not a very good one if the probe does not accurately reflect the temperature of the optics, the change in star size is a real and measurable definition of focus accuracy.
ChrisH

This has been requested before. This thread highlights our current thoughts on it (and why we have chosen not to implement this trigger):

We’re thinking the same thing. I got the impression that one of the reasons for not implementing this was that it would use an absolute star size reference.

If the star size increased by a certain amount, maybe a percentage, it would trigger a new autofocus routine and that would provide a new reference value. So if conditions had changed to give larger stars a larger reference would be used.

Yes Chris, that’s what I was thinking. The problem surfaced again last night and I had to interrupt the sequence between 2nd and 3rd 1200s sub to refocus. I was quite obvious to me just looking at the 2nd image that the stars were larger. The problem is that the probe has a very low mass and so it tracks temperature changes very closely, however the OTA has a much larger mass and is much slower to equilibrate.

Having this extra trigger would not replace temperature as the priciple trigger, it would just add an additional trigger whilst temperature was changing rapidly. Once the first autofocus run is made the HFR at the calculated focus point is recorded (say for example, 0.6 for my refractor). If by the end of the 2nd frame the HFR had changed to (example again, 0.8 or a 25% increase) then the autofocus would be triggered regardless of any ambient temperature change as read by the probe. I don’t see why this would produce extra ‘support issues’ if this feature can simply be disabled and not used if it is irrelevant to the user.

Once thermal equilibrium is reached in the early hours and ambient temperature is no longer changing quickly then HFR would be expected to change in parallel with probe temperature. I should add I’m very concious of achieving and maintaining accurate focus, perhaps more so than most. The high resolution images I strive for demand nothing less than perfect focus at all times.
ChrisH

Oh, I should explain… My scope has been outside for an hour or so before I start, and by the time of the the first autofocus run the temperature probe has already cooled (thus accurately indicating ambient temperature) - but I’m guessing the scope has NOT cooled to ambient. So by the end of the second sub in the sequence the ambient temperature may have dropped only a little more (or not at all) but certainly not enough to trigger an autofocus run. However the scope is still cooling because of it’s large mass (an NP127is has a lot of metal with Petzval lenses deep inside!). I sometimes wonder if this beast ever thermally equilibrates. So I guess I have a system which is particularly prone to this condition.

A variation on this…

why cannot autofocus first take an image at the current point, measure it, and if HFR is within some range of the last good autofocus just skip?

Only a few seconds to check if it’s worth the longer autofocus process.

Jaime

I understand the desire for this… unfortunately I think it’s ultimately a dead end approach (or at least an approach that I don’t think I would be fond of supporting) and does not address the pitfalls addressed here:

Thanks for taking the time to reply, Ken.

I think the setting would work for me, but also 100% understand you; also SGPro already has a scary amount of options, you no doubt have to be very careful before adding more.

I don’t think I showed this image history graph before - it’s real data showing changes in focus criteria (HFR, number of stars recorded, temperature) for a 3hour run where the target was rising in the East (already above 30deg) up towards the meridian. Note that the scope was already reasonably well thermally equillibrated before this run began.

The temperature trigger is set to 1deg so no re-focus was performed until just before the last frame (should be obvious from the Focus Position!) which occured just before onset of dawn twilight. So you can see the HFR gradually reduces over time (improved seeing?) and the number of stars recorded increases. Although it’s fairly predicatable in this example, there being a fairly linear drift, those changes would (likely) run in the opposite direction if the target started out at the meridian and slowly sunk to the West. This sequence of events does not even take into account the possibility of changing atmospheric conditions which can also drastically affect measured parameters.

In some other threads I note comments that desire to short-cut the autofocus mechanism, using single-point comparators as a test of focus drift (for e.g.,). Personally I think that’s a bad idea - as you can see from the above graph you are going to get changes anyway based purely on the altitude of the target, and these changes will be dependant on whether the object is rising or setting. No, I personally think the autofocus already does a fantastic job of maintaining focus and I for one do not begrudge the time taken for each and every full autofocus run, but I would like to set my own triggers for additional focus runs where I think they would be beneficial.

ChrisH