Hey everyone - this might be a silly question, so please bear with me - it’s NOT a bug report, more a feature/usability question.
I have started the process of setting up automated RoboTargets each night - such an improvement. But, with all that flexibility when I wake up in the morning, I feel as though I don’t know what data I collected without going through each directory by hand. I have noticed this is a particular challenge for me as with the rising moon, my previously started LRGB targets have now been skipped for my NB targets (as designed - great feature!) and I am not 100% certain what I have to process!
I suspect I am missing something, if anyone has any tips/tricks I would value hearing about it.
I was thinking a “post capture action” might send me a little email report or something (a bit like what iTelescope does) and point me to my new data.
I think I can relate. I would wish that for example in the “Progress” column, color-code the % that’s shoot last night. Also I would wish that the finished target doesn’t just disappear from RoboTarget list, but maybe color-coded to indicate it’s done, so I can go and grab all the data to start processing. It will make the RoboTarget a long list, but I don’t think we really need to browse/scroll there.
I tried that and (unless I am missing something) there are often folder with empty but new dates (for reasons I cant quite work out) - so there’a a lot of clicking around hunting for what’s there.
It’s not a life or death situation, just a nice convenience - as I am fond of saying, computers are good at automating repetitive stuff, so wherever possible I like to use them to do that - much more fun when the machine works for you instead of the other way around!
Have a look at this Python script that I created. It emails a list of files that were captured the previous night. It also interrogates the file name to determine which filters were used so that you can use it to automate taking the flat frames that you need. The forum post below explains in more detail.
My approach is a bit different - I run GoodSync all night to pull files from my scope PC’s to my desktop processing PC. They go into a new directory (emptied before the night) so there’s no question what is new. I use a combination of Siril and gess (an automation for Siril) to calibrate all the new frames from three OTA’s with one click. I then blink the calibrated frames in PI and move the bad ones to a rejected folder. I then run a GoodSync job that moves the calibrated files to the “permanent” directory on the processing PC. When I’m ready to process, I have calibrated data and I can just align, stack, and post-process.
This works well for me with many nights contributing to each target and three OTA’s running. It’s not a typical use case but maybe someone will get a useful idea from it !
Statistics and reports, like wrote in other similar thread are alreaady saved in database.
Will be a section in RoboTarget Manager where you can recall it, also you can receive a mail each end of night about the activities. You will not need external script.
That looks very interesting Rowland. With three scopes I guess that saves a lot of time. I might look into it as my next automation step. I also use Goodsync to transfer files from the observatory PC to a NAS drive, but only do a scheduled transfer each morning rather than a continuous sync. Haven’t been brave enough to automatically delete from the observatory PC yet though!
I was getting swamped with just calibrating and blinking my data because each scope typically images 3 to 8 targets per night. That’s a lot of pointing and clicking if you do all the calibration manually. I know most people focus on just one or two targets per night so my approach would be overkill.
I don’t delete the files from the observatory PC’s. I use Seagate external USB drives on the observatory PC’s - usually around $90 for 4TB or 5TB and they are USB powered so super easy to just plug in and go. Those drives are my “off-site backup” - they are a permanent archive of my data, and my processing PC (now a Synology NAS with 50TB) is my working data set.