Dual Scope Imaging with Voyager (my attempt)

Hello!

After getting to grips with Voyager over the last two weeks now I really want to see if I can manage to control my entire dual scope setup. In case it’s useful to someone or if anyone has any contributions I will post my progress here. I’m sure I will also have one or two questions for Leo… :smile:

Gear Setup:

My astro setup is a side by side dual scope setup on one mount. For this purpose I divide my setup into a ‘Master’ and a ‘Minion’ part:

  • Master: Avalon Linear Mount, Lakeside focuser controlled by Pegasus Hub, Moravian G2-8300 camera with internal filter wheel, SX Lodestar2 Guide Camera, Ascom Observing Conditions from Pegasus Hub
  • Minion: Lakeside focuser controlled by Lakeside controller, Moravian G2-8300 camera with internal filter wheel

Goal

  • Control both Master and Minion with Voyager
  • Achieve some kind of synchronisation between the two instances

PC Setup

At the moment Voyager does not allow more than one instance to run at the same time so it’s not possible to control both parts from the same Windows instance. To get around this limitation (without having to buy another PC) I have installed a Windows 10 virtual machine (via VMWare Player) on my Astro PC (an Intel Compute Stick). On this VM I also installed Voyager alongside the ASCOM framework as well as the drivers required for my focuser and camera.

Now I have two PCs connected by a network both running Voyager. When I connect my astro gear I choose which USB device connects to the VM and which remains on the host and I’m pretty much ready to go. In indoor testing everything seems to work very well and I am able to connect the dashboard to both instances from another computer on the same network. Sorry that I get excited about this but that’s just fantastic!

So hopefully I am now in a position to fully control both parts of my setup independently. Indoor testing has shown that it can work but need to test in real conditions to be sure. I’ve had bad experiences with USB connections and virtual machines before so this needs to be tested to see if everything is stable enough over a long period.

Voyager Setup

The main instance of Voyager running the master part of the rig remains pretty much unchanged, at this point it doesn’t need to know anything about the Minion. The Minion Voyager setup is very basic. It only has the camera, filter wheel and focuser setup in the SetupForm.

One thing to note straight away is that it doesn’t seem to be possible to run a sequence when no mount (or virtual mount) is selected (maybe Leo @Voyager could confirm this)? In any case, although it would be nice, it does not really matter as it seems perfectly fine to run a drag script without the mount connected, which is really what I need.

Now (once I’ve written an appropriate dragscript) I think I am at a point where I can image with both parts of my dual setup at the same time. Hopefully I will be able to try this soon!

Synchronisation

For me the main aim of achieving some synchronisation is to be able to dither. I’m honestly not sure if dithering will improve my images but I want to try (I think I am slightly under-sampled and would like to give drizzle integration a go). For all other events that would ‘spoil’ a Minion acquisition (moving to focus star, meridian flip etc.) I am not too bothered really. My usual setup for LRGB is to shoot RGB in the Master and L in the Minion. So if I get some spoilt L subs then no problem. I must also add that for most of my acquisitions all subs will have a similar length (in master and minion) so my approach will be geared towards that.

Now this is where it gets interesting. In an ideal world of course, both master and Minion are aware of each other and play along nicely. Although this I believe is in the the long term plans for Voyager, I really want to get something going now, even if it’s not perfect.

Approach

Before I can do anything I will need to create an app that ‘listens’ what the master is doing, a bit like the dashboard, so it knows when exposures are ongoing, how much time is left and whatever else the master is doing. I’m planning to write this in C# using the Voyager Application Server (which has excellent documentation by the way!). Once this is in place I think I will have a go at this in stages:

Stage 1:

At this stage I would like to use a dragscript to control the acquisition, filter changes and focussing on the Minion Voyager instance. To achieve some form of sync I will add a function to my listener app which decides if it is a good time to start a sub in the Minion. Initially I think I will base this on detecting the acquisition state of the master i.e. if the master is exposing it must be a good time to expose in the minion. There is a ‘time remaining’ property being transmitted from the server which can be used for this i.e. if the sub I want to take on the minion is shorter than the remaining acquisition time in the master than all is well, go ahead. This obviously assumes that the minion sub will be shorter than the master sub, so I am envisaging maybe 300s subs in the master and 295 in the Minion which should hopefully work.

The way this should hopefully work with the drag script is that the drag script can launch an external script and read a variable. So I will create another very small ‘check’ app that the drag script can launch, this ‘check’ app will contact my listener app and ask if it’s a good time to start a sub. After being told, the check app then returns either true or false to the drag script. The drag scrip (in a loop) will go round in circles until it’s a good time and take the next sub. Easy…

Stage 2

It should be possible to improve on this approach in two ways:

  • Abort a sub in the Minion if it has been spoilt and restart as soon as it’s good again. If I think about it then the best single indicator for the minion to know this is to check (or get notified) if the guide software is not guiding correctly or guiding has stopped. So in this step I think I will try to get this info either (ideally) from Voyager or alternatively from the PHD API.
  • This could potentially give more imaging time (as compared to Step 1) but would need the ability to cancel an acquisition in the minion (and the cooperation of the camera - from experience some cameras can be a bit temperamental when aborting an acquisition). This will probably need direct interaction from my app with the Voyager Minion via the TCP protocol.
  • Additionally it would probably also be beneficial to move away from a drag script and fully control the minion Voyager instance over TCP

Stage 3

By this point we are probably in 2021 and Leo has already implemented a much better internal solution… :smile:!

If not then the next best thing would be to investigate if it’s possible to have some interaction with the master and to ask it to wait before carrying out certain actions (e.g. stop tracking, mount move). If this could be achieved then much better sync could be achieved.

Finally

Wow, that post turned out a lot longer than intended… :smile:. I would welcome feedback if anyone has anything to add or if you think that the approach will not work for some reason or other!!!

Thanks,

Mike

2 Likes

Great post Mike, you are welcome … and i’ll give to you best support. Just ask what you need i add to Application Server to help you.

Some answers to your questions:

This is true, Voyager need mount data to save in FIT File for this reason or you connect a simulator mount or the Virtualmount, i can add modify VirtualMount because at now work with the Array System in Voyager Custom.

LIke said above, just ask i can add some events to Application Server From Master.

I’ve spent a lot of months to develop the Array System that work in LAN but also to develop the Application Server . so … thanks for you project. Really happy About.

Leo

I am making good progress and the basic mechanism of ‘phase 1’ is now in place!

I’ve created a synchronisation app which can connect to the (master) Voyager app server and receive information on what it is doing. It also acts as a TCP server which allows other apps to connect and ‘ask’ if it’s a good time to start an exposure (based on the information received from the Voyager master) At the moment this really only looks at ‘remaining exposure time’ but will hopefully be ‘cleverer’ soon.

I have also created a very small second app which acts as a ‘communicator’ between the dragscript in the Voyager ‘minion’ instance and my sync app (because there is no way for the dragscript to speak with the sync app directly). The way this works is that I use an ‘External Script’ element in the dragscript which can start an external process and read the returned value. So the element starts my little communicator app and tells it how long it would like to expose for. In turn the communicator app connect to the sync app and relays the questions. The sync app says yes or no, which is returned to the drag script. If it’s a no, then the script jumps back to the beginning of the block and does the same thing again. If it’s a yes then the dragscript starts exposing. In the dragscript this looks a bit like this, where 10 exposure are taken with this mechanism:

So in this current state the flow is as follows:

  • The Voyager master instance is running a sequence, blissfully unaware, taking 60s exposures of different filters, dithering, doing the focus etc etc.
  • The sync app receives data from the master which tells it when an exposure starts and how many seconds are left in the current exposure (this is updated throughout the exposure).
  • Now the dragscript in the minion Voyager instance is started.
  • In a loop it repeatedly launches the comms app to find out if a (let’s say 55s) exposure can be started.
  • The very next time the master starts an exposure this will return true, because there will be more than 55s left in the exposure and the minion starts its exposure.
  • Once the minion has finished the exposure it again goes into a monitoring loop to see if the next exposure can be started.
  • In another example if I were to take shorter exposures in the minion, let’s say 10s, then the minion would be able to take 4 or 5 images ( depending on download times) during the time the master does one exposure.
  • This continues until either the dragscript or the sequence are done.

If this all sounds complicated then it’s probably because it is :rofl:

Of course there are potential pitfalls in this and a lot of wasted imaging time but it’s a start. Leo has indicated that he would be happy to make some changes to Voyager to allow a delay to be injected for certain operations from an external source. This is great news because with this in place the sync app can ask the master to wait until the Voyager minion instance has finished an exposure etc… This will open up a lot more possibilities.

If anyone is interested to try any of this, I am more than happy to share once it’s a bit more refined.

Mike

3 Likes

Thank you so much for sharing Mike … you have all my support.

All the best
Leo

1 Like

I’m very interested if you can get this going. I’m tempted to go dual-scope/camera again (did it in the past with other software), but don’t want to give up Voyager. Dithering’s a must for me (as it gets rid of pattern noise in my CMOSes). Fingers crossed…

MZ

We try to do all … but like i said we are working on Voyager Advanced. This is our priority … when finished we switch to dual/quad scope system and like for others solution we will present a top solution about (we hope).
We always have the array system ready to use for custom users.

All the best
Leonardo Orazi

3 Likes