Optocouplers!

One of issues we observed on our project controls were quite frequent unpredictable & restless movements. Those are appearing when there is no (or constant) signal being sent to devices, but it looks like they still receive some noise there and behave quite erratically.

After a consultation with Richard, he introduced us to the world of Opto-isolators, being also addressed as Optocouplers.

So what is it that Optocoupler? As many times before, I’ll leave it to Wikipedia to answer:

An opto-isolator (also called an optocoupler, photocoupler, or optical isolator) is an electronic component that transfers electrical signals between two isolated circuits by using light. Opto-isolators prevent high voltages from affecting the system receiving the signal. Commercially available opto-isolators withstand input-to-output voltages up to 10 kV and voltage transients with speeds up to 25 kV/μs.

Wikipedia

Jameco Electronics also comes with a very nice picture, explaining how things work.

Checking this further with Richard, he approved us to buy several following DST-1R4P-P optocouplers. While picture shows 5V -> 24V, we’ve also ordered 3x 5V -> 5V and 1x 5V -> 3.3V.

I took a screenshot of the connection diagram for reference:

All arrived in a good shape & under 2 weeks.

Soldering all that thing as per instructions took a while and was honestly quite tedious. However result was very satisfying!

Installed, finally! Sebi says that it feels much better. More precise and less shaky. On other hand he also kept working on some signal clearing on the receiver side, claiming that it is his change – not optocoupler’s one. Well, time will tell.

This is our first test of it, just to check if well connected. (Optocoupler is that blinking device inside that cabling maze, at the end of video.)

Back in the air – part II

Passing our Hydrogen generation test and also having our new Hydrogen detector in place, it’s been time to move forward. First stage was to review leftovers from a previous day together with checking whether there’s been any impact to our Teflon coating.

It worked out that there’s been some more material in there, while coating came out without any scratch or signs of decay.

To test everything works and also to get rid of any excessive air in the container we filled another blimpy first and took some cool pictures of our dramatic sky

Seb’s been having fun with his new friend.

And we started filling the Windreiter envelope.

As you can see from the gallery below, it actually took few hours to get us there. Luckily chemical reaction stabilised nicely so we’ve got pretty stable Hydrogen production.

Finally we’ve ended up with some super-cool blimp!

Last bit was when we measured the lifting power to confirm achieved Hydrogen purity.

As you can see our kitchen scale measures 171g, where advertised value should be 195.60g of lift at Sea Level H2.

That’s roughly missing 25g, which we accounted to the thread attachment metallic clamp, sealing clamp and potential Hydrogen impurities as our experiment has long way to a clean laboratory environment. Whatever it was I think we did pretty well and whole team was very happy about the outcome! 🙂

Hydrogen sensor

Next morning after our initial Hydrogen generation test, I’ve realized that couple months back we’ve bought several MQ8 Hydrogen Gas sensors for our Arduino and this would be also an excellent opportunity to check if & how that’s working.

So I asked Seb and Christopher to put it together and about an hour later – voila – they had it working!

Arduino code looks pretty straightforward.

long sensorValue;
void setup() {
  Serial.begin(115200);
}
void loop() {
  sensorValue = 0;
  for (int i = 0; i < 100; i++) {
    sensorValue += analogRead(A0);
  }
  sensorValue = map(sensorValue/100, 0, 1023, 100, 10000);
  Serial.println("ppm: " + String(sensorValue));
  if (sensorValue > 300)
    tone(3, sensorValue/5);
  else
    noTone(3);
  delay(100);
}

As you may see on video below, they also added a “buzzer” to signal number of particles detected and connected it to a plotter to show values on a graph. They are using hydrogen from a previous day to test that it is really working.

Having that buzzer there came later pretty handy as it served us well to detect all the leaks whilst generating hydrogen, but that’ll have to wait for another post. 🙂

Back in the air – part I

In follow up on our latest adventure in Envelope from Windreiter we’ve ended up some amazing progress this weekend. Our plan was to get heaps of it from Aluminium and Sodium Hydroxide Reaction | Al + NaOH.

Christopher dug out for us how it should work – one sodium atom, one oxygen atom, and one hydrogen atom make up this compound. When all of the molar masses of the constituents of sodium hydroxide are added together, the result is sodium hydroxide’s (NaOH) molar mass. Thus, 22.989g/mol+15.999g/mol+1.008g/mol=39.996g/mol.

So approx 40g of caustic soda = 1g hydrogen.

Molar volume, or volume of one mole of gas , depends on pressure and temperature, and is 22.4 litres – at 0 °C (273.15 K) and 1 atm (101325 Pa), or STP (Standard Temperature and Pressure), for every gas which behaves similarly to an ideal gas. The ideal gas molar volume increases to 24.0 litres as the temperature increases to 20 °C (at 1 atm).

For an ideal gas, the attractive or repulsive interactions between the molecules of gas can be neglected, therefore we can treat this gas as “ideal”. (Side Note: interaction forces between specific gases create conditions for non-ideal gas situations)

The actual molar volume of hydrogen can be exactly calculated from the experimental density of that gas, that is 0,0899 g/L at 0 °C (1 atm ) and 0.0837 g/L at 20 °C (1 atm), knowing that one mole of dihydrogen (H2) amounts to 2,0159 g/mol. Thus, if 0,08988 grams amount to 1 litre, a mole will be as big as 2,0159/0,0899 = 22,42 litres at STP (0 °C – 1 atm) and 2,0159/0,0837 = 24,1 litres.

These values of true molar volume of hydrogen are very close to the ideal gas values of 22,41 L/mol and 24,0 L/mol at 0 °C and 20 °C, respectively, thus confirming that hydrogen gas behaves almost ideally.

As always it needed some shopping to be done first to get our Hydrogen production back on rails. We started with getting 0.5 kg of Aluminium powder from Barnes.

It looks so tiny when having it in hand.

Getting Caustic Soda from Santo was much more entertaining as they keep selling it in 25kg bags only.

Of course, handling such material might be mildly problematic, we’ve got proper protection ready.

Next phase started by putting together the “reaction chamber” by scavenging on our previous hydrogen generator. Mainly our original investment into the Teflon coating paid off thousand times as the main basin needed to be corrosion resistant to withstand that caustic soda solution, while also having high thermal conductivity to disperse heat from planned (exothermic) reaction.

As a lid we’ve used an old perspex sheet and drilled couple holes to get solution feed in and hydrogen outlet. Whole contraption got placed in an esky filled with water to cool things down and also to capture any potential leaks.

Moments later we’ve started generating our first Hydrogen!

And few moments later, with assistance from Ondra, we had our blimpy back again.

At that stage we had to stop our test as it was getting late and we had some other activities.

Envelope from Windreiter

While working in parallel on multiple projects here, an important one seemed to became quite laid back lately – our airship’s envelope. However that’s not that true. I’ve been checking all the corners to get an material which should do the job – 100 micron PU sheets – but with no luck so far.

At some stage I’ve got in touch with Mr. Martin Hill, who suggested to contact a German company called Windreiter and check what they are doing. Then it was easy and I’ve got in touch with Dr. Andreas Burkart – Windreiter co-founder and we briefly discussed our project.

To my minor setback Dr. Burkart suggested not to start building anything too complex before taking baby steps and pointer to their e-shop with few of-the-shelve envelopes there. Being busy with work I couldn’t make myself to decide where next with this topic till the “Father’s Day” event came with Sebi asking what do I want. While my first idea went to the SignMyRocket, then I reverted to a more plausible solution and asked him to pick one of the Windreiter envelopes for us instead.

Sebi picked the Silver Blimp 181-200 for 40,00 EUR.

It came from Europe in about two weeks in a well padded envelope.

Data sheet confirmed that we are getting what we’ve ordered! Based on that we should be getting ~195g of lifting power (1.91 newtons earth) with it when using Hydrogen at sea level.

Seb got instructions to use our air pump to inflate it and test if it holds pressure well.

As you can see above, it worked out very well! I can’t wait till we attempt to fill it in with Hydrogen with our aluminium foil and caustic soda experiment! 🙂

Training star identification AI using PyTorch

We’ve finished our previous article TRAINING SET WITH STELLARIUM II having the training set ready to go. It was up to Sebi to the next step and do the actual star identification AI training. Sebi ended up to be sort of reluctant making it in a post so it came to explore a form we haven’t used up to this point – an Interview.

Jan: Where do we start Sebi?

Seb: I’ve been interested which will be the better for our job – whether the PyTorch or the TensorFlow, and after some initial research I’ve picked PyTorch as being more popular.

So I’ve started watching following video tutorial about PyTorch.

After watching it I wrote my first code based on that video. Here it goes:

from random import random
import torch
import torch.optim as optim
from torch.optim import lr_scheduler
from torchvision import models, transforms
from torch.utils.data import DataLoader, random_split
from torchvision.datasets import ImageFolder
import os
import torch.nn as nn
import time
import copy
import signal
projectDir = os.path.dirname(os.path.realpath(__file__))+"/"
dataDir = projectDir+'../data/train_224_224_monochrome_big'
deviceName = "cuda:0" if torch.cuda.is_available() else "cpu"
device = torch.device(deviceName)
trainData = ImageFolder(dataDir, transform=transforms.Compose([transforms.Grayscale(num_output_channels=1), transforms.ToTensor()]))
testData = ImageFolder(projectDir+'../data/train_224_224_monochrome', transform=transforms.Compose([transforms.Grayscale(num_output_channels=1), transforms.ToTensor()]))
modelName = "resnet18"
batchSize = 1000
logName = "resnet18.log"
model = models.resnet18(pretrained=False)
model.conv1 = nn.Conv2d(1, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
model.fc = nn.Linear(model.fc.in_features, len(trainData.classes))
if os.path.exists(projectDir+modelName+".pth"):
  model.load_state_dict(torch.load(projectDir+modelName+".pth"))
def main():
  print(trainData.classes)
  save_model(model)
  trainDataLoader = DataLoader(trainData, batch_size=batchSize, shuffle=True, num_workers=3)
  testDataLoader = DataLoader(testData, batch_size=batchSize, shuffle=True, num_workers=3)
  model.to(device)
  criterion = nn.CrossEntropyLoss()
  optimizer = optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
  scheduler = lr_scheduler.StepLR(optimizer, step_size=1, gamma=0.8) # every step_size epochs multiply learning rate by gamma
  train_model(model, optimizer, criterion, scheduler, trainDataLoader, testDataLoader, 1, test=True)
def train_model(model, optimizer, criterion, scheduler, trainDataLoader, testDataLoader, numEpochs, test=False):
  start = time.time()
  best_model_wts = copy.deepcopy(model.state_dict())
  best_acc = 0.0
  best_train_acc = 0.0
  running_incorrects = {star:{star:0 for star in trainData.classes} for star in trainData.classes}
  if test:
    log("the next epochs are for testing:")
  for epoch in range(numEpochs):
    log('Epoch {}/{}'.format(epoch+1, numEpochs))
    log('-' * 10)
    for phase in ["train", "test"] if test == False else ["test"]:
      running_loss = 0.0
      running_corrects = 0
      if phase == "train":
        model.train()
      else:
        model.eval()
      for index, (inputs, labels) in enumerate(trainDataLoader if phase == "train" else testDataLoader):
        print('Batch {}/{}'.format((index+1)*len(labels), len(trainDataLoader if phase == "train" else testDataLoader)*len(labels))+' Accuracy: {}/{} Percent: %{:.4f}'.format(running_corrects, len(trainData if phase == "train" else testData), (running_corrects / ((len(labels)*(index)) + 0.00001)) * 100), end="\r")
        inputs = inputs.to(device)
        labels = labels.to(device)
        with torch.set_grad_enabled(phase == "train"):
          outputs = model(inputs)
          _, preds = torch.max(outputs, 1)
          loss = criterion(outputs, labels)
          if phase == "train":
            optimizer.zero_grad()
            loss.backward()
            optimizer.step()
        running_loss += loss.item() * inputs.size(0)
        running_corrects += torch.sum(preds == labels.data)
        for index, (prediction, label) in enumerate(zip(preds, labels.data)):
          if prediction != label:
            running_incorrects[trainData.classes[label.item()]][trainData.classes[prediction.item()]] += 1
      epoch_loss = running_loss / len(trainData if phase == "train" else testData)
      epoch_acc = running_corrects.double() / len(trainData if phase == "train" else testData)
      if phase == 'train':
        scheduler.step()
        if epoch_acc > best_train_acc:
          best_train_acc = epoch_acc
      # log("failed classes: "+str({star:sorted(running_incorrects[star].items(), key=lambda item: item[1], reverse=True) for star in running_incorrects}))
      log("failed classes: "+str(running_incorrects))
      log('{} Loss: {:.4f} Acc: {:.4f}'.format(
        phase, epoch_loss, epoch_acc))
      log('Accuracy: {}/{}'.format(running_corrects, len(trainData if phase == "train" else testData)))
      # deep copy the model
      if phase == "test" and epoch_acc > best_acc and test == False:
        best_acc = epoch_acc
        best_model_wts = copy.deepcopy(model.state_dict())
        save_model(model)
        log("updated model")
      time_elapsed = time.time() - start
      log('Epoch time stamp: {:.0f}m {:.0f}s'.format(time_elapsed // 60, time_elapsed % 60))
    log("")
  log('Training complete in {:.0f}m {:.0f}s'.format(time_elapsed // 60, time_elapsed % 60))
  log('Best val Acc: {:4f}'.format(best_acc))
  # load best model weights
  model.load_state_dict(best_model_wts)
  return model
def save_model(model, name=modelName):
  torch.save(model.state_dict(), projectDir+name+".pth")
  log("saved model: "+modelName)
def handler(signum, frame):
  quit()
def log(message, name=logName):
  print(message)
  with open(projectDir+name, 'a') as file:
    file.write(message+"\n")
signal.signal(signal.SIGINT, handler)
main()

Jan: What’s that doing? Looks messy.

Sebi: No. It is beautiful! Ok. It takes from pre-build network Resnet18 from PyTorch. (Resnet networks are in general famous for its image processing abilities.) I’ve changed it to have a correct number of inputs and outputs.

Jan: Like what?

Sebi: Inputs – greyscale and 64 x 64 points resulutions and Outputs – number of training classes (20 stars).

Jan: What next?

Sebi: Then it starts training that model through the back-propagation neural network and started showing how it progresses. It stops after several epochs and saves itself (trained weights).

Jan: What was a first result?

Sebi: So on a first epoch on a first go, I’ve ended up with accuracy of 0.01 accuracy. It kept improving over 33 epochs and it came up with 60% accuracy against the testing dataset.

Jan: That’s pretty cool for a first run. Would you share that log with us please?

Sebi:

# learning rate: 0.01, momentum: 0.0, step_size: 7, gamma: 0.1
saved model: resnet18
Epoch 1/50
----------
train Loss: 2.8978 Acc: 0.0972
Epoch time stamp: 22m 34s
test Loss: 2.7925 Acc: 0.1225
saved model: resnet18
updated model
Epoch time stamp: 23m 24s
Epoch 2/50
----------
train Loss: 2.6922 Acc: 0.1929
Epoch time stamp: 45m 48s
test Loss: 2.6295 Acc: 0.2250
saved model: resnet18
updated model
Epoch time stamp: 46m 32s

… many mode lines.

Epoch 32/50
----------
train Loss: 2.0006 Acc: 0.6081
Epoch time stamp: 762m 41s
test Loss: 2.0135 Acc: 0.5955
Epoch time stamp: 763m 35s
Epoch 33/50
----------
train Loss: 2.0006 Acc: 0.6106
Epoch time stamp: 786m 42s
test Loss: 2.0042 Acc: 0.6065
saved model: resnet18
updated model
Epoch time stamp: 787m 36s

Jan: Ok, what happened next?

Sebi: I’ve been playing with few training constants like it follows on a smaller dataset.

saved model: resnet18
ended session
# learning rate: 0.1, momentum: 0.9, step_size: 5, gamma: 0.5
saved model: resnet18
Epoch 1/10
----------
train Loss: 0.4430 Acc: 0.8650
Epoch time stamp: 41m 11s
test Loss: 0.0202 Acc: 0.9960
saved model: resnet18
updated model
Epoch time stamp: 42m 34s

.. making it clear that those new training constants have quite significant impact on how it all operates (much better now). As you can see below – it finished in achieving 100% accuracy.

saved model: resnet18
Epoch 1/3
----------
train Loss: 0.0003 Acc: 1.0000
Accuracy: 18000/18000
Epoch time stamp: 22m 11s
test Loss: 0.0001 Acc: 1.0000
Accuracy: 2000/2000
Epoch time stamp: 22m 52s

Jan: Amazing – almost unbelievable. It means that we can identify 20 brightest stars with 100% accuracy now?

Sebi: Not exactly. When we started training on a much bigger dataset and it achieved on the training set just 99% accuracy, while on the testing 95%.

Jan: Why do you think it didn’t reach 100%?

Sebi: Because that dateset is corrupted … definitely.

Jan: I don’t believe you.

Sebi: Checkout this.

Epoch 1/1
----------
failed classes: [('Arcturus', 877), ('Canopus', 157), ('Achernar', 0), ('Acrux', 0), ('Aldebaran', 0), ('Altair', 0), ('Antares', 0), ('Betelgeuse', 0), ('Capella', 0), ('Deneb', 0), ('Fomalhaut', 0), ('Hadar', 0), ('Mimosa', 0), ('Pollux', 0), ('Procyon', 0), ('Rigel', 0), ('Rigel Kentaurus', 0), ('Sirius', 0), ('Spica', 0), ('Vega', 0)]
test Loss: 0.1494 Acc: 0.9483
Accuracy: 18966/20000
Epoch time stamp: 9m 28s

Why do you think it would fail on just two stars?

Jan: That original data set was generated programatically and I am pretty sure there is no problem.

Sebi: Fine, look at this:


Training complete in 0m 25s
Best val Acc: 0.000000
saved model: resnet18
the next epochs are for testing:
Epoch 1/1
----------
failed classes: {'Achernar': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Acrux': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Aldebaran': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Altair': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Antares': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Arcturus': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 79, 'Capella': 798, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Betelgeuse': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Canopus': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 157, 'Spica': 0, 'Vega': 0}, 'Capella': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Deneb': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Fomalhaut': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Hadar': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Mimosa': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Pollux': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Procyon': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Rigel': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Rigel Kentaurus': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Sirius': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Spica': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}, 'Vega': {'Achernar': 0, 'Acrux': 0, 'Aldebaran': 0, 'Altair': 0, 'Antares': 0, 'Arcturus': 0, 'Betelgeuse': 0, 'Canopus': 0, 'Capella': 0, 'Deneb': 0, 'Fomalhaut': 0, 'Hadar': 0, 'Mimosa': 0, 'Pollux': 0, 'Procyon': 0, 'Rigel': 0, 'Rigel Kentaurus': 0, 'Sirius': 0, 'Spica': 0, 'Vega': 0}}
test Loss: 0.1494 Acc: 0.9483
Accuracy: 18966/20000
Epoch time stamp: 9m 55s

… just four stars identification results are strangely incorrect. See the table below.

Jan: Well – it is a neural network, it is ok to have few wrong. I actually think this is an awesome result! Thank you Sebi!

Sebi: FYI I also tried running it with Resnet30 and …

saved model: resnet30
Epoch 1/1
----------
....
train Loss: 0.0008 Acc: 1.0000
Accuracy: 199995/200000
Epoch time stamp: 549m 7s
failed classes: {"Achernar": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Acrux": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Aldebaran": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 1, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Altair": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Antares": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Arcturus": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 271, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Betelgeuse": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Canopus": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 2, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 840, "Hadar": 0, "Mimosa": 0, "Pollux": 1, "Procyon": 0, "Rigel": 8, "Rigel Kentaurus": 0, "Sirius": 92, "Spica": 0, "Vega": 57}, "Capella": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Deneb": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Fomalhaut": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Hadar": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 1, "Sirius": 0, "Spica": 0, "Vega": 0}, "Mimosa": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Pollux": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 1, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Procyon": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 2, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Rigel": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Rigel Kentaurus": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Sirius": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Spica": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}, "Vega": {"Achernar": 0, "Acrux": 0, "Aldebaran": 0, "Altair": 0, "Antares": 0, "Arcturus": 0, "Betelgeuse": 0, "Canopus": 0, "Capella": 0, "Deneb": 0, "Fomalhaut": 0, "Hadar": 0, "Mimosa": 0, "Pollux": 0, "Procyon": 0, "Rigel": 0, "Rigel Kentaurus": 0, "Sirius": 0, "Spica": 0, "Vega": 0}}
test Loss: 0.7076 Acc: 0.9365
Accuracy: 18729/20000
saved model: resnet30
updated model
Epoch time stamp: 566m 14s
Training complete in 566m 14s
Best val Acc: 0.936450
saved model: resnet30
Epoch 1/1
----------

… here the accuracy got even worse – getting just 94%. I find interesting that those invalid results with Resnet18 are not correlating with Resnet30. It just doesn’t make sense …

Jan: Well for me both results look impressive. Thank you for answering all my questions!

Having this stage of our project covered, what lies in front of us next? We need to finish whole loop here which I currently imagine like this:

  1. Stellarium randomly picks a place on Earth in a certain date-time
  2. Sellarium takes pictures of 4 brightest stars saves those locally and records their inclination
  3. Pre-trained AI attempts to identify all 4 stars and potentially runs some basic checks if those are forming a plausible scenario (can be seen from Earth at some stage / above horizon).
  4. In the next stage application uses inclinations of identified stars and looks up star charts for specified date to identify celestial position.
  5. Finally it translates celestial position to GPS coordinates.

However this needs to wait for another day. 🙂

BBBlimp on Hydrogen Connect Summit Brisbane 2022

On 8th Sept we had an opportunity to present our project on a Hydrogen Connect Summit Brisbane 2022 organised by H2Q.

It was a long day and we met number of awesome people interested in our project. I took few pictures and will keep adding into following gallery as getting more from other sources.

Whole event culminated by presenting our project in front of ~400 summit attendees, including our technological demonstrator.

Sebi & Jan presenting Blimpy

PowerPoint presentation itself here:

Whole team took a picture altogether with organisers.

Huge thanks to H2Q for inviting us to this event. We’ve learned a lot and got some awesome contacts to follow up on with!

UPDATE 2022-09-12: Received this video from Phil – Seb demonstrating our Blimp’s control with ramping up motors to a full power to show off. 😉

UPDATE 2022-10-07: Received few more pictures from Hydrogen Connect Summit team.

Troubles with The Arm

Working on our new super-cool gimbal arm for a while we’ve hit a bummer. The most stressed part of our design – the joint between the main arm and attachment jig – apparently is not strong enough to withstand required forces, even more, making it pretty susceptible to a so-called hangar-rash.

It took us couple hours to do a minor design change – running upper connection bolts all the way through the attachment jig to reinforce that whole thing.

Slicing on cura – that’s that easy part.

However running out of filament was a major problem 😀

Luckily one of our generous sponsors (Vilem) came back with an instant supply of 4 big rolls of it! Thank you Vilem!!! 🙂

Printing it was then easy, it just took whole weekend to get 4 of those as each print took 7hrs.

Bit of sanding + drilling and mounting and voila – it feels much stronger now.

Let’s see where it breaks next time. 🙂 Still, it keeps me wonder how all that 3D printing (Additive Manufacturing) made things possible for us. You mess up / break something and isn’t24 hrs later you have a better or replacement part in your hands without much stress and hustle.

Checking some articles on this I have found this one – Intro to Additive Manufacturing: Prototyping vs. Production, which is very nicely describing challenges of redesigning legacy products and prototyping with 3D printing provides tangible benefits.

They are mentioning 5 key points where 3D printing helps:

  • Accelerate Development Time – Exceeding the typical design cycle schedule by 3D printing your ideas overnight and having your parts available the next morning.
  • Fail Fast, Fail Often – 3D printing enables the engineering team to identify mistakes early in the process. Products are hardly ever right the first time, this process is mitigating the time lost with rapid prototyping.
  • Cost Effective – The traditional methods of developing a prototype can be time consuming and expensive. Multiple fabrication methods, reserving time on production equipment or not having access to the right technology can be costly.
  • Enhanced Creativity – Rapid prototyping is an efficient tool for engineers to quickly evaluate and improve on their ideas.
  • Product Testing – Form, fit and function. Feel the ergonomics of a new prototype, squeeze pieces to pressure fit an assembly or drop test the part to evaluate functional strength. Easy with 3D printing.

Sounds pretty handy, doesn’t it? 🙂

Training set with Stellarium II

Checking where next with our “super-awesome” 42GB training set, I’ve realized that for a next step we’ll need much lower resolution. We are now on 1920 x 1080, while many of those traditional neuron networks like ResNet are happily coping with much more modest 224 x 224.

So well onto some tiny Python scripting to get our set converted to 224 x 224 – grayscale! Following script does that magic in a matter of 90 minutes.

#!/usr/bin/env python3

from argparse import ArgumentParser

from PIL import Image, ImageOps
import os
from glob import glob

def transform(im, args):
  left = im.width/2-args.width/2*args.scale
  top = im.height/2-args.height/2*args.scale
  right = im.width/2+args.width/2*args.scale
  bottom = im.height/2+args.height/2*args.scale
  return left, top, right, bottom

def crop(args):
  result = [y for x in os.walk(args.input) for y in glob(os.path.join(x[0], '*.png'))]
  counter = 0

  for filename in result:
    with Image.open(filename) as im:
      im2 = im.crop(transform(im, args))
      im2 = im2.resize((args.width, args.height))
      if args.grayscale:
        im2 = ImageOps.grayscale(im2)
      filename_out = filename.replace(args.input, args.output)
      os.makedirs(os.path.dirname(filename_out), exist_ok=True)
      im2.save(filename_out)
    counter += 1
    pct = round(counter/len(result)*100,4);
    print("Finished processing for", filename_out, "\t[", pct, "%]")


def main():
  parser = ArgumentParser(description='Recursive crop & transformation for image files')

  parser.add_argument('--input', default='./train', help='input folder')
  parser.add_argument('--output', default='./train_224_224_monochrome', help='output folder')
  parser.add_argument('--height', default=224, help='image height')
  parser.add_argument('--width', default=224, help='image width')
  parser.add_argument('--grayscale', default=True, help='convert output to grayscale')
  parser.add_argument('--scale', default=2, help='scale down coefficient')

  args = parser.parse_args()

  print('input  folder: ', args.input)
  print('output folder: ', args.output)

  if args:
    crop(args)

  else:
    parser.print_help()

  print("Done")

main()

This whole operation worked out reducing our initial 48.5GB monster train set to much more convenient 1.4GB.

Result seems to be bit radical, but well it is like it is.

Sirius 1920 x 1080 colour (before)
Sirius 224 x 224 colour (after)

Meanwhile Sebi kept reading and experimenting with machine learning and got some fantastic results there, but that’s for another post. 🙂

Training set with Stellarium

Thinking about training our AI to locate stars Sebi ran a test demonstrating a simplified training set we’ll be needing images of 20 brightest stars – 10.000 per each having:

  • reasonable focus / zoom in & out
  • any possible rotation
  • good quality
  • no atmospheric effects
  • no horizon
  • no planets / sun / moon / satellites

The obvious question came instantly – where to get such a training set for our project? The idea came to use Stellarium.

Stellarium is already a pretty old application (started 2001) which can emulate a planetarium for your computer. It shows a realistic sky in 3D, just like what you see with the naked eye, binoculars or a telescope.

Getting and running Stellarium on a Linux is piece of cake:

$ git clone https://github.com/Stellarium/stellarium.git
$ cd stellarium
$ mkdir -p build/unix
$ cd build/unix
$ cmake -DCMAKE_INSTALL_PREFIX=/opt/stellarium ../.. 
$ make -j4
<wait here>
$ ./src/stellarium

Application throws you straight to a nice screen – showing very late morning in North Brisbane. 🙂

Well, this is a moment when things got little bit more complicated. I had to get familiar with the Stellarium scripting engine, its API, Julian calendar and also PyTorch training set layout and ended up with submitting an actual patch to the Stellarim guys with all that wrapped up.

Let’s start with a patch which seems to be trivial – just allows creating a subfolder when needed:

From 6d2fcc079705385f4803c78902099a38ddc4e89f Mon Sep 17 00:00:00 2001
From: Jan Bilek <jan.bilek@eftlab.com.au>
Date: Sat, 13 Aug 2022 22:22:59 +1000
Subject: [PATCH] Screenshot to attempt to create folder when missing

---
 src/StelMainView.cpp | 19 ++++++++++---------
 1 file changed, 10 insertions(+), 9 deletions(-)

diff --git a/src/StelMainView.cpp b/src/StelMainView.cpp
index 288832dc75..d72b1c14f6 100644
--- a/src/StelMainView.cpp
+++ b/src/StelMainView.cpp
@@ -1718,16 +1718,17 @@ void StelMainView::doScreenshot(void)
 		}
 		else
 			screenshotDir = StelFileMgr::getUserDir().append(screenshotDirSuffix);
+		StelApp::getInstance().getSettings()->setValue("main/screenshot_dir", screenshotDir);
+	}
 
-		try
-		{
-			StelFileMgr::setScreenshotDir(screenshotDir);
-			StelApp::getInstance().getSettings()->setValue("main/screenshot_dir", screenshotDir);
-		}
-		catch (std::runtime_error &e)
-		{
-			qDebug("Error: cannot create screenshot directory: %s", e.what());
-		}
+	//Always check if destination folder exists and attempt to recreate it if not
+	try
+	{
+		StelFileMgr::setScreenshotDir(screenShotDir.isEmpty() ? StelFileMgr::getScreenshotDir() : screenShotDir);
+	}
+	catch (std::runtime_error &e)
+	{
+		qDebug("Error: cannot create screenshot directory: %s", e.what());
 	}
 
 	if (screenShotDir == "")
-- 
2.25.1

Don’t get distracted by that – it looks much worse then what it is – it is just couple lines, all the rest is computer generated fluff to make it look cool (and needed to be working with the original code).

Interesting part comes now – following script generates a PyTorch star training set for our celestial navigation project – 20 x 10000 pictures of the brightest stars. 🙂

var stars = ["Sirius", "Canopus", "Arcturus", "Rigel", "Vega", "Capella", "Rigel Kentaurus", "Procyon", "Betelgeuse", "Achernar", "Hadar", "Altair", "Acrux", "Aldebaran", "Spica", "Antares", "Pollux", "Fomalhaut", "Deneb", "Mimosa"]; //20 brightest stars

DIR="~/Pictures/Stellarium/";

var pictsInSet = 10000;     //Size of a training set per star
var minMjDay = 50000.000000 //Min Julian calendar date to consider for randomization
var maxMjDay = 60000.000000 //Max Julian calendar date to consider for randomization
var minZoom = 10 	     //Min zoom to consider for randomization
var maxZoom = 30 	     //Max zoom to consider for randomization

//Helper functions
function randomFloat(min, max) {
    return min + (max - min) * Math.random();
}
function showAStar(cName) {
  core.setMJDay(randomFloat(minMjDay, maxMjDay));
  core.selectObjectByName(cName, true);
  StelMovementMgr.autoZoomIn(0);
  StelMovementMgr.zoomTo(randomFloat(minZoom, maxZoom),0);
  StelMovementMgr.deselection();
  core.wait(0.1); //Making sure that UI has a moment to catch up
}

//Disabling all
ConstellationMgr.setFlagArt(false);
ConstellationMgr.setFlagBoundaries(false);
ConstellationMgr.setFlagLines(false);
ConstellationMgr.setFlagLabels(false);
GridLinesMgr.setFlagEquatorGrid(false);
GridLinesMgr.setFlagAllLines(false);
LandscapeMgr.setFlagLandscape(false);
MilkyWay.setFlagShow(false);
NebulaMgr.setFlagHints(false);
SolarSystem.setFlagPlanets(false);
StarMgr.setFlagLabels(false);

core.setGuiVisible(false);
core.setTimeRate(0);
core.wait(1); //Making sure that UI has a moment to catch up

for (i=0; i<stars.length;++i) {
  for (ii=0; ii<pictsInSet; ++ii) {
    showAStar(stars[i]);
    var fileName =  ii;
    core.debug(DIR + stars[i] + "/" + fileName +".png");
    core.screenshot(fileName, false, DIR + stars[i], true);
  }
}

core.setGuiVisible(true);

Well then it took almost 5 days (I had some problems with stability), but it worked out well. We’ve ended up with 42.3 GB images of 20 brightest stars – 10.000 each!

Even that directory listing looks impressive!

Adding a first three images of Sirius here for reference.

Huge thanks here belongs to the Stellarium team – this wouldn’t be possible without you!

Next stage – onto some serious machine learning! 🙂

How many candy are in the jar

Sebi came with a new project on his own, which I thought it is perfect – let’s use machine learning to guess how many candy are in the jar!

I love Craion!

Somehow coincidentally few days later we’ve ended up on the Cyber wargaming in a Lego City MeetUp arranged by Chris Boden.

We’ve enjoyed MeetUp with Sebi a lot and had a good chat with guys from TLR afterwards (namely Billy was just awesome white-hat hacker).

We also caught up with Chris B. and while Sebi briefly introduced him to his new project idea, Chris instantly proposed to use today’s state-of-art machine learning computer – Jetson Nano for it!

You never heard of Jetson Nano? Jetson Nano is a small, powerful computer that lets you run multiple neural networks in parallel for applications like image classification, object detection, segmentation, and speech processing. All in an easy-to-use platform that runs in as little as 5 watts – all magic in size of a cigarette pack.

Well, Chris didn’t just told us about it, but instantly gave a call to Ben and asked him on our behalf if we can borrow one … and Ben agreed – apparently having one spare on his desk!

Jetson Nano not idling on Ben’s desk anymore!

We had to visit JayCar next day as we haven’t had a power supply strong enough to feed this hungry beast, but Sebi ended up having it working in couple days. Now just to teach it to count lollies! 😀

Getting to the end of our post – allow me here to say huge thank you to Chris Bowen and Ben Duncan – your help allowed us to jump instantly forward a mile! It is awesome to have such a great support from you Noosa guys and always feels welcome for more then a decade!

Thank you all again!

16MB ArduCam with Autofocus

Reading through my common news-feed couple months ago, I’ve stumbled on a new 16MB ArduCam with Autofocus.

Even not having any actual usage for it we decided to get one for $29 AUD.

It took us couple months to find a moment to start experimenting with it, but finally Sebi mounted it on our RPI.

I’ve left all the rest with Sebi and simply following How-To from here he made it working in ~30 minutes. And here comes our first picture!

Our first picture with ArduCam!

Not being that impressive we’ve captured one more with Eddie. 🙂

Second ArduCam picture with Eddie

Asking Sebi how he did it I’ve been provided with that link above and following line from terminal:

sudo libcamera-still -o images/test.jpg --autofocus

Gimbal arm Mk II

As per our previous article Gimbal arm, we’ve got seriously misled by this project in past three months. There were too many things happening so let’s just go through several highlights.

Design update

Trying to get around that criteria of having our arm light, but still very firm we’ve progressed for a multiple channels arm:

To a single channel arm with clamps and 20% filling:

Current complete 3D model of our arm then looks like this.

TBA

While it sounds like a trivial thingy, it took quite an effort to get it right. The thing you are seeing on picture below is practically just a garbage due to poor design, where each of these prints is ~200g of wasted filament + 24-36hrs of printing.

Printing challenge

While printing became somewhat common for us, some of those latest 36hrs prints came in as something new. Particular problem was our garage sound proofing – it is far from perfect and while it can be happily ignored through the day it resonates through a whole house through the night causing some unwelcome family discussions.

Let’s have a look how such print looks like in a time-lapse.

Here you may ask why it is being printed in that most inefficient way – but trust me, this is the best way we identified. Bending forces imposed through the printing due to the temperature differences can come with an incredible power on these lengths and kept ripping the print in a pieces without any problems.

Well at the end we’ve finished having pretty nice collection. 🙂

Cabling mayhem

Blueprint below depicts our cabling plan. Whatever is there – in short it actually means that we needed to get 7 different cables through the arm’s central shaft to have it all working.

In some cases it took up to several hours to make it happen! Chickening out on our first go – I cut out a bit of material to actually make it happen, but it really wasn’t needed later on.

This included also changes to all the connectors around and new power bridge development (I’m keeping this one out of this post as there is more about it). Anyway gimbals ended up needed rewiring as well.

All four arms assembly and cabling completed looked pretty satisfying. It took us two whole weekends just to do this.

Trusses replacement

In comparison to all the other tasks this was a back to basics. Old steel stabiliser rods connecting our thrusters needed to be replaced with a light-weight wooden variant before mounting in our new arms. This could be considered as a cosmetic, but still took us some effort to do it right.

The last bit missing was to drill an oval hole in each support leg to be able to lead our cabling through it.

New arms

Finally we can present you – new arms! Enjoy!

Aren’t they beautiful?

As always, if you read all the way here, please drop us a note on what you think – your feedback is our best motivation. 🙂

Next time – new 6V and 25V power bridges and let’s get back to the ArduPilot again!

Fun with anemometer

With all those EDFs (Electric Ducted Fans) all around our project (six of them), we’ve been thinking if their design is good or bad and the only thing to work it out seemed to measure their power. Some ideas were pretty wild, till .. you ask Google how to do this. The answer was clear – just get an anemometer!

Kept searching for something less steampunkish and ended up with BENETECH Anemometer GT8907 Sensor with ability to measure wind speeds up to 45m/s (162km/h).

It arrived after after a while, but in an excellent condition couple weeks later.

Well, long story short. Sebi will show you how it went.

So we did three tests and what we’ve found out?

Test #1Test #2Test #3
Tube diameter (mm)55mm6480
ft/min (LFM)3,740.163,346.462,559.06
m/s191713
km/h686136
m3/hr156.65196.88235.25
L/s43.5154.6965.35

940.99 m3/hr

261.39 L/s

Well, what we did find out? Unfortunately no idea. I suppose we’ll need to keep these values for reference and see how our model performs first. 🙂

To wrap this up I am adding one more video showing our gimbal test.

Tooli G3 laser cutter – part I

Months ago, when planning for putting together our new gas bags design, I’ve realised that we’ll need something more reliable to do all the cutting. Spreading a word, we suddenly ended up with a Tooli G3 Laser cutter form Toolbotics!

It looks even better on their Ad – CNC, Laser Toolbotics Tooli 3G Launch!

While it may sound like a fantastic news, it ended up being a big story – it actually might be a good time to get yourself a good cup tea before you’ll start reading.

So it begins, we’ve did the obvious first – assembled the tool and ran it!

Even Sebi & Oli had pretty promising go on that.

However we’ve hit the first obstacle here, to be able to do something sensible we needed to produce a gcode which would tell it what to do. Checking on the vendor’s page solution seems to be simple just get their Plotter! Well not that fast as it needs the Art2Gcode program which does not open because it needs the Adobe Flash service, which doesn’t exist for past X years. Bummer.

So, trying to get something we wrote an email to tooliworlds@gmail.com to get some support there. It worked out that Toolbotics practically doesn’t exist anymore and their support is limited. Anyway someone finally responded and we’ve got our hands on that Art2Gcode app and Sebi has been able to run it somehow on his Windows PC.

We’ve loaded resulting Gcode and loaded it on a SD card, plugged it in the device and .. it couldn’t find any files loaded. So we got back to Toolbotics support again, asking for any advice and they came back with:

… and then many more iterations later it became – that it all can be an SD card reader problem. And that was a proper opportunity to check what’s inside that box! 🙂

Very interesting. Information on boards provided us with few ideas. Main thing here – you can see that the board is modified Mega controller from Makerlab. Its description says that it is a Single board solution, Remix of Arduino mega and RAMPS and it is actually pretty cool!

However we are interested mainly in that SD card support (you can see connections in a blue frame saying LCD/SD support with Mini Panel above). Thinking obvious – SD Card reader is broken, we did a quick run to grab a new one for $5.95.

Well, swapping it with the new one changed nothing – no files seems to be loaded / detected by that thing! Getting already bit frustrated, I’ve asked for another advice and been given that it is very likely main board problem and it needs to be replaced. So I asked if we can be provided with a source code or a firmware to attempt to do some debugging ourselves and ended up in some sort of strange Catch 22 situation where firmware cannot be provided as it is their IP so even they recommend board replacement, there is no way to reload it.

This argument seemed to be good enough that Toolbotics support actually ended up providing whole FW source code! Having it checked by Andrew – he discovered that it is based on a Open-Source 3D printer FW called Marlin, just version 1.x, while they already moved forward a bit.

Still, Andrew’s been able to revive it to get it compile again. Next step was obvious – load it! And this is where things are getting interesting a lot again, by loading the firmware provided over the one present in there, everything stopped working, even those things which seemed to be half-working before. Darn.

I’ll keep you hanging here as this is getting pretty long, while rest assured that this story is not over!

Airships with Craiyon

Andrew came back having some fun with CrAIyon – “Craiyon, formerly DALL·E mini, is an AI model that can draw images from any text prompt!”. Some cool pics came out of it, so I put them in a small gallery.

It is fascinating what this technology can do!

UPDATE 2022-09-06 from Viktor K:

father with two sons and dog constructing blimp in the garden next to the house in australia

Let us know if you’ll get some cool ones, we’ll get them posted here as well!

10 things I like about Airship Design by Charles P. Burgess

While reading through the Fatal Flight from Bill Hammack, I’ve noticed numerous references on another book – Airship Design by Charles P. Burgess. I couldn’t resist an opportunity and bought a copy. It arrived shortly and I’ve ended up reading it for past 2 months! There were was so much of interesting information related to our project that I’ve lost track of all of them after while, but dedicated to coming back and do at least a minimal review on some highlights – I came with a plan to pick my top 10 highlights.

While plan is laid, I would still like to start with synopsis on the book’s booklet itself: Originally published in 1927, this volume was intended to fill the dual role of textbook for the student of airship design and handbook for the practical engineer. The design of airships, particularly of the rigid type, is mainly a structural problem; and theoretical aerodynamics has nothing like the relative importance which it bears in airplane design. This is to be expected when we consider that the gross lift of an airship depends solely on the specific gravity of the gas and the bulk of the gas container, and not at all on shape or other aerodynamic characteristics which determine the lift of airplanes. … and it is all there!

Now let’s start with our list itself.

1/ Beautiful historical pictures and schematics

This book is full of them. I’m picking up two of them, bu I’ll keep picking more through our list.

2/ Hydrogen vs. Helium lifting performance.

It is well known that weight difference between weight of Hydrogen and Helium is just about 5%. However it is not that apparent how it translates to the gas lifting power. One of the paragraphs in the Size and Performance section covers this topic in a detail clearly stating that usage of Hydrogen increases overall performance of the airship by incredible 54.5% – this roughly translates into larger payload / reach radius / operations ceiling in general.

Hydrogen increases overall performance of the airship by incredible 54.5%

3/ Testing with models

Imagine 1920s – no computers, 3D visualisation, well … no calculators, no super computers. Pinnacle of the modern technology was mechanical Enigma Machine. What you do? You use wind tunnels to test your aerodynamics, and underwater models for testing all sort of sheering moments and stress forces. Then you’ll come with equations which will describe how all those observations scale up. Then you build it and learn from your mistakes and repeat. Purely amazing!

4/ Venting & Exhaust trunks

Rapid pressure changes caused by the airship’s steep ascend, descend or just gas temperature changes are clearly one of the prime dangers every airship is facing. Practical way to tackle that is to have some good-sized vents which can prevent popping its envelope. Yep, even this is in, including practical calculation example relating to the overall volume.

There is also an equation providing required vent area on the next page together with a description of exhaust trunks for safe Hydrogen venting from gas bags.