Browsed by
Author: Kevin R Keegan

How to Migrate from Darktable to Lightroom Classic

How to Migrate from Darktable to Lightroom Classic


  1. Copy your Darktable sidecar files from <basename>.<extension>.xmp to <basename>.xmp
    • So for example DSC_05183.nef.xmp should be copied to DSC_05183.xmp
  2. Import your photos into Lightroom
  3. Ratings, Tags, Color Codes, and other Exif Information will be imported into Lightroom. But none of your development settings will be imported (exposure, tone curve, …)
  4. You can now edit the photo in either Lightroom or Darktable without affecting the other application. However, further changes to ratings, tags, color codes, and other exif Information will not be synchronized between the two applications. This data is only synchronized on the initial import.

The Full Story


I have been a Darktable user for 10 years. I have ~35,000 raw photos totaling ~430 GB in my Darktable library. I developed ~5,000 of those photos and around 2,500 were good enough to export for print or to share with other people.

Why I left Darktable

I recently decided to switch to Lightroom for the following reasons:

  1. I upgraded to a Nikon Z8, which has an optional High Efficiency ★ Raw setting using the ticoraw codec. This results in images that are comparable to lossless compression but 50% the size. In reality, my results are 60% of the size (~55MB -> ~32MB). This results in a larger buffer, more images on the cards, and less space on my hard drive. Sadly this format is not compatible with any open source projects, at least not yet. Tico raw is covered by patents, it isn’t clear if they would share the decoding sdk with an open source project.
  2. The AI features in Lightroom are fantastic (Denoise, Deblur, Masking, …) plus the external programs like Topaz and I suspect that these features will not be something Darktable will be able to reproduce anytime soon.
  3. The Darktable updates have made it too complicated for me to enjoy using the application. There are too many knobs to turn, these knobs seem to change dramatically in major releases, and too much photo theory to learn in order to understand how to use the knobs effectively.

Of course, Lightroom is subscription based, which the one drawback but sadly is a big drawback.

What can you import from Darktable? As noted above, only the metadata, specifically ratings, keywords, and other exif information. But you cannot import any of your development work.

This is acceptable to me because after I finish developing a photo, I export it and rarely go back to edit it again. Rarely means of my ~3,000 photos that I have exported I would guess only ~100 have I gone back to re-edit. That said, often when I have gone back, I have started over from scratch generally because darktable or my developing skills have improved since the last edit. In the future, I can either re-export it from Darktable, or I can start over in Lightroom. Basically the same choices I always had.

Python Argparse: Group Sub-Parsers

Python Argparse: Group Sub-Parsers

Python argparse has become a staple package in python in part due to its ease of use.

However, I recently came across an issue while using it on InsteonMQTT which makes extensive use of sub-parsers. Other than sorting, there is no mechanism to organize sub-parser objects to make them more readable.

This seems like a known issue going back to at least 2009 with no indication that it will be solved. Luckily, Steven Bethard was nice enough to propose a patch for argparse that I was able to convert to a module extension very easily.

In short, the following is the module extension

# Extend Argparse to Enable Sub-Parser Groups
# Based on this very old issue:
# Adds the method `add_parser_group()` to the sub-parser class.
# This adds a group heading to the sub-parser list, just like the
# `add_argument_group()` method.
# NOTE: As noted on the issue page, this probably won't work with [parents].
# see
# Pylint doesn't like us access protected items like this
import argparse

class _SubParsersAction(argparse._SubParsersAction):

    class _PseudoGroup(argparse.Action):

        def __init__(self, container, title):
            sup = super(_SubParsersAction._PseudoGroup, self)
            sup.__init__(option_strings=[], dest=title)
            self.container = container
            self._choices_actions = []

        def add_parser(self, name, **kwargs):
            # add the parser to the main Action, but move the pseudo action
            # in the group's own list
            parser = self.container.add_parser(name, **kwargs)
            choice_action = self.container._choices_actions.pop()
            return parser

        def _get_subactions(self):
            return self._choices_actions

        def add_parser_group(self, title):
            # the formatter can handle recursive subgroups
            grp = _SubParsersAction._PseudoGroup(self, title)
            return grp

    def add_parser_group(self, title):
        grp = _SubParsersAction._PseudoGroup(self, title)
        return grp

class ArgumentParser(argparse.ArgumentParser):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.register('action', 'parsers', _SubParsersAction)

And the following is a simple test file

import argparse_ext

parser = argparse_ext.ArgumentParser(prog='PROG')
cmd = parser.add_subparsers(dest='cmd')
grp1 = cmd.add_parser_group('group1:')
grp1.add_parser('a', help='a subcommand help', aliases=['a1','a2'])
grp1.add_parser('b', help='b subcommand help')
grp1.add_parser('c', help='c subcommand help')
grp2 = cmd.add_parser_group('group2:')
grp2.add_parser('d', help='d subcommand help')
grp2.add_parser('e', help='e subcommand help', aliases=['e1'])


Which produces this nice command line output:

...$ python
usage: PROG [-h] {a,a1,a2,b,c,d,e,e1} ...

positional arguments:
      a (a1, a2)        a subcommand help
      b                 b subcommand help
      c                 c subcommand help
      d                 d subcommand help
      e (e1)            e subcommand help

optional arguments:
  -h, --help            show this help message and exit

Note: There is a warning that this code may not work with parents argument of ArgumentParser, but I can live with that.

Marlin Unified Bed Leveling Tips

Marlin Unified Bed Leveling Tips

I am by no means an expert on 3D Printing. If you are looking for someone who is, I highly recommend Michael at Teaching Tech.

However, I did learn a few things while trying to level the bed of my Creality CR10 V2. I chose to use the Unified Bed Leveling system in Marlin. You should read up on it on the Marlin site.

Here are the few things I learned that I didn’t see mentioned anywhere else.

Choose Measurement Points to Maximize Sensor Reach

For most setups, the bed leveling sensor cannot reach the entire bed because the sensor is offset from the print head. For the CR10 with a BLTouch, the sensor is offset about 46mm on the X axis. Since the printer head limit on the x axis is 0, this means that the sensor cannot reach any point less than 46mm on the x axis.

I want to maximize the portion of my bed that is measured, so I chose Marlin settings that would generate a measurement grid with an x-column as close to 46mm without going below that limit.

The formula for determining the locations of the measurement points is:



Where Xpt and Ypt are the indexes of the X and Y points from 0 to (GRID_MAX_POINTS_[XY] – 1).


So in my case, the X position of the second column of measurement points is: 10 + 1((310 – 2 x 10) / (9 – 1)) = 46.25. This is conveniently just slighly higher than my limit of 46mm, meaning I am measuring the bed as far left as I can and getting as much from my level sensor as possible.

Hope this helps.

* If you have altered less commonly used setting such as [XYZ]_MIN_POS, [XYZ]_MAX_POS, or MANUAL_[XYZ]_HOME_POS you made need to adjust this formula.

Defined Your Calibration Points to Match a Measured Point

This one seems like a no-brainer, and I am a little surprised that Marlin doesn’t do this by default.

The UBL system contains an option that can transform a mesh based on a 3 point measurement using command G29 J. You can read about how this all works on the Marlin site.

By default, Marlin defines the 3 measurement points as (X Min, Y Min), (X Max, Y Min), and (X Midpoint, Y Max). However, this can lead to larger errors if one or more of the calibration points does correspond to an existing measured points.

This error happens because the bed mesh outside of the measured points is an extrapolation, an educated guess. This extrapolation is not perfect, and the error in an extrapolated point will always be equal to or greater than the error at a measured point.

So, if any of your calibration points is an extrapolated point, then your error is greater than it needs to be.

This is an easy problem to solve, simply determine the three points on your measurement grid that create the largest triangle possible. Generally the three points are (XMin, YMin), (XMax, YMin), and (XMidpoint, YMax). You can calculate these points using the formulas in the sections above.

In my case these points are (10, 10), (290, 10), and (191.25, 290).

These can be defined in Configuration_adv.h as follows:

  #define PROBE_PT_1_X 10
  #define PROBE_PT_1_Y 10
  #define PROBE_PT_2_X 290
  #define PROBE_PT_2_Y 10
  #define PROBE_PT_3_X 191.25
  #define PROBE_PT_3_Y 290

Do Not Edit the Calibration Points

UBL allows users to edit the measured points on their mesh. Whether to enter values that cannot be measured because they are outside the reach of the level sensor, or to correct for errors in the measurement.

However, it is important, not to alter the values of the 3 calibration points.

This is because, if you change these values, the next time you run a 3 point calibration, the measured values will be close to the original, but till no longer match the mesh. Marlin will attempt to tilt or translate the bed mesh to match this discrepancy, which will cause the mesh to be wrong.

So instead, check the bed at all 3 calibration points. If adjustments need to be made, change the NOZZLE_TO_PROBE_OFFSET or from the Marlin UI in “Configuration” -> “Probe Z Offset”. If the discrepancy between the three calibration points is not identical, you will have to select the best value.

Again, hope this helps. Contact me if you have questions.

Mosquitto SSL/TLS Error SSL routines:ssl3_get_record:wrong version number

Mosquitto SSL/TLS Error SSL routines:ssl3_get_record:wrong version number

Up front, I will admit that I ran into this error because I did not read the documentation fully. However, in my defense, I feel like the error reporting could be clearer and the imprecise error message caused me to waste a bunch of time looking in the wrong place. Hopefully, this will prevent someone else from wasting their time as well.

Using an SSL/TLS Connection with Mosquitto MQTT

This is not a post about how to setup SSL/TLS on a Mosquitto broker. That has been well covered. Personally I followed the Mosquitto docs for instructions generating the necessary certificates and keys. Since I am using the Home Assistant Mosquitto Add-On I followed it’s instructions for configuring the Mosquitto Broker.

However, when I tried to connect using the mosquitto_sub command line tool, all I got was this:

 Client mosq-WzCVS53wMuaPbU8oNT sending CONNECT
 Client mosq-WzCVS53wMuaPbU8oNT sending CONNECT
 Client mosq-WzCVS53wMuaPbU8oNT sending CONNECT

When I checked the logs of the Mosquitto broker, all I saw was this error

Client connection from XXX.XXX.XXX.XXX failed: error:1408F10B:SSL routines:ssl3_get_record:wrong version number.

So I spent an hour trying different tls_versions and ciphers with no luck.

You must Specify a cafile or capath to Enable Encryption

It is that easy. If you specify the correct --cafile or a --capath in your mosquitto_sub command, things should work.

I would have expected a better error message from the broker or the client. I also was under the impression that using the --insecure flag would have allowed testing without the --cafile. I was wrong.

Of course, in hindsight the documentation clearly notes this requirement.

mosquitto_sub man page excerpt.
Yup, that is pretty clear.

Insteon is Great; The User Experience is Awful

Insteon is Great; The User Experience is Awful

I own more than 75 Insteon products. I also own numerous other “smart” devices from other manufacturers.  In all I have video doorbells, smart smoke detectors, controlled door locks, weather stations, electronic blinds, smart light switches in every room, smart outlets, and a sophisticated automation hub to link all of this together.

Beyond that, I have been involved in open source programming for home automation for a number of years.  I have contributed extensively to Misterhouse, written some of my own Insteon software, and most recently started contributing to an Insteon-MQTT project for Home Assistant.  Over the years I have spent hours digging into and deciphering the Insteon messaging protocol and solved some very complex problems.

In short, I should be the target audience for anything related to home automation/ internet of things (“IOT”).  Given my work and usage of the products I speak also with some authority about issues related to Insteon.

With that out of the way, it is my opinion that historically Insteon products are good (not great) and that the Insteon user experience is awful.

This opinion is formed from years of irritation with Insteon and Smart Labs, but my recent experience with the Insteon Hub is the impetus for this entry.

Insteon Hardware

Insteon hardware has had its issues, many of them fail to last largely due to electronics issues.

  • 2008 – Cheap tactile switches were used in Insteon products causing what seems like all of the devices to stop working.  As the sad owner of many of these devices this was very irritating.
  • Constant – Bad capacitors used in PLMs this has been a problem at least as far back as 2014 and appears to continue today. I recently had to have a PLM replaced due to this issue.
  • Currently – Possibly more bad capacitors used in Hubs I honestly don’t know why power supply capacitors continue to be an issue.  Again, I also have suffered through this failure.

That said, in generally I find the hardware to be largely acceptable, as long as it doesn’t break.  The prices are reasonable, the appearance is generally acceptable, and the mechanical components stand up to a fair amount of use without breaking.

If the electronics were more robust I would fully endorse the hardware.

Insteon Software

Sadly things go dramatically downhill on the software side.  I won’t get into the poor choices buried deep inside the Insteon protocol (ahem, not a stateless protocol), but needless to say there are many.  I also won’t go into detail about the poor and inaccurate documentation.  While these things are bad and should have never happened, these are only problems that developers see.

However, Smart Labs and Insteon have continually released poor consumer facing software.  For example HouseLinc, is a windows automation app offered by Insteon.  It looks like something written for Windows 95 (indeed it supports XP through Win 7).  Even ignoring that, it had extremely limited functionality and offers no web based or mobile phone control.

The current Insteon Hub is no better. In the 5 years since its release. Insteon has release two separate mobile applications for it App One and App Two. Neither app is very user friendly. As I noted above, some of the Hubs died prematurely. When this happened, there was no way for a user to migrate to a new hub, or even to sign into their account to add a new Hub! The only option was to use a different email address to create a new account, or call Insteon and ask them to do some magic.

Insteon is Hostile to Developers, or at least Apathetic

Years ago, Insteon had a paywall for developers. It was something like pay a $200 one time fee and they would give you access to their developer documents for their devices. The sad thing, was that these documents were poorly written, often failed to disclose all of the features, rarely updated, and contained numerous errors.

Luckily, by using various tricks, many of us have been able to “sniff” the Insteon traffic and have reverse engineered support for features through a lot of hard work.

Today, Insteon claims to accept developer applications, but I have never heard of anyone hearing back from them.

I suspect that the makers of the ISY products have developer access and possibly a contact at Insteon that will answer their questions. Of course, ISY sells its products through Insteon and SmartHome. I suppose it is possible that Insteon is reluctant to help any other developers as a favor to ISY.

I am Still an Insteon Advocate

Given all of my complaints, you should rightly expect that I would have abandoned Insteon years ago.

But I still believe that fundamentally, their products are some of the best home automation products you can buy.


  1. The products do not require or use any cloud based service.*
    • This means there are no outages.
    • Insteon devices will continue to work even if the company goes out of business. This isn’t true for other protocols.
  2. The products use both radio frequency and power line communications.
    • Given a decent sized network, they tend to suffer from less communication issues than competing products.
  3. They do not use wifi!
    • The joke as long been that the “S” in IOT stands for security.
    • Keeping my devices off of the network makes them a much less tempting target for hacking.
  4. The products generally look nice and work well.
    • Honestly, this should be your main concern when buying any home automation products.
  5. The protocol had remained consistent for 20+ years.
    • Devices purchased 20 years ago are still compatible with devices purchased today.

* The Insteon Hub does have a cloud component, and can suffer at least partial outages in the app. But my understanding is that even when the cloud is down, local control using the hub still continues to work.

What Insteon Needs to do Going Forward

  1. Embrace Developers!
    • The Home Automation community is dominated by Makers and Hobbyists.
    • Sure, some users only want their Google Home Device to be able to turn on their lights. But in my experience, once a user does this, the majority of them start looking for more complex home automation systems.
  2. Stop trying to be Google.
    • By this I mean, embrace your local control. Forget or at least minimize the cloud.
    • You can sell this as a privacy and security feature, because it is!
  3. Fix your software!
    • The user experience is critical.
    • You have to get the basics to work smoothly first.
    • If that means you have to temporary abandon some cloud features, then do it.
    • If you do step 1, the community will make a great user experience for you.

These are my initial thoughts. I will try to revisit this post as I more thoughts come to me.