Centralized XMP database? – #10 by paperdigits – darktable
I have my photo library backed up to several external harddrives (independent copies). When editing a file in darktable, an XMP sidecar file is written. However, since I use different harddrives at different times, the different harddrives have different edits. I would like to have some way to sync edits between these harddrives so that if I edit a file on harddrive A, and then I load identical file from harddrive C, I can see those edits. One way would be to write the edits to XMP and then copy them to each harddrive. The main shortcoming with this method is that it doesn’t play very well with local copies.
What I would like to figure out is a way to coordinate the edits across the different harddrives. One idea that came to mind was to create a master XMP repository. A sidecar for each edited raw would exist in a directory with a hash of the file it belongs to. When loading a file in darktable, hash the raw file and then lookup the sidecar and apply that sidecar (if it exists) to the raw file imported into darktable.
When seeing how to load XMP sidecars into darktable, I see that darktable allows me to load an XMP sidecar file to apply to selected images. This would take a single xmp file and apply it to any file selected. Doing this for more than just a few raw files would be quite tedious. I thought that perhaps I could solve the problem via lua scripting. However, it seems that the lua api has no mechanism to load and apply an xmp to a file in darktable.
Does the lua api have the ability load and apply an xmp to a raw file? Then I could write a lua script to map the file to sidecar and load the appropriate sidecar.
Read more here: Source link