• HTTPS in DaCHS

    Browser windows with and without HTTPS.

    Another little aspect of HTTPS support in DaCHS: In the web interface, the webSAMP button must disappear in pages served through HTTPS: it simply wouldn't work.

    (Warning: No astronomy-relevant content at all this time).

    I can't say I'm a big fan of the mighty push towards HTTPS that's going on right now – as I'm arguing in the updated operator's guide it doesn't do people's privacy a lot of good (compared to, say, pushing for browsers to not execute Javascript by default or have DNSSEC widely deployed), but it's a fairly substantial operational liability. With HTTPS, operators have to deal with cryptographic material, regularly update their certificates, restart their services in time and assemble the whole thing correctly (don't get me started about proxying, SNI, and all those horrors). Users, on the other hand, have to keep their CA certificates in order, in particular when they do programmatic VO access, where the browser vendors, their employers and who knows who else doesn't do it for them. Pop quiz: How would you install a new CA certificate on your box? And will your default browser see it?

    But on the other hand, there are some scenarios in which HTTPS makes sense, and I can remotely fantasise that some of those may even be relevant to the VO. And people have been asking for HTTPS in DaCHS a number of times, at times even because their administrations urged them to switch. So, here it is, hopefully. Turning it on is reasonably easy when you use Letsencrypt (which in particular entails having ports 80 and 443); the section on Letencrypt in the operator's guide tells what to do. In particular don't forget the cron job, because without it, things would break after three months (when the initial certificate expires).

    Things get difficult after that. For one, if your box is known under several names (our data center, for instance, can be reached as any of dc.g-vo.org, vo.uni-hd.de, and dc.zah.uni-heidelberg.de; this of course also includes things like www.example.org and example.org), you'll now have to tell DaCHS about it in the new [web]alternateHostnames configuration item; for instance, we have:

    [web]
    serverURL: http://dc.zah.uni-heidelberg.de
    alternateHostnames:dc.g-vo.org, vo.uni-hd.de
    

    in our /etc/gavo.rc.

    And then the Registry has to know you have https. There's actually no convention for that in the VO yet. But since I'd really like to have at least fallback interfaces with plain HTTP, we'll have to come up with something. For now, my plan is to have the alternative protocol (i.e., HTTPS for sites that have an HTTP-serverURL and vice versa) using the brand-new VOResource 1.1 mirrorURLs (in RegTAP 1.1, they are in the mirror_url column rr.interface). To make DaCHS declare the alternate URLs, set [web]registerAlternative to True.

    Another change I've introduced for HTTPS is that the default HTML template for the form renderer (i.e., the one people use who come with a browser) now suppresses the SAMP button if the request came in through HTTPS; that's because WebSAMP doesn't work with HTTPS and probably never will – at least I can't see a way to make it happen without totally wrecking what security guarantees HTTPS gives.

    All this doesn't yet cater for the case when you use a reverse proxy to terminate HTTPS. If you are in that situation, please talk to me so we can figure out a sane way for you explain to DaCHS what to tell the Registry.

    Anyway, if you want to try things out, just switch to the beta repostitory and upgrade. Feedback is highly welcome.

    Oh, and if you're a client developer: Our data center is now reachable through HTTPS (at https://dc.g-vo.org), and we already have pushed the records with mirrorURLs declaring HTTPS support to the RegTAP service at dc.g-vo.org (the others will have to wait a bit longer, as we haven't re-published our registry records yet (it's all experimental, after all).

  • Deredden using TAP

    An animated color-magnitude diagram

    Raw and dereddened CMD for a region in Cygnus.

    Today I published a nice new set of tables on our TAP service: The Bayestar17 3D dust map derived from Pan-STARRS 1 by Greg Green et al. I mention in passing that this was made particularly enjoyable because Greg and friends put an explicit license on their data (in this case, CC-BY-SA).

    This dust map is probably a fascinating resource by itself, but the really nifty thing is that you can use it to correct all kinds of photometric data for extinction – at least to some extent. On the Bayestar web page, the authors give some examples for usage – and with our new service, you can use TAP as well to correct photometry for extinction.

    To see how, first have a look at the table metadata for the prdust.map_union table; this is what casual users probably should look at. More specifically, at the coverage, best_fit, and grdiagnostic columns.

    coverage here is an interval of 10-healpixes. It has to be an interval because the orginal data comes on wildly different levels; depending on the density of stars, sometimes it takes the area of a 6-healpix (about a square degree) to get enough signal, whereas in the galactic plane a 10-healpix (a thousandth of a square degree) already has enough stars. To make the whole thing conveniently queriable without exploding a 6-healpix row into 1000 identical rows, larger healpixes translate into intervals of 10-helpixes. Don't panic, though, I'll show how to conveniently query this below.

    best_fit and grdiagnostic are arrays (remember the light cuves in Gaia DR2?). In bins of 0.5 in distance modulus (which is, in case you feel a bit uncertain as to the algebraic signs, 5 log10(dist)-5 for a distance in parsec), starting with a distance modulus of 4 and ending with 19. This means that for a distance modulus of 4.2 you should check the array index 0, whereas 4.3 already would be covered by array index 1. With this, best_fit[ind] gives E(B-V) = (B-V) - (B-V)0 in the direction of coverage in a distance modulus bin of 2*ind+4. For each best_fit[ind], grdiagnostic[ind] contains a quality measure for that value. You probably shouldn't touch the E(B-V) if that measure is larger than 1.2.

    So, how does one use this?

    To try things, let's pull some Gaia data with distances; in order to have interesting extinctions, I'm using a patch in Cygnus (RA 288.5, Dec 2.3). If you live on the northern hemisphere and step out tonight, you could see dust clouds there with the naked eye (provided electricity fails all around, that is). Full disclosure: I tried the Coal Sack first but after checking the coverage of the dataset – which essentially is the sky north of -30 degrees – I noticed that wouldn't fly. But stories like these are one reason why I'm making such a fuss about having standard STC coverage representations.

    We want distances, and to dodge all the intricacies involved when naively turning parallaxes to distances discussed at length in a paper by Xavier Luri et al (and elsewhere), I'm using precomputed distances from Bailer-Jones et al. (2018AJ....156...58B); you'll find them on the "ARI Gaia" service; in TOPCAT's TAP dialog simply search for “Gaia” – that'll give you the GAVO DC TAP search, too, and that we'll need in a second.

    The pre-computed distances are in the gaiadr2_complements.geometric_distance table, which can be joined to the main Gaia object catalog using the source_id column. So, here's a query to produce a little photometric catalog around our spot in Cygnus (we're discarding objects with excessive parallax errors while we're at it):

    SELECT
    r_est, 5*log10(r_est)-5 as dist_mod,
    phot_g_mean_mag, phot_bp_mean_mag, phot_rp_mean_mag,
    ra, dec
    FROM
    gaiadr2.gaia_source
    JOIN gaiadr2_complements.geometric_distance
    USING (source_id)
    WHERE
    parallax_over_error>1
    AND 1=CONTAINS(POINT('ICRS', ra, dec), CIRCLE('ICRS', 288.5, 2.3, 0.5 ))
    

    The color-magnitude diagram resulting from this is the red point cloud in the animated GIF at the top. To reproduce it, just plot phot_bp_mean_mag-phot_rp_mean_mag against phot_g_mean_mag-dist_mod (and invert the y axis).

    De-reddening this needs a few minor technicalities. The most important one is how to match against the odd intervals of healpixes in the prdust.map_union table. A secondary one is that we have only pulled equatorial coordinates, and the healpixes in prdust are in galactic coordinates.

    Computing the healpix requires the ivo_healpix_index ADQL user defined function (UDF) that you may have met before, and since we have to go from ICRS to Galactic it requires a fairly new UDF I've recently defined to finally get the discussion on having a “standard library” of astrometric functions in ADQL going: gavo_transform. Here's how to get a 10-healpix as required for map_union from ra and dec:

    CAST(ivo_healpix_index(10,
      gavo_transform('ICRS', 'GALACTIC', POINT(ra, dec))) AS INTEGER)
    

    The CAST call is a pure technicality – ivo_healpix_index returns a 64-bit integer, which I can't use in my interval logic.

    The comparison against the intervals you could do yourself, but as argued in Registry-STC article this is one of the trivial things that are easy to get wrong. So, let's use the ivo_interval_overlaps UDF; it goes in the join condition to properly match prdust healpixes to catalog positions. Then our total query – that, I hope, should be reasonably easy to adapt to similar problems – is:

    WITH sources AS (
      SELECT phot_g_mean_mag,
        phot_bp_mean_mag,
        phot_rp_mean_mag,
        dist_mod,
        CAST(ivo_healpix_index(10,
          gavo_transform('ICRS', 'GALACTIC', POINT(ra, dec))) AS INTEGER) AS hpx,
        ROUND((dist_mod-4)*2)+1 AS dist_mod_bin
      FROM TAP_UPLOAD.T1)
    
    SELECT
      phot_bp_mean_mag-phot_rp_mean_mag-dust.best_fit[dist_mod_bin] AS color,
      phot_g_mean_mag-dist_mod+
        dust.best_fit[dist_mod_bin]*3.384 AS abs_mag,
      dust.grdiagnostic[dist_mod_bin] as qual
    FROM sources
    JOIN prdust.map_union AS dust
    ON (1=ivo_interval_has(hpx, coverage))
    

    (If you're following along: you have to switch to the GAVO DC TAP to run this, and you will probably have to change the index after TAP_UPLOAD).

    Ok, in the photometry department there's a bit of cheating going on here – I'm correcting Gaia B-R with B-V, and I'm using the factor for Johnson V to estimate the extinction in Gaia G (if you're curious where that comes from: See the footnote on best_fit and the MC extinction service docs should get you started), so this is far from physically correct. But, as you can see from the green cloud in the plot above, it already helps a bit. And if you find out better factors, by all means let me know so I can add an update... right here:

    Update (2018-09-11): The original data creator, Gregory Green points out that the thing with having a better factor for Gaia G isn't that simple, because, as he says “Gaia G is very broad, [and] the extinction coefficients are much more dependent on stellar type, and extinction is also more nonlinear with dust column (extinction is only linear with dust column and independent of stellar type for an infinitely narrow passband)”. So – when de-reddening, prefer narrow passbands. But whether narrow or wide: TAP helps you.

  • DaCHS 1.2 is out

    Today, I have released DaCHS 1.2 – somewhat belatedly perhaps, because I managed to break my collarbone, but here it is. If you've been following this blog, you already know about the headline news: the dachs start command, ADQL 2.1, and early support for STC in the registry.

    If you're not yet on DaCHS 1.1, please have a quick look at the corresponding release article. While the upgrade itself should work fine in one go even from older versions, the release notes of course apply cumulatively, and you may still have to do the dist-upgrade to 1.1.

    As usual, the generic upgrading instructions are available in the operator's guide (in short: do a dachs val ALL; apt update; apt upgrade). Since I've still encountered DaCHS installations with wrong sources.lists last April: Note again that our repository names have changed in August 2016 – we now have release and beta rather than Debian release names. So, make sure you have something like:

    deb http://vo.ari.uni-heidelberg.de/debian release main
    

    in your /etc/apt/sources.list, not something containing “stable” or the like.

    That said, here's the commented changes for 1.2:

    • New dachs start command to produce structured templates for certain service types. See Horror Vacui Begone on this blog for the full story.
    • Support for ADQL 2.1 (actually, its current proposed recommendation), including almost all of the optional parts (see Speak out on ADQL 2.1 on this blog). While not strictly necessary, it's a good idea to run dachs imp //adql after the upgrade; this will give you some nice new UDFs, in particular gavo_histogram.
    • New coverage element (with updaters) to build and declare the space-time-spectral coverage of a resource. It would be great if you could add coverage elements to your resources where it makes sense and re-publish them. This blog post tells you how to do it (you'll have to scroll down a bit).
    • There is now odbcGrammar to feed an import from another database. Essentially, you put an ODBC connection string into a file, point your sources element there, and you'll get one rawdict per tuple in a foreign database table. This might be a nice way to publish moderate-size non-postgres tables via DaCHS.
    • You can now declare associated datalink services for tables using the _associatedDatalinkSvc meta item. In particular, if you had a datalink property on SSAP services, you should migrate at some point. One advantage: Users will get the datalinks even when querying the tables through TAP. See “Integrating Datalink Services” in the reference documentation for the full story.
    • We now force matplotlib to read its configuration from /var/gavo/etc/matplotlibrc; to get a default, just run dachs init again. This is mainly to avoid uncontrolled imports of matplotlibrcs when DaCHS is run under a uid that does other things now and then.
    • DaCHS now supports VOSI 1.1; in particular, DaCHS now understands the detail hints and has per-table endpoints, so clients like TOPCAT could avoid reading the full table metadata in one go. Realistically, at least TOPCAT doesn't yet, so this is perhaps less cool than it may sound.
    • The indices generated by the ssa mixins are now a bit more sensible considering typical query modes. You probably want to run dachs imp -I on the RDs for your ssap data collections when convenient. If you have larger spectral collections, chances are many queries will be a lot faster.
    • ssapCore no longer wantonly adds preview columns. If you have previews with spectra, you probably want to add <property name="previews">auto</property> to your ssapCores. If you don't, the preview column will not be added to SSA responses (right now, few clients evaluate it, but that will hopefully change in the future).
    • You can now add a statisticsTarget property to columns; you will want this on largish tables with non-uniformly distributed values to aid the query planner; something like <property key=" statisticsTarget">10000</property> within the corresponding column element can go a long way to improve query planning (you need to run gavo imp -m on the RD after the change).
    • DaCHS's log now by default does not contain IP addresses, user agents, and referrers any more, which should mostly keep you from processing personal data and thus from having to muck around with the EU GDPR. To get back the previous behaviour, set [web]logFormat in /etc/gavo.rc to combined.
    • I fixed some utypes for obscore 1.1. These utypes are useless, so there's nothing you have to do. But then stilts taplint complains about them, and so you may want to run dachs imp -m //obscore.
    • As usual, there are many minor bug fixes and improvements (e.g., memmapping FITSes for cutout again, delimited table references in ADQL, new-style tutorial resource records, correct obscore standardId, much saner nD-arrays in VOTables).

    Well – enjoy the release, and if something goes wrong with it, be sure to let us know, preferably on the DaCHS-suppport mailing list.

  • Gaia DR2: A light version and light curves

    screenshot: topcat and matplotlib

    Topcat is doing datalink, and our little python script has plotted a two-color time series of RMC 18 (or so I think).

    If anyone ever writes a history of the VO, the second data release of Gaia on April 25, 2018 will probably mark its coming-of-age – at least if you, like me, consider the Registry the central element of the VO. It was spectacular to view the spike of tens of Registry queries per second right around 12:00 CEST, the moment the various TAP services handing out the data made it public (with great aplomb, of course).

    In GAVO's Data Center we also carry Gaia DR2 data. Our host institute, the Zentrum für Astronomie in Heidelberg, also has a dedicated Gaia server. This gives relieves us from having to be a true mirror of the upstream data release. And since the source catalog has lots and lots of columns that most users will not be using most of the time, we figured a “light” version of the source catalog might fill an interesting ecological niche: Behold gaia.dr2light on the GAVO DC TAP service, containing essentially just the basic astrometric parameters and the diagonal of the covariance matrix.

    That has two advantages: Result sets with SELECT * are a lot less unwieldy (but: just don't do this with Gaia DR2), and, more importantly, a lighter table puts less load on the server. You see, conventional databases read entire rows when processing data, and having just 30% of the columns means we will be 3 times faster on I/O-bound tasks (assuming the same hardware, of course). Hence, and contrary to several other DR2-carrying sites, you can perform full sequential scans before timing out on our TAP service on gaia.dr2light. If, on the other hand, you need to do debugging or full-covariance-matrix error calculations: The full DR2 gaia_source table is available in many places in the VO. Just use the Registry.

    Photometry via TAP

    A piece of Gaia DR2 that's not available in this form anywhere else is the lightcurves; that's per-transit photometry in the G, BP, and RP band for about 0.5 million objects that the reduction system classified as variable. ESAC publishes these through datalink from within their gaia_source table, and what you get back is a VOTable that has the photometry in the three bands interleaved.

    I figured it might be useful if that data were available in a TAP-queriable table with lightcurves in the database. And that's how gaia.dr2epochflux came into being. In there, you have three triples of arrays: the epochs (g_transit_time, bp_obs_time, and rp_obs_time), the fluxes (g_transit_flux, bp_flux, and rp_flux), and their errors (you can probably guess their names). So, to retrieve G lightcurves where available together with a gaia_source query of your liking, you could write something like:

    SELECT g.*, g_transit_time, g_transit_flux
    FROM gaia.dr2light AS g
    LEFT OUTER JOIN gaia.dr2epochflux
    USING (source_id)
    WHERE ...whatever...
    

    – the LEFT OUTER JOIN arranges things such that the g_transit_time and g_transit_flux columns simply are NULL when there are no lightcurves; with a normal (“inner”) join, rows without lightcurves would not be returned in such a query.

    To give you an idea of what you can do with this, suppose you would like to discover new variable blue supergiants in the Gaia data (who knows – you might discover the precursor of the next nearby supernova!). You could start with establishing color cuts and train your favourite machine learning device on light curves of variable blue supergiants. Here's how to get (and, for simplicity, plot) time series of stars classified as blue supergiants by Simbad for which Gaia DR2 lightcurves are available, using pyvo and a little async trick:

    from matplotlib import pyplot as plt
    import pyvo
    
    def main():
      simbad = pyvo.dal.TAPService(
        "http://simbad.u-strasbg.fr:80/simbad/sim-tap")
      gavodc = pyvo.dal.TAPService("http://dc.g-vo.org/tap")
    
      # Get blue supergiants from Simbad
      simjob = simbad.submit_job("""
        select main_id, ra, dec
        from basic
        where otype='BlueSG*'""")
      simjob.run()
    
      # Get lightcurves from Gaia
      try:
        simjob.wait()
        time_series = gavodc.run_sync("""
          SELECT b.*, bp_obs_time, bp_flux, rp_obs_time, rp_flux
          FROM (SELECT
             main_id, source_id, g.ra, g.dec
             FROM
            gaia.dr2light as g
             JOIN TAP_UPLOAD.t1 AS tc
             ON (0.002>DISTANCE(tc.ra, tc.dec, g.ra, g.dec))
          OFFSET 0) AS b
          JOIN gaia.dr2epochflux
          USING (source_id)
          """,
          uploads={"t1": simjob.result_uri})
      finally:
        simjob.delete()
    
      # Now plot one after the other
      for row in time_series.table:
        plt.plot(row["bp_obs_time"], row["bp_flux"])
        plt.plot(row["rp_obs_time"], row["rp_flux"])
        plt.show(block=False)
        raw_input("{}; press return for next...".format(row["main_id"]))
        plt.cla()
    
    if __name__=="__main__":
      main()
    

    If you bother to read the code, you'll notice that we transfer the Simbad result directly to the GAVO data center without first downloading it. That's fairly boring in this case, where the table is small. But if you have a narrow pipe for one reason or another and some 105 rows, passing around async result URLs is a useful trick.

    In this particular case the whole thing returns just four stars, so perhaps that's not a terribly useful target for your learning machine. But this piece of code should get you started to where there's more data.

    You should read the column descriptions and footnotes in the query results (or from the reference URL) – this tells you how to interpret the times and how to make magnitudes from the fluxes if you must. You probably can't hear it any more, but just in case: If you can, process fluxes rather than magnitudes from Gaia, because the errors are painful to interpret in magnitudes when the fluxes are small (try it!).

    Note how the photometry data is stored in arrays in the database, and that VOTables can just transport these. The bad news is that support for manipulating arrays in ADQL is pretty much zero at this point; this means that, when you have trained your ML device, you'll probably have to still download lots and lots of light curves rather than write some elegant ADQL to do the filtering server-side. However, I'd be highly interested to work out how some tastefully chosen user defined functions might enable offloading at least a good deal of that analysis to the database. So – if you know what you'd like to do, by all means let me know. Perhaps there's something I can do for you.

    Incidentally, I'll talk a bit more about ADQL arrays in a blog post coming up in a few weeks (I think). Don't miss it, subscribe to our feed).

    SSAP and Obscore

    If you're fed up with bleeding-edge tech, the light curves are also available through good old SSAP and Obscore. To use that, just get Splat (or another SSA client, preferably with a bit of time series support). Look for a Gaia DR2 time series service (you may have to update the service list before you find it), enter (in keeping with our LBV theme) S Dor as position and hit “Lookup” followed by “Send Query”. Just click on any result to just view the time series – and then apply Splat's rich tool set to it.

    Update (8.5.2018): Clusters

    Here's another quick application – how about looking for variable stars in clusters? This piece of ADQL should get you started:

    SELECT TOP 100
      source_id, ra, dec, parallax, g.pmra, g.pmdec,
      m.name, m.pmra AS c_pmra, m.pmde AS c_pmde,
      m.e_pm AS c_e_pm,
      1/dist AS cluster_parallax
    FROM
      gaia.dr2epochflux
      JOIN gaia.dr2light AS g USING (source_id)
      JOIN mwsc.main AS m
      ON (1=CONTAINS(
        POINT(g.ra, g.dec),
        CIRCLE(m.raj2000, m.dej2000, rcluster)))
    WHERE IN_UNIT(pmdec, 'deg/yr') BETWEEN m.pmde-m.e_pm*3 AND m.pmde+m.e_pm*3
    

    – yes, you'll want to constrain pmra, too, and the distance, and properly deal with error and all. But you get simple lightcurves for free. Just add them in the SELECT clause!

  • Horror vacui begone

    browser and editor

    Mikhail's qrdcreator in a browser and an editor with a dachs start-produced template.

    One of the major usability issues our publishing suite DaCHS has for operators (i.e., people who want publish data) is the “horror vacui”: How do I start a Resource Descriptor (RD – the file DaCHS interprets to build services)?

    I used to recommend to start by having a look at the RDs of our existing services and pick whatever matches best your publication project. But finding a matching service and figuring out what is generic, what's a special property of the concrete data collection, and what's a hack that should not be reproduced isn't straightforward at all, not to mention the fact that some of those RDs have been in maintenance mode for almost 10 years and hence may show deprecated practices.

    Then came the the VESPA implementation workshop last year, during which Mikhail Minin showed me a piece of javascript and HTML (source on github) he has written to overcome the empty editor window. Essentially, Mikhail has built a fairly comprehensive form interface in a web browser that asks people the right questions to eventually write an RD for EPN-TAP (i.e., solar system) resources.

    I had planned to generalise Mikhail's approach to several types of resources supported by DaCHS, ideally inferring the questions to ask from the built-in documentation of mixins and applys. But during the last year, whenever I felt it would be a good time to tackle that generalisation, I quickly gave up again. It was mostly rather trivial stuff such as how to tell apart repeatable metadata (waveband, say) and non-repeatable metadata (instrument, say). But it was bad enough that I quickly found something else to do each time I got started.

    Eventually, I gave up on a menu interface altogether – making it flexible and generatable at the same time seemed a fairly complex problem. But that doesn't mean I forgot about overcoming the horror vacui thing. So, when forms aren't flexible enough for data entry, where do you turn? Right! A text editor.

    Enter dachs start. That's a new DaCHS subcommand that gets you started with your RD. For one, you can list the templates available:

    $ dachs start list
    siap -- Image collections via SIAP1 and TAP
    ssap+datalink -- Spectra via SSAP and TAP, going through datalink
    epntap -- Solar system data via EPN-TAP 2.0
    scs -- Catalogs via SCS and TAP
    

    More templates are planned; siap+datalink, for instance, would cover some frequent use cases. Feel free to mail in requests.

    Once you find a suitable template, create your future resource directory, enter it and run dachs start again, this time passing the name of the template you want:

    $ mkdir ex_data
    $ cd ex_data
    $ dachs start scs
    $ head -16 q.rd | tail -9
    <resource schema="ex_data">
      <meta name="creationDate">2018-04-13T12:34:31Z</meta>
    
      <meta name="title">%title -- not more than a line%</meta>
      <meta name="description">
        %this should be a paragraph or two (take care to mention salient terms)%
      </meta>
      <!-- Take keywords from
        http://astrothesaurus.org/thesaurus/hierarchical-browse/
    

    dachs start uses the directory name as the new schema name and then writes a file q.rd (which is the canonical name for the “main” RD in a resource). Within this file, you'll see things to fill out between pairs of percent signs with short explanantions. Where longer explanations are necessary, embedded comments should help.

    To give you an idea of the intended use: As a vim user, I've put

    augroup rd
      au!
      au BufRead,BufNewFile *.rd imap  /%[^%]*%a
      au BufRead,BufNewFile *.rd imap  cf%
    augroup END
    

    into my ~/.vimrc. That way, while editing the template into an actual RD, hitting F8 takes me to the next thing to be edited; I can then read the instructions, and when I have made up my mind, I can either delete the template element or hit F9 and replace the explanation text with whatever belongs there.

    The command is available starting with the 1.1.3 beta (available now by switching to the beta repo) and will be part of the 1.2 release, planned for early June after the Victoria interop.

    If you have a publication project: just try it out and give feedback. Note that the templates haven't actually been tested yet, and the comments were written by a DaCHS and VO nerd, so they might not always be great either. Thus, when you get stuck: complain early, complain often!

« Page 15 / 20 »