Tangible Astronomy and Movies with TOPCAT

This March, I’ve put up two new VO resources (that’s jargon for “table or service or whatever”) that, I think, fit quite well what I like to call tangible astronomy: things you can readily relate to what you see when you step out at night. And, since I’m a professing astronomy nerd, that’s always nicely gratifying.

The two resources are the Constellations as Polygons and the Gaia eDR3 catalogue of nearby stars (GCNS).

Constellations

On the constellations, you might rightfully say that’s really far from science. But then they do help getting an idea where something is, and when and from where you might see something. I’ve hence wanted for a long time to re-publish the Davenhall Constellation Boundary Data as proper, ADQL-queriable polygons, and figuring out where the loneliest star in the sky (and Voyager 1) were finally made me do it.

GCNS density around taurus.
Taurus in the GCNS density plot: with constellations!

So, since early March there’s the cstl.geo table on the TAP service at https://dc.g-vo.org/tap with the constallation polygons in its p column. Which, for starters, means it’s trivial to overplot constallation boundaries in your favourite VO clients now, as in the plot above. To make it, I’ve just done a boring SELECT * FROM cstl.geo, did the background (a plain HEALPix density plot of GCNS) and, clicked Layers → Add Area Control and selected the cstl.geo table.

If you want to identify constellations by clicking, while in the area control, choose “add central” from the Forms menu in the Form tab; that’s what I did in the figure above to ensure that what we’re looking at here is the Hyades and hence Taurus. Admittedly: these “centres“ are – as in the catalogue – just the means of the vertices rather than the centres of mass of the polygon (which are hard to compute). Oh, and: there is also the AreaLabel in the Forms menu, for when you need the identification more than the table highlighting (be sure to use a center anchor here).

Note that TOPCAT’s polygon plot at this point is not really geared towards large polygons (which the constellations are) right now. At the time of writing, the documentation has: “Areas specified in this way are generally intended for displaying relatively small shapes such as instrument footprints. Larger areas may also be specified, but there may be issues with use.” That you’ll see at the edges of the sky plots – but keeping that in mind I’d say this is a fun and potentially very useful feature.

What’s a bit worse: You cannot turn the constellation polygons into MOCs yet, because the MOC library currently running within our database will not touch non-convex polygons. We’re working on getting that fixed.

Nearby Stars

Similarly tangible in my book is the GCNS: nearby stars I always find romantic.

Let’s look at the 100 nearest stars, and let’s add spectral types from Henry Draper (cf. my post on Annie Cannon’s catalogue) as well as the constellation name:

WITH nearest AS (
SELECT TOP 100 
  a.source_id, 
  a.ra, a.dec,
  phot_g_mean_mag,
  dist_50,
  spectral
FROM gcns.main AS a
LEFT OUTER JOIN hdgaia.main AS b 
  ON (b.source_id_dr3=a.source_id)
ORDER BY dist_50 ASC)
SELECT nearest.*, name
FROM nearest
JOIN cstl.geo AS g
  ON (1=CONTAINS(
    POINT(nearest.ra, nearest.dec), 
    p))

Note how I’m using CONTAINS with the polygon in the constellations table here; that’s the usage I’ve had in mind for this table (and it’s particularly handy with table uploads).

That I have a Common Table Expression (“WITH”) here is due to SQL planner confusion (I’ll post something about that real soon now): With the WITH, the machine first selects the nearest 100 rows and then does the (relatively costly) spatial match, without it, the machine (somewhat surprisingly) did the geometric match first. This particular confusion looks fixable, but for now I’d ask you for forgiveness for the hack – and the technique is often useful anyway.

If you inspect the result, you will notice that Proxima Cen is right there, but α Cen is missing; without having properly investigated matters, I’d say it’s just too bright for the current Gaia data reduction (and quite possibly even for future Gaia analysis).

Most of the objects on that list that have made it into the HD (i.e., have a spectral type here) are K dwarfs – which is an interesting conspiracy between the limits of the HD (the late red and old white dwarfs are too weak for it) and the limits of Gaia (the few earlier stars within 6 parsec – which includes such luminaries as Sirius at a bit more than 2.5 pc – are just too bright for where Gaia data reduction is now).

Animation

Another fairly tangible thing in the GCNS is the space velcity, given in km/s in the three dimensions U, V, and W. That is, of course, an invitation to look for stellar streams, as, within the relatively small portion of the Milky Way the GCNS looks at, stars on similar orbits will exhibit similar space motions.

Considering the velocity dispersion within a stellar stream will be a few km/s, let’s have the database bin the data. Even though this data is small enough to conveniently handle locally, this kind of remote analysis is half of what TAP is really great at (the other half being the ability to just jump right into a new dataset). You can group by multiple things at the same time:

SELECT 
  COUNT(*) AS n,
  ROUND(uvel_50/5)*5 AS ubin,
  ROUND(vvel_50/5)*5 AS vbin,
  ROUND(wvel_50/5)*5 AS wbin
FROM gcns.main
GROUP BY ubin, vbin, wbin

Note that this (truly) 3D histogram only represents a small minority of the GCNS objects – you need radial velocities for space motion, and these are precious even in the Gaia age.

What really surprised me is how clumpy this distribution is – are we sure we already know all stellar streams in the solar neighbourhood? Watch for yourself (if your browser can’t play webm, complain to your vendor):

[Update (2021-04-01): Mark Taylor points out that the “flashes” you sometimes see when the grid is aligned with the viewing axes (and the general appearance) could be improved by just pulling all non-NULL UVW values out of the table and using a density plot (perhaps shading=density densemap=inferno densefunc=linear). That is quite certainly true, but it would of course defeat the purpose of having on-server aggregation. Which, again, isn’t all that critical for this dataset, so doing the prettier plot actually is a valuable exercise for the reader]

How did I make this video? Well, I started with a Cube Plot in TOPCAT as usual, configuring weighted plotting with n as its weight and played around a bit with scaling out a few outliers. And then I saved the table (to zw.vot), hit “STILTS“ in the plot window and saved the text from there to a text file, zw.sh. I had to change the “in“ clause in the script to make it look like this:

  #!/bin/sh
  stilts plot2cube \
   xpix=887 ypix=431 \
   xlabel='ubin / km/s' ylabel='vbin / km/s' \
   zlabel='wbin / km/s' \
   xmin=-184.5 xmax=49.5 ymin=-77.6 ymax=57.6 \
   zmin=-119.1 zmax=94.1 phi=-84.27 theta=90.35 \
    psi=-62.21 \
   auxmin=1 auxmax=53.6 \
   auxvisible=true auxlabel=n \
   legend=true \
   layer=Mark \
      in=zw.vot \
      x=ubin y=vbin z=wbin weight=n \
      shading=weighted size=2 color=blue

– and presto, “sh zw.sh“ would produce the plot I just had in TOPCAT. This makes a difference because now I can animate this.

In his documentation, Mark already has a few hints on how to build animations; here are a few more ideas on how to organise this. For instance, if, as I want here, you want to animate more than one variable, stilts tloop may become a bit unwieldy. Here’s how to give the camera angles in python:

import sys
from astropy import table
import numpy

angles = numpy.array(
  [float(a) for a in range(0, 360)])
table.Table([
    angles,
    40+30*numpy.cos((angles+57)*numpy.pi/180)],
  names=("psi", "theta")).write(
    sys.stdout, format="votable")

– the only thing to watch out for is that the names match the names of the arguments in stilts that you want to animate (and yes, the creation of angles will make numpy afficionados shudder – but I wasn’t sure if I might want to have somewhat more complex logic there).

[Update (2021-04-01): Mark Taylor points out that all that Python could simply be replaced with a straightforward piece of stilts using the new loop table scheme in stilts, where you would simply put

  animate=:loop:0,360,0.5
  acmd='addcol phi $1'
  acmd='addcol theta 40+30*cosDeg($1+57)'

into the plot2cube command line – and you wouldn’t even need the shell pipeline.]

What’s left to do is basically the shell script that TOPCAT wrote for me above. In the script below I’m using a little convenience hack to let me quickly switch between screen output and file output: I’m defining a shell variable OUTPUT, and when I un-comment the second OUTPUT, stilts renders to the screen. The other changes versus what TOPCAT gave me are de-dented (and I’ve deleted the theta and psi parameters from the command line, as I’m now filling them from the little python script):

OUTPUT="omode=out out=pre-movie.png"
#OUTPUT=omode=swing

python3 camera.py |\
stilts plot2cube \
   xpix=500 ypix=500 \
   xlabel='ubin / km/s' ylabel='vbin / km/s' \
   zlabel='wbin / km/s' \
   xmin=-184.5 xmax=49.5 ymin=-77.6 ymax=57.6 \
   zmin=-119.1 zmax=94.1 \
   auxmin=1 auxmax=53.6 \
phi=8 \
animate=- \
afmt=votable \
$OUTPUT \
   layer=Mark \
      in=zw.vot \
      x=ubin y=vbin z=wbin weight=n \
      shading=weighted size=4 color=blue

# render to movie with something like
# ffmpeg -i "pre-movie-%03d.png" -framerate 15 -pix_fmt yuv420p /stream-movie.webm
# (the yuv420p incantation is so real-world 
# web browsers properly will not go psychedelic 
# with the colours)

The comment at the end says how to make a proper movie out of the PNGs this produces, using ffmpeg (packaged with every self-respecting distribution these days) and yielding a webm. Yes, going for mpeg x264 might be a lot faster for you as it’s a lot more likely to have hardware support, but everything around mpeg is so patent-infested that for the sake of your first-born’s soul you probably should steer clear of it.

Movies are fun in webm, too.

The Loneliest Star in the Sky

sky images and a distribution plot
The loneliest star in the sky on the left, and on the right a somewhat more lonelier one (it’s explained in the text). The inset shows the distribution of the 500 loneliest stars on the whole sky in Galactic coordinates.

In early December, the object catalogue of Gaia’s data release 3 was published (“eDR3“), and I’ve been busy in various ways on this data off and on since then – see, for instance, the The Case of the disappearing bits on this blog.

One of the things I have missed when advising people on projects with previous Gaia data releases is a table that, for every object, gives the nearest neighbour. And so for this release I’ve created it and christened it, perhaps just a bit over-grandiosely, “Gaia eDR3 Autocorrelation”. Technically, it is just a long (1811709771 rows, to be precise) list of pairs of Gaia eDR3 source ids, the ids of their nearest neighbour, and a spherical distance between.

This kind of data is useful for many applications, mostly when looking for objects that are close together or (more often) things that fail for such close pairs for a wide variety of reasons. I have taken some pains to not only have close neighbours, though, because sometimes you may want specifically objects far away from others.

As in the case of this article’s featured image: The loneliest star in the sky (as seen by Gaia, that is) is eDR3 6049144983226879232, which is 4.3 arcminutes from its neighbour, 6049144021153793024, which in turn is the second-loneliest star in the sky. They are, perhaps a bit surprisingly, in Ophiuchus (and thus fairly close to the Milky Way plane), and (probably) only about 150 parsec from Earth. Doesn’t sound too lonely, hm? Turns out: these stars are lonely because dust clouds blot out all their neighbours.

Rank three is in another dust cloud, this time in Taurus, and so it continues in low Galactic latitude to rank 8 (4402975278134691456) at Galactic latitude 36.79 degrees; visualising the thing, it turns out it’s again in a dark cloud. What about rank 23 at 83.92 Galactic (3954600105683842048)? That’s probably bona-fide, or at least it doesn’t look very dusty in the either DSS or PanSTARRS. Coryn (see below) estimates it’s about 1100 parsec away. More than 1 kpc above the galactic disk: that’s more what I had expected for lonely stars.

Looking at the whole distribution of the 500 loneliest stars (inset above), things return a bit more to what I had expected: Most of them are around the galactic poles, where the stellar density is low.

So: How did I find these objects? Here’s the ADQL query I’ve used:

SELECT TOP 500 
  ra, dec, source_id, phot_g_mean_mag, ruwe,
  r_med_photogeo,
  partner_id, dist, 
  COORD2(gavo_transform('ICRS', 'GALACTIC', 
    point(ra, dec))) AS glat
FROM
  gedr3dist.litewithdist
  NATURAL JOIN gedr3auto.main
ORDER BY dist DESC

– run this on the TAP server at http://dc.g-vo.org/tap (don’t be shy, it’s a cheap query).

Most of this should be familiar to you if you’ve worked through the first pages of ADQL course. There’s two ADQL things I’d like to advertise while I have your attention:

  1. NATURAL JOIN is like a JOIN USING, except that the database auto-selects what column(s) to join on by matching the columns that have the same name. This is a convenient way to join tables designed to be joined (as they are here). And it probably won’t work at all if the tables haven’t been designed for that.
  2. The messy stuff with GALACTIC in it. Coordinate transformations had a bad start in ADQL; the original designers hoped they could hide much of this; and it’s rarely a good idea in science tools to hide complexity essentially everyone has to deal with. To get back on track in this field, DaCHS servers since about version 1.4 have been offering a user defined function gavo_transfrom that can transform (within reason) between a number of popular reference frames. You will find more on it in the server’s capabilities (in TOPCAT: the “service” tab). What is happening in the query is: I’m making a Point out of the RA and Dec given in the catalogue, tell the transform function it’s in ICRS and ask it to make Galactic coordinates from it, and then take the second element of the result: the latitude.

And what about the gedr3dist.litewithdist table? That doesn’t look a lot like the gaiaedr3.gaiasource we’re supposed to query for eDR3?

Well, as for DR2, I’m again only carrying a “lite” version of the Gaia catalogue in GAVO’s Heidelberg data center, stripped down to the columns you absolutely cannot live without even for the most gung-ho science; it’s called gaia.edr3lite.

But then my impression is that almost everyone wants distances and then hacks something to make Gaia’s parallax work for them. That’s a bad idea as the SNR goes down to levels very common in the Gaia result catalogue (see 2020arXiv201205220B if you don’t take my word for it). Hence, I’m offering a pre-joined view (a virtual table, if you will) with the carefully estimated distances from Coryn Bailer-Jones, and that’s this gedr3dist.litewithdist. Whenever you’re doing something with eDR3 and distances, this is where I’d point you first.

Oh, and I should be mentioning that, of course, I figured out what is in dust clouds and what is not with TOPCAT and Aladin as in our tutorial TOPCAT and Aladin working together (which needs a bit of an update, but you’ll figure it out).

There’s a lot more fun to be had with this (depending on what you find fun in). What about finding the 10 arcsec-pairs with the least different luminosities (which might actually be useful for testing some optics)? Try this:

SELECT TOP 300 
  a.source_id, partner_id, dist, 
  a.phot_g_mean_mag AS source_mag,
  b.phot_g_mean_mag AS partner_mag,
  abs(a.phot_g_mean_mag-b.phot_g_mean_mag) AS magdiff
FROM gedr3auto.main
  NATURAL JOIN gaia.edr3lite AS a
  JOIN gaia.edr3lite AS b
    ON (partner_id=b.source_id)
WHERE 
  dist BETWEEN 9.999/3600 AND 10.001/3600
  AND a.phot_g_mean_mag IS NOT NULL
  AND b.phot_g_mean_mag IS NOT NULL
ORDER BY magdiff ASC

– this one takes a bit longer, as there’s many 10 arcsec-pairs in eDR3; the query above looks at 84690 of them. Of course, this only returns really faint pairs, and given the errors stars that weak have they’re probably not all that equal-luminosity as that. But fixing all that is left as an exercise to the reader. Given there’s the RP and BP magnitude columns, what about looking for the most colourful pair with a given separation?

Acknowledgement: I couldn’t have coolly mumbled about Ophiuchus or Taurus without the SCS service ivo://cds.vizier/vi/42 (”Identification of a Constellation From Position, Roman 1982”).

Update [2021-02-05]: I discovered an extra twist to this story: Voyager 1 is currently flying towards Ophiuchus (or so Wikipedia claims). With an industrial size package of artistic licence you could say: It’s coming to keep the loneliest star company. But of course: by the time Voyager will be 150 pc from earth, eDR3 6049144983226879232 will quite certainly have left Ophiuchus (and Voyager will be in a completely different part of our sky, that wouldn’t look familar to us at all) – so, I’m afraid apart from a nice conincidence in this very moment (galactically speaking), this whole thing won’t be Hollywood material.

Crazy Shapes in TAP

OpenNGC shapes
A complex shape from OpenNGC: MOCs need not be convex, or simply connected, or anything.

So far when you did spherical geometry in ADQL, you had points, circles, and polygons as data types, and you could test for intersection and containment as operations. This feature set is a bit unsatisfying because there are no (algebraic) groups in this picture: When you join or intersect two circles, the result only is a circle if one contains the other. With non-intersecting polygons, you will again not have a (simply connected) spherical polygon in the end.

Enter MOCs (which I’ve mentioned a few times before on this blog): these are essentially arbitrary shapes on the sky, in practice represented through lists of pixels, cleverly done so they can be sufficiently precise and rather compact at the same time. While MOCs are powerful and surprisingly simple in practice, ADQL doesn’t know about them so far, which limits quite a bit what you can do with them. Well, DaCHS would serve them since about 1.3 if you managed to push them into the database, but there were no operations you could do on them.

Thanks to work done by credativ (who were really nice to work with), funded with some money we had left from our previous e-inf-astro project (BMBF FKZ 05A17VH2) on the pgsphere database extension, this has now changed. At least on the GAVO data center, MOCs are now essentially first-class citizens that you can create, join, and intersect within ADQL, and you can retrieve the results. All operators of DaCHS services are just a few updates away from being able to offer the same.

So, what can you do? To follow what’s below, get a sufficiently new TOPCAT (4.7 will do) and open its TAP client on http://dc.g-vo.org/tap (a.k.a. GAVO DC TAP).

Basic MOC Operations in TAP

First, let’s make sure you can plot MOCs; run

SELECT name, deepest_shape
FROM openngc.shapes

Then do Graphics/Sky Plot, and in the window that pops up then, Layers/Add Area Control. Then select your new table in the Position tab, and finally choose deepest_shape as area (yeah, this could become a bit more automatic and probably will over time). You will then see the footprints of a few NGC objects (OpenNGC’s author Mattia Verga hasn’t done all yet; he certainly welcomes help on OpenNGC’s version control repo), and you can move around in the plot, yielding perhaps something like Fig. 1.

Now let’s color these shapes by object class. If you look, openngc.data has an obj_type column – let’s group on it:

SELECT
  obj_type,
  shape,
  AREA(shape) AS ar
FROM (
  SELECT obj_type, SUM(deepest_shape) AS shape
  FROM openngc.shapes
  NATURAL JOIN openngc.data
  GROUP BY obj_type) AS q

(the extra subquery is a workaround necessary because the area function wants a geometry or a column reference, and ADQL doesn’t allow aggregate functions – like sum – as either of these).

Coloured shapes
Fig. 2: OpenNGC shapes grouped and coloured by type.

In the result you will see that so far, contours for about 40 square degrees of star clusters with nebulae have been put in, but only 0.003 square degrees of stellar associations. And you can now plot by the areas covered by the various sorts of objects; in Fig. 2, I’ve used Subsets/Classify by Column in TOPCAT’s Row Subsets to have colours indicate the different object types – a great workaround when one deals with categorial variables in TOPCAT.

MOCs and JOINs

Another table that already has MOCs in them is rr.stc_spatial, which has the coverage of VO resources (and is the deeper reason I’ve been pushing improved MOC support in pgsphere – background); this isn’t available for all resources yet , but at least there are about 16000 in already. For instance, here’s how to get the coverage of resources talking about planetary nebulae:

SELECT ivoid, res_title, coverage
FROM rr.subject_uat
  NATURAL JOIN rr.stc_spatial
  NATURAL JOIN rr.resource
WHERE uat_concept='planetary-nebulae'
  AND AREA(coverage)<20

(the rr.subject_uat table is a local extension to RegTAP that will be the subject of some future blog post; you could also use rr.res_subject, but because people still use wildly different keyword schemes – if any –, that wouldn’t be as much fun). When plotted, that’s the left side of Fig. 3. If you do that yourself, you will notice that the resolution here is about one degree, which is a special property of the sort of MOCs I am proposing for the Registry: They are of order 6. Resolution in MOC goes up with order, doubling with every step. Thus MOCs of order 7 have a resolution of about half a degree, MOCs of order 5 a resolution of about two degrees.

One possible next step is fetch the intersection of each of these coverages with, say, the DFBS (cf. the post on Byurakan spectra). That would look like this:

SELECT 
  ivoid, 
  res_title,
  gavo_mocintersect(coverage, dfbscoverage) as ovrlp
FROM (
  SELECT ivoid, res_title, coverage
  FROM rr.subject_uat
  NATURAL JOIN rr.stc_spatial
  NATURAL JOIN rr.resource
  WHERE uat_concept='planetary-nebulae'
  AND AREA(coverage)<20) AS others
CROSS JOIN (
  SELECT coverage AS dfbscoverage 
  FROM rr.stc_spatial
  WHERE ivoid='ivo://org.gavo.dc/dfbsspec/q/spectra') AS dfbs

(the DFBS’ identifier I got with a quick query on WIRR). This uses the gavo_mocintersect user defined function (UDF), which takes two MOCs and returns a MOC of their common pixels. Which is another important part why MOCs are so cool: together with union and intersection, they form groups. It should not come as a surprise that there is also a gavo_mocunion UDF. The sum aggregate function we’ve used in our grouping above is (conceptually) built on that.

Planetary Nebula footprint and plate matches
Fig. 3: Left: The common footprint of VO resources declaring a subject of planetary-nebula (and declaring a footprint). Right bottom: Heidelberg plates intersecting this, and, in blue, level-6 intersections. Above this, an enlarged detail from this plot.

You can also convert polygons and circles to MOCs using the (still DaCHS-only) MOC constructor. For instance, you could compute the coverage of all resources dealing with planetary nebulae, filtering against obviously over-eager ones by limiting the total area, and then match that against the coverages of images in, say, the Königstuhl plate achives HDAP. Watch this:

SELECT 
  im.*,
  gavo_mocintersect(MOC(6, im.coverage), pn_coverage) as ovrlp
FROM (
  SELECT SUM(coverage) AS pn_coverage
  FROM rr.subject_uat
  NATURAL JOIN rr.stc_spatial
  WHERE uat_concept='planetary-nebulae'
  AND AREA(coverage)<20) AS c
JOIN lsw.plates AS im
ON 1=INTERSECTS(pn_coverage, MOC(6, coverage))

– so, the MOC(order, geo) function should give you a MOC for other geometries. There are limits to this right now because of limitations of the underlying MOC library; in particular, non-convex polygons are not supported right now, and there are precision issue. We hope this will be rectified soon-ish when we base pgsphere’s MOC operations on the CDS HEALPix library. Anyway, the result of this is plotted on the right of Fig. 3.

Open Ends

In case you have MOCs from the outside, you can also construct MOCs from literals, which happen to be the ASCII MOCs from the standard. This could look like this:

SELECT TOP 1 
  MOC('4/30-33 38 52 7/324-934') AS ar 
FROM tap_schema.tables

For now, you cannot combine MOCs in CONTAINS and INTERSECTS expressions directly; this is mainly because in such an operation, the machine as to decide on the order of the MOC the other geometries are converted to (and computing the predicates between geometry and MOC directly is really painful). This means that if you have a local table with MOCs in a column cmoc that you want to compare against a polygon-valued column coverage in a remote table like this:

SELECT db.* FROM 
  lsw.plates AS db
  JOIN tap_upload.t6
ON 1=CONTAINS(coverage, cmoc) -- fails!

you will receive a rather scary message of the type “operator does not exist: spoly <@ smoc”. To fix it (until we’ve worked out how to reasonably let the computer do that), explicitly convert the polygon:

SELECT db.* FROM 
  lsw.plates AS db
  JOIN tap_upload.t6
ON 1=CONTAINS(MOC(7, coverage), cmoc)

(be stingy when choosing the order here – MOCs that already exist are fast, but making them at high order is expensive).

Having said all that: what I’ve written here is bleeding-edge, and it is not standardised yet. I’d wager, though, that we will see MOCs in ADQL relatively soon, and that what we will see will not be too far from this experiment. Well: Some rough edges, I’d hope, will still be smoothed out.

Getting This on Your Own DaCHS Installation

If you are running a DaCHS installation, you can contribute to takeup (and if not, you can stop reading here). To do that, you need to upgrade to DaCHS’s latest beta (anything newer than 2.1.4 will do) to have the ADQL extension, and, even more importantly, you need to install the postgresql-postgres package from our release repository (that’s version 1.1.4 or newer; in a few weeks, getting it from Debian testing would work as well).

You will probably not get that automatically, because if you followed our normal installation instructions, you will have a package called postgresql-11-pgsphere installed (apologies for this chaos; as ususal, every single step made sense). The upshot is that with our release repo added, sudo apt install postgresql-pgsphere should give you the new code.

That’s not quite enough, though, because you also need to acquaint the database with the new functions. This can only be done with database administrator privileges, which DaCHS by design does not possess. What DaCHS can do is figure out the commands to do that when it is called as dachs upgrade -e. Have a look at the output, and if you are satisfied it is about what to expect, just pipe it into psql as a superuser; in the default installation, dachsroot would be sufficiently privileged. That is:

dachs upgrade -e | psql gavo   # as dachsroot

If running

select top 1 gavo_mocunion(moc('1/3'), moc('2/9')) 
from tap_schema.tables

through your TAP endpoint returns ‘1/3 2/9’, then all is fine. For entertainment, you might also make sure that gavo_mocintersect(moc('1/3'), moc('2/13')) is 2/13 as expected, and that if you intersect with 2/3 you get back an empty string.

So – let’s bring MOCs to ADQL!

The Bochum Galactic Disk Survey

[Image: Patches of higher perceived variability on the Sky]
Fig 1: How our haphazard variability ratio varies over the sky (galactic coordinates). And yes, it’s clear that this isn’t dominated by physical variability.

About a year ago, I reported on a workshop on “Large Surveys with Small Telescopes” in Bamberg; at around the same time, I’ve published an example for those, the Bochum Galactic Disk Survey BGDS, which used a twin 15 cm robotic telescope in some no longer forsaken place in the Andes mountains to monitor the brighter stars in the southern Milky Way. While some tables from an early phase of the survey have been on VizieR for a while, we now publish the source images (also in SIAP and Obscore), the mean photometry (via SCS and TAP) and, perhaps potentially most fun of all, the the lightcurves (via SSAP and TAP) – a whopping 35 million of the latter.

This means that in tools like Aladin, you can now find such light curves (and images in two bands from a lot of epochs) when you are in the survey’s coverage, and you can run TAP queries on GAVO’s http://dc.g-vo.org/tap server against the full photometry table and the time series.

Regular readers of this blog will not be surprised to see me use this as an excuse to show off a bit of ADQL trickery.

If you have a look at the bgds.phot_all table in your favourite TAP client, you’ll see that it has a column amp, giving the difference between the highest and lowest magnitude. The trouble is that amp for almost all objects just reflects the measurement error rather than any intrinsic variability. To get an idea what’s “normal” (based on the fact that essentially all stars have essentially constant luminosity on the range and resolution scales considered here), run a query like

SELECT ROUND(amp/err_mag*10)/10 AS bin, COUNT(*) AS n
FROM bgds.phot_all
WHERE nobs>10
GROUP BY bin

As this scans the entire 75 million rows of the table, you will probably have to use async mode to run this.

[image: distribution of amplitude/mag error
Figure 2: The distribution of amplitude over magnitude error for all BGDS objects with nobs>10 (blue) and the subset with a mean magnitude brighter than 15 (blue).

When it comes back, you will have, for objects where any sort of statistics make sense at all (hence nobs>10), a histogram (of sorts) of the amplitude in units of upstream’s magnitude error estimation. If you log-log-plot this, you’ll see something like Figure 2. The curve at least tells you that the magnitude error estimate is not very far off – the peak at about 3 “sigma” is not unreasonable since about half of the objects have nobs of the order of a hundred and thus would likely contain outliers that far out assuming roughly Gaussian errors.

And if you’re doing a rough cutoff at amp/magerr>10, you will get perhaps not necessarily true variables, but, at least potentially interesting objects.

Let’s use this insight to see if we spot any pattern in the distribution of these interesting objects. We’ll use the HEALPix technique I’ve discussed three years ago in this blog, but with a little twist from ADQL 2.1: The Common Table Expressions or CTEs I have already mentioned in my blog post on ADQL 2.1 and then advertised in the piece on the Henry Draper catalogue. The brief idea, again, is that you can write queries and give them a name that you can use elsewhere in the query as if it were an actual table. It’s not much different from normal subqueries, but you can re-use CTEs in multiple places in the query (hence the “common”), and it’s usually more readable.

Here, we first create a version of the photometry table that contains HEALPixes and our variability measure, use that to compute two unsophisticated per-HEALPix statistics and eventually join these two to our observable, the ratio of suspected variables to all stars observed (the multiplication with 1.0 is a cheap way to make a float out of a value, which is necessary here because a/b does integer division in ADQL if a and b are both integers):

WITH photpoints AS (
  SELECT 
    amp/err_mag AS redamp,
    amp,
    ivo_healpix_index(5, ra, dec) AS hpx
  FROM bgds.phot_all
  WHERE 
    nobs>10
    AND band_name='SDSS i'
    AND mean_mag<16),
all_objs AS (
  SELECT count(*) AS ct,
    hpx
    FROM photpoints GROUP BY hpx),
strong_var AS (
  SELECT COUNT(*) AS ct,
    hpx
    FROM photpoints
    WHERE redamp>4 AND amp>1 GROUP BY hpx)
SELECT
  strong_var.ct/(1.0*all_objs.ct) AS obs,
  all_objs.ct AS n,
  hpx
FROM strong_var JOIN all_objs USING (hpx)
WHERE all_objs.ct>20

If you plot this using TOPCAT’s HEALPix thingy and ask it to use Galactic coordinates, you’ll end up with something like Figure 1.

There clearly is some structure, but given that the variables ratio reaches up to 0.2, this is still reflecting instrumental or pipeline effects and thus earthly rather than Astrophysics. And that’s going beyond what I’d like to talk about on a VO blog, although I’l take any bet that you will see significant structure in the spatial distribution of the variability ratio at about any magnitude cutoff, since there are a lot of different population mixtures in the survey’s footprint.

Be that as it may, let’s have a quick look at the time series. As with the short spectra from Byurakan use case, we’ve stored the actual time series as arrays in the database (the mjd and mags columns in bgds.ssa_time_series. Unfortunately, since they are a lot less array-like than homogeneous spectra, it’s also a lot harder to do interesting things with them without downloading them (I’m grateful for ideas for ADQL functions that will let you do in-DB analysis for such things). Still, you can at least easily download them in bulk and then process them in, say, python to your heart’s content. The Byurakan use case should give you a head start there.

For a quick demo, I couldn’t resist checking out objects that Simbad classifies as possible long-period variables (you see, as I write this, the public bohei over Betelgeuse’s brief waning is just dying down), and so I queried Simbad for:

SELECT ra, dec, main_id
FROM basic
WHERE
  otype='LP?'
  AND 1=CONTAINS(
     POINT('', ra, dec),
     POLYGON('', 127, -30, 112, -30, 272, -30, 258, -30))

(as of this writing, Simbad still needs the ADQL 2.0-compliant first arguments to POINT and POLYGON), where the POLYGON is intended to give the survey’s footprint. I obtained that by reading off the coordinates of the corners in my Figure 1 while it was still in TOPCAT. Oh, and I had to shrink it a bit because Simbad (well, the underlying Postgres server, and, more precisely, its pg_sphere extension) doesn’t want polygons with edges longer than π. This will soon become less pedestrian: MOCs in relational databases are coming; more on this soon.

[TOPCAT action shot with a light curve display]
Fig 3: V566 Pup’s BGDS lightcuve in a TOPCAT configured to auto-plot the light curves associated with a row from the bgds.ssa_time_series table on the GAVO DC TAP service.

If you now do the usual spiel with an upload crossmatch to the bgds.ssa_time_series table and check “Plot Table” in Views/Activation Action, you can quickly page through the light curves (TOPCAT will keep the plot style as you go from dataset to dataset, so it’s worth configuring the lines and the error bars). Which could bring you to something like Fig. 3; and that would suggest that V* V566 Pup isn’t really long-period unless the errors are grossly off.

Gaia DR2: A light version and light curves

screenshot: topcat and matplotlib
Topcat is doing datalink, and our little python script has plotted a two-color time series of RMC 18 (or so I think).

If anyone ever writes a history of the VO, the second data release of Gaia on April 25, 2018 will probably mark its coming-of-age – at least if you, like me, consider the Registry the central element of the VO. It was spectacular to view the spike of tens of Registry queries per second right around 12:00 CEST, the moment the various TAP services handing out the data made it public (with great aplomb, of course).

In GAVO’s Data Center we also carry Gaia DR2 data. Our host institute, the Zentrum für Astronomie in Heidelberg, also has a dedicated Gaia server. This gives relieves us from having to be a true mirror of the upstream data release. And since the source catalog has lots and lots of columns that most users will not be using most of the time, we figured a “light” version of the source catalog might fill an interesting ecological niche: Behold gaia.dr2light on the GAVO DC TAP service, containing essentially just the basic astrometric parameters and the diagonal of the covariance matrix.

That has two advantages: Result sets with SELECT * are a lot less unwieldy (but: just don’t do this with Gaia DR2), and, more importantly, a lighter table puts less load on the server. You see, conventional databases read entire rows when processing data, and having just 30% of the columns means we will be 3 times faster on I/O-bound tasks (assuming the same hardware, of course). Hence, and contrary to several other DR2-carrying sites, you can perform full sequential scans before timing out on our TAP service on gaia.dr2light. If, on the other hand, you need to do debugging or full-covariance-matrix error calculations: The full DR2 gaia_source table is available in many places in the VO. Just use the Registry.

Photometry via TAP

A piece of Gaia DR2 that’s not available in this form anywhere else is the lightcurves; that’s per-transit photometry in the G, BP, and RP band for about 0.5 million objects that the reduction system classified as variable. ESAC publishes these through datalink from within their gaia_source table, and what you get back is a VOTable that has the photometry in the three bands interleaved.

I figured it might be useful if that data were available in a TAP-queriable table with lightcurves in the database. And that’s how gaia.dr2epochflux came into being. In there, you have three triples of arrays: the epochs (g_transit_time, bp_obs_time, and rp_obs_time), the fluxes (g_transit_flux, bp_flux, and rp_flux), and their errors (you can probably guess their names). So, to retrieve G lightcurves where available together with a gaia_source query of your liking, you could write something like

SELECT g.*, g_transit_time, g_transit_flux
FROM gaia.dr2light AS g
LEFT OUTER JOIN gaia.dr2epochflux
USING (source_id)
WHERE ...whatever...

– the LEFT OUTER JOIN arranges things such that the g_transit_time and g_transit_flux columns simply are NULL when there are no lightcurves; with a normal (“inner”) join, rows without lightcurves would not be returned in such a query.

To give you an idea of what you can do with this, suppose you would like to discover new variable blue supergiants in the Gaia data (who knows – you might discover the precursor of the next nearby supernova!). You could start with establishing color cuts and train your favourite machine learning device on light curves of variable blue supergiants. Here’s how to get (and, for simplicity, plot) time series of stars classified as blue supergiants by Simbad for which Gaia DR2 lightcurves are available, using pyvo and a little async trick:

from matplotlib import pyplot as plt
import pyvo

def main():
  simbad = pyvo.dal.TAPService(
    "http://simbad.u-strasbg.fr:80/simbad/sim-tap")
  gavodc = pyvo.dal.TAPService("http://dc.g-vo.org/tap")

  # Get blue supergiants from Simbad
  simjob = simbad.submit_job("""
    select main_id, ra, dec
    from basic
    where otype='BlueSG*'""")
  simjob.run()

  # Get lightcurves from Gaia
  try:
    simjob.wait()
    time_series = gavodc.run_sync("""
      SELECT b.*, bp_obs_time, bp_flux, rp_obs_time, rp_flux
      FROM (SELECT
         main_id, source_id, g.ra, g.dec
         FROM 
        gaia.dr2light as g
         JOIN TAP_UPLOAD.t1 AS tc
         ON (0.002>DISTANCE(tc.ra, tc.dec, g.ra, g.dec))
      OFFSET 0) AS b
      JOIN gaia.dr2epochflux
      USING (source_id)
      """, 
      uploads={"t1": simjob.result_uri})
  finally:
    simjob.delete()

  # Now plot one after the other
  for row in time_series.table:
    plt.plot(row["bp_obs_time"], row["bp_flux"])
    plt.plot(row["rp_obs_time"], row["rp_flux"])
    plt.show(block=False)
    raw_input("{}; press return for next...".format(row["main_id"]))
    plt.cla()

if __name__=="__main__":
  main()

If you bother to read the code, you’ll notice that we transfer the Simbad result directly to the GAVO data center without first downloading it. That’s fairly boring in this case, where the table is small. But if you have a narrow pipe for one reason or another and some 105 rows, passing around async result URLs is a useful trick.

In this particular case the whole thing returns just four stars, so perhaps that’s not a terribly useful target for your learning machine. But this piece of code should get you started to where there’s more data.

You should read the column descriptions and footnotes in the query results (or from the reference URL) – this tells you how to interpret the times and how to make magnitudes from the fluxes if you must. You probably can’t hear it any more, but just in case: If you can, process fluxes rather than magnitudes from Gaia, because the errors are painful to interpret in magnitudes when the fluxes are small (try it!).

Note how the photometry data is stored in arrays in the database, and that VOTables can just transport these. The bad news is that support for manipulating arrays in ADQL is pretty much zero at this point; this means that, when you have trained your ML device, you’ll probably have to still download lots and lots of light curves rather than write some elegant ADQL to do the filtering server-side. However, I’d be highly interested to work out how some tastefully chosen user defined functions might enable offloading at least a good deal of that analysis to the database. So – if you know what you’d like to do, by all means let me know. Perhaps there’s something I can do for you.

Incidentally, I’ll talk a bit more about ADQL arrays in a blog post coming up in a few weeks (I think). Don’t miss it, subscribe to our feed).

Datalink

In the results from queries involving gaia.dr2epochflux, we also provide datalinks. These let you retrieve lightcurves that already have mags and that are more easily plotted. Perhaps more importantly, they link back to the full ESAC lightcurves that, in addition, give you a lot more debug information and are required if you want to reliably identify photometry points with the identifiers of the transits that generated them.

Datalink support in clients still is not great, but it’s growing nicely. Your ideas for workflows that should be supported are (again) most welcome – and have a good chance of being adopted. So, try things out, for instance by getting the most recent TOPCAT (as of this writing) and do the following:

  1. Open the VO/TAP dialog from the menu bar and double click the GAVO DC TAP service.
  2. Enter
    SELECT source_id, ra, dec,
    phot_bp_mean_mag, phot_rp_mean_mag, phot_g_mean_mag,
    g_transit_time, g_transit_flux,
    rp_obs_time, rp_flux
    FROM gaia.dr2epochflux 
    JOIN gaia.dr2light
    USING (source_id)
    WHERE parallax>50
    

    into “ADQL” text to retrieve lightcurves for the more nearby variables (in reality, you’d have to be a bit more careful with the distances, but you already knew that).

  3. plot something like phot_bp_mean_mag-phot_rp_mean_mag vs. phot_g_mean_mag (and adapt the plot to fit your viewing habits).
  4. Open the dialog for Views/Activation Actions (from the menu bar or the tool bar – same thing), check “Invoke Service”, choose “View Datalink Table”.
  5. Whenever you click on a a point in your CMD, a window will pop up in which you can choose between the time series in the various bands, and you can pull in the data from ESAC; to load a table, select “Load Table” from the actions near the foot of the datalink table and click “Invoke”.

Yeah. It’s clunky. Help us make it better with your fresh ideas for interfaces (and don’t be cross with us if we have to marry them with what’s technically feasible and readily generalised).

SSAP and Obscore

If you’re fed up with bleeding-edge tech, the light curves are also available through good old SSAP and Obscore. To use that, just get Splat (or another SSA client, preferably with a bit of time series support). Look for a Gaia DR2 time series service (you may have to update the service list before you find it), enter (in keeping with our LBV theme) S Dor as position and hit “Lookup” followed by “Send Query”. Just click on any result to just view the time series – and then apply Splat’s rich tool set to it.

Update (8.5.2018): Clusters

Here’s another quick application – how about looking for variable stars in clusters? This piece of ADQL should get you started:

SELECT TOP 100 
  source_id, ra, dec, parallax, g.pmra, g.pmdec,
  m.name, m.pmra AS c_pmra, m.pmde AS c_pmde, 
  m.e_pm AS c_e_pm,
  1/dist AS cluster_parallax
FROM 
  gaia.dr2epochflux
  JOIN gaia.dr2light AS g USING (source_id)
  JOIN mwsc.main AS m
  ON (1=CONTAINS(
    POINT(g.ra, g.dec),
    CIRCLE(m.raj2000, m.dej2000, rcluster)))
WHERE IN_UNIT(pmdec, 'deg/yr') BETWEEN m.pmde-m.e_pm*3 AND m.pmde+m.e_pm*3

– yes, you’ll want to constrain pmra, too, and the distance, and properly deal with error and all. But you get simple lightcurves for free. Just add them in the SELECT clause!