Kernel-Based Moving Object Detection
KBMOD query timing plot. We plot, as a function of the week of the eScience incubator project, the amount of time per-Trajectory to intersect the Trajectory and Image tables. The size of each point is proportional to the number of entries in the Trajectory table during the query.
Project Lead: Andrew Becker, UW Astronomy
eScience Liaison: Daniel Halperin
With assistance from: Andrew Whitaker, Bill Howe
Kernel-Based Moving Object Detection (KBMOD) describes a new technique to discover faint moving objects in time-series imaging data. The essence of the technique is to filter each image with its own point-spread-function (PSF), and normalize by the image noise, yielding a likelihood image where the value of each pixel represents the likelihood that there is an underlying point source. We wish to search for objects that have low S/N in a single image (e.g. pixel value between 1-3), but when the signal is aggregated from the multiple images in which the objects appear, have a cumulative S/N that is significant enough to claim a detection (e.g. greater than 10). We consider the process of running a detection kernel along putative moving object trajectories, and summing the likelihood values when this trajectory intersects a science image, the core functionality of KBMOD.
The first step in this process, implemented during the Fall 2014 eScience Data Incubator project, involved examining a database-based solution for the data access and query implementation.PostgreSQL was chosen as the database implementation, primarily because of the PostGISspatial extension. This allows for native spherical geometry objects and queries. Since the package was originally designed to represent Earth-based geographic information, one minor detail is to make sure that geometric objects are represented on an ideal sphere (I believe this is the correct one: http://spatialreference.org/ref/epsg/3786/) instead of Earth’s ellipsoid.
We started the project running a PostsreSQL database on an Amazon Relational Database Service (RDS) instance, but ran into the limitations that you could not log (e.g. ssh) into the machine to copy data locally for ingest, or install C-language User Defined Functions (UDFs). The latter requirement was due to the desire to replicate the WCSLIB mapping of sky coordinates to image pixels, which comes from metadata contained in the image headers, in the database. This necessitated a need to install the database on an Elastic Compute Cloud (EC2) instance where we had complete sysadmin control of the system.
The bulk of the work during this incubator was in designing database tables and then queries on these tables, for the purposes of intersecting space-time trajectories of moving objects with our imaging dataset. In shorthand, we wanted to find out which image a moving object intersected with, at which sky coordinate inside the image (in the 2-D sky plane defined by the Right Ascension and Declination coordinate system), and finally which x,y pixel this corresponds to. Three table versions were implemented, which can be reduced to a maximal and minimal table design, described below.
Click here to read the project’s full summary.