Transform input data
This tutorial demonstrates how to transform input data during ingestion.
Prerequisite
Before proceeding, download Apache Druid® as described in Quickstart (local) and have it running on your local machine. You don't need to load any data into the Druid cluster.
You should be familiar with data querying in Druid. If you haven't already, go through the Query data tutorial first.
Sample data
For this tutorial, you use the following sample data:
{"timestamp":"2018-01-01T07:01:35Z", "animal":"octopus", "location":1, "number":100}
{"timestamp":"2018-01-01T05:01:35Z", "animal":"mongoose", "location":2,"number":200}
{"timestamp":"2018-01-01T06:01:35Z", "animal":"snake", "location":3, "number":300}
{"timestamp":"2018-01-01T01:01:35Z", "animal":"lion", "location":4, "number":300}
Transform data during ingestion
Load the sample dataset using the INSERT INTO
statement and the EXTERN
function to ingest the data inline. In the Druid web console, go to the Query view and run the following query:
INSERT INTO "transform_tutorial"
WITH "ext" AS (
SELECT *
FROM TABLE(EXTERN('{"type":"inline","data":"{\"timestamp\":\"2018-01-01T07:01:35Z\",\"animal\":\"octopus\", \"location\":1, \"number\":100}\n{\"timestamp\":\"2018-01-01T05:01:35Z\",\"animal\":\"mongoose\", \"location\":2,\"number\":200}\n{\"timestamp\":\"2018-01-01T06:01:35Z\",\"animal\":\"snake\", \"location\":3, \"number\":300}\n{\"timestamp\":\"2018-01-01T01:01:35Z\",\"animal\":\"lion\", \"location\":4, \"number\":300}"}', '{"type":"json"}')) EXTEND ("timestamp" VARCHAR, "animal" VARCHAR, "location" BIGINT, "number" BIGINT)
)
SELECT
TIME_PARSE("timestamp") AS "__time",
TEXTCAT('super-', "animal") AS "animal",
"location",
"number",
"number" * 3 AS "triple-number"
FROM "ext"
WHERE (TEXTCAT('super-', "animal") = 'super-mongoose' OR "location" = 3 OR "number" = 100)
PARTITIONED BY DAY
In the SELECT
clause, you specify the following transformations:
animal
: prepends "super-" to the values in theanimal
column using theTEXTCAT
function. Note that it only ingests the transformed data.triple-number
: multiplies thenumber
column by three and stores the results in a column namedtriple-number
. Note that the query ingests both the original and the transformed data.
Additionally, the WHERE
clause applies the following three OR operators so that the query only ingests the rows where at least one of the following conditions is true
:
TEXTCAT('super-', "animal")
matches "super-mongoose"location
matches 3number
matches 100
Once a row passes the filter, the ingestion job applies the transformations. In this example, the filter selects the first three rows because each row meets at least one of the required OR conditions. For the selected rows, the ingestion job ingests the transformed animal
column, the location
column, and both the original number
and the transformed triple-number
column. The "lion" row doesn't meet any of the conditions, so it is not ingested or transformed.
Query the transformed data
In the web console, open a new tab in the Query view. Run the following query to view the ingested data:
SELECT * FROM "transform_tutorial"
Returns the following:
__time | animal | location | number | triple-number |
---|---|---|---|---|
2018-01-01T05:01:35.000Z | super-mongoose | 2 | 200 | 600 |
2018-01-01T06:01:35.000Z | super-snake | 3 | 300 | 900 |
2018-01-01T07:01:35.000Z | super-octopus | 1 | 100 | 300 |
Notice how the "lion" row is missing, and how the other three rows that were ingested have transformations applied to them.
Learn more
See the following topics for more information:
- All functions for a list of functions that can be used to transform data.
- Transform spec reference to learn more about transforms in JSON-based batch ingestion.
- WHERE clause to learn how to specify filters in Druid SQL.