将 unix 时间戳转换为 avro 并将其存储在 BigQuery 中
Convert unix timestamp to avro and store it in BigQuery
Avro 架构:
{
"name": "Entity",
"type": "record",
"namespace": "com.foobar.entity",
"fields": [
{
"name": "attribute",
"type": "string"
},
{
"name": "value",
"type": "int"
},
{
"name": "timestamp",
"type": { "type": "long", "logicalType": "timestamp-micros" }
}
]
}
源时间戳为 Unix 格式,精度为毫秒。
当我将此类记录放入 BigQuery 时,我会在 BigQuery 数据预览中得到类似 1970-01-19 01:18:19.415 UTC
的值。但是我存储的值是 1559899418
,即 Friday, 7. June 2019 09:23:38
。有什么想法吗?
参考:https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-avro#logical_types
您的时间戳相差 1000 倍。确实,1559899418
对应于 Friday, 7. June 2019 09:23:38
,但那是 second-precision(Unix timestamp),而不是毫秒。
而1559899
(1559899418的千分之一)确实对应1970-01-19 01:18:19
Avro 架构:
{
"name": "Entity",
"type": "record",
"namespace": "com.foobar.entity",
"fields": [
{
"name": "attribute",
"type": "string"
},
{
"name": "value",
"type": "int"
},
{
"name": "timestamp",
"type": { "type": "long", "logicalType": "timestamp-micros" }
}
]
}
源时间戳为 Unix 格式,精度为毫秒。
当我将此类记录放入 BigQuery 时,我会在 BigQuery 数据预览中得到类似 1970-01-19 01:18:19.415 UTC
的值。但是我存储的值是 1559899418
,即 Friday, 7. June 2019 09:23:38
。有什么想法吗?
参考:https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-avro#logical_types
您的时间戳相差 1000 倍。确实,1559899418
对应于 Friday, 7. June 2019 09:23:38
,但那是 second-precision(Unix timestamp),而不是毫秒。
而1559899
(1559899418的千分之一)确实对应1970-01-19 01:18:19