I am trying to create schema, but I am getting error, this:
bin/pinot-admin.sh AddTable -tableConfigFile $PDATA_HOME table-config.json -schemaFile schema-config.json -controllerPort 9000 -exec Executing command: AddTable -tableConfigFile table-config.json -schemaFile schema-config.json -controllerProtocol http -controllerHost 172.31.10.219 -controllerPort 9000 -exec Sending request: http://172.31.10.219:9000/schemas to controller: localhost, version: Unknown Got Exception to upload Pinot Schema: myschema org.apache.pinot.common.exception.HttpErrorStatusException: Got error status code: 400 (Bad Request) with reason: "Cannot add invalid schema: myschema. Reason: null" while sending request: http://172.31.10.219:9000/schemas to controller: localhost, version: Unknown at org.apache.pinot.common.utils.FileUploadDownloadClient.sendRequest(FileUploadDownloadClient.java:397) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
Table Config:
{
"tableName": "eventflow",
"tableType": "REALTIME",
"segmentsConfig": {
"timeColumnName": "_source.sDate",
"timeType": "MILLISECONDS",
"schemaName": "myschema",
"replicasPerPartition": "1"
},
"tenants": {},
"tableIndexConfig": {
"loadMode": "MMAP",
"streamConfigs": {
"streamType": "kafka",
"stream.kafka.consumer.type": "lowlevel",
"stream.kafka.topic.name": "mytopic",
"stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
"stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
"stream.kafka.broker.list": "localhost:9876",
"realtime.segment.flush.threshold.time": "3600000",
"realtime.segment.flush.threshold.size": "50000",
"stream.kafka.consumer.prop.auto.offset.reset": "smallest"
}
},
"metadata": {
"customConfigs": {}
}
}
And Schema Config:
{
"schemaName": "myschema",
"eventflow": [
{
"name": "_index",
"dataType": "INT"
},
{
"name": "_type",
"dataType": "STRING"
},
{
"name": "id",
"dataType": "INT"
}
],
"dateTimeFieldSpecs": [
{
"name": "_source.sDate",
"dataType": "STRING",
"format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
"granularity": "1:DAYS"
}
]
}
Maybe defining the following could help. Add
"defaultNullValue": "null"to the columns you believe may have actual null values and perhaps the ingestion will come through ok: