I am creating a river using below mentioned JSON string. This river will fetch records from database using $river.state.timestamp value.
{
"type": "jdbc",
"jdbc": {
"driver": "com.microsoft.sqlserver.jdbc.SQLServerDriver",
"url": "jdbc:sqlserver://sql2008r2;databaseName=DBName",
"user": "user",
"password": "password",
"sql": [
{
"statement": "select * from dbo.rivertest where timestamp < (?)",
"parameter": [
"$river.state.timestamp"
]
}
],
"index": "readIndex",
"type": "read_type",
"autocommit": true,
"schedule": "0 0/1 * 1/1 * ? *"
}
}
I am using same JSON on two different machines. This river is executing fine on one server but on my local machine it is throwing exception.
[2014-12-02 22:28:01,949][ERROR][river.jdbc.RiverPipeline ] com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting date and/or time from character string.
java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting date and/or time from character string.
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.fetch(SimpleRiverSource.java:341)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverFlow.fetch(SimpleRiverFlow.java:209)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverFlow.execute(SimpleRiverFlow.java:139)
at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.request(RiverPipeline.java:88)
at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.call(RiverPipeline.java:66)
at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.call(RiverPipeline.java:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting date and/or time from character string.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216)
at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:254)
at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:84)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.<init>(SQLServerResultSet.java:311)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1526)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:404)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:350)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:180)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:155)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:285)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.executeQuery(SimpleRiverSource.java:648)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.executeWithParameter(SimpleRiverSource.java:419)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.fetch(SimpleRiverSource.java:317)
... 9 more
[2014-12-02 22:28:01,952][INFO ][river.jdbc.RiverMetrics ] pipeline org.xbib.elasticsearch.plugin.jdbc.RiverPipeline@7c70ae7a complete: river jdbc/read_river metrics: 0 rows, 0.0 mean, (0.0 0.0 0.0), ingest metrics: elapsed 1 second, 0.0 bytes bytes, 0.0 bytes avg, 0 MB/s
While this river is fetching records from DB when executed on server. JSON is same JDBC driver is same. Both rivers are using same SQL server and same table.
Server is using ES Version: 1.1.1 JDBC River Plugin Version: 1.1.0.0
My Machine is using ES Version: 1.3.4 JDBC River Plugin Version: 1.3.4.4
It is something related to ES/JDBC River Plugin version? Any help will be appreciated. Thanks