I have Percona Mysql server and Java client with custom ORM. In DB I have table:
CREATE TABLE `PlayerSecret` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`created` timestamp(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6),
`secret` binary(16) NOT NULL,
`player_id` bigint(20) NOT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `PlayerSecret_secret_unique` (`secret`),
KEY `PlayerSecret_player_id` (`player_id`)
) ENGINE=InnoDB AUTO_INCREMENT=141 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
I found that query SELECT PlayerSecret.player_id FROM PlayerSecret WHERE PlayerSecret.secret = ?
returns an empty resultset when parameter is provided by java.sql.PreparedStatement#setBytes
method, and works as expected though java.sql.PreparedStatement#setBinaryStream
. I've enabled mysql general log and found that in this log both queries are the same, I've checked this in hex mode.
In general log it looks like:
SELECT PlayerSecret.player_id FROM PlayerSecret WHERE PlayerSecret.secret = '<96>R\Ø8üõA\í¤Z´^E\Ô\ÊÁ\Ö'
Query parameter from general log in hex mode:
2796 525c d838 fcf5 415c eda4 5ab4 055c d45c cac1 5cd6 27
Value in database:
mysql> select hex(secret) from PlayerSecret where id=109;
+----------------------------------+
| hex(secret) |
+----------------------------------+
| 9652D838FCF541EDA45AB405D4CAC1D6 |
+----------------------------------+
1 row in set (0.00 sec)
The problem is that my ORM doing this query though setBytes
method, I think it is right way for BINARY
data type, but it doesn't work.
Part of my.cnf
with encoding settings(maybe it's matters):
[client]
default-character-set = utf8mb4
[mysql]
default-character-set = utf8mb4
[mysqld]
general_log = on
general_log_file=/var/log/mysql/mysqld_general.log
require_secure_transport = ON
init-connect = SET collation_connection = utf8mb4_unicode_ci
character-set-server = utf8mb4
collation-server = utf8mb4_unicode_ci
Java code:
var uuid = UUID.fromString("9652d838-fcf5-41ed-a45a-b405d4cac1d6");
var array = ByteBuffer.allocate(16).putLong(uuid.getMostSignificantBits()).putLong(uuid.getLeastSignificantBits()).array();
// works
stmt.setBinaryStream(index, new ByteArrayInputStream(array));
// don't works
stmt.setBytes(index, array);
I can't understand what is the difference between both cases, and how to fix this for setBytes
variant.
Maybe someone can clarify this or point me to important parts/places?
My env:
- Openjdk 11
- HicariCP 3.1.0
- MySQL Connector/J 8.0.13
- Percona 5.7.24-26-log Percona Server (GPL), Release '26', Revision 'c8fe767'
Finally I've figured it out. The problem was in
character_set_client=utf8
instead ofutf8mb4
.This query shows the difference between expected values and real thread values(think it's very handy query):
When I replaced
init-connect = SET collation_connection = utf8mb4_unicode_ci
withinit_connect='SET NAMES utf8mb4 COLLATE utf8mb4_unicode_ci'
inmy.cnf
the problem disappeared and queries thoughsetBytes
began to work as expected.Why is it works for
setBinaryStream
and doesn't work forsetBytes
- because in first case works this codecom.mysql.cj.ServerPreparedQueryBindings#setBinaryStream(int, java.io.InputStream, int)
:Important part here is
binding.resetToType(MysqlType.FIELD_TYPE_BLOB
- driver notes mysql that this data isBLOB
And in second case
com.mysql.cj.ServerPreparedQueryBindings#setBytes(int, byte[])
contains:MysqlType.FIELD_TYPE_VAR_STRING
means that this is not simple bytes, but string in some encoding(with some collation). I really don't know why driver sets this type of data for bytes - this question stays opened for me.