I’ve seen many threads about this error, but the solutions I’ve found don’t seem to be applicable in my case.
I’ve received a rather large (~150Go) dump file from an Oracle database.
I converted it to a MySQL one, using OraDump. However, when I try to import it in my MySQL server, I get the infamous error :
ERROR 111 (42000) at line 162936 : Row size too large. The maximum row size for the used table, not counting BLOBs, is 65535.
This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs.
I tried increasing the innodb_log_file_size parameter, removing the strict mode, switching from ImmoDB to myISAM, nothing worked.
In my last attempt, I tried to add the -f parameter to the dump importation, in hope to just squeeze past the error, but now it just seems stuck.
I don’t think I can change the table schemas, since they are created within the 150Go dump file, and I don’t even know which tables/columns are at fault.
Is there any way around it ?
EDIT : I managed to find the table responsible for that error, and found that it happens when I’m trying to declare it :
#
# Table structure for table 'F_TABLE_EXAMPLE'
#
DROP TABLE IF EXISTS `F_TABLE_EXAMPLE`;
CREATE TABLE `F_TABLE_EXAMPLE` (
`COL_1` BIGINT,
`COL_2` VARCHAR(10) CHARACTER SET utf8,
`COL_3` BIGINT,
`COL_4` BIGINT,
`COL_5` DECIMAL(16,2),
`COL_6` DECIMAL(16,2),
`COL_7` VARCHAR(5) CHARACTER SET utf8,
`COL_8` DATETIME,
`COL_9` VARCHAR(50) CHARACTER SET utf8,
`COL_10` VARCHAR(4000) CHARACTER SET utf8,
`COL_11` VARCHAR(4000) CHARACTER SET utf8,
`COL_12` VARCHAR(4000) CHARACTER SET utf8,
`COL_13` VARCHAR(4000) CHARACTER SET utf8,
`COL_14` VARCHAR(4000) CHARACTER SET utf8,
`COL_15` VARCHAR(4000) CHARACTER SET utf8
) ENGINE=InnoDB;
If I remove COL_15, there’s no error, but with it included I get the usual error. (I only included COL_15 since the error begins there, but I have a bunch of other columns in my declaration)
2
Answers
The way to figure out which table is at fault is to dump and restore one table at a time. That will identify which table(s) are at fault. Then if necessary, dump and restore smaller subsets of rows from the table that causes the error. Continue subdividing the problem until you have narrowed it down to the culprit.
This process might take several tries.
Don’t change the storage engine. This particular error is about MySQL in general, not any specific storage engine.
This error has nothing to do with the innodb_log_file_size. (Once you get past this error, you might get a subsequent error regarding innodb_log_file_size — it must be 10x the size of the largest individual BLOB or TEXT value).
Don’t try to squeeze past the error by "forcing" it or disabling strict mode. If it did allow this, it would just corrupt your data.
The error message tells you exactly what you need to do:
This means change the table definition in MySQL. You probably don’t have to change it in Oracle.
If you can’t change the table definition, then sorry, you can’t import this data into MySQL.
If you want any suggestion for how to fix this, then please post the DDL for the original table from Oracle, and the DDL for the table you’re trying to import into in MySQL. Please use
DBMS_METADATA.GET_DDL()
in Oracle andSHOW CREATE TABLE
in MySQL to get the DDL in text — not a screenshot image.I see from your table definition that you have six columns:
VARCHAR(4000) CHARACTER SET utf8
. The potential width of these columns is 72000 bytes, because MySQL’s utf8 character set may store up to 3 bytes per character. These columns therefore exceed the row size limit. When I test it, I get the same error you did as I try to create the table.If I change each of these six VARCHAR(4000) columns to TEXT in my test, the table is created successfully. This is what the error message suggested to do.
I would also recommend to use character set utf8mb4 in all cases. This is the default in MySQL 8.0, and it supports the full range of utf8 encodings.
Seeing the
CREATE TABLE
for the offending table may suffice for figuring out what to do next.If the loading loaded some, but not all, of the tables, can you figure out which table was "last" (and not fully reloaded) or "next" to be reloaded?
The error talked about "row size" the talk about
BLOB
, etc may be not that relevant, since there is a datatype for a column. In particular, changing columns in an Engine=InnoDB table from VARCHAR to TEXT will not help.The problem may be in having too many columns and/or a "row format" that is less forgiving of lots of columns.
More on limits: http://mysql.rjweb.org/doc.php/limits
More
It appears that "line 162936" is bigger than most. For loading large rows…
Check the current values of
net_buffer_length
andmax_allowed_packet
, then… Increasenet_buffer_length
andmax_allowed_packet
in the config file (or when invoking the server).Change
to
for each of those big columns. The functionality will be the same, while avoiding the error. (Technically, incoming text will not be limited to 4000 characters, but instead limited to 64K bytes.)
Since you seem to have a big dump file, and very few editors can handle files that big, and the dump replaces the table, I do not have an easy way to achieve the edit. If you could rebuild the dump, but without the table creations, then you could create the tables manually before the load.
The Warnings are suggesting that you might want to use
utf8mb4
— which lets you get the rest of Chinese and Emoji.