LOB column support in CRDB

Hi All,

I am trying to port some Oracle application to CRDB and want to understand that does crdb supports any data type which can hold data from Oracle LOB columns.

It depends on how big your LOBs are. CockroachDB has a limit of 64MB per row (total across all columns). As long as your LOBs fit within that limit, you can use CockroachDB’s BLOB and TEXT types (use TEXT in place of oracle’s CLOB or NCLOB)

1 Like

Thanks for the quick reply, it helps for sure. Its also interesting to know that total row size limit of 64MB is in place, need to analyse my test case accordingly.

I have another question on my initial requirement, looks like crdb doesn’t support stored procedures, is there a workaround in case application I am trying to port has few SPs. Where can I get a list of what all SQL features are supported or not supported by crdb ( I do understand that crdb is good for new applications rather then porting application which have advance SQL features).

@apande28, you can check the SQL Feature Support page for a pretty comprehensive list of supported and unsupported features. Let us know if anything’s missing there. As for a workaround for the lack of support for stored procedures, I suspect @bdarnell will have an idea.

1 Like

Thanks @jesse, this page helps, will look forward for expert comments on SPs.

There’s not really a workaround for the lack of stored procedures, you’ll just have to move your logic out of the database into your client applications. We’ll add stored procedures eventually, but it won’t be this year.

1 Like

Thanks for your prompt reply, this information will help me to narrow down my test case.

Any good way to migrate the data from any other RDBMS ( non Postgres) to crdb, dumping the data and loading it to crdb also doesn’t seems a very good option because of lots of data type limitations and compatibility issues.

for migrations crdb lacks interconnectivity and also has a limit on transaction size for inserts. Interconnectivity can be solved to place a psql database between the Oracle and the target crdb. The middle database can run fdw_Oracle and fdw_psql. Using this it should be possible to copy the data as a select from the Oracle foreign table into the crdb foreign table and here comes the transaction size limit … You will need to copy the data in sets of rows, not a simple insert into crdb.X as select * from oracle.X. For real big data sets this would be required any way to be able to copy the data in parallel.

1 Like

Hi @ik_zelf Thanks for the great input and workaround, however putting a psql database in middle will further increase the complexity of migration but if that is the only way let me give a try :slight_smile:

By the way what is defined value of transaction size limit in crdb, another information I got in same discussion is that size of a row can not > 64 MB that might be another issues as I have LOB columns in my Oracle schema.

here 1.0 Release notes and known limitations are a few limitations documented. A simple test like
create database testdb;
create table z as select * from information_schema.tables;
and followed by a few
insert into testdb.z select * from testdb.z;

will eventually end in an error:
root@192.168.56.101:26257/> INSERT INTO testdb.z SELECT * FROM testdb.z;
INSERT 51200
root@192.168.56.101:26257/> INSERT INTO testdb.z SELECT * FROM testdb.z;
pq: kv/txn_coord_sender.go:428: transaction is too large to commit: 102400 intents

1 Like

Hi @ik_zelf, Only way I found to load the data bulk data is ( while not using the pg_dump approch as I am trying to port from Oracle to crdb) :

cockroach sql --insecure --host= --port= --database=<db_name> < load.sql

Where load.sql contains insert statement with multiple record set, process works fine for fewer records ( say 500 or 1K) in single insert statement but as soon as records increases load time increases exponentially. This is expected nature as crdb documents says for optimal performance we should load not more than 500 records in single insert command. What can be a better or faster way to load the data, suggestions please ?

I am also just learning crdb … :wink:
Did you get the fdw idea working as I suggested earlier? It would allow to port your app to psql and keep using your Oracle db, addressed from psql using fdw_Oracle and incrementally move your data to crdb that should be adressed using fdw_psql
I have yet to change my current migration code to address the transaction limits that crdb imposes so I have not tested real bulk work in crdb yet.

@apande28, how are you generating the INSERT statements? Your best bet is to configure your dump tool to output INSERTs in batches of 500 or so. E.g., if you’re using something like Ora2Pg (disclaimer: I’ve never tried this tool), you should be able to set the DATA_LIMIT parameter to limit the number of tuples in each INSERT.

If that’s not possible, you could also write a script to transform your SQL file to split the one large INSERT statement into multiple smaller chunks.