SQLAlchemy + ORA-014161

Tags: python

Python, SQLAlchemy and “ORA-01461: can bind a LONG value only for insert into a LONG column”

If you’re on this page you either read all of my entries regardless of content (thank you!) or you have the error above.

Look through just about every ORM and you’ll find special case handling for Oracle’s large objects. If there’s no special case handling in the ORM itself, then you will write some special case code yourself.

SQLAlchemy’s special case handling of Oracle’s large objects is to automatically load the large objects when retrieving rows from the database and convert them to strings. This means the database results object isn’t tethered to the database cursor while you work. It’s pretty handy, and if you’re just pulling data from the database it means you may never need to think about how weird Oracle large objects are compared to “regular” data.

I was reading and writing large objects between two schemas when the ORA-014161 surfaced. Since this was a one-time ad-hoc deal, I just hand crafted some SQL, and the script looked a little like this:

import sqlkeyring
 
sql_keyring = sqlkeyring.SQLKeyring()
 
targetdb = sql_keyring.get_engine('targetdb')
sourcedb = sql_keyring.get_engine('sourcedb')
 
interesting_results = sourcedb.execute("select * from the_interesting_table")
for row in interesting_results.fetchall():
column_names = row.keys()
column_names.sort()
 
names = ', '.join(column_names)
values = ', :'.join(column_names)
 
params = {}
for k, v in row.items():
params[k] = v
 
sql = "insert into the_interesting_table(%s) values(:%s)" % (names, value)
targetdb.execute(sql, **params)

Python

And that failed terribly with the following:

"ORA-01461: can bind a LONG value only for insert into a LONG column"

Of course it did, otherwise we wouldn’t be here now. Since the values came out of the source database and were converted to strings, SQLAlchemy then attempted to drop them into the target database as strings.

SQLAlchemy is smart enough to coerce types correctly if we give it some information about our schema, either explicitly or by asking it to autoload table metadata. A few changes to the script above and we were in business without manually hacking together SQL:

import sqlkeyring
 
sql_keyring = sqlkeyring.SQLKeyring()
 
targetdb = sql_keyring.get_engine('targetdb')
targetdb_md = MetaData()
targetdb_md.bind = targetdb
sourcedb = sql_keyring.get_engine('sourcedb')
 
the_interesting_table = Table('the_interesting_table', targetdb_md, autoload=True)
 
interesting_results = sourcedb.execute("select * from the_interesting_table")
for row in interesting_results.fetchall():
params = {}
for k, v in row.items():
params[k] = v
 
sql = the_interesting_table.insert()
targetdb.execute(sql, **params)

Python

About the same number of lines, but less hacky and it actually works by letting SQLAlchemy do more heavy lifting for us. Since SQLAlchemy has metadata information for the target table, it knows that any input for the large object columns will need to be properly converted.

Any tool that successfully abstracts away Oracle large objects is a good tool in my book.