Thanks for the question, Bob.
Asked: April 21, 2021 - 9:03 am UTC
Last updated: April 21, 2021 - 3:01 pm UTC
Version: 12c
Viewed 1000+ times
You Asked
I have a legacy table with a large (more than 4000 bytes) unstructured BLOB column containing XML generated by marshalling JAXB created classes created from an XSD. The XSD root schema references multiple imported schemas. Each BLOB in the table may reference a different root schema and has a type field indicating which XSD it complies with. Typical use is to store or retrieve the entire BLOB.
When a schema changes we use Java based XSL transforms to migrate existing data to the new schema version in memory and unmarshall it using the latest schema. We do not update the existing BLOB.
I'm looking for a more flexible approach to handling schema changes as well as updating values of specific schema attributes and elements. This is typically done when migrating existing data when deploying a new build.
I've thought of reading each row from the BLOB column into an XMLType in memory, performing the changes, converting it to a BLOB and updating the column.
Can you suggest a better way? As I mentioned, the BLOB is unstructured to allow data conforming to new schemas to be added to the column. I would prefer not to go to a one table per schema approach.
and Chris said...
If you're able to, a one-off update migrating the existing data from BLOB storage to XMLType would make the process easier as you have no ongoing BLOB <> XMLType conversions. You can still store documents with different structures in this column.
Otherwise your process sounds fine - if you can share an example of exactly how you're planning on doing this we can give more detailed feedback.
Is this answer out of date? If it is, please let us know via a Comment