Hi all,
facing dynamic storage overflows at runtime when handling big XML data that includes BASE64-coded binary documents, I have already asked in the Facebook IBM i (AS400, iSeries, System i) Group if someone had an idea to reduce the usage of that storage. There were great answers, but still I couldn’t accomplish what I want. I want to ask here because I guess a forum is the better environment in which to ask such questions.
The problem is that I need 4000000 varchar for the binary / BASE64 document that is part of the XML data.
I have to decode the BASE64 data and write it into a BLOB field in a table. I strip the XML and write it as is in the same BLOB field in the table.
A procedure in a service program writeArchive() shall do the trick, and it works fine with other programs since 2018.
Scott Klement’s service program BASE64, procedure base64_decode() is used to do the conversion to the binary file and it works flawlessly.
The converted document is put in a SQLTYPE(BLOB:4000000) field and then written into the archive table with the BLOB field.
In writeArchive() I have a by-reference parameter
and this work field
and a blob field
The dynamic storage overflows when the procedure writeArchive() is called.
I tried to give the service program its own activation group. It didn’t help.
Now I hope it could be possible to do base64 conversion with an SQL function that hopefully doesn’t occupy dynamic storage.
Unfortunately systools.base64decode seems to have a length restriction, and if it had not, I still wouldn’t know how I could convert and insert a SQLTYPE(BLOB) field in one go.
Thanks for any ideas.
Markus
facing dynamic storage overflows at runtime when handling big XML data that includes BASE64-coded binary documents, I have already asked in the Facebook IBM i (AS400, iSeries, System i) Group if someone had an idea to reduce the usage of that storage. There were great answers, but still I couldn’t accomplish what I want. I want to ask here because I guess a forum is the better environment in which to ask such questions.
The problem is that I need 4000000 varchar for the binary / BASE64 document that is part of the XML data.
I have to decode the BASE64 data and write it into a BLOB field in a table. I strip the XML and write it as is in the same BLOB field in the table.
A procedure in a service program writeArchive() shall do the trick, and it works fine with other programs since 2018.
Scott Klement’s service program BASE64, procedure base64_decode() is used to do the conversion to the binary file and it works flawlessly.
The converted document is put in a SQLTYPE(BLOB:4000000) field and then written into the archive table with the BLOB field.
In writeArchive() I have a by-reference parameter
Code:
doc varchar(4000000)
Code:
chrBinDok S A LEN(4000000) VARYING STATIC INZ // work field to receive base64_decode result
Code:
blobDoc S SQLTYPE(BLOB:4000000)
I tried to give the service program its own activation group. It didn’t help.
Now I hope it could be possible to do base64 conversion with an SQL function that hopefully doesn’t occupy dynamic storage.
Unfortunately systools.base64decode seems to have a length restriction, and if it had not, I still wouldn’t know how I could convert and insert a SQLTYPE(BLOB) field in one go.
Thanks for any ideas.
Markus
Comment