Skip to main content
Question

varbinary(max) column using OpenESQL in .NET Core VS COBOL

  • December 18, 2025
  • 3 replies
  • 19 views

Peter Restorick
Forum|alt.badge.img

We currently have a native COBOL application that uses dynamic SQL (via OpenESQL) to fetch, insert and update into Microsoft SQL.

This includes varbinary(max) columns which we have working and I suspect it was based on the demo which was available from this link https://community.rocketsoftware.com/rocket-visual-cobol-28/demo-using-the-varbinary-max-column-type-from-cobol-4560. Unfortunately the link to the actual demo is no longer available.

I now have the fun and games of trying to convert this logic to .NET core Visual COBOL and I am really struggling to work out how to ‘stitch’ the 65k blocks back together into the correct format and would really appreciate some guidance on this (the same goes for splitting it into the 65k blocks to insert/update).

Is there any chance for the demo mentioned to be converted into .NET Core VC so that I can refer to it and work out what I need to do?

Thanks in advance

 

3 replies

Chris Glazier
Forum|alt.badge.img+3
  • Moderator
  • December 18, 2025

Hi Peter,

I am not sure why you would need to convert the logic of the program as you should be able to recompile it for .NET as is and use the SQL Server ADO provider instead of ODBC. Are you having problems with this?

According to the OpenESQL docs, the max size of a column that can be used with a varbinary(max) column is 450MB.

I just tested this here, using static SQL instead of dynamic and it worked just fine and inserted a 400MB data item into my varbinary(max) column.

The following is from the docs:

SQL [TYPE] [IS] BLOB(lob-length)

Where lob-length is a value between 1 and 1073741823 expressed either as a number or a number followed by K (kilobytes), M (megabytes), or G (gigabytes).

Note: Although this SQL TYPE has a theoretical size limitation of 2G, for all practical purposes, the actual limitation is approximately 450M, which is the data size actually allocated to the application program.


Peter Restorick
Forum|alt.badge.img
  • Author
  • New Participant
  • December 19, 2025

Hi Chris,

Thanks for getting back to me. Yes I have changed my connection to use ADO instead of ODBC.

In the Native COBOL application we read the column in 65k blocks (the limit in OpenESQL) and store each of these blocks in a field pic x(65535). These are then output/appended to a character array using the withByteLengthValue option. This loop continues, appending to the character array until the read is complete.

When inserting/updating the information is extracted, in blocks, from a character array using “getValueWithSizeAtPosition”

I guess what I am looking at is what type of .NET object I can use to replace the native character array which I can interact with allowing me to continue using OpenESQL

I have tried the static SQL statements fetching the result into a byte array and that does seem to work correctly however we rely on the dynamic SQL quite heavily, via a generic program that gets used by a number of programs and to change this logic to use static statements is, unfortunately, not feasible.

Peter


Kim Hoskin
Forum|alt.badge.img+2
  • Moderator
  • December 19, 2025

For information, Peter your colleague Neil contacted me via email about this post, we have located the missing sample files and will provide that to Neil directly.