A bit of history Yesterday I asked this question, about reading a JSON file which had arbitrary carriage returns in it, and reading it into one variable. This was because the random new lines were messing with my INPUT statement. I'm having a separate issue with the same file which I thought I'd raise separately. My new problem Now that I've sanitised the file to remove the random new lines, I have a text file containing JSON which has a single line of text whose length exceeds 32767 characters (worth pointing out that even before I removed the random new lines, there were still lines longer than 32767 characters!) When I try to read this line in, my INPUT step only processes records until the 32767th character, then ends, so I'm left with a fraction of the observations that I should have. I assume this is either due to the maximum length of the input buffer or the fact that I've set the LRECL to its maximum value (32767)? What I'd like to do Ideally I'd like to specify a record delimiter string (e.g. "},{") which could split my massive, massive line to treat each JSON object (which are wrapped in curly brackets) individually. Is this possible? Any solution which will allow me to read this whole file would work though. Any help appreciated. <<Sample file attached>>
... View more