JSON array Follow

0
Avatar
Aivo Rautam

I’m using Input Data Format type JASON. If I include data in just one record the integration works fine. But if I use two or more records, I receive the following message: 

Input data file 'FinalLabel.txt' contains data not in correct JSONVariables format. Details: Invalid object passed in, ':' or '}' expected. (2):

{[{"prodOrderNo": "","prodRelease": "","partNo": "02003361"},
{"prodOrderNo": "","prodRelease": "","partNo": "02003361"}]}

Any ideas on what is causing my error?

8 comments

0
Avatar
Jasper Wen
Moderator
Comment actions Permalink

Currently, Integration doesn't natively read in JSON arrays or multiple objects/records out of the box. It only supports the format of one JSON object and value pairs. You would have do some type of data transformation to get in a data format Integration can understand such as a text delimited data format and multiple records. There is a feature request on this already submitted to better handle and natively support JSON formatting.

1
Avatar
Ruben Arslanian
Comment actions Permalink

This answer is from 2016 - does the Integration Builder support JSON arrays in 2019R6?

0
Avatar
Nicolas Bykov
Comment actions Permalink

I'm also interested...We need to get a lebel with list of goods on it. Passing data through JSON. Please give feedback! 

1
Avatar
Anthony Scaglione
Comment actions Permalink

Anyone get an answer on this one yet? Have a need to send JSON data in one call for a large number of labels to be created.

0
Avatar
Pete Thane
Comment actions Permalink

It maybe better raising a query direct with your local Tech Support team about this. 

Although not sure if you have seen this in the 2021 Preview 1 stuff:

https://support.seagullscientific.com/hc/en-us/community/posts/360044135693-BarTender-2020-Preview-1-New-Database-Connectors

 

0
Avatar
Ruben Arslanian
Comment actions Permalink

I was just reading through this, and realized I asked a question back in February. Many headaches later, I can confirm 2019R6 definitely supports JSON arrays. We send some pretty long strings via POST and it handles them just fine. CPU usage looks pretty inefficient, but other than that, JSON works very well.

0
Avatar
Anthony Scaglione
Comment actions Permalink

@Ruben

 

1) When you use arrays, you can send different data to labels which print as part of the same print job? or are these sent as separate print jobs? My problem is i haven't been able to get the same print job to have different variables on each label therefore when using 4 labels across and wasting 3 labels each time one prints. Not ideal by any means.

2) can you provide an example of the call you're sending with the array syntax.

Thanks

 

0
Avatar
Ruben Arslanian
Comment actions Permalink

Hi Anthony

 

1. We send one array per BTW file per instance. So a job might use 3 files; so we'll send one POST for each one; and each post includes an array of variable data, that's merged at print-time with its corresponding BTW template. We print to PDF in this case, so we don;t have to deal with a fixed # of ups in the media. If I were you, I'd set up the integration to write (append) your JSON values to a text file. Every time it does, it should increment the value of a custom variable (there's a specific action for this) by the number of records you are adding to that file. If at the end of the integration, the value of that variable is divisible by 4 (a multiple of), then you can run the action to print and reset your variable back to 0. Otherwise, you end the integration and leave everything as is until you send more data. To be clear, that text file will work as your data source. In your print action, you can either hardcore its location, or you can dynamically change it. Don't forget to clear, delete, or rename that text file every time you run your print action.

 

2. Here's an example:

{"JobName":"FNOVA_269949_FN-CL-01_20200929_084652","BarTenderLabel":"FN-CL-01.btw","DataString":"FIELD1, FIELD2, FIELD3, FIELD4, FIELD5, FIELD6, FIELD7, FIELD8, FIELD9, FIELD10, FIELD11, FIELD12, FIELD13, FIELD14, FIELD15, FIELD16, FIELD17, FIELD18, FIELD19, FIELD0, FIELD00, FIELD000\nAPHRODITE JEANS, AP6017, Black, 14, 2637094, WOMEN'S, 269949, Original, , , , , , , , , , , , 10, 1, \u00cd:E)\u00c84+\u00ce\nAPHRODITE JEANS, AP6017, Black, 16, 2637095, WOMEN'S, 269949, Original, , , , , , , , , , , , 10, 1, \u00cd:E)\u00c850\u00ce\nAPHRODITE JEANS, AP6017, Black, 18, 2637096, WOMEN'S, 269949, Original, , , , , , , , , , , , 10, 1, \u00cd:E)\u00c865\u00ce\nAPHRODITE JEANS, AP6017, Black, 20, 2637097, WOMEN'S, 269949, Original, , , , , , , , , , , , 10, 1, \u00cd:E)\u00c87:\u00ce\n","BaseItem":"FN-CL-01"}

The first few JSON pairs set the Job Name (file path used across all the entire integration), and the BTW file this job calls for. The "FIELD" portion of the string sets the format / assigns the variable names; and everything below it are the values for each of those variables. The values are comma-separated, and the successive commas just mean that the values for those variables are blank.

We save the FIELD and values section to a TXT file, which we later merge with the indicated BTW file. In our case, we don't append data to an existing file because this integration always prints to PDF (its a dynamic preview tool on an ecomm platform). Since you do have physical media, you would do something similar, but only print once your counter hits a number that's a multiple of 4.

Hope this helped.

Ruben Arslanian

Please sign in to leave a comment.