Instead of using tokens.insertLast() to grow the tokens array with every new token, preallocate the array with the number of the elements equal to the size of the json file (which is a theoretical maximum number of tokens it can have) and assign tokens to the array with index++ for each new token. When tokenizer finishes the job, resize the array down to the number of tokens assigned.
Instead of using tokens.insertLast() to grow the tokens array with every new token, preallocate the array with the number of the elements equal to the size of the json file (which is a theoretical maximum number of tokens it can have) and assign tokens to the array with index++ for each new token. When tokenizer finishes the job, resize the array down to the number of tokens assigned.