Skip to content

Implement document length tracking for DataInput-backed JSON parsers via subclass#1575

Open
pjfanning wants to merge 5 commits intoFasterXML:3.xfrom
pjfanning:copilot/update-utf8datainputjsonparser
Open

Implement document length tracking for DataInput-backed JSON parsers via subclass#1575
pjfanning wants to merge 5 commits intoFasterXML:3.xfrom
pjfanning:copilot/update-utf8datainputjsonparser

Conversation

@pjfanning
Copy link
Member

One possible way to support max document len on DataInput use case.
The count and validation overhead only kicks in if you set a non-negative max document len - which is not the default case.

Copilot AI and others added 4 commits March 12, 2026 00:57
…fast from JsonFactory

Co-authored-by: pjfanning <11783444+pjfanning@users.noreply.github.com>
…c-length tracking

Co-authored-by: pjfanning <11783444+pjfanning@users.noreply.github.com>
…via subclass

Co-authored-by: pjfanning <11783444+pjfanning@users.noreply.github.com>
@cowtowncoder
Copy link
Member

I would want to see performance testing (esp. for non-validating case) since this seems it could wreck performance (overrides and indirection) -- f.ex with https://github.com/FasterXML/jackson-benchmarks/ .

@pjfanning
Copy link
Member Author

@cowtowncoder I'll put together a benchmark but would expect it to have quite an impact. The parser itself is already not very efficient becuase it reads byte by byte and I would recommend that users try to get an InputStream instead of a DataInput instance to parse if it at all possible.

  • The normal case of no max doc len will not be affect noticeably by this change
  • This change will allow users to at least parse DataInput inputs again if they do set a max doc len (because recent code changes cause this code path to throw an exception - deliberately)
  • One option would be to add a JsonReadFeature that controls whether the max doc len is set case fails deliberately or uses this inefficient approach.

@cowtowncoder
Copy link
Member

The normal case of no max doc len will not be affect noticeably by this change

I would not assume that without measurements, due to inlining effects.
Depends on how JVM handles optimizations I suppose & benchmarking can give answer.

As to failing, configurability -- while possible, I don't think it's necessary at this point.

@pjfanning
Copy link
Member Author

@cowtowncoder I created a benchmark suite and compared the v2.21.1 parser with one that has the overridable shared readUnsignedByte method and a 3rd parser that applies the size limit.

The results for all 3 parsers are almost the same. So close that I nearly suspect that I have a mistake in the test suite.

I tested with different JSON sizes and Java versions and in no run was there any significant diff in perf between the 3 parsers.

I'll double check my test suite logic over the next few days.

https://github.com/pjfanning/jackson-bench/blob/main/src/jmh/java/org/example/jackson/DataInputBench.java

@pjfanning
Copy link
Member Author

@cowtowncoder this change seems to work ok. I see a small overhead when the max doc len is enabled but it is only a few percent.

One example run (Limited is the benchmark with the check enabled)

Benchmark                              Mode  Cnt     Score     Error  Units
DataInputBench.benchDataInput         thrpt    5  5176.015 ± 574.007  ops/s
DataInputBench.benchDataInputLimited  thrpt    5  5027.761 ± 319.194  ops/s
DataInputBench.benchDataInputNew      thrpt    5  5299.451 ± 130.896  ops/s

@cowtowncoder
Copy link
Member

@pjfanning Sounds good so far. Could you also run test against unchanged build from 3.x (or official 3.1.0 which should be about the same)?

@pjfanning
Copy link
Member Author

@cowtowncoder You want to create a new benchmark that uses Jackson 3 instead of Jackson 2?

@cowtowncoder
Copy link
Member

@cowtowncoder You want to create a new benchmark that uses Jackson 3 instead of Jackson 2?

I thought this PR is for 3.x? Wasn't thinking we implement this functionality for 2.x, only 3.x

But if we are talking about changes 2.x, comparison to unmodified version (2.21.1 would be fine) vs new versions.

@pjfanning
Copy link
Member Author

The jackson3 based benchmarks behave similarly to the jackson2 ones.

Benchmark                                      Mode  Cnt     Score     Error  Units
DataInputJackson3Bench.benchDataInput         thrpt    5  1969.188 ± 297.992  ops/s
DataInputJackson3Bench.benchDataInputLimited  thrpt    5  1908.798 ± 106.280  ops/s
DataInputJackson3Bench.benchDataInputNew      thrpt    5  2037.195 ±  66.618  ops/s

@github-actions
Copy link

📈 Overall Code Coverage

Metric Coverage Change
Instructions coverage 83.07% 📉 -0.010%
Branches branches 75.74% 📈 +0.010%

Overall project coverage from JaCoCo test results. Change values compare against the latest base branch build.

@cowtowncoder
Copy link
Member

@pjfanning Does "benchDataInput" refer to pre-changes version from 3.x, and "benchDataInputLimited" / "benchDataInputNew" to modified version with and withou limitations?

If so, differences do indeed look insignificant.

@pjfanning
Copy link
Member Author

@pjfanning Does "benchDataInput" refer to pre-changes version from 3.x, and "benchDataInputLimited" / "benchDataInputNew" to modified version with and withou limitations?

If so, differences do indeed look insignificant.

@cowtowncoder your interpretation of the names is correct

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants