Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System.Text.Json 5.x preview deserialization is slower than Newtonsoft when using Stream scenarios on Xamarin platforms #41089

Closed
klogeaage opened this issue Aug 20, 2020 · 8 comments
Labels
Milestone

Comments

@klogeaage
Copy link

Description

I have found that if you use Newtonsoft's ability to deserialize from a Stream via its JsonTextReader(), then on some platforms, it is comparable to or even better than System.Text.Json. This is not the stated goal of your efforts.

So a typical case is deserializing the result of a HTTP call, which in Newtonsoft should be done like this:

    HttpResponseMessage response = await _client.GetAsync($"{ControllerName}{additionalParameters}");
    response.EnsureSuccessStatusCode();
    using (var stream = await response.Content.ReadAsStreamAsync())
    using (var reader = new StreamReader(stream))
    using (var json = new JsonTextReader(reader))
	items = _serializer.Deserialize<IEnumerable<T>>(json);

While with System.Text.Json the deserializing part will be:

    using (var stream = await response.Content.ReadAsStreamAsync())
        _items = await System.Text.Json.JsonSerializer.DeserializeAsync<IEnumerable<T>>(stream, JsonOptions.Default1);

I have found the performance to be roughly the same on iOS devices and MUCH slower (25-40%) on Android. But even equal performance is not the goal, I presume.

I don't know of a unit testing framework that would let me verify this, especially on real physical devices, which may have quite different characteristics than emulators.

So I have created a small Xamarin.Forms app to measure this, which I will be happy to share with you if you find it relevant.

I used another app of mine to share screen shots of the results.

Configuration

  • Net Standard 2.1 (Xamarin.Forms 4.8.0.1269)
  • Newtonsoft.Json 12.0.3
  • System.Text.Json 5.0.0-preview.7.20364.11
  • iOS 13.6 and Xamarin 10
  • iPhone7 and Galaxy A50

Data

In my app, I store a JSON payload from a real world domain model of 526 KB. It is deserialized 5 times and the average is returned for the two serializers.

If can measure that the performance of this is 3-7% slower on the iOS device and 25-40% slower on the Android device. On simulators, I get similar results.

@layomia
Copy link
Contributor

layomia commented Aug 20, 2020

So I have created a small Xamarin.Forms app to measure this, which I will be happy to share with you if you find it relevant.

Yes please share your app so that we can try to repro your numbers.

@layomia layomia removed the untriaged New issue has not been triaged by the area owner label Aug 20, 2020
@layomia layomia added this to the 6.0.0 milestone Aug 20, 2020
@klogeaage
Copy link
Author

@layomia OK, so now the solution is published here. Please let me know if there is anything I can assist with.

@steveharter
Copy link
Member

steveharter commented Aug 24, 2020

The likely reason perf is slower in STJ is in these cases is because STJ uses standard reflection Invoke, not Emit, for the netstandard (non-inbox) configurations. Newtonsoft instead uses compiled expressions trees which are netstandard-compatible but also brings in the very large System.Linq.Expressions.dll so there a disk_size+memory_usage+startup_time vs throughput tradeoff here.

It is possible to add support for S.Linq.Expressions if we decide it's worth the tradeoff in these non-inbox scenarios. However, for 6.0 we are also looking at alternate ways to get\set properties and call constructors that do not use Emit but are just as fast, plus we plan on having a code-gen solution that will generate AOT code per POCO type to get\set properties and call ctors that avoids the need for Emit\Reflection entirely.

@acaly
Copy link

acaly commented Dec 2, 2020

However, for 6.0 we are also looking at alternate ways to get\set properties and call constructors that do not use Emit but are just as fast

@steveharter Is that going to be a new CLR feature? I am interested because it should benefit many projects which now depend on Emit for performance.

@layomia
Copy link
Contributor

layomia commented Jul 23, 2021

With JSON source generation, we're able to generate code that statically invokes type members without reflection, so this perf issue should be mitigated. It would be good to take measurements that verify this. This work won't stop the .NET 6 release, so I'll move this issue to .NET 7.

@layomia layomia modified the milestones: 6.0.0, 7.0.0 Jul 23, 2021
@klogeaage
Copy link
Author

klogeaage commented Jul 29, 2021

@layomia I have updated my small test app to latest versions of Newtonsoft, Xamarin and .NET 6.0.0-preview.6.21352.12. And things are looking a bit better. On an iPhone12 with iOS, I now consistently see that System.Text.Json is 15-20% faster - previously the performance was roughly the same. On a Samsung phone SM-A505FN, the performance is now roughly the same, sometimes 3-5% faster, sometimes 3-5% slower. However, on a Samsung tablet SM-T510, it is still consistently 30 - 40% slower.

So while this certainly is better, there still seems to be room for improvement, at least on certain Android devices.

@layomia
Copy link
Contributor

layomia commented Jul 29, 2021

@klogeaage thanks for the feedback! Did you try using the JSON source generator for your tests? https://devblogs.microsoft.com/dotnet/try-the-new-system-text-json-source-generator/

@klogeaage
Copy link
Author

@layomia I have now updated my test with the new JsonSerializerContext and I'm happy to report that it has at least doubled performance compared to NewtonSoft on both my two Android devices, both during startup and for the following tests . 👍

On iOS, the picture is the same: it is consistently ~43% faster than NewtonSoft.

These results of course also mean it is much faster than .NET 5.0 System.Text.Json. I was also glad to see that support for the System.Net.Http.Json package that my real application is using is already available. So good work and highly recommended.

Your blog entry was very thorough and well laid out. However, I have a few things that you might consider for a future update:

  1. Since this is Xamarin, I was targeting .NET Standard 2.1, which caused compilation errors in the generated code. The workaround is not mentioned in your article, but explicitly setting C# to version 9.0 in the project file was the solution, i.e. add
	<PropertyGroup>
		<LangVersion>9.0</LangVersion>
	</PropertyGroup>
  1. It took me a little while to figure out that I had to specify [JsonSerializable(typeof(IEnumerable<Journey>))] and then the property IEnumerableJourney to deserialize a list. This is such a common thing that it might deserve specific mentioning.

  2. This is a very small thing, but I was quite puzzled by the fact that the compiler generates the additional source code on the fly, as opposed to earlier source code generation tools, where an external tool is invoked and the generated source code is added to your solution and stored under source control.

Since this is so easy to add to a solution using System.Text.Json, I will close this issue. Thanks.

@ghost ghost locked as resolved and limited conversation to collaborators Sep 24, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

5 participants