If you are trying to read a large JSON file larger than 2 Gb file size, you’ll probably run into a .Net OutOfMemory exception soon. We had a similar situation at hand, with a JSON file of 100000 records. We had to read and parse this data for our unit tests.

The environment in this case was a windows machine running on a 64-bit operating system with 16 Gb RAM. In this environment, .Net could save this large file with no hiccups. However, reading this file back was the issue. We would get a System.OutOfMemoryException while trying to read this large file.

The end resolution was to create smaller files by splitting the data into smaller chunks. These smaller files could then be read successfully. Here’s the code to split the initial large list of objects into a list of smaller list objects. 100000 records were split into 20 objects of 5000 records each. Each of these 20 objects were saved in separate files, reducing the file sizes to approximately 100 MB each. 

var listOfSmallerLists = largeDataList.Select((p, index) => new {p, index})
			      .GroupBy(a => a.index/5000)
			      .Select((grp => grp.Select(g => g.p).ToList()))
			      .ToList();

However, this wasn’t enough. .NET was still throwing the System.OutOfMemoryException after reading 8 of the 20 files.

This code was in a Visual Studio Test project. So after some research, it was found that setting the project’s build platform to “Active (Any CPU)”, which caters to both 32-bit and 64-bit platforms was not enough. Changing this to x64 finally resolved the System.OutOfMemoryException.

This setting can be changed through Test > Test Settings > Default Processor Architecture > X64.