• Which the release of FS2020 we see an explosition of activity on the forun and of course we are very happy to see this. But having all questions about FS2020 in one forum becomes a bit messy. So therefore we would like to ask you all to use the following guidelines when posting your questions:

    • Tag FS2020 specific questions with the MSFS2020 tag.
    • Questions about making 3D assets can be posted in the 3D asset design forum. Either post them in the subforum of the modelling tool you use or in the general forum if they are general.
    • Questions about aircraft design can be posted in the Aircraft design forum
    • Questions about airport design can be posted in the FS2020 airport design forum. Once airport development tools have been updated for FS2020 you can post tool speciifc questions in the subforums of those tools as well of course.
    • Questions about terrain design can be posted in the FS2020 terrain design forum.
    • Questions about SimConnect can be posted in the SimConnect forum.

    Any other question that is not specific to an aspect of development or tool can be posted in the General chat forum.

    By following these guidelines we make sure that the forums remain easy to read for everybody and also that the right people can find your post to answer it.

Image size problems SBuilderX

Messages
115
Country
australia
I find that if you select an area at high zoom (well any zoom actually) that would result in a file being >~2,350,000KB, SBX just says it can't create the file. At Google Zoom 20, that's a very small parcel of land :p.

cheers

Braedon
 

rhumbaflappy

Administrator
Staff member
Resource contributor
Messages
5,932
Country
us-wisconsin
Hi Braedon.

I'm thinking that your needs aren't going to be met by SBuilderX. I'm guessing you want to do photoreal for an extremely large area. This isn't what SBuilderX was made for. Tiles collected by SBuilderX are mostly copyrighted, and cannot be distributed. They are for personal use only. You need a different source of imagery, the use of a GIS program, and learn to use resample and INF files directly... perhaps using batch files to run the whole works.

You're still going to hit the 2GB barrier for resample for each image.
 
Messages
115
Country
australia
I am using map tiles (as opposed to imagery tiles) for generation of building footprint vector features from raster and not using them directly, and yes, it's an extremely large area at a high zoom to get finer resolution for the raster to vector algorithms in Global Mapper. I have free imagery from the Tasmanian Government for my photo-real imagery. Thus my PM to you guys about the GoogleAPIs and a tool to gather up tiles without some of the GM feature layers. SBX is the only tool to get and compile the tiles at the moment via your plugin, but it is onerous doing it in ~500m x 300m chunks for several thousand square kilometres. If SBX could write BigTIFF and a GIS application standard world file or a BigTIFF Geotiff it would save weeks of work, but then it would have trouble managing the file in memory to display it.

I do a lot of the heavy lifting in Global Mapper, but there are some things specific to MSFS/P3D that need to be done in SBX as it is the only tool that does it. The lack of seamless interoperability with standard formats (Geotiff, standard world files, etc) is just another thing that adds to the workload. Thus my interest in the potential for a v4.0...
 
Last edited:

Luis_Sá

Resource contributor
Messages
326
Country
portugal
I am using map tiles (as opposed to imagery tiles) for generation of building footprint vector features from raster and not using them directly, and yes, it's an extremely large area at a high zoom to get finer resolution for the raster to vector algorithms in Global Mapper. I have free imagery from the Tasmanian Government for my photo-real imagery. Thus my PM to you guys about the GoogleAPIs and a tool to gather up tiles without some of the GM feature layers. SBX is the only tool to get and compile the tiles at the moment via your plugin, but it is onerous doing it in ~500m x 300m chunks for several thousand square kilometres. If SBX could write BigTIFF and a GIS application standard world file or a BigTIFF Geotiff it would save weeks of work, but then it would have trouble managing the file in memory to display it.

I do a lot of the heavy lifting in Global Mapper, but there are some things specific to MSFS/P3D that need to be done in SBX as it is the only tool that does it. The lack of seamless interoperability with standard formats (Geotiff, standard world files, etc) is just another thing that adds to the workload. Thus my interest in the potential for a v4.0...

Hello,

Just saw this right now. I am not sure about how large geotiff files can be in .Net and I do not remember if geotiff files can be the sources for photo scenery produced by resample. If large files are possible and if resample accepts geotiffs, it would be very easy to generate a big geotiff from cached tiles (probably downloaded with the "sister" tool of SBuilderX, called GetTiles). However, it seems to me that if you generate a very large photo scenery from a single file, distortions may appear (I am not sure). I am ready to elaborate on this further, 3 weeks from now!

Regards, Luis
 
Messages
115
Country
australia
Geotiffs can follow the BigTiff format which overcomes the 4GB limit of the standard TIFF format, though resample's limit seems to be 2GB (in TiFF or Geotiff). But as with all things GIS, you can develop with a Bigtiff and tile it to resample sized chunks (potentially with SBX but I tend to use a GIS program for that at the moment) at the final step before resample.

Currently I use SBX to gather the tiles and use them as a backdrop for generic building placement, but (as an example) I recently used it to get Google Map tiles with building footprints using the Google APIv3 No roads tile server made by Dick, to get a very large area of clean building shapes at Zoom 20 to then process raster to vector in Global Mapper and Scenproc. The most time consuming part was gathering 3000 sq. km of maps in tiny chunks (1000+), individually georeferencing them by transcription of the text file details into the georeferencing window in Global Mapper to mosaic them into a large mosaic and then do the raster to vector conversion. The format of the text file with the geographic coordinates is not one of the standards that any GIS software I have found will read. I think Arno has made scenproc able to read it, but that's another step again, and scenproc doesn't tile massive quantities of data that GIS programs are programmed specifically to process.

Flicking backwards and forwards between software designed to interface with FSX (like Scenproc and SBX) and GIS software is always going to be a daunting prospect given the different priorities and strengths of the two, so to a degree that is a given, but anything that takes makes it a little less tedious would be good. As people do larger areas as PR, being able to handle large datasets is becoming more important.

Cheers and thanks

Braedon
 
Messages
115
Country
australia
G'Day Holger,

Judging by the posts and the voices of experience that have translated into gems of advice elsewhere in this forum, others here have had bad experiences with LZW compressed (Geo)TIFFs. We may all be making the same basic error, but for anything but relatively small areas, resample fails with what I have only been able to determine as an educated guess is Geotiffs that expand into >~2GB RAM.

cheers

Braedon
 
Messages
7,450
Country
us-illinois
Hi Braedon:

[EDITED]

As I understand you are working with some very large file sets that in part will be submitted to SDK Resample, I'd like to ask whether you would consider performing the quick test of a hypothesis I proposed to Holger in the latter part of my post (...in the original thread where this sub-topic discussion arose before it was split off into this separate thread) ? ;)

http://www.fsdeveloper.com/forum/threads/sbuilderx-source-code.441176/#post-785394

[END_EDIT]


I'm hopeful you might also help with testing whether we 'may' be able to extend 64-Bit USERVA for SDK Resample task sessions. :)

GaryGB
 
Last edited:
Messages
115
Country
australia
Will give it a try on the weekend. Given that resample runs in (and is potentially limited by) the windows command environment, is there going to be any limitation on memory availability caused by that?

ie Does resample.exe run in it's own memory space or does it run in the command environment's memory space?

cheers

Braedon
 
Last edited:
Messages
7,450
Country
us-illinois
Hi Braedon:

Mark Russinovich has an excellent dissertation on that subject here: :idea:

https://blogs.technet.microsoft.com...pushing-the-limits-of-windows-virtual-memory/

AFAIK, for applications coded compatible to a Portable Executable (aka "PE") format, the "4GB Patch" reportedly enables a 4GB (theoretical) maximum USERVA in 64-Bit Windows, and a (theoretical) maximum 3GB USERVA in 32-bit Windows (when the system is configured to enable that).


BTW: A good discussion on another SysInternals utility to map process memory usage is here:

http://blogs.microsoft.co.il/sasha/2016/01/05/windows-process-memory-usage-demystified/


GaryGB
 
Last edited:
Messages
115
Country
australia
Apologies, had an impromptu spring clean thrust upon me this weekend so did not get a change to export my imagery a little larger and test. Will do so ASAP.

cheers

Braedon
 
Messages
7,450
Country
us-illinois
Thanks Braedon; I'll look forward to your further reply, when you get some time for this. :)

GaryGB
 
Messages
115
Country
australia
OK, we've sort of hijacked this thread a bit, but I started some testing with the standard and 4G patched P3D 1.4 resample.exe.

The test is with a 3x4 grid of 30000x30000px uncompressed geotiffs of the Tasman peninsula at 45cm resolution, the largest geing 2.6GB, with the ones around the eastern and southern edges being about 1.2GB. Windows 10 Pro x64, 32GB RAM. Just the base image at this stage, no masks.

No failures with uncompressed geotiffs with either the patched version or the standard version. One thing I did notice with the VMMap utility was that when the tiles being processed were the smaller ones (~1.2GB) the mapped file memory showed the file was loaded into that classificaton of memory and you could tell which file it was processing, and the committed memory was about 2.4GB for both resamples. When the files were the 2.5GB files, the commited memory was about 70MB, and the imput file was not visible in mapped memory (??). Windows Task Manager had resample using only about 15-20MB no matter which test I was running. Maybe resample/windows handles the memory with the bigger files differently.

Compressed Tiffs at 30Kx30K and 35Kx35K and an uncompressed set at 35Kx35K being exported now. Will test with those when they have finished exporting.

cheers

Braedon
 
Last edited:
Messages
7,450
Country
us-illinois
Hi again, Braedon:

Thanks for your help in further testing this. :)

After we get some idea of what Windows 64-Bit task session USERVA may be possible with SDK Resample, I anticipate we may next try to get a better idea of what the USERVA limits may be when preparing source imagery in SBuilderX.

Hopefully we can identify what SBuilderX USERVA limits are for creating Maps from Background ...versus from Disk. :scratchch

Once we know that, we might identify maximum source file sizes that can safely be used in SBuilderX without OOMs. :idea:

GaryGB
 
Messages
115
Country
australia
OK, preliminary results.

I decided to test with an area that is about 40km square at 45cm resolution as per below image of the Tasman Penninsula in Tasmania.

Tas-Pen-Test-coverage.jpg


Started testing using the 4G patched resample and the vanilla resample, but as exporting and resampling takes a long time, I used the logic that I would test the limits of the unpatched vanilla resample and once I had reached it's limits, then see if the patched resample extended them any.

Using a 2x2 tile set of GeoTIFF imagery at the maximum resolution and file size, I had the following results. File sizes are in old fashioned Gigabytes, not the new metric ones.

Base Imagery Only, 20000x20000px uncompressed @ 1.12GB ea. fixed file size = completed OK
Base Imagery Only, 30000x30000px uncompressed @ 2.51GB ea. fixed file size = completed OK
Base Imagery Only, 35000x35000px uncompressed @ 3.42GB ea. fixed file size = completed OK
Base Imagery Only, 37750x37750px uncompressed @ 3.98GB ea. fixed file size = completed OK
Base Imagery Only, 40000x40000px uncompressed @ 4.47GB ea. fixed file size = failed with "Not a TIFF file, bad version number 43 (0x2b)." immediately loading the first source.
Base Imagery Only, 40000x40000px uncompressed @ 4.47GB ea. fixed file size, 4G Patched Resample = failed with "Not a TIFF file, bad version number 43 (0x2b)." immediately loading the first source.
Base Imagery + Watermask, 37750x37750px uncompressed @ 3.98GB ea. fixed file size = completed OK
Base Imagery Only, 40000x40000px LZW compressed @ <4GB ea. variable file size = completed OK
Base Imagery + Watermask, 40000x40000px LZW compressed @ <4GB ea. variable file size = completed OK

As Global Mapper asked me to export as BigTIFF when exporting the uncompressed imagery at 40000x40000px, I assume the BigTIFF format has a TIFF format identifier of 43 which resample does not understand. At that same resolution, none of my LZW GeoTIFF files were BigTIFFs (but one was only just under). Thus, resample handled them OK.

I tested the last one with the 4G patched resample but it too gave the same error (not surprisingly as the failure wasn't a memory issue).

I then tested this with the FSX SDK resample on the largest working image and it too worked OK. This was just in case Lockheed Martin had somehow tweaked the v1.4 SDK resample, which is seems they did not.

The main issue that I can see with using LZW compressed TIFF files to glean the most out of the process is that before you export your imagery, you cannot guarantee that all of the imagery will be below 4GB in size. At 40K square I had one that was just under at 3.97GB. So what you would gain from using compressed imagery just a little bit larger than valid uncompressed imagery, I don't know (unless you have a lot of water and use a solid colour to colour the water in so it compresses better).

I then ran the 37750x37750px uncompressed and the 40000x40000px LZW imagery through resample with a blend mask, and both worked just fine, so blend masks don't seem to add to the memory load. Water masks LZW compress quite dramatically anyway.

The big take from this is: I didn't see the need to use the 4G patched resample as the limitation seems to be with file size/format not memory use.

DISCLAIMER(S):
  • This test was conducted with a 2x2 grid of tiles at the sizes indicated above which takes 9-13hrs to resample.
  • Given more time I could stress test with a much larger area and see if the LZW compressed files fail with base imagery, water, and blend masks over a 5x4 grid, but that will take about a week of preparation and exporting/resampling.
  • Time comparisons between LZW and uncompressed tests were not taken.
  • The tests were conducted in parallel with between 1 and 4 running at the same time in their own command windows.

cheers

Braedon
 
Last edited:
Messages
115
Country
australia
I was just thinking, there is one thing I need to check, and that is that the largest working tests weren't filled with missing data. I've seen resample finish OK previously and have vast tracts of missing data in the middle of the scene. When I get home tonight I'll set one off again (as I just deleted them yesterday to reclaim some space) and report back.

Braedon
 
Messages
7,450
Country
us-illinois
http://www.fsdeveloper.com/forum/threads/sbuilderx-source-code.441176/#post-785394


Considering the statements posted by Sean Isom and Dick earlier in this same thread:

http://www.fsdeveloper.com/forum/threads/sbuilderx-source-code.441176/#post-782734

http://www.fsdeveloper.com/forum/threads/sbuilderx-source-code.441176/#post-782738

...one certainly might wonder whether there may be potential to increase the input, working set, and output data sizes that MSFS / LM-P3D SDK Resample will accommodate.


However, since FS SDK Resample is a stand-alone application which does not rely on external libraries / DLLs etc., I am still curious as to whether we actually may be able to push the envelope of available USERVA for SDK Resample a 'bit' (or many 'bytes' !) ...further. :scratchch



If it would be possible at some point within your limited available time, I would be very grateful if you would test something that might prove useful to the FS Developer knowledge base for Resample. :idea:


After making a ZIP backup of SDK Resample.exe, set the "LARGE_ADDRESS_AWARE" flag in the executable for that original Resample.exe file, by applying the "4GB Patch" (and specifically not via MS "CorFlags") ...following the procedure described recently for patching CvxExtractor.exe in this thread:

http://www.fsdeveloper.com/forum/th...porting-vector-data.432918/page-5#post-783986


BTW: This test uses '4GB Patch', as I am not certain whether Resample is in a fully compatible Portable Executable (aka "PE") file format that may be patched via 'MS CorFlags' ...which may- or may not- be intended only for use with 'certain' Win32/64 GUI executables.

https://en.wikipedia.org/wiki/Portable_Executable


Then, if you would please, submit to that copy of Resample with the "4GB Patch" applied to it, a:

* 'Larger than 3-GB' (ex: 4-GB ?) multi-source INF source data set of GeoTIFF files (either LZW-compressed or not)

...to see if Resample will accommodate that data set, and then successfully output a 'larger than 2-GB' output BGL.


PS: I hope we 'may' be able to increase the size of both working and output data sets submitted by SBuilderX315 to SDK Resample.


Thanks in advance for considering a quick test of whether we 'may' be able to extend 64-Bit USERVA for SDK Resample task sessions. :)

GaryGB

Please note that a discussion arising on a latter sub-topic which I had initially posed as a question to Holger in my quoted post above:

http://www.fsdeveloper.com/forum/threads/sbuilderx-source-code.441176/#post-785394


...has now been split into this new thread, and my comment here was posted shortly after the time this split was implemented.


Hopefully this re-structuring of posted content will allow discussion to proceed with a more appropriate focus on the primary topic intended to be discussed in each resulting thread. ;)


GaryGB
 
Messages
7,450
Country
us-illinois
Hi Braedon:

Thanks for sharing your further test results on SDK Resample resource use. :)

With regard to actual SDK Resample BGL output results, did your tests above at any time generate a BGL larger than 2-GB ?


Also, When you refer to SDK Resample BGL output results with "Missing_Data" seen in ex: TMFViewer, were such areas intended to have the "Missing_Data" attribute when the source date was processed in ex: Global Mapper ...prior to Export of that data ? :scratchch

GaryGB
 
Messages
115
Country
australia
The missing data I was looking for is the undesirable type. I created LOD9 sized BGL ouput files, so the largest is about 700MB. Will remove the LOD9 output and test again.
 
Messages
115
Country
australia
OK. I recently exported my TAS-SE scenery and quickly re-did the resample lots to use:
  • 37750x37750 (just under 4GB) uncompressed input files,
  • SU, FA, WI, SP, Water, and Blend masks
  • Output split into LOD9 Cells (About 8 full LOD9 Cells, and 9 ~1/3 filled LOD9 Cells around the outside)
  • processing broken into 4 overlaping groups of input tiles to generate 4 LOD 9 cells in one process at complression 80, the rest at compression 70 split into 3 processes.
Results:
Running 4 processing lots simultaneously on the same PC with processor affinity set so that they used 2 of 12 processors each without sharing a CPU with another process, with outputs going to different physical disks, the process took 3 days. The big limiting factor seemed to be Hard Disk congestion for the source files, as processor utilisation was minimal, and everything else I was doing that used Hard Disk time (mainly Global Mapper) was on the go slow.

I had no failures processing the data through resample with input data this size. However the quantity of data that goes into 4 seasons, blend and water masks for 45cm imagery at either 80 and 70 level compression across a whole LOD9 cell resulted in some >2GB output BGLs for several of the processes. Needless to say, these were large but invalid. So I stopped those processes and restarted the affected ones with LOD10 cell split output. This is still running as I only restarted these this morning. I will run a separate process to see if the 4G enabled resample is OK with the >2G BGL files. It's not that I think I will make any startling discovery but I just want to cover all the bases.

cheers

Braedon

TAS-SE-LOD9-GRID.jpg


Note: This is not the most efficient way to achieve the output I was after, but it was the quickest way for me to get the test ready from the previous state of my files. I'll be exporting in LOD10 (ie much smaller) grid squares in future so there won't be any need for overlap, but won't test large input files then either... :p
 
Top