• Which the release of FS2020 we see an explosition of activity on the forun and of course we are very happy to see this. But having all questions about FS2020 in one forum becomes a bit messy. So therefore we would like to ask you all to use the following guidelines when posting your questions:

    • Tag FS2020 specific questions with the MSFS2020 tag.
    • Questions about making 3D assets can be posted in the 3D asset design forum. Either post them in the subforum of the modelling tool you use or in the general forum if they are general.
    • Questions about aircraft design can be posted in the Aircraft design forum
    • Questions about airport design can be posted in the FS2020 airport design forum. Once airport development tools have been updated for FS2020 you can post tool speciifc questions in the subforums of those tools as well of course.
    • Questions about terrain design can be posted in the FS2020 terrain design forum.
    • Questions about SimConnect can be posted in the SimConnect forum.

    Any other question that is not specific to an aspect of development or tool can be posted in the General chat forum.

    By following these guidelines we make sure that the forums remain easy to read for everybody and also that the right people can find your post to answer it.

FS2004 BGL files decomp formulas

Messages
1,043
Country
us-northcarolina
Hi all,

A few issues with some code I'm creating to decompile BGL files:
  1. Very nice BGL mapping at http://www.fsdeveloper.com/wiki/index.php?title=BGL_File_Format however it seems to be for FSX. Is there a similar page for FS2004?
  2. Is there a more elegant or maybe faster way to convert Little-Endian say to Integer? At the moment I use:
    iValue = vals(0) + (vals(1) * 256)
    If vals.Length = 4 Then iValue += (vals(2) * 65536) + (vals(3) * 16777216)

  3. Converted offset come out correctly, latitude is retrieved and converted correctly, however when it comes to longitude it's all messed up. I'm using the formulas found in the page referred to in paragraph 1.
    I see: (double) Lon = (DWORD) Lon * (360.0 / (3 * 0x10000000) – 180.0)
    My code: iLongitude = ConvertLittleEndianToDouble(vals) * (360.0 / (3 * &H10000000) - 180.0)
    My var is declared as: Dim iLongitude As Double = 0
    Using KCLT (Charlotte NC) the longitude should be: lon="-80.9431251883507"
    When I use the previous formula I get: Longitude: -5474173486.40474
    Is the formula correct? What do you guys use?
  4. Now about ICAO. Referring to: http://www.fsdeveloper.com/wiki/index.php?title=BGL_File_Format#ICAO_Identifiers_and_region_codes. The code is calculated by starting from left: the value of the first digit/letter is multiplied by 38, then the value of the next digit/letter to the right is added, the sum s multiplied by 38, and as long as there are more digits/letters this process is repeated. Several questions:
    - Is this to be applied to Little-Endian or to Big-Endian?
    - Is this to be applied to digits/letters or to bytes?
    - Do I multiply by 38 or 0x38?
    - What do I do with the resulting value to convert it to text?
    This section is very confusing and atm I'm dead lost.
Any input will be very much appreciated,
Patrick
 
Hi Patrick
1 - Yes the wiki page is based on findings in FSX. However, there is a document by Winfried Orthmann (that I'm attaching) and that was written for Fs2004 (if I'm not mistaken).

2- In C#, to read an integer, you can use the BitConverter class.
For example : lets' say you have read 4 bytes from the file in the byte[4] array. Then BitConverter.ToInt32(array, 0) returns the corresponding integer.

3- so at the longitude offset, in your file, you should have 4 bytes like 0x2E, 0x24, 0x35, 0x0D which gives a DWORD = 0x0D35242E (hexa value) = 221586478 (integer value)
(360.0 / (3 * 0x10000000) = (360.0 / (3 * 268435456) = 0.0000004470348358154296875.

Hence (221586478 * 0.0000004470348358154296875) - 180 = -80.943125188350677490234375 and that's KCLT longitude.

4 - Everything is little endian (Intel format). and it's 38 (decimal not hexa).
So in your file , you should have 4 bytes like 0x21, 0xC2, 0x57, 0x02 which gives a DWORD = 0x0257C221
0x0257C221 is first shifted 5 bits to the rights , which gives 0x0012BE11 = 1228305
- 1228305 is >= 38 so 1228305 % 38 = 31 and (1228305 - 31)/ 38 = 32323
- 32323 is >= 38 so 32323 % 38 = 23 and (32323 - 23) / 38 = 850
- 850 >= 38 so 850 % 38 = 14 and (850 - 14) / 38 = 22
- 22 is < 38 so that is the last value.
So we got 31,23,14 and 22
- 22 is in the range [12 - 37] so letter = 'A' + (22-12) = 'K'
- 14 is in the range [12 - 37] so letter = 'A' + (14-12) = 'C'
- 23 is in the range [12 - 37] so letter = 'A' + (14-23) = 'L'
- 31 is in the range [12 - 37] so letter = 'A' + (31-12) = 'T'

Hope this helps!
 

Attachments

Salut Patrick,

Thank you so much, I didn't expect such a complete response, you save my day. It seems a few things in the Wiki were misleading, like the fact that the ICAO explanation was for the coding, not the decoding. At first reading everything you explain seems to make good sense. Back to my coding to apply modification and test.

Encore, merci beaucoup pour ton aide et pour le fichier,

Patrick (the other one :))
 
Hey Patrick, do you know why I'm getting the same longitude from AIRPORT subsection data and AIRPORTSUMMARY subsection data but I'm getting a different latitude? Am I getting my offsets wrong or is it that way in BGL?

Thanks,
Patrick
 
Hey Patrick, do you know why I'm getting the same longitude from AIRPORT subsection data and AIRPORTSUMMARY subsection data but I'm getting a different latitude? Am I getting my offsets wrong or is it that way in BGL?

Thanks,
Patrick
I don't really know. What file are you working on ? Can you send it to me?
 
Ok, here is the file. It was created from ADE. Thanks for looking into it.
Thanks also for the tips you gave me. I now get all the data I needed.
 

Attachments

An FYI for anyone reading this thread:
In the wiki http://www.fsdeveloper.com/wiki/index.php?title=BGL_File_Format#Name there is a statement for the Name subsection as follows, "This subrecord seems to be present in every airport record, and it is always the first one immediately after the fixed part. (of the Airport subsection)"
This statement proves to be erroneous. As you can see in this screenshot I found in BGL file a Delete subsection between Airport and Name subsection. It is possible that the position of fields in the XML file used to generate the BGL file is responsible for this positioning. It's not big deal anyway but just be aware of it.

Untitled-1.jpg
 
Your latitude and longitude are ok.
At offset 0x214 and 0x218, you have the Airport Record values:
00000214 Airport Longitude 2E 24 35 0D
00000218 Airport Latitude 09 5F BD 09

At offset 0xFB3C, you have the AirportSummary values:
0000FB3C Longitude 2E 24 35 0D
0000FB40 Latitude 09 5F BD 09

They both match.

Désolé! :-)

Btw, is your BGL file for FS2004 ?
 
Every step brings new problems, or solutions to be found...and I certainly am looking for a solution to this one. I have flipped the situation every way I can think of and can't figure out the following issue:

I'm working on the NameList data section. I have broken down the BGL file into a text file to make the mapping more readable. At the end of this file you'll find the NameList section. The file is from the FS2004 Scenery\Same folder.
I read in the wiki and PDF file in the NameList section, ICAO records "Country name index, State name index, et." I suppose those are the index in the lists of countries, cities, etc. from the previous entries (right after the NameList headers), is that correct or am I misunderstanding the meaning of those indexes?

Assuming I am correct, I do not understand the following indexes:

Untitled-1.jpg


Shouldn't they match, shouldn't City and Airport name have the same index in their respective files?
I could understand if they didn't in case of several airports being in the same city but this is not the case here.
In code I load country, state, city, name into lists then associate them to an ICAO using those indexes and of course I get them all wrong.
Help me understand where I'm going wrong please

Thanks,
Patrick
 

Attachments

Last edited:
Also in section NameList-ICAO i see:
0x022WORDbit 0-3 : unknown
bit 4-15 : state name index

I'm not a byte/bit person. How do I extract the value of bit 4-15. I found several solutions online that all involve conversion, loop, more conversion. Any direct way in .NET?

Thanks
 
Patrick, thank you so much for all your help, I wouldn't have figured all that out without you great advises.
I had mistaken ICAO section index to offset for index in the list of country, city, etc.
All works fine now and I am very pleased with the results.
My debug says:
BGL files processed: 15611
Airport files processed: 1518
Airports found : 23760
Process time: 8.208 seconds

If I ever go visit Canada I'll take you out for a nice serving of poutine (that I cook on regular basis, I replace the curd cheese with Swiss cheese though) yum yum. ;)
 
Salut, I hate to bug you again but it seems I spoke too fast. While looking at some of the collected data I noticed multiple copies of the same ICAO.
File: Scenery\Ocen\scenery\AP906250.BGL
Hawaii, 4 airports
I get the following data:
Hexa
E1 3E 0B 02
A1 3E 0B 02
41 3F 0B 02
61 3F 0B 02

Bytes obtained by processing Hexa with the formula (collected by stepping through code):
7 4 20 19
5 4 20 19
10 4 20 19
11 4 20 19

Resulting ICAO:
HI25
HI23
HI2
HI2

Here is my code:
Code:
      Dim lstBytes As New List(Of Byte)
      Dim sICAO As String = ""
      Dim iVal As Byte = 0

      ' Shift 5 bits to the rights
      iValue = iValue >> 5

      Do While iValue > 37
         iVal = Convert.ToByte(iValue Mod 38)
         lstBytes.Add(iVal)
         iValue = (iValue - iVal) \ 38
         If iValue > 1 And iValue < 38 Then lstBytes.Add(CByte(iValue)) ' Last byte
      Loop

      For n = lstBytes.Count - 1 To 0 Step -1
         iVal = lstBytes(n)
         If iVal > 11 And iVal < 38 Then
            sICAO += Chr(65 + iVal - 12)
         ElseIf iVal > 1 And iVal < 10 Then
            sICAO += Chr(48 + iVal - 2)
         End If
      Next

      Return sICAO

I stepped through it with your example (KCLT) and got the exact same values everywhere. So, I'm at a loss, no idea why I'm getting values 10 & 11 in the last 2 ICAO that translate to HI2.
What do you think?
Thanks,
Patrick
 
Hum hum :rolleyes:
And where do the values 10 and 11 in your code ? Nowhere. Only values [12-37] and [2-9] are processed in your code.

As per the wiki:
blank 00
Digits 0-9
02-11
Letters A-Z
12-37

The fix is :
If iVal < 1 Then
' 0
sICAO += Chr(32)
ElseIf iVal > 1 And iVal < 12 Then
' 2 to 11
sICAO += Chr(48 + iVal - 2)
Else
' the rest
sICAO += Chr(65 + iVal - 12)
End If

That should give you HI28 and HI29
;)
 
GRRRR :banghead: Why can't I ever see the obvious. I think it's time for stronger glasses again, or maybe brain implant.
Glad you don't have that problem.
Thank you.
 
Patrick, I hate abusing your time but I see you are a C# expert. Could you take a look at this short code at your convenience and see if you can spot why it works well on any form but if used on a form opened with frm.ShowDialog it closes the form when clicking on any button? No rush and if it's not obvious to you then don't worry I'll just look for another solution.
http://www.lyquidity.com/devblog/?p=136

Thanks,
Patrick
 
Back
Top