Hi all,
A few issues with some code I'm creating to decompile BGL files:
Patrick
A few issues with some code I'm creating to decompile BGL files:
- Very nice BGL mapping at http://www.fsdeveloper.com/wiki/index.php?title=BGL_File_Format however it seems to be for FSX. Is there a similar page for FS2004?
- Is there a more elegant or maybe faster way to convert Little-Endian say to Integer? At the moment I use:
iValue = vals(0) + (vals(1) * 256)
If vals.Length = 4 Then iValue += (vals(2) * 65536) + (vals(3) * 16777216)
- Converted offset come out correctly, latitude is retrieved and converted correctly, however when it comes to longitude it's all messed up. I'm using the formulas found in the page referred to in paragraph 1.
I see: (double) Lon = (DWORD) Lon * (360.0 / (3 * 0x10000000) – 180.0)
My code: iLongitude = ConvertLittleEndianToDouble(vals) * (360.0 / (3 * &H10000000) - 180.0)
My var is declared as: Dim iLongitude As Double = 0
Using KCLT (Charlotte NC) the longitude should be: lon="-80.9431251883507"
When I use the previous formula I get: Longitude: -5474173486.40474
Is the formula correct? What do you guys use? - Now about ICAO. Referring to: http://www.fsdeveloper.com/wiki/index.php?title=BGL_File_Format#ICAO_Identifiers_and_region_codes. The code is calculated by starting from left: the value of the first digit/letter is multiplied by 38, then the value of the next digit/letter to the right is added, the sum s multiplied by 38, and as long as there are more digits/letters this process is repeated. Several questions:
- Is this to be applied to Little-Endian or to Big-Endian?
- Is this to be applied to digits/letters or to bytes?
- Do I multiply by 38 or 0x38?
- What do I do with the resulting value to convert it to text?
This section is very confusing and atm I'm dead lost.
Patrick


Why can't I ever see the obvious. I think it's time for stronger glasses again, or maybe brain implant.