• Which the release of FS2020 we see an explosition of activity on the forun and of course we are very happy to see this. But having all questions about FS2020 in one forum becomes a bit messy. So therefore we would like to ask you all to use the following guidelines when posting your questions:

    • Tag FS2020 specific questions with the MSFS2020 tag.
    • Questions about making 3D assets can be posted in the 3D asset design forum. Either post them in the subforum of the modelling tool you use or in the general forum if they are general.
    • Questions about aircraft design can be posted in the Aircraft design forum
    • Questions about airport design can be posted in the FS2020 airport design forum. Once airport development tools have been updated for FS2020 you can post tool speciifc questions in the subforums of those tools as well of course.
    • Questions about terrain design can be posted in the FS2020 terrain design forum.
    • Questions about SimConnect can be posted in the SimConnect forum.

    Any other question that is not specific to an aspect of development or tool can be posted in the General chat forum.

    By following these guidelines we make sure that the forums remain easy to read for everybody and also that the right people can find your post to answer it.

Need Help to demistify XML needle pointing and nonlinearity table correlations.

I'm about to knit this new XML sweater with a needle... Let's call it a fuel gauge. I've worn out far too many old threads and the now vintage FS2X tutorials which leaves some things to be desired in the way of full explanations so I will attempt to provide some here to help future proof while also posing some questions to ponder in hopes of generating some discussions.

Currently, I'm a little confused about XML needle points and what to use, particularly when you have a needle that may need to point West or North (or really anything other than East and also anything that requires counter-clockwise movement). I see many gauges that seem to only use "PointsTo=East" for everything and nothing else which adds to the confusion when there are three other directions available and seemingly should be in use much more from a more intuitive standpoint. I would also think west pointing needles which move clockwise would actually be most common but I haven't found many. So I'm wondering why is that? Is overuse of East pointing needles and the PointsTo=East command just an "easy way out" coding habit that works for a lot of directions simply because it's what people have copied from one another over the last 20 years or are there actually more "proper" ways to do these things? Why don't I ever see the other 3 cardinal directions in use with their coding reflecting that use!?

I'm hoping to get to the bottom of this once and for all as I have only found 4 or 5 older threads and while they give some code examples the main issue for me is none of them ever teach the HOWs and WHYs to allow you to really understand the full process of needle pointing. Every time I come back to it I quickly glaze over staring at code examples, especially when I can't mentally connect the how and why something works in one way but not another that seems so similar. I want to also preface this by saying this theory that I'm about to dive into might not be status quo or fully correct, but I think it is and I think you are going to easily see what I'm getting at and how this should actually be intuitive in its application.

I also get that the coding in the nonlinearity table always reflects clockwise movement and that the table can be reversed to change things or change direction but that is not my only issue and I really want to take a logical stab at analyzing this from the top down.

So let's begin;

As I understand it, you should first create a needle bitmap graphic in whatever physical orientation that will most closely reflect its intended movement. That would of course make the most logical sense. However, I often see many gauges where people DO NOT do this which I can only assume is to save a little time and use just one needle image per gauge instead of two, or three or more. The issue with that for me is it becomes very confusing when trying to learn to code. Regardless, if using more than one needle for purposes of better coding, and for better purposes of intuition and illustration, I have to assume that if doing so, you would also want the PointsTo= command variable to reflect that same exact orientation. Thus any vertical graphic with the needle point set on top (0 degrees / north) should intuitively be created for a needle that when in use will mainly point up and would primarily use some portion of the top semi-circle of a gauge dial. And so in this case you would similarly intend to use a PointsTo=North command in the XML code. (see the included drawing example I made to follow along). I think I've read elsewhere that this is how to start.

As as example, a north pointing needle may have its Nonlinearity table set to start on the left side of a gauge, let's say 270 degrees left (for zero percent fuel or zero gallons, zero amps, etc) and then end at 90 degrees right (for 100 percent fuel or x-full gallons, etc) I would assume that you would want to use a needle graphic and "PointsTo" that is always somewhere in between your two extremes and not fully to one side or the other. Even though I suppose it could work I don't think it would make sense to use one extreme or the other (such as east or west) Regardless, if just using an East needle graphic for everything then assumably you could also just use a West facing needle graphic too for the same purpose.

Once again, it seems a lot of coders often use East exclusively and so I just want to know why that is unless all gauges typically tend to be able to use that assignment. As I discuss below, West and North needles are intuitively more cumbersome to figure out, but if we can demystify their use then maybe it will no longer appear that way for myself and others who are now looking towards the longer term of FS gauge coding now that we have a new sim this year. Regardless, and in efforts of common sense and intuition, you would think you would always want to use one that best represents your midpoint between the two value range extremes and never one that is on either extreme or very close to it. Regardless, after reading various posts on this topic, many of which go unanswered, this topic seems to be something most people struggle with. I made a graphic that is attached which you can follow along with.

So let's assume the following four things are most often true for the 4 possible needle orientations of North, ..South, ..East or ..West:

A -- a vertical (North) pointing needle will most likely need to sweep from approximately 270 degrees left or west to 90 right or east in a CLOCKWISE fashion, thus fully sweeping left to right like the speedometer in your car.

B -- a South pointing needle would need to do the same from 270 left to 90 right but in a COUNTER-clockwise direction sweeping at the bottom of a semi-circular arc moving left to right. (I assume this is a rare use in FS)

C -- an East pointing needle would generally sweep up from 180 degrees south (for empty, off, zero fuel, etc,) to 0 north (full, on, 100%, etc) thus moving in a COUNTER-clockwise direction.

D -- a West pointing needle would need to visually point west and sweeps from 180 degrees south to 0 north in a CLOCKWISE direction like a North needle does.


So in these four scenarios, we have two needles (A-North and D-West) that move CLOCKWISE and two needles (B-South and C-East )that move in a COUNTER-clockwise direction. I think these are the only four possibilities you will ever encounter.

They all overlap each other of course (90 degrees each side of their cardinal heading) but if you mentally assign each one a 180-degree arc movement then this all makes for logical illustration as to which is best to use for your coding needs.

*Note: A caveat here is this is not (yet) discussing needle movements which may require full 360 rotation like an Altimeter, Airspeed Indicator, ADF, RMI, etc... only gauge needles with a limited sweep of 220 degrees or less and most commonly 180 or less.


Based on those four assumptions I assume most everyone will have the greatest troubles understanding the next two scenarios below because of their non-intuitive positioning (Position=) in relation to the axis of rotation <Axis X= Y= />.

For this example; Let's say you have a needle that when the graphic is drawn East or West both are X=50 Y=25. And when the graphic is drawn vertically for a North or South orientation the values are both now X=25 and Y=50 (thus opposite).

So the big difference here between all four is their Position= values (i.e; coordinates). These will all be very different if you always have to position and reference each one of them from the Top Left corner of the graphic bitmap.


Now let's look at the two which are most confusing and least intuitive;

Scenario 1 -- A West needle. -- If you have a needle that needs to primarily face west and sweeps from "0/empty" at 180 deg. south up to 0 deg. north, how is it coded using a bitmap that is also graphically drawn pointing west???

Scenario 2 -- A North needle. -- If you have a needle that needs to primarily face north and sweeps from "0/empty" from a 270 west position a to 90 degrees east. How is it coded using a bitmap that is also graphically drawn pointing north???

Why these two particular scenarios are much more confusing than an East or South needle is because if you must always use the top left corner of the needle bitmap for its Position= coordinates in relation to the main bitmap it sits on, it means the axis of rotation for these two needles in particular will always be on the far opposite end from where you are anchoring them with the Position=. So this means the Position point and axis point are on far opposite ends from one another and in turn requires a lot more "careful math" to properly calculate and place the axis reference point of these two needle types. What may have been only counting 15 or 20 pixels with an East or South needle may become 50 to 60 or more with a North or West needle.

So with these two scenarios in particular it means the Axis' X value for needle rotation is now calculated at much longer of a distance away from the needle point than would be for any East or South pointing needle because with East / South the axis is always set much closer to the anchoring Position= point of the graphic. This is why I think people probably tend to just use East needles a lot. And here is where I'm even more confused is I swear I've seen instances of East, West, and North needles in use and the gauge works but the axis values are all the exact same... like X=10, Y=13 for example. I don't know how or why it works because it just doesn't make any sense. Logically speaking, and as I've just went to lengths to illustrate and explain, West and North needles should have a much greater axial X distance for horizontally drawn West needle bitmaps and a taller or longer axial Y distance for vertically drawn North needle bitmaps and the Axis section should always reflect that. Am I somehow wrong with this assumption? I don't think so but again, this is a head-scratcher that I haven't nailed down yet. I'll amend this document once I do.

If we assumed the two scenarios above are true then it will always mean any East and South needles (again.. both drawn that way and intentionally used and or positioned that way) will always have much smaller Axis X/Y values than any West or North needles because the axis points of all East / South needles will always be much closer to the point of orientation for the top left Position = coordinates and are therefore much easier to calculate very quickly. Again, this is my hunch why you see PointsTo=east with east drawn needles being used more than anything else.


Now let's look at a visual scenario of all these examples and explore how the above two West and North scenarios are coded, as well the two East and South that could be considered more common. (See attached image)

Given a base background bitmap of 100 x 100 with a central axis rotation point of 50,50 and a needle bitmap for all four cardinal headings with a size of X= 50 Y= 25 (or X= 25 Y= 50 for a North or South vertically drawn needle) and each with an opposing end central axis point of 13 pixels (halfway point of 25 pixels), then the following Position= and Axis X/Y values should be true as follows below if properly set and oriented on their true Cardinal headings of North, South, East, West as referenced in each Position = example below. I've set the Axis for each at the very tail end of the needles (which all meet in the middle of the drawing) as they are essentially 13, 1 or 1, 13 for East and South but would be very different for West and North as discussed earlier. I also realize that many people give their needles more of a "tail" for some reason and imbed the axis point deeper into the bed of the graphic but for ease of understanding and illustration I just set the axis out at the very end.

A final note to remember here that may help some people is to always draw a needle graphics longitudinal width with an odd number... 21, 23, 25, 27, 29 and so on because you need a central longitudinal axis on which to pivot with two equal even number of pixels on each side. A needles total length can be even but its width which determines the longitudinal axis and pivot point for rotation should be odd, hence we are using 25 in this case with 13 being the central axis line (12+1+12) or 13 from either side.

NORTH NEEDLE (red)
Main Image: 100 x 100
Needle Bitmap of: X=25 Y=50
Variable code of: PointsTo=North

Code is then:
<Position= X="38" Y="0" /> (here X is half of the vertical needle's axis thickness thus 50 pixel (middle of 100x100 base image minus 12 and Y = 0 because its set at the very top)
<Axis X="13" Y="49"/>

SOUTH NEEDLE (yellow)
Main Image: 100 x 100
Needle Bitmap of: X=25 Y=50
Variable code of: PointsTo=South

Code is then:
<Position= X="38" Y="50" /> (here X is also 50 - 12 to keep the needle axis on the 100x100 images centerline and the Y begins on the halfway mark of the 100x100 image)
<Axis X="13" Y="1"/>

EAST NEEDLE (green)
Main Image: 100 x 100
Needle Bitmap of: X=50 Y=25
Variable code of: PointsTo=East

Code is then:
<Position= X="50" Y="38" /> (here X is the 100x100 halfway mark but the top edge of the needle begins at 38 pixel down from top to keep the center axis on the 50,50 center point)
<Axis X="1" Y="13"/>


WEST NEEDLE (blue)
Main Image: 100 x 100
Needle Bitmap of: X=50 Y=25
Variable code of: PointsTo=West

Code is then:
<Position= X="0" Y=38" /> (here X is zero because its' top left half begins on the far left side of the 100x100 image and Y is 38 down as before to set the needle's axis line right on the 50,50 centerline of the background image)
<Axis X="49" Y="13"/>


Providing I am correct with all my assessments, these are the four primary calculations, correlations, and relationships to be used whenever coding any of the four primary needle positions. Take notice that the Bitmap X/Y values simply flip when using North/South vs East/West and that the Axis values are all very different as discussed earlier, especially with the West needle having a high X axis value and North needle having a high Y axis value simply because the axis is set on the far opposite end of the Positioning (Position=) locations. These can help serve as reminders to check that the ratios of whatever you're working on will follow a similar relationship.

Other things to possibly discuss in this thread are the few odd scenarios where you have non-standard counter-clockwise movement or may have for example an East needle that needs to move clockwise (up to down) or a west needle that needs to move counter-clockwise (up to down), or south needle that needs to move right to left or north needle that needs to move right to left but I think those are extremely rare cases and maybe this is where you just need to flip the ascending/descending order of the Item Value = in the Nonlinearity tables to get opposing movements. At least I think that's how it's done. Hopefully some pro's will chime in at some point on this essay.


Now let's move on to what makes the needles actually move:...the Nonlinearity Table.

With the Nonlinearity table you have three styles of movement to tell a needle or other objects how to move. These are an X/Y Coordinate System (very common for needle movements), a Percentage System (more common to full rotation gauges), and a Radians System. I won't go into how all three work and will just use the defacto X/Y coordinates for quick example purposes since that seems to be the majority of what people use, especially for needle movements.

On the code line above a Nonlinearity table is the Value line. Here you first assign a Minimum and Maximum value for your gauge Element and this determines the two far extremes of how the gauge will move whether it be in a quantity of something like fuel gallons, fuel percentage, fuel pressure, engine torque, cylinder head temperature, etc. Therefore the Minimum and Maximum values are an expression of what your variable limits will be. This could be Value Minimum="0" Maximum="100" or could be 0/1000 or even 1/10000 and will always depend on the variable in use that is listed right after the value range itself. I assume this is fairly understood by most and there are plenty of other tutorials to explore on this subject so I won't go into it further. Regardless, once that value range is established, you then apply it to your nonlinearity table.

So a value data set could look like this:

Descending value percentage order from 100 to 0 indicating a needle set on a left side or center of a gauge dial with needle pointing East..

<Item Value="100" X="141" Y="124"/>
<Item Value="50" X="136" Y="173"/>
<Item Value="0" X="113" Y="217"/>

Ascending value percentage order from 0 to 100 indicating a needle set on a right side or center of a gauge dial with needle pointing West..

<Item Value="0" X="188" Y="216"/>
<Item Value="50" X="161" Y="172"/>
<Item Value="100" X="156" Y="123"/>


As I understand it, you really only need two values in most cases that provide the minimum range (typically Item Value="0") and the maximum range (typically Item Value="100" or 1000, 10000, etc) and do not need to include a lot of other various X/Y needle placement value coordinates in between to mark every single position on the gauge as the simulator itself knows what to do once the two value extremes are given along with the point of rotation. So for these examples, given you have properly set the minimum / maximum value ranges, a needle should be able to show 10%, 25%, 50%.. (or any other percent or value in-between the min/max value ranges) as needed without you having to manually enter and code each one. This too has been confusing to me as I often see some people have only registered the minimum and maximum values while other times people will register every other point, or maybe the start, middle, and end (3 points as I illustrated above). So the confusion created is that there never seems to be a defined or definitive standard in use between coder... No code for code if you will and so this is really confusing for those of us in our infancy of XML. My point here is that if you typically only need two item values for the min./max. then this obviously leaves a lot less work to do in the nonlinearity table for gauge needles.


I think that's enough for now. I really want to hear people's inputs on all this and if this egg is cracked or needs more baking. I just want to understand this once and for all!
 

Attachments

Last edited:

tgibson

Resource contributor
While I didn't read your entire post, I know for me the reason I use a certain needle direction most of the time is indeed I am copying some of my previous work, rather than working it out all over again.
 
"Points to" - this is nothing more than "how" you made the arrow bitmap. The bitmap must be rectangular in shape, so; On which of the 4 sides is the pointer pointing when looking at it in the normal orientation with your bitmap viewer/editor?
It doesn't matter what it's action is or where it points to most often in the gauge. The rest of code takes care of that.
If one supplies an incorrect cardinal direction to their bitmap then all nonlinearity table entries will be a little more difficult, being 90°, 180° or 270° off.

ex.png


The main thing to setup are the <Axis> and <Position>.
<Axis> is the center point of rotation on the needle bitmap.
<Position> is the axis point location on the needle bitmap overlaying and referencing to the background bitmap.

Nonlinearity table -
1) IIRC all "value" entries in the table should be ascending in value. Again, if I remember correctly, if it is descending in value the code fails. Counter rotation is taken care of by the X,Y locations in the table.
2) You should use, at the minimum, 3 entries in the table. Imagine 2 values, 180° apart. Which direction should it go when the value changes? And again, when the value returns to zero ( or the minimum ), which direction should it go? A mid-point value table entry will aleve this.
3) The X,Y entries are no more than a point, any point, on the line originating from the axis/position location radiating to where the needle should point. ( think a VOR radial ) The point does not have to be the exact point where the tip of the needle should be. In fact, the further away from the axis results in a more precise direction of the needle/handle.
From this post, disregard the green notes :
 
Last edited:

taguilo

Resource contributor
I'm hoping to get to the bottom of this once and for all as I have only found 4 or 5 older threads and while they give some code examples the main issue for me is none of them ever teach the HOWs and WHYs to allow you to really understand the full process of needle pointing.
Hi,

Here you have a basic description on how PointsTo work:



Tom
 
Ok, some final questions and I think this subject is totally resolved for me once and for all. It looks like everything I was saying in my lengthy post was correct.

#1- Let's say you have a south drawn needle graphic and are using the PointsTo=South, but the arc you are wanting to effect needle movement on is actually on the top half of a circle (northern side) and the needle should move clockwise. If you schedule your nonlinearity table correctly, for example from minimum-0 to maximum-100 with the first set of XY coordinates being taken from the far left and second set from far right is there going to be any issue given that you are using a south-pointing needle?

Given Romans answer I'm guessing NO as he said;

"It doesn't matter what it's action is or where it points to most often in the gauge. The rest of code takes care of that." (*he did say "most often" though lol, ..so are there some caveats here?)
and..
"If one supplies an incorrect cardinal direction to their bitmap then all Nonlinearity table entries will be a little more difficult, being 90°, 180° or 270° off. " (so the key takeaway here is make sure your PointTo= matches the needle art direction!)

#2 - As I theorized in my post, and as Tom said in the first reply, "indeed, most of the time I am copying some of my previous work, rather than working it out all over again. " -- So based on his reply I assume this is why I've seen a lot of east-facing needles / PointsTo=east. It had just not made any logical intuitive sense to me when intuitively many other needles/directions should or could have been used instead. I said in my post that it would only make sense to use East all the time from the standpoint of "calculation simplicity" followed secondly by a south needle because both of their Position= points will always be much closer to their axis points than West or North. And so with East it allows the axis location to be calculated much easier than using a West or North facing needle graphic. And with that being the case, and with only the Nonlinearity table dictating the direction of needle movement, then it also begs the question of why even have the other three cardinal directions available for use at all..

So..

#3 - Is there any particular reason the other 3 cardinal directions ever need to get used if you could just use any one of them for all purposes but that East is always easiest? When would you use them specifically?


#4 - In order to reverse the needle counter-clockwise in the same "south needle" scenario mentioned earlier, is flipping the nonlinearity table order to start with the 100 value (or whatever is highest/largest value) and end it with the minimum-0 value all that is required for counter-rotation? Is it really that simple?


#5 - Bonus question... when do people mostly use the Percentage values and also Radians valves instead of X Y coordinates? I assume each one is most ideal for certain types of gauges over others. What are they?


#6 - I see in some gauges the needle bitmaps position being repeated twice like "25,50,25,50" -- what does this mean and do? In other instances, I just see it with normal X Y as "25,50". Is there some "if, and, or, butts" to this?


#7 - 8-bit artwork? Is it absolutely necessary in this day and age and why? I made some toggle gauges a few weeks ago with 24-bit and didn't seem to have any issues in Prepar3D v5? So is this just an "old world" thing from FS2004 days that coders are hanging onto or is it actually necessary for something like performance and frame rate issues? I can attest that I haven't had any issues so far with maybe 10 on/off toggles on a panel.

#8 - I know the sim rescales gauges for you as per your panel.cfg sizing entries, so for example if using a 150 x 150 gauge you can display it at 300 x 300 (but it pixelates all to hell) and so it's best to "create to scale" if you want to see "life size" gauges on something like a 24" or 27" display. So given that, I've been making a few gauges in 1:1 real-world ratios. For example, a real-world Airspeed indicator is 3" wide, so I would make the background 300x300 (even though at 96dpi it will be ever so slightly larger than 1:1.) So with rescaling, either stretching larger (which looks like crap) or compression (which looks really good) have you ever found limits to display compression like creating a 300x300 gauge but then telling the sim to show it at 150x150? I would think compression could also have limits just as the expansion does with noticeable pixelation.

I have to assume both stretching and compressing may also have performance hit aspects if you had a whole panel or multiple panel sets that are either/or. For best performance I assume a 1:1 scale at the intended final resolution would always be the ideal scenario for all gauges so that the sim and your CPU/GPU isn't having to do excessive math. It's at least intuitive in theory I'd say. I am specifically asking because I'm thinking of testing some gauges that are even double in size for use with a 4k display. So an Airspeed indicator would be made at 600x600 instead of 300x300 but then could also be displayed/rendered smaller at 1920 x 1080 if necessary.

Basically, I want to toy around with whatever will look very crisp but could be flexible between a 4k display and standard HD. Again, if they are not made to be 1:1 scale for the largest MainPanel you would use, and or your largest intended display resolution, then any stretched gauges will look janky. So a 300x300 gauge rendered on a 4k display at 4k resolution will look bad. But a 4k gauge on a 4k rendered panel that is then rescaled at 1920 x 1080 may look really good (but may impact performance more than just a native 1:1 at 4k) I hope I'm making sense here and should probably start a separate thread for this subject but figured I'd ask to see if anyone cares to comment.

I've noticed that with many XML gauges when compressed to around 75% from their full 1:1 size they also look a lot more like vector graphics and it looks nice. It's a shame all the old gauges were not built larger. So many of the old FS2002, 2004, and FSX planes have such small gauges that they just pixellate so bad when trying to render with a larger display. I assume nobody uses anything smaller than 1920 x 1080 these days. I'm using a 32" 4k for my desktop and totally love it. I'd never go back.


Again, thanks for the quick replies, this has all been very helpful.
 
Last edited:
I have always used 24 bit images in FS2004 panels and not had a problem yet.
Thanks Tom, that's one down out of the list. I wonder if that may just be an old issue for C++ gauges only? I know it used to be a "thing" somewhere in yesteryear. I've been out of simulation for the last 5 years or so but recall that was some sort of requirement and that either the gauge would not appear or it caused issues unless reduced to 8-bit. I would think a modern sim like P3D being x64 would actually require something like 24-bit but I've yet to check out the SDk files. I do at least know that old FS2004 / FSX C++ gauges have to be recompiled from x32 to x64 in order for them to work with P3D v4 or v5 but I haven't looked into any other details yet.

I'm trying to piece together a few old 2D panel sets just for fun right now. It's a shame they fell off the map. I like the virtual cockpits now that they finally look better and with a 32" monitor I feel like I can finally see the gauges instead of squinting at nickles, dimes, or quarters. But I still like being able to just use more of an "official" feeling panel for IFR flying. I also really want to try VR at some point as that seems like what the 3D cockpits were really made for or will do best with, but have also read that their pixel resolution is not what it needs to be yet. I assume in another 5 years it probably will be..
 

DragonflightDesign

Resource contributor
Nope - all bitmaps on screen are still 8-bit. If you feed the sim a 24-bit image it just converts it before displaying it. Okay up to a point, but one day that conversion is not going to be what you want to see. Better off feeding it 8-bit images to start with.
 

DragonflightDesign

Resource contributor
Tom, you've got lucky! :D That's bitten me on a few occasions, but OTOH I've also used it to my own advantage because occasionally the sim conversion does produce what I want better than my 8-bit images. It's also possible to use 32-bit images but it takes a long road to get there. Discussed here:


About halfway down.
 
Thanks for the clarification sir Dai, master of C++! I have another small issue that I could use some help with. I'm trying to make an adjustable bug for a round engine torque gauge and cannot find any applicable generic value / event ID or applicable A, G, or L.? I just need something generic that will instruct the needle graphic to rotate the bug via mouse clicks.

I think I've set it up correctly for what is to be a static knob click area on the bottom right of the main background bitmap. It would of course be wonderful to also have a knob that visually rotates as well too (and understand that would require adding another element, rotate command, etc... but in the short term I'm first wanting the mouse to click on either side of the "knob area" and get the bug to move clockwise from 0 to 100 percent. I think I've coded that part correctly for the nonlinearity table and included 10 Item Values as I assumed for every 1 mouse click it would likely move one item value, but regardless need an event term value that says what it is that I'm actually turning.

I know I've seen airspeed bugs like this before so I have to assume it's very similar...just some sort of generic "bug name" but am also worried that if I were to use an event ID already in another gauge (like heading bug) then it would affect the heading. I looked through the whole event list last night and couldn't find anything applicable.

Element parts;

<Image Name="TORQ-MKR.bmp" PointsTo="South" ImageSizes="75,170" Luminous="Yes" />
<Axis X="38" Y="1"/>
</Image>
<Rotate>
<Value Minimum="0" Maximum="100">(G: ?????? )</Value>

and then... mouse parts;

<Mouse>
<Area Left="nnn" Top="nnn" Width="nn" Height="nn">
<Area Left="nn">
<Cursor Type="DownArrow"/>
<Click Repeat="Yes">(G: ??????) 2 - d 0 &lt; if{ 0 } (&gt;G: ?????? )</Click>
</Area>
<Area Right="nn">
<Cursor Type="UpArrow"/>
<Click Repeat="Yes">(G: ??????) 2 + d 100 &gt; if{ 100 } (&gt;G: ?????? )</Click>
</Area>
</Area>
</Mouse>
 
Guys is there any good documentation on such things. I'm am an obvious noob here. I've been taking a ton of notes and have at least 50 or more web pages bookmarked including all the new Prepar3D v5 SDK pages, and was also studying the Reverse Polish SDK page and Variables pages before I even posed this question. I just can't find any good tutorials that walk you through everything to give you a good foundation for understanding. It seems every pdf online, even the FS2X pdfs are thrown together with "just do this and that's" without enough thorough methodical explanations that give you a leg up to make sense of everything, especially for beginners.

I can see how framework structures are organized per a desired task, but when it comes to the events, variables, and how to link it all together with the script languages there just seems to be zero documentation. I can now see why back in the day so many people would just take various gauges from other aircraft rather than make new ones. This stuff is ridiculously hard and it doesn't need to be. With a much firmer foundation, people would ask the right questions, and ask less them often, rather than taking endless stabs in the dark which makes people feel foolish and creates excessively long threads for people to wade through. To be clear I feel really foolish lol.

The thing I've often found with really smart people or people that have been doing something for decades is they often don't know how to teach or convey what the student is asking and so the student fails, not because of their lack of enthusiasm to learn, but because of the teacher's inability to teach even if they are a wealth of knowledge. When you've been doing something forever your brain often assumes other people know what you do. Well... I don't lol.
 
Last edited:

tgibson

Resource contributor
Hi,

Indeed, there is no place with a really deep discussion of all of this, sorry. Most of us just don't have the time to do MS's job and document all of this. The FSDeveloper Wiki has much of the needed information, and posts here in the Gauges forum usually contain the rest. Web searches are normally what I use to learn about such things, along with examining similar existing gauges.
 
Top