As the originator of this thread, I would like to say it created some very interesting discussions related to the oil system. I would like to shift the thread back to its original topic of discussion related to the design of the oil pressure sender.
The oil pressure in the engine is monitored in aircraft because it is so important to detect a failure or fault in the oil circulation system. Oil pressure and temperature are both important to ensure we operate the engine within the placard limits set by Rotax design engineers.
It is important the oil pressure and temps are true and accurate. Earlier in this thread, it was advised the 912 oil pressure sender is calibrated as follows;
4mA = 0.75 bar
20mA = 10.75 bar.
Additionally, the sensor is linear between these limits.
I believe the sensor cannot supply less than 4mA of output current. 4mA to 20mA is the full scale defined limits of operation. So, even below 0.75 bar, the sensor will send 4mA to the oil pressure gauge. If we loose our oil due to a leak or failure of the oil system lines, and we loose oil pressure, the sensor will output 4mA. The oil pressure gauge is calibrated such that 4mA will display 0.75 bar. In this case, the pilot will assume he has a low pressure, but NOT zero pressure which may actually the case.
0.8 bar is the specified minimum oil pressure for a 912UL below 3500 RPM. Therefore, a pilot will notice a low oil pressure warning but may not understand this means zero pressure is also possible for this reading.
I suggest an improvement would be to scale the sensor to the following limits.
4mA = 0 bar
20mA = 10 bar.
This would then allow pilots to know if their oil pressure was zero because the measurement system could accurately measure down to zero, rather than only measure as low as 0.8bar.
The industry standard for pressure senders is typically 4mA = 0 bar (gauge pressure) to 20mA = Full Scale Pressure. Many manufacturers make such sensors for all types of process industries. They are very common. Keller is one such a manufacturer with a wide range of high quality industrial sensors amongst many others.
The Rotax sensor scaling, 4mA = 0.75 bar is very unique. This is not a standard industry calibration.
Does anyone know why this specific range of 0.75 bar to 10.75 bar was specified for the sensor?
I may have missed something in the above analysis which would be interesting to know. I'm sure there will be a good reason for the non standard scaling.