What is a pressure transducer?

transducer
What is a pressure transducer?

Pressure measurement is an important element in hydraulic/pneumatic applications in control processes in all types of applications (industry, agriculture, etc.) to guarantee safe operation and optimal performance.
Apart from numerous factors such as accuracy, selection of the appropriate pressure range is a fundamental factor in adapting the instrument to the specific application.

Types of pressure transducers

You can know types of pressure transducers and sectors in the above link.

Recommendations for the use of pressure transducers

When selecting the pressure range of a pressure transmitter, two factors must be considered:

1- Prevent destruction due to overloads, caused by low range

2- Maintaining accuracy or too low an output signal due to too wide a pressure range.

A look at the technical sheets of various manufacturers reveals three different concepts regarding the pressure range of a pressure transmitter:

– Rated operating pressure

– Safe working pressure

– Burst pressure

Different manufacturers may use slightly different terms (and you should always ask about a precise definition) but in principle there is a common understanding of the concepts. It is important to keep in mind that only in the nominal pressure range are all the values specified in the technical sheet applicable. This is especially true if you are dealing with accuracy/error values. Once you exceed the nominal pressure range, the data sheet values are no longer true and may be exceeded, i.e. the output signal can no longer be used to obtain accurate measurements.

But what happens, if the user selects a much wider measurement range, ie the user only uses a fraction of the nominal pressure range, say 10 bar from a 100 bar sensor?

No problem. Or at least not much. The truth is that most precision parameters, such as linearity, repeatability, hysteresis, etc., are parameters that are relative to actual reading/value, not full scale. Its maximum value is defined in the data sheet and is expressed as a percentage of the nominal pressure range, but the error in service really “scales down” with use.

If the linearity (better: non-linearity) is 0.1% full scale (final value), this means that a 100 bar pressure sensor can experience a deviation of 0.1 bar from the ideal line and will be expect this to happen somewhere near the middle of the range, i.e. at 50 bar. But when the measurement is limited to only 25 bar with the same transmitter, it is expected that the deviation from linearity does not reach 1 bar but only a fraction of this value.

In fact, if you want to use a 100 bar sensor to measure pressure from 0 to 50 bar, the expected linearity error between 0 and 50 bar would be approx. 0.025 bar and is expected to move around 25 bar.

And what about the resolution of the process pressure transmitter?
The only limitation left to consider is the resolution of your electronic equipment that receives the signals, that is, your analog input card of your PLC or the input of your instrument or display. In most cases today this is no longer problematic as they typically provide 12-bit resolution and up. If we look again at our previous example and assume that we only use the sensor up to 25 bar, this will still provide us with a 10-bit resolution (in case of a 12-bit A/D) providing a resolution of 0.1% 0…25 bar (= 0.025 bar).

It is very important to choose the appropriate pressure range for each application. This is not difficult since modern pressure transmitters provide a large number of ranges to select from. 

It is more important to consider the overpressure of each application than to worry about reduced accuracy or resolution.

Share