Archive for category configuration
Lots of people like the pushbuttons on industrial pressure transmitters because the basic settings that every transmitter needs can be set up without a HART communicator. This includes things like the tag name, engineering units, LRV (Lower Range Value, the zero, or what 4.0mA represents), URV (Upper Range Value, the span, or what 20.0mA represents) and damping (an average or filter factor that dampens noise).
On the new Honeywell ST700/ST800 series smart transmitters, the tag name and engineering units are easy to configure and self explanatory, but I seem to stumble when setting up the LRV and URV because I’m faced with a non-descript choice. There’s two sets of options (under Transmitter Setup, not Calibration):
OK, either configures an LRV or a URV value, but which is which? What’s the difference?
The HART communication protocol has been firmly established as the standard means of configuring field instruments for some years. But talking to a field instrument needs a communicator.
There are the handheld communicators, Rosemount’s x75s and the “budget-priced” Meriam MFC 4150, but at a cost that’s more a capital appropriation than an MRO expense. Even the Meriam, with a 3-year field device description subscription starts at more than $4000.
People continue to ask me if there isn’t a more budget conscious approach to HART configuration.
When you’re making programming changes to a field device, you don’t always have time to wait. Here’s a hidden feature that helps you speed up the process between Siemens SIMATIC PDM and HART field instruments.
Normally, when you’re using PDM software, it takes a minute or so to upload or download changes to and from your HART devices. Seems like an eternity when all you need to do is change a range.
So, I’m going to let you in on a feature you might not have seen before.
A customer who had had lots of experience with Milltronics and Siemens ultrasonics was installing his first SITRANS LR560 radar level transmitter. They had worked with it in the shop beforehand, going through most of the settings. They even tested it by setting it up to shoot against a file cabinet and used a tape measure the check the indicated distance value.
Everything checked out OK in the shop.
When they installed the transmitter on the top of the bin, they changed the transmitter’s sensor mode parameter from the distance mode they used in the shop for testing to level mode. After aiming, the level value shown in the local display was dead nuts on, but the 4-20mA signal going back to the control room was way off.
The bin was a third full. The 4-20mA showed it about double that. Not only that, the 4-20 was going in the wrong direction. The bin was emptying and the HMI reported an increasing level value. Someone realized that an inverse-acting output was typical of a distance value, so they reconfigured the sensor mode to distance. That got the 4-20mA much closer to a distance value, but it was still not exactly what it should be, and besides, the goal was to read level, not distance, in the control room. What was going on?