- New Content
- File Library
Chemical and Process Engineering Resources
Rupture Disks for Process Engineers - Part 4Nov 08 2010 01:30 PM | pleckner in Safety and Pressure Relief Share this topic:
Part 1 of this series on rupture disks for Process Engineers covered why you use a rupture disk and when you might want to use this device. Part 2 discussed how to size the rupture disk. Part 3 discussed how to set the burst pressure. In this part, I will discuss how temperature and backpressure affects the rupture disk design. Subsequent parts will include the Relief Valve/Rupture Disk combination, how to specify the rupture disk and some discussion on the type of rupture disks you can purchase.
Before I begin, let me point out that most of what is included in this series of articles can be found in API RP5201 and API RP5212, and ASME Section VIII, Division 13. Much of what is found in these documents can also be found in vendor literature.
Temperature and Backpressure Considerations
In Part 3, I discussed how to set the burst pressure of the rupture disk. However, the discussion is not complete without considering the affects of temperature and backpressure on the bursting pressure.
The rupture disk manufacturer uses both the specified burst pressure and the specified temperature when designing and stamping the disk. (In this instance, I use the term design to mean arriving at the correct burst pressure, not mechanical integrity). However, it is more than likely that the temperature of the rupture disk will not be at the specified temperature when it is called into service. Why is this so?
The temperature most commonly specified is that of the relieving fluid coincident with the burst pressure, i.e. relieving conditions. Sounds logical, but remember that the disk is continuously exposed to the process stream for hours, days, weeks or even months before it may ever be needed. Or, the disk may be exposed to ambient conditions. Therefore, expect the disk temperature to be approximately equal to its environment during normal operation of the system. When a process upset occurs, system pressure rises until it reaches relief (burst). The temperature of the relieving fluid also rises per thermodynamics. However, the time interval between normal system operation and relief is usually so small that the rupture disk's temperature hardly has time to come to equilibrium with the higher process fluid temperature. Therefore the disk can actually be colder than it's specified temperature. The affects?
In general, burst pressure varies inversely with temperature. For some rupture disks, the burst pressure can be as much as 15 psi greater than stamped if the actual temperature is 100oF lower than specified, e.g. a disk specified with a burst pressure of 350 psig at a temperature of 400oF will actually burst at 365 psig if its temperature is only 300oF4. This doesn't sound like a big difference but if 350 psig were the design pressure (or MAWP) of the vessel, then a burst pressure of 365 psig would be in violation of code (LAW). The opposite is also true. A disk at a temperature hotter than specified when called into service will burst at a pressure lower than stamped. Although this is considered to be the more conservative approach because code can't be violated and there is no risk of catastrophic failure of the vessel, specifying too low of a temperature can lead to the not so desirable action of premature bursting.
The bottom line is that the specified burst temperature must be carefully considered. Specify the lowest temperature at the time the disk is expected to burst. Consider that this might be the normal process operating temperature or even ambient rather than the calculated relieving temperature.
Note that different materials and different types of rupture disks have different sensitivities to temperature. This is an excellent topic of discussion for your rupture disk manufacturer!