|

High Temperature Alarm Setting For Tube Wall Temperature In A Fired He
#1
Posted 03 May 2010 - 08:28 PM
I need information related to Tube Wall Temperature (TWT) or tube skin monitoring in Fired Heater operation. When we consider to set a high temperature alarm for the TWT in a fired heater, what temperature should be used as the setting? Currently, we are considering some ideas such as:
1) Maximum TWT calculated as mentioned in the heater specification sheet (usually there are 2 maximum TWT mentioned in the specification sheet: calculated and design)
2) Creep temperature or Limited Design Temperature (refer to API Std 530 table 4) for the corresponding tube material
3) Certain temperature below Limited Design Temperature as mentioned in the point 2
It will be highly appreciated if any of you can share your experience in such issue.
Best regards
Obs
#2
Posted 04 May 2010 - 12:00 AM
If you look at API 530 it says quite specifically about what should be the maximum metal temperature:
The limiting design metal temperature is the upper limit of the reliability of the rupture strength data. Higher temperatures, i.e. up to 30 °C (50 °F) below the lower critical temperature, are permitted for short-term operating conditions, such as those that exist during steam-air decoking or regeneration. Operation at higher temperatures can result in changes in the alloy's microstructure.
Edited by Zauberberg, 04 May 2010 - 01:24 AM.
#3
Posted 04 May 2010 - 04:40 AM
Big thanks for the enlightenment,
I mentioned creep temperature or Limited Design Temperature since I found them in two different references but their values are almost similar.
Cheers
#4
Posted 04 May 2010 - 04:41 AM
1. What is the purpose of an alarm?
2. What priority and action should be assigned to process condition leading to equipment failure?
This concept is usually defined within the "Alarm Management" philosophy. In your particular case, you want to have an early warning in cases when actual metal temperature approaches its design value - that should be the High Alarm (AH) value. If the operator does not respond on High Alarm warning and the metal temperature continues to rise, he will be approaching the condition when the tube(s) can actually fail due to change in the alloy's microstructure. That should be the Trip Point (AHH).
To summarize, Design TWT could be a High Alarm, and 20-30 degC below the LDT could be a heater trip point.
#5
Posted 04 May 2010 - 09:22 AM
To expand the subject further, we need to look at two things:
1. What is the purpose of an alarm?
2. What priority and action should be assigned to process condition leading to equipment failure?
This concept is usually defined within the "Alarm Management" philosophy. In your particular case, you want to have an early warning in cases when actual metal temperature approaches its design value - that should be the High Alarm (AH) value. If the operator does not respond on High Alarm warning and the metal temperature continues to rise, he will be approaching the condition when the tube(s) can actually fail due to change in the alloy's microstructure. That should be the Trip Point (AHH).
To summarize, Design TWT could be a High Alarm, and 20-30 degC below the LDT could be a heater trip point.
#6
Posted 04 May 2010 - 09:37 AM
In my refinery, we haven't used tube skin temperature as a trip system yet. Currently we only set a TAH - whose the values are still under discussion as I mentioned before. Is it common for a heater to have a trip point from its tube skin temperature? Further information will be very appreciated. Thanks for sharing, Zauberberg. You're so helpful.
Cheers
#7
Posted 04 May 2010 - 10:03 AM
Also, if I remember well, TAHH was to be activated on a time-delay mode, e.g. after continuous exposure to high temperature in duration >30 seconds. But I can't remember the figure exactly. I'll see if I can get in touch with my colleagues from Bioko Island and will let you know in case I get the answer.
Edited by Zauberberg, 04 May 2010 - 01:09 PM.
#8
Posted 04 May 2010 - 07:11 PM
Again, thanks for all the helpful information.
See you again and take care
Obs
#9
Posted 09 May 2010 - 12:59 PM
1. A boiler starting in 2001 required one skin temperature measurement on the economizer tube fed with boiler feed water (that is last tube to stack, concerning flue gas path). Purpose was to see actual difference to flue gas acid dew point (estimated), so that monitored metal (skin) temperature be always higher (no trip) by up to 15 oC. Vendor asserted no need of it and finally this skin temperature sensor was not installed.
2. A new boiler to start after about 2 years includes requirement of skin temperature monitoring as above in its duty specification.
3. Rest operating boilers do not have skin temperature monitoring, even though two of them have been operating for fifty (50) years. No thoughts of protecting metal from high temperatures (through skin temperature monitoring) have been detected.
4. Nevertheless conditions of fired heaters may be different (mentioned API 530). Any advice on this probable difference dictating measurement of skin metal temperature? It would be useful to know.
Similar Topics
![]() Negative Pressure Discharge / Negative At High PointStarted by Guest_felderosfelder101021_* , 05 Jul 2025 |
|
![]() |
||
![]() Pipe Fluid Temperature Increase Due To Solar LoadingStarted by Guest_HeemeshJ126_* , 06 Apr 2016 |
|
![]()
|
||
![]() Recommended Installation Location For "pressure High" InterlocStarted by Guest_omupei_* , 20 Jun 2025 |
|
![]() |
||
![]() Fired Heaters/ Burners/ Flue VelocityStarted by Guest_Lyne_* , 15 Mar 2024 |
|
![]() |
||
![]() Distillation Tower Top And Bottom Temperature And PressureStarted by Guest_student55_* , 27 Mar 2016 |
|
![]()
|