Jump to content



Featured Articles

Check out the latest featured articles.

File Library

Check out the latest downloads available in the File Library.

New Article

Product Viscosity vs. Shear

Featured File

Vertical Tank Selection

New Blog Entry

Low Flow in Pipes- posted in Ankur's blog

Column Environment


This topic has been archived. This means that you cannot reply to this topic.
2 replies to this topic
Share this topic:
| More

#1 stu

stu

    Gold Member

  • Members
  • 83 posts

Posted 02 July 2011 - 06:38 AM

Dear Sirs,
when Simulating column, i had observed that the temperature of the column top vap. inside the column environment and outside the column are different.Please suggest what is the reason.If we change the transfer basis as T-H basis it is same but vapor fraction differs.
Please suggest the reason for this and how to simulate in proper way.
Regards,
Stu

#2 PaoloPemi

PaoloPemi

    Gold Member

  • Members
  • 549 posts

Posted 02 July 2011 - 04:19 PM

you do not mention the software and the specific case which may suggest the reason (partial condenser ? pressure drop ? numerical tolerance ?), a difference in temperature could be originated by a pressure drop or by some enthalpy addition/subtraction (for example a stage dp, a partial condenser etc.), there could be also little differences due to numerical tolerances etc.
Perhaps the book of Kister with the description of methods for solving columns could help.


#3 Chellani

Chellani

    Gold Member

  • Members
  • 78 posts

Posted 05 July 2011 - 03:19 AM

Could be two reasons that I can think of
1. Different fluid packages. Two different fluid packages (or rather property packages) can give you different results i.e. vapor fraction for the same conditions i.e. P & T. T-H by itself is not preferable method as transfer basis. P-H is default one and is also used in the downstream of all unit operations (which you can't change). If you have a change in fluid package; P-T would be preferable.
2. Inconsistent flashes of fluid package. You can find out this by first using P-T flash (define P & T of stream), copy enthalpy value calculated for this flash and use as input in P-H flash (define P & H rather than P & T for the same stream); this should give you same value of T which was used earlier. Similarly you can also check reliability of T-H flash as well.

Temperature can't be different because of any of the above reasons if you are using T-H flash as basis. If it is, don't worry because problem is not in your simulation; it is in your simulation tool so report bug to support.

Hope this helps.




Similar Topics