Tomas,
I've experienced similar issues with computation times increasing by several orders of magnitude when units are involved. I've queried the issue with Wolfram's technical support and have received a response that included the following comments.
- "The Quantity objects are designed as a convenience feature rather than a computationally efficient objects."
- "The best method to handle the efficiency issue is to keep the Quantity objects out of repetitive computations, e.g. Table, loops, or plots, especially if you have a computation of which you know the result unit."
My experience is that, if run time is important, units are best avoided for all but the most trivial of computations.
As to the reason for the huge slow down, I'm not sure. Although I suspect one factor might be that there's such a vast number of units in the rule base (many of which are extremely obscure) that needs to be searched each time a unit appears in an expression. If this is the case then maybe part of the solution could be to ditch all the units that aren't being used in a particular notebook and work only with the required subset. But that, I think, would be something WR would need to address.
In respect to your current problem, you could try applying the function QuantityMagnitude[] to the expressions involved in the computations and evaluate only the de-unitised numeric values (although you may need to use UnitConvert first to express the values in consistent units - unless all variables are expressed in consistent units).
I hope this helps,
Ian