I have a couple questions regarding SAM/NREL estimates for tower and receiver capital costs and changes in the default values across different versions, as well as the models used by NREL and others for estimating these costs.
First, I think that between SAM v2020.11.29 and v2021.12.2 the default value for "tower cost fixed" changed from $8M to $3M, at least based on looking at my prior saved model files. I couldn't find a section in the release notes covering this, so I wanted to confirm whether this was the case (and, if so, inquire the reason: just incorporating more real-world data or something else?).
Second, looking at various publications and the SAM help file, it seems like there are some different options for estimating (or at least representing) the combined or separate tower and receiver costs. My versions of SAM use the scaling formula based on physical subcomponent heights
tower_cost = tower_fixed_cost * exp(tower_exp * (h_tower - rec_height/2 + helio_height/2));
. DOE reports a
2018 benchmark
of $137/kW
t for combined tower and receiver costs, citing
Turchi et al. 2019
, but the table there doesn't report the units or values for receiver and tower cost. I'm assuming DOE is reporting a single model result for the 100 MW, 14 h storage case they mention, scaled to per kW
t, rather than a linear linear model for receiver + tower costs? Or did SAM at some point change from using a linear model to the above relationship?
Thanks for any details you can provide. My motivation here is a mix of curiosity and wanting to have a justification for some input value changes between versions of an analysis I am performing.