My objective is to simulate a commercial building with PV + batteries. As a start, I modelled only PV, and saw that the cost from the demand charge (kW) did not change when PV was added to the building. This is strange, because you could clearly see that the peaks from the grid were reduced in sunny months, such as in July. In July, the building load peak is 185 kW, but the power received from the grid is only 140 kW because of the PV production, when looking at the time series plots. Anyway, the demand charge of $821 is the product of 185 kW times the fee of $4.44/kW and not $140. Is there is a bug in the system that makes the grid tariff based on the residential load only? Or is the PV not connected behind but in-front of the meter?
This made me look further into how peak demand is handled in the program, and as you seen in my attachments, the data don’t seem to correlate with each other: When the battery was added, the peak demand was reduced, but the demand charge was increased, which doesn’t seem very logic to me.
Also, the data in the table seem to be wrong, as the peak demand with the system is almost zero. Here it seems to be a missmatch between the tables and the graphs, as the time series plots show a more realistic demand.
Maybe someone knows what caused this problem, and I would be very thankful for your help