Existing energy infrastructure already pays for "capacity" thats not being used, to older systems like natural gas and coal. This is necessary because energy production MUST match energy demand or someone's lights are going out. When an energy source (power plant) on a reasonably big grid suddenly goes offline ("trips out"), the grid can tolerate it but drops dangerously low. Other plants that are only putting out say 70% of their capacity quickly spin up to max production to cover this shortfall and bring the grid back up to a safe level. (usually in terms of grid frequency)
These plants don't want to just sit at 70% production most of the time, that costs money to have capacity and not be SELLING it. So the grid pays them a percentage of the going rate for their unused capacity, because the grid MUST have some reserve in case of the previously described situation. They're being paid to have (but not USE) capacity. ie be a "safety net". If we weren't willing to pay them something for this unused reserve, then they'd have no reason to invest in having it in the first place.
Renewable energy sources (like wind and solar) are in similar situations. When its windy and sunny, they produce plentiful, cheap energy. During those times, non-renewable plants can shut down or throttle back, saving consumers a lot of money with their cheap energy. But sometimes production exceeds demand. It depends on the time of day or day of week. (and even time of year) Solar and wind aren't just "on" or "off". Once you get enough of them online, there's going to be periods of time where more energy can be produced than can be used. (and storing energy is hard - we're just getting going with grid batteries) But if you want all that delicious cheap power during the peak use, you've got to give them something for the times they have more power than you need.
So there's nothing new about "paying for capacity", we've been doing that for decades. It IS slightly more annoying though, since this is "leaving money on the table" since that capacity is essentially "free" power not being used. (no coal or natural gas required) That's just the thing with most renewables, they produce power on their own schedule, not when it's convenient for us. We need to improve our energy storage infrastructure. I'm not sure why this is only "becoming obvious" now, it's been a known issue for decades. Maybe it's just been a matter of waiting for better energy storage technology, or maybe it's a "we'll bring up this added expense after we're done paying for the wind and solar plants"?
All this means that just like the hydrocarbon plants, renewable plants need to get paid something for the time they spend not being fully used, because other times we NEED them to run at full capacity.
This isn't a problem that's going to go away on its own. We've got A LOT more ways to make power than to store it. Batteries are probably the best solution in the short term, but they're relatively large for what they store and are expensive. Pumped storage (dams) are the grand-daddies of energy storage, but have very specific geographic requirements. Molten salt is often considered (especially for solar) but has technical challenges and limited capacity. We really haven't found anything so far that works better than batteries, and it's not for lack of trying. So for now you can expect we're just going to have to keep paying energy plants (of all types) for unused capacity.