If you want to be fast, and don't mind using the command line, Claude Code is your go-to, but you'll pay through the nose.
If you want it to be free (for now) and want a web interface, go mess around with
Vibes.diy, but your apps will only work on their platform unless you do a lotta work to selfhost.
If you don't mind slow, set up something like Ollama and Aider, or Browser-Use, but unless you have a decent GPU the slow here is VERY slow with anything above a small or medium model.
Regardless of the method, LLM-assisted bugs are pervasive and perversely subtle, and thinking through what you want to do and defining it in text for an LLM that barely clears the threshold of "clueless intern" becomes an exercise in frustration as your goal seems close, but never quite close enough to actually achieve without manually re-writing it yourself.
Good luck.