I'm going to chip in here because I've seen a lot.
I contend the complexity of today's programs is different, not necessarily better or worse. Anyone who's worked on a system with hundreds of thousands of lines of IBM 360/370 Assembler code, like I have, will be nodding their head right now. Brooks' seminal "No Silver Bullet" paper was written in 1987 and was heavily based on his experience in the '60s and '70s managing projects requiring centuries of work-hours. There was quite a lot of complexity back then.
Then the software engineering community created development software and methods greatly reducing that complexity. I cannot express what a joy it is to have automatic garbage collection in modern languages and features such as classes that help enforce modularity and encapsulation. When I first learned C#, I was gobsmacked because there was a pre-existing data type, language feature, or library method for pretty much everything. I remember learning about sorts, hashes, etc. in college, but nowadays those are all efficient library functions that most people use without caring if it's O(n^2) or O(nlogn) or how many duplicate hits you get on your hash algorithm. I once worked on a large production system that contained self-modifying code, and today's development environments are absolutely wonderful in comparison to what existed back then.
So today's developers, for the most part, don't have to deal with that type of stuff unless they're being sloppy. Instead, as the article stated, it's more about the massive amount of distributed connectivity and networking. Software engineers in the '80s and '90s did a great job of solving many of the early complexity issues. Now we have a new breed of complexity which is just as bad for modern developers, and which we will hopefully solve in the coming decades.