He cited an article by Bruce Schneier.
In response, I observed to a number of colleagues, proteges, and students that "One takeaway from this article and the Schneier article that it points to is that we need to reduce our attack surface. Dramatically. Perhaps ninety percent. Think least privilege access at all layers to include application white-listing, safe dcfaults, end-to-end application layer encryption, and strong authentication."
One colleague responded "I think one reason the cyber attack surface is so large is that the global intel agencies hoard vulnerabilities and exploits..." Since secret "vulnerabilities and exploits" account for so little of our attack surface, I fear that he missed my point.
While it is true that intelligence agencies enjoy the benefits of our vulnerable systems and are little motivated to reduce the attack surface, the "hoarded vulnerabilities and exploits" are not the attack surface and the intel agencies are not the cause.
The cause is the IT culture. There is a broad market preference for open networks, systems, and applications. TCP/IP drove the more secure SNA/SDLC from the field. The market prefers Windows and Linux to OS X, Android to iOS, IBM 360 to System 38, MVS to FS, MS-DOS to OS/2, Z Systems to iSeries, Flash to HTML5, von Neumann architecture [Wintel systems] to almost anything else.
One can get a degree in Computer Science, even in Cyber Security, without ever even hearing about a more secure alternative architecture to von Neumann's [e.g. IBM iSeries. Closed, finite state architecture (operations can take the system only from one valid state to another), limited set of strongly-typed (e.g., data can not be executed, programs cannot be modified) objects, single level store, symbolic only addressing, etc.)]
We prefer to try and stop leakage at the end user device or the perimeter rather than administer access control at the database or file system. We persist in using replayable passwords in preference to strong authentication, even though they are implicated in almost every breach. We terminate encryption on the OS, or even the perimeter, rather than the application. We deploy user programmable systems where application only systems would do. We enable escape mechanisms and run scripts and macros by default.
We have too many overly privileged users with almost no multi-party controls. We discourage shared UIDs and passwords for end users but default to them for the most privileged users, where we most need accountability. We store our most sensitive information in the clear, as file system objects, on the desktop, rather than encryptied, in document management systems, on servers. We keep our most sensitive data and mission critical apps on the same systems where we run our most vulnerable applications, browsing and e-mail. We talk about defense in depth but operate our enterprise networks flat, any to any connectivity and trust, not structured, not architected. It takes us weeks to months just to detect breaches and more time to fix them.
I can go on and I am sure you can add examples of your own. Not only is the intelligence community not responsible for this practice, they are guilty of it themselves. It was this practice, not secret vulnerabilities, that was exploited by Snowden. It is this culture, not "hoarded vulnerabilities and exploits," that is implicated in the breaches of the past few years. It defies reason that one person acting alone could collect the data that Snowden did without being detected.
Nation states do what they do; their targets of choice will yield to their overwhelming force.
However, we need not make it so easy. We might not be able to resist dragons but we are yielding to bears and brigands. I admit that the culture is defended and resistant to change but it will not be changed by blaming the other guy. "We have seen the enemy and he is us."