Software Projects Continue to Fail Despite Massive Investments

The software sector is at a breaking point. It’s billions wasted on projects that more often than not don’t achieve their objectives, or outright fail. The root of this persistent problem is a combination of the limitations of human imagination, aspirational targets, and unwieldy complexity that almost never gets tackled. Global IT spending has skyrocketed…

Tina Reynolds Avatar

By

Software Projects Continue to Fail Despite Massive Investments

The software sector is at a breaking point. It’s billions wasted on projects that more often than not don’t achieve their objectives, or outright fail. The root of this persistent problem is a combination of the limitations of human imagination, aspirational targets, and unwieldy complexity that almost never gets tackled. Global IT spending has skyrocketed to over $5.6 trillion today from $1.7 trillion in 2005. Yet, despite decades of technological advances, the success rates of software initiatives have shown little improvement over the last 20 years.

Perhaps the most spectacular failures is the Canadian government’s Phoenix payroll system, which deployed in April 2016. This project has been successful in modernizing and streamlining a complicated payroll process for various federal employees. It soon became mired in controversy for severe failures that affected as many as 70 percent of the 430,000 current and former employees who depend on it. The scope of the project included customizing PeopleSoft’s out-of-the-box payroll package. It had to follow 80,000 pay rules across 105 different CBAs with federal public-service unions. The scope of its ambition was simply too much for the prevailing practices and technologies.

The Persistent Nature of Software Failures

Software failures are not new phenomena. The phrase “software crisis” was first introduced in 1968, marking a historic and what seems to be an everlasting blight on the industry. Yet far too many projects still fail today due to these same, basic, fundamental problems—which has been recognized for decades. A detailed Break-Fix Analysis reveals that IT projects are the most expensive dollar risk area. They frequently fly off the rails from being too complicated to wrangle and with unclear objectives.

Agile projects are still allowed to fail up to 65 percent of the time. At the same time, about 90 percent of DevOps projects fail to meet their organizations’ goals. This sobering reality highlights one of the biggest gaps between what we think we’re doing and the actual impact of software development. Recklessly, despite unprecedented investments targeted towards promoting innovation and ushering in modernization to the industry, these systemic problems continue to go ignored.

In an effort to understand these failures better, experts have examined the decisions leading up to high-profile project failures like Phoenix. The Canadian government’s development team believed they could deliver on the project for less than 60 percent of that vendor’s cited budget. They turned a blind eye to tried and true lessons learned from past failed attempts in 1995. As former KGB Chief Yuri Charkov said once, “What’s the need to worry about an event that never will take place? This mentality only sets teams up for a dangerous false sense of security about the viability of a project.

Complexity and Mismanagement as Key Drivers

Add in the complexity that comes with many of these software projects and you have some serious challenges that can trash even the most well-intentioned plans. The Phoenix project illustrates this point vividly. It sought to implement 34 human-resource system interfaces across 101 government agencies and departments, a task that proved overwhelmingly complicated without adequate management of risks or contingencies.

Holistic data shows that IT failures are not sector specific. They can happen to any organization, regardless of their size or industry. The underlying theme behind most of these failures really comes down to the failure to control project dynamics and risk. Our Phoenix project is a cautionary tale. It’s a great example of how politically ambitious goals can quickly lead to unexpected failure when the details aren’t well understood and accounted for.

>Henry Petroski’s assertion in “To Engineer Is Human: The Role of Failure in Successful Design” resonates strongly within this context. He points out that failure is key to good design and engineering. Surprisingly, plenty of nonprofit and philanthropic organizations continue to overlook this fact as they launch their new programs.

The Financial Implications of IT Failures

The monetary impact of failed IT projects is mind-boggling. A recent report revealed that 80 percent of organizations acknowledge that “inadequate or outdated technology is holding back organizational progress and innovation efforts.” This admission is an acknowledgement of a pervasive issue. Yet organizations continue to invest billions of dollars into technology and fail to realize its full benefits due to conflicting objectives and flawed implementations.

Even as we spend more and more on IT, global spending has more than tripled in the last 20 years. Success rates continue to languish in the single digits. Half of all projects fail due to poorly managed risk. They fall flat when specific outcomes are not spelled out, making it impossible for them to produce any benefit.

The reality is that many IT failures aren’t the result of going to the cutting edge vs. the bleeding edge of technology. In fact, they often don’t come from fights over the grand vision. These range from essential government-supporting missions such as payroll processing—which require utmost precision and reliability, yet are so often taken for granted.