Fortune 500 companies abandon more than one-third of software-development projects after implementation because they don’t meet user needs, according to a study released today by analyst firm Voke Media. The survey was conducted from April through July.
That sees a lot of money squandered — the average project costs $3.2 million and requires almost 1,300 person months to complete, Voke said.
The solution lies in automating the requirements and test-case creation processes and involving business analysts from the beginning, Voke founder Theresa Lanowitz told InternetNews.com. Business analysts are “a valuable conduit between the business and application developers,” she said.
One of the reasons why so much software development is a waste is the lack of requirements verification. The study found that only 45 percent of organizations ensure that requirements are reviewed and approved before IT begins designing and developing the application.
The other two reasons are inadequate tools and the lack of automation. Most of the respondents to the study create test cases manually with Microsoft Word, which “gives you this huge document.”
This results in lengthy reviews that often miss key elements and “result in late feedback of critical elements much later in the life cycle when the information is actually used by a stakeholder in development or testing,” the report said.
How could Fortune 500 companies have such poor processes when proper application-development procedures are standard operating procedure in the mainframe world and all of them have had mainframes for years?
“Just because they’re pretty much household names, it doesn’t mean their IT department is completely organized,” Lanowitz explained. “Some have a very good quality assurance department, others have a very good development organization, but that doesn’t mean everything is moving along in a precise fashion.”
The changing nature of applications and application development made things worse. “For a long time, the approach was ‘let’s just throw some code together’ because all these new languages like C++ and Java were coming out and they were supposed to make development easier,” Lanowitz said.
“We didn’t have proper requirements defined, proper testing methodologies in place, and people are only now beginning to realize this is a complete life cycle discipline,” she added.
Web-based development only made things worse. “Enterprises fell victim to the false promises of the Web making things easier,” Lanowitz said. “It did, but it also made things far more complex; you need more system integration, use cases and testing,” she added.
But don’t standards such as CMM, the Carnegie-Mellon University’s Software Engineering Institute’s Capability Maturity Model, help? About 15 years ago, CMM aimed at helping software development teams move from ad hoc, chaotic development processes to mature, disciplined processes.
This article was first published on InternetNews.com. To read the full article, click here.