Improving software quality by optimizing through automation

At STPCon 2013, HP's Kelly Emo talks about how organizations are using new software tools to improve software quality through the use of automation.

As part of the conference coverage of STPCon 2013 in Phoenix, TheServerSide talked with Hewlett-Packard Co.'s Kelly Emo about how organizations are using new software tools and development frameworks that are improving software quality by optimizing the software development process through the use of automation.

Part of our strategy is having a framework that allows organizations to plug in the different components they are already using, but we extend it quite a bit beyond that.

Kelly Emo,
director of applications product marketing, Hewlett-Packard Co.

What does a complete, effective application development software stack look like? For many, the simplest, smallest set of tools required to get the job done is the right answer, which is why so many development environments consist of little more than the compulsory integrated development environment such as Eclipse or IntelliJ, the requisite source code management tool such as GIT, a team tracker such as JIRA, and hopefully a continuous integration tool such as Hudson or Jenkins. A Java minimalist would likely endorse such a setup. But what one gains in terms of simplicity from such a setup is lost in terms of maximizing automation, minimizing human involvement and benefiting from gathering continual feedback and collaborative insights from all phases of the application lifecycle process.

Moving beyond continuous integration tools

"Jenkins is great for doing builds and running tests, and it's a great start for organizations looking to automate manual processes, but there is so much more to be gained," says Kelly Emo, director of applications product marketing at Hewlett-Packard Co. (HP). According to Emo, organizations that take a minimalist approach to application developing are missing out on significant opportunities for optimization, which can be achieved by extending their use of automation beyond simply compiling code and running unit tests with continuous integration (CI) tools. Just pushing beyond simple CI and into other areas, such as load and regression testing, would be a good start.

But there are other opportunities for optimization through automation that go beyond the simple integration of different testing functions. According to Emo, organizations should "squeeze out, wherever possible, any manual efforts, whether it is the provisioning of the underlying staff that support the build and the test, or triggering the scheduling and actual execution of that information." Many opportunities arise where human tasks can be simplified through automation, and finding these opportunities can be as simple as looking beyond the software development lifecycle (SDLC) and thinking about the entire application lifecycle management (ALM) process.

Insight, automation, optimization and collaboration

"We've always considered the SDLC as a core part of the overall application lifecycle management process," says Emo. But when thinking about automation, you must go beyond the core and consider the complete application lifecycle, including demand management, design, requirements, dev-testing, readiness for delivery and how software moves through development to operations (DevOps). According to Emo, it should all be a seamless, symbiotic flow where each phase collaboratively sends information back into the others. "When things are out in operations, you're going to see how that application behaves, and you need to flow all of that information back into the demand process and then effectively bring that information back to your agile team."

So how does an organization go from embracing a minimalist approach to software development to one that can intelligently automate various aspects of the application lifecycle?

"Part of our strategy is having a framework that allows organizations to plug in the different components they are already using, but we extend it quite a bit beyond that," says Emo. That means providing a framework that pulls together everything from portfolio management to DevOps, and pulling in feedback from every stage, making every operation traceable. All of the data and meta data generated from such a comprehensive system can then be used to provide a better understanding of how a given project is progressing. Vice presidents can see overall trends, while dev managers can see information that might be specific to a spring, and QA managers can see trends around defects and quality, and project managers can see how developers and testers are allocating their time and priorities. The key is then pulling together all of the generated data and meta data to give everyone involved in the process a better understanding of how a project is progressing. "We go way beyond extending from what you would consider the core SDLC, integrating with it and tying it together with test management, design management, development management and then release management," said Emo.

So much of the work Emo's team does to help companies automate the ALM process revolves around products like HP Agile Manager and HP Application Lifecycle Intelligence. But regardless of the platform and the tools being used, the basic premise remains the same: By considering all aspects of the ALM process, a plentitude of opportunities arise in which processes can be optimized through automation, and insight can be gathered by feeding input from one lifecycle phase back into another, all of which goes toward improving and simplifying the onerous task of delivering high quality software to the user.

How have you used automation to simplify the application development process in your organization? Let us know.

Dig Deeper on Front-end, back-end and middle-tier frameworks