Application Lifecycle Management – Improving Quality in SharePoint Solutions

Introduction

“Application Lifecycle Management (ALM) is a continuous process of managing the life of an application through governance, development and maintenance. ALM is the marriage of business management to software engineering made possible by tools that facilitate and integrate requirements management, architecture, coding, testing, tracking, and release management.”

In this and future blog posts we will look at how ALM and the tools that MS provides support us in ensuring high quality solutions. Specifically, we explore a few different types of testing and how they relate to our SharePoint solutions.

  • Manual Tests (this post)
  • Load Tests
  • Code Review/ Code Analysis
  • Unit Tests
  • Coded UI Tests

To get things straight, I like testing. I think it is by far the best (academic) method to prove you did things right. And the best part, even before the UATs start!

This post is not meant to be exhaustive nor used as the perfect recipe for integrating testing in your products lifecycle. It is aimed at getting you started with some of the testing basics and how to set them up for your SharePoint projects. Whether you start a project with the design of your tests (yes, those guys need designing too) or use them to sign off your project, that is of course completely up to you.

Note: it is perfectly feasible to have different tests target the same area. For instance a load test might show memory degradation over time because of memory leaks in your custom solution. A sanity test could warn you for the same issue, but does so by analyzing your custom code and look for the incorrect disposal of objects.

The Visual Studio 2012 Start Screen contains a lot of how-to videos related to testing, so make sure you check those out too!

Videos

Core Concepts

Before we start, let’s look at some of the core concepts.

Tests are part of a test plan.

Sounds pretty simple, right? You cannot start implementing any kind of realistic testing unless you at least determine the following:

Goal you need to set the bar at a certain level. Even if you are just going to test a small part of the products functionality, state so in your test plan goals.
Exceptions highlight areas or components not part of your tests. Examples are external line of business systems, interfaces and third party assemblies. Or make it an opt-in, so something along the lines “anything not addressed here, is not part of the test…..”.
Tests covered what type of tests do you cover (and not what do your tests cover)?
Software used list the tools you need to implement and execute your test plan.
Test data describe your test dataset if you need one.
Test strategy how and in what order are we going to execute these tests? And describe how we are going to report on the findings and results. Answer questions like: what depth do we need and who the actors are in our play.
Test environment describe the required conditions needed for correct execution of the tests.

Tests need a design.

Every test, even the modest ones need some form of design:

Name Descriptive name or title plus a brief description.
Link to a requirement if possible, link a test to a requirement (or set of requirements), a user story, product backlog item or functional description.
Test data used what is needed from your test data set to perform this test? Link or attach the items to your test case.
Measurement what indicators do we use to determine a pass or a fail?
Expected result(s) For example, in case of a load test you would expect the environment to stay in the specified “Green Zone”. In case of a manual UI test every step could have its expected result:

  • Step 1: Login as user x with password y.
  • Expected result: Intranet landing page should render without errors.
  • Step 2: Navigate to news by clicking the Global Navigation Menu Item labeled “News”.
  • Expected result: News overview page should render, with a list of articles ordered by date descending.
Pass/ fail criteria

Manual Tests

Simple but powerful. These type of tests are easy to setup (as in non-complex), but usually require a decent amount of effort to work out. My advice, keep them simple by design with small descriptive steps. You could administer the different tests using Microsoft Excel, but of course Microsoft offers tooling for this.

Microsoft Test Manager allows you to plan, manage, design and execute manual tests. It can also be used to run exploratory test sessions and automated tests, directly from a test plan. Connected to TFS (or Team Foundation Service) it enables the logging of bugs and can provide trace information and snapshot capabilities. It enables the tester to provide as much valuable information using recorded actions and a commenting system.

Microsoft Test Manager requires one of the following Visual Studio editions: Visual Studio Ultimate, Visual Studio Premium or Visual Studio Test Professional.

More information: http://msdn.microsoft.com/en-us/library/jj635157.aspx

To get you started using Microsoft Test Manager, check the hyperlink mentioned above or follow these steps.

  1. Fire up Test Manager, either from a Visual Studio test case or directly from the program group.
  2. Connect it to Team Foundation Server or Team Foundation Service. For more information about TFS and Team Foundation Service, see previous blog posts:
  3. Create a test plan (see “Core Concepts”)
m1 m2

Your test plan should contain at least one Test Suite that will hold the actual tests to be run. You can add Test Suites directly by hand or automatically by adding requirements to your plan. As a bonus, a default static Test Suite is created automatically for you but you can also nest Test Suites if you like.

Your test plan should contain at least one Test Suite that will hold the actual tests to be run. You can add Test Suites directly by hand or automatically by adding requirements to your plan. As a bonus, a default static Test Suite is created automatically for you but you can also nest Test Suites if you like.

  1. Add Test Suites by adding requirements to your plan. The query by default shows all items from the “Requirements” category and thus includes bugs, backlog items, etc.
m3a m3
  1. Finally, we can start adding Test Cases to our Test Suite. If you already have Test Cases setup through Visual Studio or TFS web, you can use the query based “Add” method. There is also an option to create them directly from Microsoft Test Manager through the “New” button:

m4

These Test Cases are stored in TFS and automatically linked to the Product Backlog Item. You can also attach or link additional items or documents for the Tester to use.

The actual test run is also performed from Microsoft Test Manager. It shows the tester the steps and expected outcome using a “split-screen”:

m5

The real fun starts when testers provide feedback whenever a step fails. From the results of this test, a bug can be created and stored in TFS. This bug then contains the steps performed and any extra information (comment, video, screenshot, etc.) the tester provided.

m6

More posts to come 🙂

Update: Part two is ready, so now it is officially a series (allbeit a small one).

  • Load Tests
  • Code Review/ Code Analysis
  • Unit Tests
  • Coded UI Tests

/Y.

Advertisements

2 comments

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s