Writing great user stories

As a user story writer, I want to write great stories so that I can be clear to developers

One of the projects I have been working on lately has required me to write a large number of user stories for the other developers on the project. Part of the time has been spent essentially teaching our product owner to write stories independently, since we were having to work with him to get the stories broken down and clear enough for a developer to simply pick up and work with minimal previous knowledge required.

This is a great ideal for stories, but it's easier said than done. In this project I noticed some of my own assumptions making their way into some of the stories that I wrote, because of my knowledge of the problem domain. I had to rework my perspective when righting some of the stories to force myself to write the stories for other developers and not myself. Below are some of the techniques and concepts I use to reel in my perspective.

User / Role clause

Use this to narrow the scope of the story and add a more refined perspective for the who of the story.

  • Reference a specific user role, preferably defined in the project documentation.
  • Avoid simply declaring “as a user”. This is usually too broad.

Goal / Desire clause

Use this to address the what or the goal of the story while avoiding referencing concrete functionality concerns (Acceptance Tests will cover those).

  • Be specific about what a user must accomplish. If this part of the story isn't granular enough, it will not establish clear scope boundaries for a developer.

Benefit clause

Use this to address the why of the story. I feel that this clause is often hastily written, which is unfortunate, since many times a developer can come up with better alternative ideas about how to accomplish what when given the why.

  • Bring perspective to the business or user benefit. Sometimes the real benefit is not necessarily for the user themselves. It could be for customers or other external entities.

Acceptance tests

Acceptance tests should address much of the specific interface requirements. If a user story defines the feature's scope boundary, the Acceptance tests are the fence posts in that boundary. When writing ATs be think about a story from a developer’s perspective. Understandably this degree of granularity can be difficult for someone who isn't a developer.

  • Acceptance tests should have complete coverage of story boundaries.
  • Ideally Acceptance tests should correspond to automated tests (unit, integration, UI).

Overall concepts


When possible try to write a story as if you were writing it for a five-year-old. Simplicity is key in user stories.

  • Use basic words and avoid jargon.
  • When domain-specific jargon is necessary create a term dictionary or wiki for new developers.

Atomic stories

Each story should be as independent as possible.

  • Granularity makes stories easier for developers to digest and work.
  • Referencing multiple areas or components in an app can be a story smell. The story may need to be broken into multiple stories.


Sometimes stories are unavoidably reliant on each other. Project management apps sometimes provide tools to account for this.

  • Epics are course-grained user stories which can encompass many child user stories.
  • Setting up blocker tasks can establish precedence of stories.

Exporting SQL data into CSVs via Powershell

I've been working on an integration component for a web app with a team of developers. This system is supposed receive data from multiple disparate systems. In once case in particular, we really wanted some sample data from the company that was going to be passing us data to our API to better understand what some of the incoming data could look like.

The Problem

facepalm Since I had been traditionally the one to communicate with our liaison at this company. I asked him if he could provide us with something in the way of sample data and he obliged, albeit in an unexpected way. After one day passed, I received a zip file via Google Drive share link. I opened it up and found that he had simply exported a very large backup of the system's SQL Server database. Some of my teammates who were CC'd on the email thread noticed the ominous ".bak" file and scornfully remarked "What the heck is that file?!", understandably so, as most hadn't worked on anything in the Microsoft / .NET stack at all.

So it was up to me to get some viewable sample data out of this to my teammates who had no other way of viewing it, being on Macs with no Windows VM or Bootcamp setup. My first thought was to dump all of the tables into CSV files. Since we've received sample data in this format in the past and everyone was comfortable with it, that's the route I took.

The Options

I needed to figure out a fast way to get all of tables in this SQL Server backup into CSVs. The first step, of course, was to restore them into my local SQL Server instance. After that, I thought about a few different ways to accomplish the task. In addition to getting this data quickly, I hoped to find an easily reusable way to do this, because I've seen this scenario come up before. I knew that I could easily get CSVs out of a SQL Server Database with SSIS, which I've had extensive experience with in the past. However, it's time consuming and not easily made reusable. I also seem to recall the "Export Data" dialog in SQL Server being able to handle tasks like this, but I also remember it being unwieldy and having to setup awkward datasources similar to SSIS. My final thought and the solution I chose was to write a fairly simple Powershell script.

The Solution

I knew it would be fairly trivial to make the script reusable. I had used Powershell commandlet in the past that allows piping objects into CSVs, and it only took about 15 minutes to refresh my knowledge of how to select objects from SQL Server database using Powershell. In a few minutes I had a working script and was able to get the output files passed off to my teammates.

$database = 'MyDatabaseName'
$instance = 'localhost\SQLEXPRESS'

$tables = Invoke-SqlCmd -Query "SELECT TABLE_NAME FROM [$database].INFORMATION_SCHEMA.Tables" -ServerInstance $instance -Database 'msdb'
foreach ($table in $tables) {
  $tableName = $table.TABLE_NAME
  Invoke-SqlCmd -Query "SELECT TOP 50 * FROM [$tableName]" -ServerInstance $instance -Database $database |
  ConvertTo-Csv -Delimiter '|' -NoType |
  ForEach-Object {$_.Replace('"','')} |
  Out-file "C:\DATA\$tableName.csv"

I'm sure that there are more concise or elegant ways of solving this problem, but this is the route I chose based on my knowledge of the tools available and time constraints. I'm quite pleased that the script is fairly simple and reusable. I encourage anyone who has alternative methods to post in the comments thread below. I would love to hear your ideas.

Hello world!

Hello, internet.

So this is my first entry... I've thought a great deal about creating a blog at times in the past, but the pace of my life made it difficult for me to do so. I've come to realize however, that life probably isn't going to slow down. So I'm going to take this opportunity to start sharing some of the things I deal with on a daily basis and things that help me be a more efficient developer, both software and hardware. I hope that they are helpful to you, and if you have suggestions or alternatives to anything relevant to my entries, please comment and share them. I love hearing about different techniques, tools and technologies.

If you would like to read a little bit about my background, visit the About me section.