Are You Pitching Your Project Based on Burger Capacity?

by garyg 27. April 2010 11:29

I’m in Wal-Mart over the weekend shopping for a gas grill for my camp.  Now I’m not really into this at all since I mainly burn things on the grill, but the last one up and died on us so I had no choice if I wanted anything grilled this summer.  So here I am looking at the display of grills, about 10 if I remember.  They are all presented in much the same way, listing number of burners, square inches of cooking space, and in one case a gas consumption rate.  All except one, which happened to list burger capacity.  That’s right, burger capacity. I laughed until I thought about it a little more. It was advertising a capacity of 22 burgers.

Now you are probably thinking no one in their right mind needs to make 22 burgers at a meal.  I didn’t either, but I was definitely hungry (never buy anything food related when you are hungry), and I know one of my pet peeves is how long it takes to get all of them cooked when we do a cookout.  Everyone is done eating by the time you get to enjoy one.  With that kind of capacity I could eat with everyone else, and make up for my low yield count (sounds better than saying I can’t really cook).  So what does this have to do with pitching a project to a sponsor or review board ?  Stay with me on this.

Every one of those grills was giving me stats and data I couldn’t care less about.  All except the one, which directly addressed the one core need I had.  The point to this is that in this highly competitive environment, your project is likely competing for funding in a crowded portfolio.  They are all completing ROI in one year and “green”.  You need to find and address your project sponsors key needs beyond these.  We can’t afford to simply present the requisite ROI data, you need to find a differentiator in either escalating business capacity, increased order rates per day, more visitors per hour, etc. Whatever your businesses core need is, find it and align your project to it.  If you can’t, find a project that can because projects that can will get consistently funded.

7 Tips on Managing Virtual Project Teams

by garyg 16. April 2010 03:01

Its a Team, Virtually

One area that I never seem to see covered enough in Project Management literature is Virtual Teams, or more specifically how to manage them without it consuming your entire life.  Virtual teams are defined as teams whose members operate across space, time, and organizational boundaries and are linked through information technologies to achieve organizational tasks (or in your case, project tasks). Unlike conventional teams, virtual team members are not co-located, so they are more dependent on information technologies rather than face-to-face interaction.

So what does this mean for you, the beleaguered Project Manager who has just been told your new project team is scattered across the ends of the earth rather than down the hall?  Well unless you manage communication carefully you can loose control of your project faster than you ever thought possible.  As Project Managers we know that we manage more by influence than direct authority.  This will never be more clear to you than when you can’t realistically gather your team in a room and look everyone in the eye.   In some cases asynchronous communication (email, news group postings, voicemail, etc) may represent the majority of your communication.  Just don’t let it be the sole method you use (see below).

It CAN be a Competitive Advantage

The bulk of the press of virtual teams goes to off-shoring but the tips here can also be successfully applied to teams on this side of the pond as well.  Also depending on the nature of your project you may be able to benefit by extending the workday into different time zones (international teams may even be used for “follow the sun” style planning). 

Some Tips

So your probably saying enough already, on with the tips.  Here we go:

  1. Do not rely solely on email or other asynchronous communication methods.  This may seem obvious but I’ve seen more than one junior PM and a few senior PM’s who should have known better fall into this trap.  If you are forced to use it for the bulk of your communication due to time zone differences, ensure you are scheduling regular good old live synchronous communication (i.e. a conference call) at regular scheduled intervals.
  2. Accountability is key.  Insist on frequent detailed status updates indicating progress on key milestones.  It isn’t enough that your getting task updates out of MS Project, you are looking for complete, transparent, and ongoing understanding.  I typically require stoplight style (look for a discussion on this in a future post) for any of my direct reports to make sure we both understand what’s expected and needed to keep things moving.
  3. Embrace Video Conferencing.  This technology used to be outrageously expensive but now with a web cam and and headset anyone can look like a pro.  There are some really high quality proprietary services out there but at very least Skype is free and easy to set up.  Just test everything before the “big call”.  Eye contact is invaluable and you’ll wonder how you ever got by without it.
  4. Multiple Mediums.  Use every communication method available to you.  I’m especially fond of Microsoft SharePoint team discussion sites where communication is threaded and can be easily referred back to by all stakeholder.  Some people just seem to respond better to some mediums than others so this will keep misunderstandings to a minimum.  This is especially important in multi-cultural teams.
  5. Get face to face.  This also sounds obvious but sometimes you need to go to the mountain.  Ideally having everyone in a big room for a “kick off” meeting would be ideal, but if you can’t do that make plans for occasional face time even if its one on one.
  6. Get to know your team.  This is a tough one for some of us in the IT field.  Get to know and be interested in your people.  Spend time bonding (even if just over the phone) on non-work items.  People are much less likely to let down someone they have a personal relationship with than that antiseptic PM who is only interested in the project status.
  7. Keep the option open to co-locate.  If possible this can be a lifesaver.  I always try and negotiate this in my contracts or charters.  If things start to go wrong on a critical project you want the ability to relocate people to the same work area if it seems misunderstandings are mounting and getting in the way of progress.  This can be enormously expensive so you need to account for this in your Risk Management Plan.  I find just knowing I have this option can keep things moving along.

I hope at least one of these can help you out in a future or current project.  In a follow on to this post I’ll discuss potential effects of Virtual Teams on your Communications Management Plan and my theory on how it effects the Communication Channel formula used in the PMBOK (Project Management Body of Knowledge).

Can you be both hands-on technical and a good Project Manager?

by garyg 10. April 2010 10:05

This has to be one of the most heated debates I’ve ever seen play out in recent memory.  Recently someone asked this question on a LinkedIn PMP only discussion (later deleted by LinkedIn) group.   This is sort of one of those Red vs. Blue questions and  your ability and experience tends to shape your opinion in this area. 

In a perfectly executed large 250+ resource project following the PMBOK (Project Management Body of Knowledge) processes you should be able to rely solely on the science, process, and SME’s (Subject Matter Experts) while you concentrate on the business of Project Management.  Actually you’d be foolish to think your individual technical contribution would even put a dent in the plan in a truly large scale multi-year project.  Lately however, these ideal situations and projects are less the norm and more the exception as organizations are more likely to be concentrating on the smaller, rapid ROI projects in the portfolio.

More often than not some balance needs to be struck.  Either you can’t completely rely on your SME’s, your teams knowledge ends up light in an area you are strong in, or the project is just too small and fast moving for you not to be a little more “hands-on”.  In these cases as a PM your ability to jump in along side your team (as long as you aren’t losing sight of your primary role as PM) could only be an asset. 

Also if you are Crashing or Fast Tracking  (schedule compression techniques) a project, and requiring your team work into a holiday weekend, you’d better be prepared to put in some work product yourself or risk being compared to the Pointy Haired Boss ;-).

So do I think we’ve settled the debate here ?  Not by a long shot.  However if we can generate some lively and useful discussion (especially among our stakeholders) than it is worthy of engaging.

Preventing multiple plug-in request calls with IsRedirectFollow()

by garyg 10. April 2010 10:01

Figured I'd share something I found valuable.  Did you ever make a call to a request level plug-in in a Visual Studio 2008 WebTest and get multiple calls to the same plug-in because of a 302 redirect?  Well I did and took me a little bit to find out a way to prevent it.
When you are making the call in your code, decide if it should run based on IsRedirectFollow property (
As an example:

  1: namespace MyAppTests
  2: {
  3:     public class GetSomethingPlease: WebTestRequestPlugin
  4:     {
  6:         public override void PostRequest(object sender, PostRequestEventArgs e)
  7:         {
  8:             if (e.Request.IsRedirectFollow == false) //only want to run this on on a primary, not a redirect
  9:             {
 10:                 //do something here
 11:              }
 12:          }
 13:       }
 14: }

Anyway, I hope this helps someone else a little further along.  It works in VS2008 and VS2010 as well.  I'm sure there could be a more efficient way, but this worked in a pinch ;-)

Using Visual Studio 2008 Web Test Request Plug-In to Check results in a SQL DB

by garyg 10. April 2010 06:12

Some of my regular readers said they wanted to see more "technical" content, so here is one that perplexed me a while.

While assisting a client with setting up an automating testing environment using Visual Studio 2008 Web Tests (among other things) we uncovered a need to check the results of a transaction halfway through, then when its complete to verify the results of the test.

Now I know what experienced testers and SQA people are thinking, "why didn't you just use an Extraction rule from a results page and validate that?". Yes, that's the first thing I wanted to have done as well and that works quite well under most circumstances.

Unfortunately on this particular application there is no real "confirmation" screen that displays the results of the transaction, just kind of a yeah I did or no I didn't kind of page. Not good enough in our case where I wanted to have real transaction results.

Since time was very short and I'm still working with the group to adopt more "test friendly" designs we needed a sure way of verifying the results. I though that this check DB feature would be a native feature in the VSTS2008 Web Test, but the only DB connectivity included out of the box was binding to a DB for data driven testing (which is very useful as well).

So our option was a to create a request level plug-in to go out to the DB and check the transaction results, and write them back to the test results.

My goals here were:

  1. Connects to an SQL DB.
  2. Builds a query string using a Context parameter
  3. Puts the value pulled from the DB back into another Context parameter for use in follow on requests.

Here is how I did it (in a plug-in 101 type format), complete with code snippet I used to create this:

1. Create the following in a class library in your test project (that part is covered in detail in MSDN), compile, and reference it:

  1: using System.Text;
  3: using Microsoft.VisualStudio.TestTools.WebTesting;
  5: using System.Data.SqlClient;
  9: namespace Test1
 10: {
 12:     public class MyRequestPlugin : WebTestRequestPlugin
 13:     {
 15:         public override void PostRequest(object sender, PostRequestEventArgs e)
 16:         {
 18:             base.PostRequest(sender, e);
 20:             int CustomerID = 0;
 23:             // this is my connection string
 24:             String connectionString = "Persist Security Info=False;Initial Catalog=dbname;Data Source=machinename; User Id=dbuser;Password=somepassword";
 27:             // select statement getting just the field I need.  Note that if this is;
 28:             // messed up it may throw an error saying is cant open the db.;
 29:             // This is misleading, its probably your select; 
 30:             SqlConnection connection = new SqlConnection(connectionString);
 32:             string queryString = "Select CustomerID from Orders where OrderID=" + e.WebTest.Context["OrderID"];
 36:             SqlCommand command = new SqlCommand(queryString, connection);
 38:             command.Connection.Open();
 40:             SqlDataReader reader = command.ExecuteReader();
 42:             while (reader.Read())
 43:             {
 45:                 CustomerID = Convert.ToInt32(reader[0]);
 47:             }
 49:             e.WebTest.Context.Add("CustomerID", CustomerID);
 53:         }
 57:         public override void PreRequest(object sender, PreRequestEventArgs e)
 58:         {
 60:             base.PreRequest(sender, e);
 62:         }
 64:     }
 66: }

2. Insert the Request Plug-in (if you compiled and referenced it, it will be in the list.) AFTER the Context parameter you are using (in a production test you'll need error control, errors in a Plug-in are ugly and will mess up your results).

This whole exercise made think that a "data check" validation rule of sorts really should be part of this product.

Anyway hope this helps someone else a little further along some day using web tests. This same method also works in Visual Studio 2010 as well.

About the author

Gary Gauvin is a 20+ year Information Technologies industry leader, currently working as the Director of Application Lifecycle Management for CD-Adapco, a leading developer of CFD/CAE solutions. Working in both enterprise environments and small businesses, Gary enjoys bringing ROI to the organizations he works with through strategic management and getting hands-on wherever practical. Among other qualifications, Gary holds a Bachelor of Science in Information Technologies, an MBA, a PMP (Project Management Professional) certification, and PSM (Professional Scrum Master) certification.  Gary has also been recognized as a Microsoft Most Valuable Professional.

LinkedIn Profile:

(Note: Comments on this blog are moderated for content and relevancy)


Month List

Page List