Thursday, December 31, 2009

PM vs. BA

A couple weeks ago I wrote about the differences in the approach to the roles of the Project Manager and the Contract Officer Representative (PM vs. COR). But that doesn't identify all of the potential seams around the PM role. Today I want to take a look at the Project Manager versus the Business Analyst.


I came across this graphic in one of the training programs I participated in not too long ago. I am interested in becoming a CBAP (Certified Business Analysis Professional).



I don't see myself as fitting exclusively into the PM role or the BA role, and the reason for this is because if you can perform the PM role perfectly and still have a failed project. Similarly, you could perform the BA role perfectly and also fail with the project.

To be fair, I think that the PMI would disagree with this graphic on some points. I think PMs must deal with stakeholders and the graphic doesn't give them credit for that. I also think both PMs and BAs identify and deal with business issues as well.

But regardless, industry is intent on driving a wedge between these two roles. I suspect that the reason for that is because there is more money to be made in training and certifications if these are considered different roles. But I really think PMs need to be strong BAs in order to be really successful. You can't just control the levers of scope, schedule, risk, cost, quality and satisfaction and expect to be successful. You need (at least for IT project management) to understand the underlying business process that is to be automated. You must see the big picture and you must be able to zoom into portions of it to see greater detail. You can't just implement a quality control system and then sit back and forget it. You must be a part of the automation process to know that stakeholders are in fact testing and verifying the deliverables.

Anyway, for what it's worth, to people who are serious about project success and mission success, these are not different roles. I might have someone on the project whose responsibility is primarily geared towards business analysis, that isn't a problem, but I am also a BA on the project.

Tuesday, December 29, 2009

First Droid Snafu and Cool

I still love my Droid. For Christmas I received a car dock. And I was amazed to discover that the phone knew when it was in the car dock. This happened despite the lack of any discernable electronics in the dock itself. But as soon as I drop it in there it brings up the car navigation screen and puts a different icon in the status bar signifying that it is in a car dock.

The issue that I discovered is that the phone is not automatically in speakerphone mode while it is in the dock. That's weird right? How could I have it near my ear while it is in the dock? So if I could make a request of the Verizon gods, please make a change so that when car dock mode is on, speaker is on by default.

That covers the snafu part. The good part was that the voice recognition is awesome. I hit a button to call contacts and then I said my Dad's name and blamo, there is my Dad on the phone. I had to hit the button to put him on speaker, but still...

With the car dock, the navigation is really convenient. Now, I have it running all the time with the GPS enabled and turned on the traffic layer in Google maps and I actually avoided some bad traffic on my way home last night.

Finally, I linked to Tech Broiler with the promise of more stupid droid trick volumes. He has another week but he will be dumped soon if he doesn't get more 'stupid'.

Monday, December 28, 2009

Federal Employees Almanac

I came in to work this morning and found a book on my chair. It is the 2009 Federal Employees Almanac. Hopefully it was bought at a bargain since 2010 begins this week. Anyway, this book is a must have for a federal employee, especially any new federal employee. It covers Pay, Insurance, Retirement, Leave & Benefits, TSP, Sepatation, Roles and Responsibilities, and a whole bunch more.

I've been a federal employee for many years now and some of the things I've read here I'm learning for the first time. As a federal employee you are assumed to already know everything. This book doesn't so that. It spells it all out for you.

I don't think a person needs to get an updated copy each year, but if you get just one, that will probably serve as a good reference point for answering difficult questions concerning federal service.

Monday, December 14, 2009

PM vs. COR

Most who know me know that I wear many hats at work. My business card reads "Project Manager". My Annual Performance Review reads Program Manager. And I'm also a Contract Officer Representative or COR.

This is a complicated set-up because most people think that there is virtually no difference between being a PM and a COR, but I think they are really distinct roles. I will try to highlight the differences between them.
I'll run each role across the 6 dimensions of project management:



As you can see, the differences are subtle, but important. Perhaps there are some situations in which the contractor does or delivers everything important for project success, but my experience has been that if you only focus on the contractor work you are missing a lot of the overall project. A lot of times I think the government personnel play the position of the COR and think that they are getting all the work done. I am not in that camp, but for people like me, it is difficult for me to tell you which hat I'm wearing at any given time. Since I perform as both the PM and the COR, I don't see much of a difference. It is all just work to me. But in situations in which they are different people, or when there is just a COR and no PM, I can see how things could easily break down.

Friday, December 11, 2009

Two Important Things

The first thing is...
Tis the season for holiday parties and merriment. Especially in the government sector we tend to have an hour or two in which we have some food and socialize. I have seen it where there will be 10 different parties on the same floor over a 2-week period. The important thing I want to draw attention to is in the plight of contractors. If you have contractors embedded with the feds, you should make a point of inviting them to participate in your holiday festivities. Think about it like this, you want them to be part of the team, committed to the mission all year long. Don't be inconsistent in the month of December.

I remember when I was a contractor working with and near the feds, and I could hear them in their holiday party. The party that they chose to not invite their contractors to. It was completely demoralizing to me. It communicated that, I was a part of the team when it was convenient for them to include me. So if you are going to have a holiday party, make a point of inviting the contractors to come. It costs you nothing, but it will help to maintain the team structure.

The second thing is...
I really expected to see a lot more negative comments about the Motorola/ Verizon Droid, and I have really been watching. But I haven't seen anything. I saw the MiKandi thing and that might prove to be negative, but I'm not going to download those applications. I saw that it can be hacked to get Root access, but that was inevitable. I've read that the developer community is frustrated with Google, but no matter what they are less frustrated with the Goog than they are with Apple right?

I've had the device for about 3 or 4 weeks now, and I find new things on it every day. It is so easy and intuitive that I find myself using it more and more. Then I read that Time has chosen the Droid as the Best of everything in 2009. I'm not telling you to abandon your iPhone, it could still be used as a paper weight or as a sculpture from the aesthetic period in contemporary culture, but you should get a Droid for communications.

Thursday, December 10, 2009

Save Award

Vote today. The Save Award has 5 nominees that will be included in the President's budget. In my opinion, these are all good ideas and all should be done. Congratulations to each of these people with articulating simple, easily implementable ideas. The idea concerning the inspections and income verification for public housing hit closest to my heart, but these are all great ideas.

Click over and cast your vote.

Wednesday, December 9, 2009

Data.Gov

Read an article today that Federal Agencies Must Post Public Data Online from the Washington Post. This hasn't been well publicized and I must confess that while I knew about the new Data.gov service, I was not aware of the requirement. Essentially it is a low bar that 3 data sets must be provided by the end of January 2010. That is a relatively easy thing to accomplish.

So I went to the Data.gov website to take a look. What I was afraid of is that we were duplicating work already accomplished. I figured that the easiest course would have been to take data from an agency and post it up to the Data.gov website. To my pleasant surprise, that is not what has happened. Data.gov is merely linking to data that is externally exposed on the agencys' own websites. This is a very good thing because the thing that made me most nervous about this is data quality.

If I work for USDA and I post data, and then post that same data on the Data.gov website, I could run into a problem. Inevitably someone will identify a data issue that I will work to correct. The problem avoided in the way Data.gov appears to be approaching this is that there is one and only one copy of the publicly available data. Meaning that if USDA makes a correction to the data, they don't have to send the corrected data to Data.gov for re-publication. Good.

I embarked on an initiative to assemble all of the data we had many years ago when I worked at HUD. My State and Local CPD Information site is still there, but probably not for long as HUD is refreshing their website. This was my big aggregation point for all data related to the Office of Community Planning and Development including:

I am not claiming to have created any of this data. Rather, my idea was to aggregate it in a manner that helped it to be more consumable by real people with real questions. As such, I created pages like this:
http://www.hud.gov/offices/cpd/about/local/pa/index.cfm
so that if you are wanting information about the state of Pennsylvania, it is all right here at your finger tips.

That is what Data.gov is doing. They want to be a place that can help people to aggregate this data. If there is an opportunity to grow, it is in the way the data is sliced. Put yourself in the shoes of a real person, perhaps someone in city or state government. This person has questions she wants to answer. If she is in Maine, she will not be interested in data from California. Someone has to think about the questions we are striving to answer. If we are just trying to feed the national-level researchers, then this set-up is great. But if we want to create something that can help to answer state or local questions, not so much.

Monday, December 7, 2009

Familiar Tune

I read today that the Veterans Administration is planning a Wide-ranging IT Services Contract. The article seems very familiar, in fact, it sounds exactly like the HITS contract that was awarded to support HUD. I don't know if people from VA visit my blog, but there are a few lessons that can be learned from that experience. I think nobody will argue that the final result was probably better than the status quo, but it was more painful than it should have been.

Numero Uno, all of your contract documentation must be perfect. This means that the solicitation, the SOW, the technical evaluation and best value analysis must be perfect. This is the type of contract that will be protested if it is not perfect, so take the time to be perfect and avoid those issue.

2. Make sure you have really strong transition in and transition out clauses. As it was written I'm not seeing discrete sections for in and out work. This is kind of a big deal because they are looking at a base of 5 years and the option of 2 more. The issue that I would help them avoid is what happens after that 7 years. You don't want to be in a situation in which you are stuck with the incumbent, because, trust me, they will try to lock you in. You can avoid this pain by planning for their obsolescence now.

3. Something that makes me nervous is integration. I can't tell whether they are looking to make a single award to just one vendor or if they are looking to make more than one award. I anticipate that because if the security work it is probably two, but it could be more than that. Whenever you are looking at a multiple contractor situation you need to identify who has the responsibility to make it all work together.

4. Where is the Architecture work? Maybe VA doesn't need support services for their architecture, but as I look through the PWS I'm missing the architecture support. It could very easily be related to #3 above, Integration support.

5. They cite CMM level 2, but don't seem to be requiring it or using it as a discriminator. I would recommend prescribing this as a minimum standard.

While I'm identifying a lot of thing that can be improved, I think this is a good opportunity for the VA. I just hope they talk to some people from HUD to make the transition less painful.

Friday, December 4, 2009

Hybrids

I saw an article today that strikes at the heart of one of my beliefs. The article was essentially that Homeland Security is considering hybrid contracts that include a mix of performance-based elements and Time and Materials (T&M) and cost-reimbursement elements. This is indeed good news.

At issue here, and the reason it is noteworthy is because it is a sign that people are starting to wise up to the fact that we must match the scope of the work to the type of contract. Months ago I posted something about Choosing the Right Vehicle, I had no way of knowing back then that it would turn into a great pun, but it looks like the right vehicle in many situations is a hybrid.

<>Drum snap<>Thank you, remember to tip your waiters and waitresses.

Monday, November 30, 2009

Configuration Management

As I have mentioned previously, Development and Maintenance are two separate things. Development is more risky and, consequently, more costly. Maintenance is less risky and should cost less. But when you make this adjustment you must consider the additional complexity involved with the system.

In my situation I have a system that was initially developed, is going into maintenance mode, but we are also working to expand the functionality into other programs. This means that I have both maintenance and development currently underway. Since I sent the maintenance work to a separate team this means I have two different sets of developers working on the code.

This introduces some risk into the process and requires some additional planning to efficiently and effectively manage the code for both of these teams. In my situation, I'm doing what I had planned, I have 4 releases programmed over the course of the year to maintain the application. But then I have the Development team working to extend the functionality to other programs and they are using an Iterative Development life cycle. Iteration 1 comes out in a couple months. That means that both the development team and the maintenance teams are working on the code at the same time.

I probably glossed over this issue in my previous posts. It is CRITICAL that you implement a really strong CM process to make sure that both teams are able to be productive and leverage work from the other team.

Tuesday, November 24, 2009

Dare to Dream, a 2-Factor Smartphone

This is the time to be thankful, and let me please start by saying how thankful I am to be clear of the RAZR and Treo I was carying around. I have nothing but love for my new phone. But...


Like most or many federal employees I have an ID card, technically known as an HSPD 12 badge. It looks just like this, only with a much more attractive picture. See that little gold thing towards the bottom? When I insert that into my laptop it sends a signal to initiate the authentication process. When I type in my pin or password it completes that process and allows me access to the LAN, network resources, the Intranet, email etc.
It is good because it is easy and secure. It is more secure than a really hard password simply because it has 2- factors, something you have and something you know. The thing you have is the card. The thing you know is the pin or password. Either of these without the other is not good enough to authenticate a user. I might lose my card, and we wouldn't want someone else going around pretending to be me. So, I have a pin, which, for me, is easy to remember, but is not likely to be guessed. The problem with this is, that the laptop I use, while really nice, is like 22" by 17" by 2" and weights, 5 pounds. It's cool for taking on a plane when I have to go on a trip, or for working at home. But when I go to a meeting and want to be able to check my email between meetings, it's kind of a pain there. It doesn't easily fit into my pocket.

But, what I do have is my new Droid. I could check email and my calendar and do lots of stuff that I would normally do on my laptop on this smartphone, and, bonus, it does easily fit into my pocket. The issue that I have is in authentication. Thus, while I have a really hard password, that is like honestly 18 characters, upper, lower, alpha, numeric and special, it is a real pain to try to type that super hard password into my new phone.


As such, I propose a marriage. Build a smartphone that includes the capability for me to insert my HSPD-12 badge into it (Factor 1) and allow me to type in my PIN (Factor 2). This would allow me to access all of the same resources I use when I'm logged into my laptop without going through GOOD. No offense to GOOD, I just don't like your software. My opinion is that Good is unproductivity software because it makes things more difficult.

So let's try to create a hardware solution to authentication. Try to focus on 2-factor and make use of stuff that most of us have anyway. Put a card reader on a smartphone and I guarantee you will command this segment of users. If you want to take it to the next level you will create an ap-store by agency that will allow USDA to identify the applications that can be installed on USDA smartphones and HHS to identify the applications that can be installed on their phones. Then, as an authenticated employee, I can cruise through that store to install the applications that I want, and we get to avoid applications that hold risk.

Monday, November 23, 2009

Droid

Those who know me know that I have been patiently waiting for some cool phone from Verizon (maybe not so patiently). I have been using my Motorola RAZR for too long. I remember buying this phone prior to one of my kids being born and my youngest is more than 3 years old, so this phone was old. But it was a trick phone. I have had it so long that the battery has swelled up and doesn't fit the case any more. Whenever I took out the phone, the back would fall off, so it had commedic appeal. The time the back fell off into a bowl of chile was the kicker (I still ate the chile).

Anyway, I was kind of hoping Verizon would deliver an Iphone. Instead they came out with the Droid. It is a good product and I am happy with it. The Droid, manufactured by Motorola, needs to be a hit for both Motorola and Verizon. Motorola needs it for the phone division to remain solvent and competitive in this space. The Droid needs to be a hit for Verizon to stem the tide of defectors going to AT&T and the Iphone. As such the need for this combination to be successful for both companies is very high, and I think they do have a hit here.

The phone for voice calls (the reason for having a phone) is very good. I think it is better than my old RAZR and infinitely better than my old Treo. Reading email is good and the interface is sleek and easy to use. The only issue there is that the service we use for work doesn't allow me to open attachements, but they are still there. I imagine that Good will fix that in the near futurte. The operating system, Google's Android, is really cool. This is my first exposure to that OS. Obviously everyone is familiar with Apple's mobile OS (since I have an Ipod Touch), and this is consistent but different.

The big takeaway I have is that it is super fast. I mean, even maps, using my current location, so nothing that is pre-cached, is really fast. If this was baseball I would administer a drug test and expect to find steroids here it is so fast. But since it is a phone I doubt I'm going to find needle marks anywhere.

The thing I didn't realize is that the different elements can really drain the power. Which is why I have added a new blog to the list of blogs I'm tracking on the right side of this page. You'll see the Tech Broiler blog from Jason Perlow. He is starting a new section called Stupid Droid tricks to teach dummys like me how to efficiently use this device. I did have a problem. My battery drained in like 4 hours. I didn't realize that when every single service is running the machine is draining power. I read Stupid Droid Trick #1 and figured out how to easily turn off services that I don't need like disabling Wi-Fi when I'm not at home. Duh. I'll be looking for more stupid Droid tricks in the future.

Anyway, since I'm a federal employee, and we have to use the Networx contract, my department has had a relationship with Verizon and those are the only services and phones we are allowed to use. Sorry other companies, that is just how it is. So we are piloting the Droid and I really like it. I think this will really work for us. As a pilot tester it is my job to find out these little problems and identify solutions so that if we want to roll it out to the larger community they won't be forced to deal with headaches.

Monday, November 16, 2009

Quality Management

I have seen that many people often struggle with quality management on big development projects. The reason they often have a hard time in this area is not because they don't know how to manage for quality, or what quality is, rather the problem is that they have to define quality so early in the process.

Sorry for being quiet for so long. I have been traveling around the country teaching both state and Indian Tribe users as well as internal government people how to use my new application. I find it hard to write when I'm on the road like that. I need to be in my routine to get it. I feel like I'm now back in my routine, at least for a little while, I have some additional travel coming up, but nothing like the last few weeks.

Anyway, quality management. Remember, I'm approaching this from the perspective of a government PM, so most of the development work is performed by the contractor. When we award that contract we want to have the vendor be a part of the process defining quality. That is a terrible mistake because their version of quality may be different from your version of quality. And if you discover that difference after the contract has been awarded, guess what? You can have your version of quality, but we'll need a change order and a cost adjustment.

The reason so many good people fall down on quality management is because you have to define it to a fairly high level of granularity very early in the process. In fact, you have to define it and set it in stone before you even start working on requirements. What I mean is that you need to define quality in your Statement of Work, and you need to be very deliberate about it. Here is how I do it.

I develop a section of the SOW called "Performance Standards". This is a common section in a performance-based contract. Also remember that most of my contracts are Firm Fixed Price as well. That means that only 5 of the 6 dimensions of Project Management are in play for my projects, Quality, Schedule, Satisfaction, Risk and Scope. I don't have to worry about Cost in FFP. Additionally, I manage risk discretely. So I literally create a table with each area of service and make Quality, Schedule, Satisfaction and Scope the column headers. It looks like this:

Requirements Gathering Sessions
Quality - Stakeholder comments are implemented and functions need not be revisited.
Timeliness - RAP materials are available at least 2 days before the RGS. Meeting minutes are available no more than 2 days after. RGS are held on the day indicated in the project plan.
Satisfaction - Stakeholders are able to achieve consensus with the consistent process.
Scope - By the end of RGS, Activity Diagrams, Use Cases and a Requirements Baseline has been agreed to by consensus of the stakeholders.

I then have a Deliverables Table in the following section that describes the Quality, Timeliness, Satisfaction and Scope of each deliverable. I go through each of the service areas like this and objectively define them so that offerors are able to assemble a high-precision bid and everyone is on the same page in terms of expectations.

The reason this is difficult for some people is because you must define it so early in the process. Do not shy away from it though. You need to do it this way if you want to avoid problems later.

Monday, October 26, 2009

Made to Stick


I just finished a great book, dare I use the cliche, a must read. These guys have gone through a lot of the information and advertising over the last 40 years or so to examine why some messages stick with us and why some don't.


They essentially break it down to a checklist of elements that, if you use them effectively, will make your message more likely to stick with people. Obviously, you don't want to use them for a notice that your building will have a fire alarm test today at 2:00 pm, but there are certain messages that will benefit from using one or more of these sticky devices.

  • Simplicity
  • Unexpectedness
  • Concreteness
  • Credibility
  • Emotions
  • Stories

I recommend reading the Excerpts page to see if you want to read this book.

Friday, October 23, 2009

Awestruck

Every time I work to design and develop a system or application I always hear this phrase in one way or another, "It won't work like that on an Indian Reservation."

I have never been able to respond to that comment. I have to take their word for it, that the person who is telling me that it won't work is right. That is, until now. I have had the unique opportunity to visit a reservation, and I will not soon forget it.

Some colleagues and I were invited onto the Ute Mountain Ute tribal land to explore some of their pueblos and life. This was a fascinating experience for me. Remember, I'm mostly a city person, so just the sight of a mountain can get my blood going. But what I saw on this tour was nothing short of amazing. The day started off great with something I had never seen before, a rainbow that was just in the sky, it was like the sun had an extra halo around it. I've never seen anything like that ever before and I doubt I ever will again.

Our guide was a man named Wolf. He led us down a series of ladders to see a bunch of Pueblos known as the Treehouse, the Lion's Head and the Eagle's Nest. This is a picture of him talking to us at the Treehouse.





Everything about these ancient people was amazing. How they lived, their construction practices, things they ate, I just found it to be interesting and being able to listen to Wolf's stories and then see it for myself was inspiring. I wish I had a week or more to more thoroughly explore the area and experience their way of life. I remember thinking the first second I stepped outside of the airport in Durango, that this is like a different planet. The air smelled different, the people were different. It was like nothing I had ever known.

I know that most people will never get to see these pueblos or the shards of pottery that were laying out for everyone to see, but you should try. I am so grateful to the tribe for their hospitality and kindness in sharing this with us. To know the people, you have to better understand where they come from. For just a day I got to see the world to Wolf's eyes. His world is much more rugged and difficult than mine, but he has so much more wonder and natural beauty as well.

Will this make me a better Program Manager? I don't know, but I met some interesting people and their perspectives are now a part of my consciousness, so maybe...

Wednesday, October 14, 2009

Updated - AFFIRM - The Federal Cloud

I was pleased to attend the AFFIRM luncheon yesterday on Cloud Computing. This was the first Association For Federal Information Resources Management (AFFIRM) event I have attended. Interesting speakers all the way around and they have all had some success with cloud computing.

The one thing I expected to hear but didn't was some discussion concerning the relationship of Enterprise Architecture and Cloud Computing. My theory is that the cloud provides an opportunity to operationalize EA for applications and data centers. My justification for this is that you can write very specific rules about what technology is to be used in the cloud and what is not permitted. For example if an agency targets Oracle 10 as the database for the enterprise, you can write into the SLAs that the provider cannot use SQL Server, Informix, Sybase or any other database. As such there is a new, more meaningful way to implement a target architecture than was previously possible. Essentially you have your cloud and you know that 100% of what is in the cloud is compliant with the target. Everything not in the cloud is suspect and deserves more oversight to report compliance.

I also didn't hear any discussion about the Certification and Accreditation which is somewhat surprising. C&A should be about 50% easier than it is without a cloud-type of system. For example all controls that are physical, sanitation and platform related should be established for the cloud. Then each application would be responsible for the application-level controls like roles and least privilege.

Overall, I think the AFFIRM group provides a reasonable opportunity to dive into some depth on a given topic. The one is somewhat squishy still, but I liked the format. The one thing I didn't like was the overly hard sell by a couple people. Don't push so hard, entice me.

Addendum - FCW posted a link to this meeting, you can read all about it in their article, Adapting to the Cloud.

Tuesday, October 13, 2009

PMO Roles

A few weeks ago I attended the CMIT event at the University of Virgina. I've written about two of the presenters that day, here and here. But today I want to touch on the presentation from Sanjiv Augestine. I actually have no issues with his presentation, my thoughts are more inquisitive than anything else.

Project Management Offices can, essentially, perform 3 distinct roles as I see it. And on this, I must confess that I think I see it like the PMI see's it. But I'm going on my own knowledge here, so if I get the wording different from them, don't hold it against me. A PMO can:
  • Develop standard processes and perform as the unit that evangelizes them to the rest of the organization. This would include training and mentoring execution-level Project Managers.
  • Perform as a control authority in oversight of Project Management execution. In this area the PMO is the gatekeeper ensuring that the standards identified in the previous bullet are adhered to. In this area the PMO is the authority for the Capital Planning and Investment Control (CPIC) process, and manages Integrated Baseline Reviews and Post Implementation Reviews.
  • The PMO can also serve as execution-level Project Management, running projects.

In Sanjiv's presentation I think he has assigned the first two roles to a PMO, but I don't think he assigned the execution-level PM role to the PMO. This is neither a good or bad thing. PMOs need to identify the functions that they will perform and then do a good job with them.

I can only comment ony my own experience, but I think it is important to mention that if a PMO only has the first two functions assigned to it then I believe it is easy for them to lose touch with the execution-level work and develop processes, standards and control gates that are too rigorous and costly. By requiring the PMO to manage projects means that they will work to be more pragmatic in the standards they develop as well as the control gates they erect. Just my two cents.

What other functions should a PMO perform?

Thursday, October 8, 2009

Bye Bye Graphic Artists

I am not a Photoshop expert, but I would like to think that I was fairly proficient with it in my day. Then I read this morning about PhotoSketch. This looks like the coolest software I have seen in a year. To use it you draw and label stick figures for your desired composition and the software intuitively scans millions of images on the Internet to assemble portions that meet your need. As I watched the video I have to tell you, I was stunned. This is amazing.

This is also probably going to put stress on graphic artists. If the software can do it with fantastic results, you will need to find a way of doing something that is better or different. Watch the video below see if you agree.


PhotoSketch: Internet Image Montage from Tao Chen on Vimeo.

Tuesday, October 6, 2009

In Response to the FTC, My Endorsements

The Federal Trade Commission released new rules yesterday concerning testimonials and product reviews when the reviewer has been compensated in some way. So, just to make sure I'm on the up and up, I think that I should come clean and let you all know that when I published that review of a Maserati Gran Turismo, in fact Maserati allowed me to keep the car. While I'm on the subject of full disclosure, the review I wrote of the cute little inn in Var, France... Yes, they comped my bill for the sweethart review.

I almost forgot, all that stuff about the University of Virginia, not only did they waive the tuition, but they gave me the degree for just showing up. Anyway, I just thought I owed it to my fans to come clean and let you know that I have been compensated for some of the product reviews I have written. And before I forget, Suntrust Bank, yeah, they tore up the mortgage and threw it in the trash after my review of their services.

So now the slate is clean, well, mostly, that stuff in Vegas is gonna stay in Vegas right? And I can continue to review the products and services that everyone uses every day with a clean conscience.

Invisotext:
Yeah right, I wish I had a Maserati, or went on vacation to the south of France, didn't have about $30,000 in loans to pay back or a mortgage. This post was a farce, take it in the spirit it was delivered.

Wednesday, September 30, 2009

Agile ROI

Last Friday I listened to the CMIT presentation by David F. Rico on the Business Value of Agile Methods. Overall a good presentation but I left with some concerns and questions. Specifically, slide 13 of his presentation is what has proven to be successful for me in how I approach development projects. The reason SCRUM and Extremem Programming, in my opinion, can't be successful in the government environment, is because the government has an imperative to 'know' that we are on the right track and 'monitor' progress. SCRUM and XP don't lend the visibility necessary to government Project Managers and Contract Officers to know that things are going well. Slide 13 demonstrates the overall model and features that are to be delivered for each iteration. The CO or COR can easily check the box to report that performance is in alignment with the Quality Assurance Surveilance Plan (QASP). With the other Agile methodologies I do not know how I could construct a QASP to effectively monitor progress. So kudos to Mr. Rico for pulling this slide that I was in agreement with, even though he said that most development projects that he has been involved with use SCRUM and XP.

The part for me that was concerning begins in slide 21 in which he begins to use data to build his case. The issue I have with his data is that he is using Lines of Code (LOC) as a common denominator across Agile and non-Agile projects. The issue here is that older projects not using Agile methods are very likely using older development languages. I strongly suspect that the data for the non-Agile projects is relying heavily on COBOL and C projects. I also suspect that the Agile projects is relying more on C++, JAVA and .NET languages. The fact that there are likely differences in the development toolset increses the risk of the findings derived from the studies. In this instance, I see LOC as a high risk because I would expect more lines of code from the COBOL and C projects than the Object-Oriented projects because the OO languages operate more efficiently. For instance you can perform a function in 10 lines of JAVA code that would require 100 lines of COBOL code. As such I have concerns about the findings presented here.

Finally, I ended with some questions. If someone asked me, "What is the return on investment from switching from a waterfall or code-and-test methodology to an Agile methodology" I would probably not start with this type of formula. The first place I would look is to project success. I would begin by digging into overall project success and failure for Agile methods versus non-Agile methods. I'm not a researcher or writing a book on this subject, but I suspect that projects using Agile methods are more likely to be launched or released than projects using other methods. I would also argue that this factor is likely to dwarf other measures.

But if you insist on other measures, I would offer that it offers a significant benefit in the scope dimension when compared to the Waterfall method. In waterfall you exhaustively capture and carve requirements into stone tablets that are then delivered in complete software some amount of time later. The reality is that if the business requirements could lend themselves to stone tablets the world would be a much happier place. But that is unrealistic. As such delivered software in a waterfall project rarely meets the scope of the business needs because the business needs have evolved during the time the team was working to develop the software. Agile allows closer interaction with the business personnel during the entire development process and this helps the final products to be in much closer alignment with the business needs.

An additional measure for comparing Agile practices against the client-server code-and-test practice is on cost. Good Enterprise Architecture practices are very difficult to implement in a code-and-test environment. This leads to a lot of redundant development, more difficult integration, and the most significant component, an increased cost to maintain the finished product.

So overall I agree with the conclusions, that Agile Methods have an increase return when compared to other traditional methods of development. But the details of that analysis are a little uncomfortable for me.

Tuesday, September 29, 2009

Historical Lessons Learned

I was lucky to attend the Center for the Management of Information Technology (CMIT) conference last week because I got to listen to Mark Kozak-Holland. He was speaking about his new book, which just came out, Agile Leadership and the Management of Change: Project Lessons from Winston Churchill and the Battle of Britain. As an undergrad History and Political Science major I enjoyed his presentation for both the historical perspective as well as the relationship he posited towards Agile leadership. It's important to keep this idea in perspective though. It wasn't like Churchill, Dowding and Beaverbrook were scientifically trying to do something unique or different.

I probably disagree with the speaker on the foundation of his position, but it was an interesting exercise to walk through it. In an Agile project you start with the key requirements that can be built in the iteration or sprint schedule. You implement them, deploy and consider the evolving, adjusted or new requirements for the next iteration or sprint. Sometimes this type of project might have a throw-away iteration, especially early. I doubt that anyone in England would agree that they could throw away a couple or three weeks. Additionally, I don't see historical breaks in which something was complete and they went back to re-prioritize the changes or requirements. As such, I think the argument that this is Agile leadership is thin, but as I said, it was nonetheless fun.

Thursday, September 24, 2009

Necessary Deliverables

With all contracts there are certain things that are small nuances that can take a lot of time and energy if they are not managed effectively. In my current environment the administrative elements include the Background Investigations, which I mentioned previously, Computer Security Awareness Training and Separation Forms in my Statements of Work. It is a million times easier to just identify each of these things as deliverables and get the contractor to treat them as such than it is to handle these like administrative items.

I also include the notifications of incurred costs as deliverables. The problem I try to address here is the lag on invoicing. If I wait to receive an invoice that tell me we have expended 80% of the funds available then I am likely to already be at 90% because of the invoice lag. To counter this, I am making contract deliverables on T&M contracts that the contractor sends notification within 2 or 3 business days of incurring costs at the 80% and 90% thresholds of funding obligated and awarded to the contract. This avoids the issue of lag and gives me the visibility I need to take action.

  • List of Key Personnel - Any change
  • Status Meetings - Agenda 2 days prior, Minutes 1 day after
  • Project Schedule - Updated with each status meeting
  • Risk Register - Updated with each status meeting
  • Change Control Register - Updated with each status meeting
  • BI Forms - Submitted before the resource begins work
  • Separation Forms - Submitted before the resource's last day
  • Computer Security Awareness Training - Specified by the COR
  • 80% Cost Incurred - In writing within 3 business days
  • 90% Cost Incurred - In writing within 3 business days

Monday, September 21, 2009

BI

While it would be nice to talk about Business Intelligence, I'm going to touch on a different BI today; the Background Investigation. We have 5 different levels of investigation that we can initiate for people supporting the agency. They are:
  1. Finger Print
  2. NACI
  3. MBI
  4. BI
  5. SSBI

The Finger Print is quick, easy and for very low risk positions. In fact, I don't think that anyone working on a contract for which I am the COR has just a Finger Print check. The most common is the National Agency Check with Inquiries (NACI). This typically costs $100 and is the most common investigation. The Minimum Background Investigation costs about $525 and this is the check that I request for anyone who is an administrator or who has access to the production environment. As the opportunity to do harm increases, the level of investigation should also increase so that we don't allow people with a history of doing bad things the opportunity to repeat that deed. The standard Backgrount Investigation (BI) cost $2825 and is a significant investigation. The highest BI here is an SSBI or Single Scope Background Investigation. This is only used for specialized positions.

All of the contractors working for me undergo some BI. Sometimes there are vendors out there who choose to not submit a BI form for someone working on the project. That is a mistake in my opinion. Everyone who is billed better have a BI in place or at least in process. I check all the time. I don't mean to be a jerk about it, but this is a hard and fast rule, and there is no gray area.

Wednesday, September 16, 2009

It's a Cloudy Day

I can't believe I didn't write about this already. I was sure I had, but as I look back, I didn't. I know I put it on my LinkedIn status. Back in April I was excited to try something unique and innovative for a development project. I wanted to develop the backup capabilities that would allow me to backup to the cloud. In this case, I had performed research to backup to the Amazon cloud, EC2. I had priced it out and was eager to try to get started. The contractors supporting me were eager to try this as well because this has to be the direction of the government.

Unfortunately, I was a little ahead of my time. I was told that I would be required to backup to a different office and use other internal resources. Not exicted about this news, but I rolled with it. Getting services from internal resources is not always a good situation, and this case bears that out. We met and ran through the schedule and agreed that it would be set and ready before August 1. At the time, I asked, "Is there any risk that could get us that would prevent us from meeting the August 1 date?" The response was "No." Here we are on September 16 and it still isn't ready. This is a six week schedule variance.

Then today, what do I read? Apps.gov is now available for use. This is GSA's cloud, competing with Amazon and Google but geard for the federal sector. It is about 4 months too late for my project, but for backup of data and contingency operations, I will try to use it for my next project. I bet that I can get a Service Level Agreement that will help me to avoid six week schedule variances.

Monday, September 14, 2009

Drill Baby Drill (Not what you think)

I recently participated in my first contingency exercise for an application. It was what we call a "Tabletop Test", and while I would rather have had a physical exercise, it was nonetheless informative. This is something that I have been pushing for a very long time. The problem is that people are often so consumed with day-to-day operations that we never make time to actually run through a simulation of what to do when bad things happen.

I picked the very unlikely scenario of a hurricane knocking out operations of a data center in the upper midwest. I know it is not reasonable that a hurricane is going to do that, but a tornado is much more likely, only it wasn't on my scenario list and the hurricane still allowed me to run through what people should do for the other, more likely situation. We actually identified a couple of areas that need some improvement. For example, there are several applications that run out of this particular data center, we have to take some time to prioritize the order in which these applications will be restored and assume that we don't have the resources to bring them all up at once. Also, you already know that I am a Green IT fan. This means that if I don't have to print it, I won't. But Contingency Plans and Disaster Recovery Plans must be available in hard copy. I didn't have them in paper before.

Overall though, it was a worthwhile exercise and we found some things that can be improved for the next time. And the next time will be in 6 months. I think that the more frequently you run through these types of scenarios, the better you become at it. Practice makes perfect (well, at least better than before), drill drill drill. So we'll be doing it again 6 months from now, only I want to have a physical exercise, bring down the servers and restore them at the alternate location and bring the application up. I know that we'll find other useful information that will help us to perform more efficiently if we ever had to do it for real.

Friday, September 11, 2009

Worker Shortage and Hiring Process

I'm really sorry that I keep picking on Federal Computer Week, but they published an article today about a severe worker shortage in the cybersecurity segment of the government space (Do federal hiring processes discourage...). This article went on about how difficult it is to get a job and the pain of the hiring process. But the article really misfired on a couple of levels. First, while there is an emerging need in the cybersecurity segment, the immediate concern is in the aqusition segment. The real problem is in Contract Specialists, Contract Officers and people qualified and capable of awarding contracts.

Second, and I can't believe this was not mentioned is the fact that we have been talking about the risk of a brain drain in the federal government for almost 10 years now. There has been a bubble of retirement eligible people for several years. The issue that I'm surprised to not find discussed is how the current economic climate is affecting the retirement bubble. Though I have no scientific evidence, I think that people are holding off on retirement for 2 reasons:
  1. They lost a chunk of their nest egg in the recent devaluation of the stock market and
  2. There is too much risk and uncertainty in the current economy to begin a retirement now.

On the first point, the market lost about half its value and has been slowly inching up ever since. I think that people will begin to cash out when they feel like their portfolio gets to the pre-recession level. Once they feel like they are even, then we will see a big wave to cash out. Unfortunately, that will likely cause a ripple recession all by itself.

About the same time people start to get to the pre-recession levels in their stocks, the economic outlook will be a lot more rosy and seem less risky. It will feel like a good time to begin a retirement and seem like it will have less risk.

No matter what though, I think that we are looking at a significant opportunity for the next generation to step up and step in to real positions of leadership as soon as the economy turns the corner the retirement bubble will begin to burst. The pace of retirements will quicken, the vacancy rate will increase. Over the short-term this will be painful because we will be forced to do the same amount of work with fewer people, but, as they say, necessity is the mother of invention. We will be forced to become more efficient with our hiring process.

In this, I speak from experience. I recently (2 weeks ago) participated in a panel reviewing applicants for a position. We reviewed 6 resumes, we met for 2 hours, discussed the strengths and weaknesses of each applicant and tabulated the scores. This was on a Wednesday. The offer to the candidate was made on Thursday. The candidate accepted the offer on Friday and started work on Monday. Sure, there was 2 weeks that had passed from the closing of the announcement until we met to consider candidates, but, when the need is urgent, the government can move at the same pace, or even faster than the industry.

Thursday, September 10, 2009

Really?!? Really!?!

I read an article and clicked over to all of the links to get a sense for the content on a recent article in Federal Computer Week, 7 Federal IT Bloggers Worth Reading. I don't mean any offense to anyone, but I would die of boredom while waiting for these people to post content. I'm sure they are all good people, they just aren't exactly prolific bloggers. I know I went through a 3-week lul there, sorry about that. But I'm getting back into the swing of things. I'm very disappointed that FCW would point to these, barely alive blogs and say that these are the ones to watch. Were they just looking for any old blog?

I know my last post will be a resource for a bunch of people, because I have talked about it with them, so that is the kind of content I really want to deliver to this blog.

Tuesday, September 8, 2009

Best Value Analysis

I have participated in the process of awarding many contracts. Most times it is fairly straightforward and easy to identify the winner. Sometimes though it can be tough. I remember several years ago I performed a very complicated Best Value Analysis (BVA) in which I used the labor hours proposed and the cost to identify the average rates for each persons on the team. One bidder for that project only had a couple people and another proposed an army, so it was a complicated affair.

I recently finished another BVA and I am very happy with the process and formula I used to identify the best value. First, in the solicitation we were careful to identify that the award would be made based on Best Value and that the Technical review would be 65% of the score while cost would be 35%.

Then we completed the technical review, and let's just say that we hypothertically had:
  • Offeror A - Technical 90 points - Cost $600K
  • Offeror B - Technical 85 points - Cost $500K
  • Offeror C - Technical 80 points - Cost $450K
  • Offeror D - Technical 75 points - Cost $400K
Just for fun, take a second here and pick who you think the Best Value offeror will be.

In this hypothertical, let's set the Independent Government Cost Estimate to $350K. So everyone is over the IGCE. To get the cost into alignment that will allow me to integrate it with the technical I needed to figure out a way to get it to a 2-digit number that rewarded the offerors closer to the IGCE. I thought about a percentage of the IGCE, (proposed cost / IGCE) would work, but it went the wrong way. As costs got further away from the IGCE the score increased.

But if I took the inverse of that, then it worked well. As such I used the formula, 1 / (proposed cost / IGCE) = Cost Score

Using my examples above, I have:
  • Offeror A - Technical 90 points - Cost Score 58
  • Offeror B - Technical 85 points - Cost Score 70
  • Offeror C - Technical 80 points - Cost Score 78
  • Offeror D - Technical 75 points - Cost Score 88
With this I have all the invormation I need to combine my cost analysis and technical analysis to perform my best value analysis. The formula looks like:
(Technical x .65) + (Cost Score x .35) = combined score

When I do this I find that the offerors' final scores are:
  • Offeror A - Technical 90 points - Cost Score 58 - Combined Score 78.8
  • Offeror B - Technical 85 points - Cost Score 70 - Combined Score 79.75
  • Offeror C - Technical 80 points - Cost Score 78 - Combined Score 79.3
  • Offeror D - Technical 75 points - Cost Score 88 - Combined Score 79.55
Offeror B had the highest combined score, and is the Best Value.

Monday, August 17, 2009

GIS has an Easy Button

Do you remember the old days of Geographical Information Systems (GIS)? You had to get some base-map, which were raster-based, and slow to build. Ugh, I remember how painful it was. Then, of course, you wanted to move just a little east to include a feature, and it would have to build the entire map all over again. Then Google Earth came and it was jaw dropping and fast.

The problem with Google Earth's product is that it doesn't bring in all of the GIS data. If you want to find a place, and see it, cool, you can do that. But if you want to look at the demography of a place, you need to stick with the older products. That is, until now.

ESRI, the silicon valley company that developed the entire ARC suite of products, has developed an online version of their product.
ArcGIS Business Analyst Online is very cool and very cheap. I don't mean cheap in a stetson cologne sort of way, I mean cheap in that it only cost $2500 per user. For $2500 you get granularity down to the block group level with current and historic demographic data.


If you are a geologist or a biologist, this is not going to give you the level of detail that you need to conduct research. But if you are a business and want to know where your customers are, yes. If you want to know where traffic congestion is highest in the aggregate, yes. If you want to know which area is expected to have the highest density of families with school-age kids yes. The applications for this type of product are virtually unlimited. All real estate agents should have this information because it would help you to sell houses. School systems should have this information because it will help you to visualize the growth areas as well as see the difficult traffic areas.


The most beautiful thing is that it is completely turn-key. There is no hardware to buy, no software to install. No data to buy, and maintain. If you have a browser and an Internet connection, you are there. Oh, and I almost forgot, the interface is simple, type in the address, identify the distance for rings around your location and generate the reports. It is so easy, even my Mom could do it.

Tuesday, August 11, 2009

Shift Happens_

A friend posted a link to a presentation that could be legit, could be part marketing. Check out his NetFlix Culture post. While it sounds good and pure, I am somewhat dubious of the overall impact to the organization. Anecdotal evidence indicates something less than complete satisfaction with the process. But these people could be people who fell out as a result of the process and want to spend their time grinding axes.

That being said, there is a lot of content in the
NetFlix deck and it is a worthwhile read. Some of the slides have an indication of "Confidential" at the bottom which makes me think that it is more image/marketing related because if it was really intended to be confidential, then it wouldn't be available for everyone to read. I found the annual compensation review to be interesting. This is a real motivating factor, but it can also create a feeling that you are working for your lunch.

Some people will look at that presentation as an inspiring opportunity. I can't believe I haven't included this before, but my favorite presentation is
Shift Happens_. If you aren't familiar with it, you should watch it. I watch it frequently to recharge the batteries. I find a new nugget every time I watch it.

Monday, August 10, 2009

A Bowl of Candy on Every Desk

I was privileged to attend a lecture by Rob Cross from the University of Virginia a couple weeks ago. His book, Driving Results Through Social Networks is quite good and I thoroughly enjoyed the discussion. In every organization there is the traditional org chart with people assigned to positions. That is how things are supposed to get done, but it is rarely the case.

Have you ever noticed that people who smoke tend to be the most informed people in an organization? This happens because these people take their smoke breaks and talk with people who are outside of their defined organizational space. These conversations transcend the organizational boundaries and allow information to travel more freely.

He told a story about how there was one woman, an assistant in a division who was critical to accomplishing a bunch of distinct processes. To the organization she represented a risk because there was only one of her. The question Rob tried to answer was, how. How did this assistant come to hold so much power? He went through all of the factors and came up blank. It wasn't until he actually visited her and found a huge bowl of M&Ms that he realized the answer. People were coming to her because doing so gave them an opportunity to grab some candy.

As such, I promise to get a bowl and put some candy on my desk. If I can keep from eating it all myself that will indeed be a feat.

But these anecdotes illustrate that the org chart on the website and sent around by HR don't always provide a complete picture of how things actually get done in an organization. You have to understand the social network and see who people go to for activities to understand the complete picture. Cross has a lot of excellent content on how to map that out and get a feeling for the shadow organization.

Wednesday, August 5, 2009

Acquisition Roles

Sorry for harping so much on acquisition and procurement stuff, this makes 4 in a row, but I saw an article last week in Federal Computer Week titled Acquisition Workforce Needs Retooling and I feel compelled to comment. I agree that we can't leave it as it is.

The Federal Acquisition Institute (FAI) has created a bunch of different flavors and we don't quite get how they are all supposed to work. Check out their Certifications site for more detail. I am currently a FAC-PM and a FAC-COTR. Should I pursue the FAC-C? I don't know. It's a patchwork and there is considerable overlap for the types of projects I'm involved with.

The good news is that FAI has recognized that more is needed. I participated in a focus group a week or two ago in which a number of people from a variety of agencies and the DoD participated in a COR/COTR focus group. There were some themes that came out of that experience that will help to build some more hard skills for constructing work statements that are objective and measurable and then monitoring performance against those benchmarks. I'm optimistic that new content will be developed to reinforce these types of adjustments at the COR/COTR level.

Monday, August 3, 2009

Federal Acquisition Change (Part 3 of 3)

I've touched on reducing risk and how the pendulum keeps swinging away from Time and Materials contracts and to the Firm Fixed Price side. Then I wrote about how A-76 seems to favor the industry because it doesn't consider the total cost of ownership. Today, I'm going to focus on the third memo that was identified in the Washington Post article last week. This one is an easy one for me to write because it is something I have blogged about before.

Back in February I posted an entry about a web application that has gone relatively unnoticed in the federal sector. The Past Performance Information Retrieval System (PPIRS) should be a no brainer. Almost every solicitation indicates that the technical evaluation will be based partly on a review of the offeror's past performance. But in reality, PPIRS does not help in assessing that factor. Since my post 6 months ago, I feel stronger about this issue. I've used PPIRS, I have tried to verify contractor performance, and, I have found it to not be a useful tool.

The problem with PPIRS is data. The data in this application is terrible. I wanted to objectively evaluate the performance of several offerors. Each of them indicated that they were the prime on a government contract, one was even a DoD contract, and only a couple were actually in PPIRS. The DoD contract was not, which is surprising since PPIRS is run by the Navy. One of the offerors that was actually in the system was in there 4 separate times because their name was spelled 4 different ways.

Don't misunderstand, I really like the idea of PPIRS and I am all for requiring everyone to input more information. But before we start doing that, let's take a moment to identify the goals of what we are trying to do. This memo neglected to do that. The goal, I believe, is to objectively score past performance when awarding a contract. Based on that goal federal agencies should input information into a system (PPIRS) that will help acquisition offices in other federal agencies to form a clear opinion of that performance.

The problem, as I mentioned is data. What this system needs is a primary key that will allow contractor companies to be uniquely identified and trap nuances for spelling. Almost all of these recipients of contracts get paid for the services they perform and most pay taxes on those fees. As such, I would recommend the Tax Identification Number (TIN) be used to uniquely identify companies. Once the companies are known it is a lot easier to consolidate all of the information we have about their past performance.

Next, standardize the evaluation of the performance. This standardization should be on two fronts; qualitative and quantitative. The qualities that are evaluated should be as objective as possible. If you asked me, I would say, Cost, Schedule, Quality, Scope, Satisfaction and Risk, but I'm a Project Manager. In the qualitative review the contractor's performance in each of these categories should be evaluated. In the quantitative side, each of these should be scored from 0 to 10 (0 being the worst performance). These numbers could then be rolled up to identify an overall contractor score. Of course, this would greatly favor the small contractor that only has a few contracts and could hinder the larger vendors that have hundreds. As such, there should be a way to weight the amount of information we have to identify the confidence we have in the score that was assigned. As such small company ABC, that has only performed 1 contract, but did it well, could have a score of 59, but a confidence of 1. Whereas, big integrator DEF may have a lower score of 49, but a confidence of 200. This would give acquisition personnel what aren't familiar with those companies an opportunity to understand the rated performance and put it into perspective.

While getting on the right track in scoring past performance is a good idea and the right thing to do. What would make an even bigger impact would be to figure out a way of getting the government's Independent Government Cost Estimates (IGCE) under control. We are terrible at generating these. I am terrible at these, and I'm better than most. Estimates, regardless of where you are, are notoriously slippery. But, if we are going to be doing a better job with collecting information about the past performance of contractors, couldn't we use that information to help us generate better estimates too? Think about how many contracts the government awards per year. When I go to generate an IGCE for some segment of work, PPIRS should be the first stop to generate a parametric estimate.

Thursday, July 30, 2009

Federal Acquisition Change (Part 2 of 3)

Two days ago I talked about Improving Government Acquisition through savings and a reduction in risk. I am still a little skeptical about the risk side of that. Today though, I will focus on the second memo, Managing a Multi-sector Workforce.

This memo doesn't actually do, much, but it is a clear shot across the bow to a lot of current work. For many years the federal sector has been dealing with Curcular A-76. The intent of A-76 is to introduce competition for segments of work that were/are performed by federal employees. Momentum on this issue has been solidly behind the industry. Part of the reason for that is because there is a lot of political momentum for decreasing the size of the federal workforce.

My personal take on this issue is that, the work doesn't disappear. Some process, call it ABC, was getting done by federal workers, and now with A-76, a company can compete with the government to try to do it cheaper and better. The problem I have with this is that we never actually realized the cost reduction. In fact, I strongly believe that we actually received a cost increase because of this policy. Take this as an example. A federal employee processes my leave forms for when I need to use vacation time and sick time. Let's say hypothetically, that person gets paid $50,000 per year as a federal employee. We have an A-76 competition on for that work and the industry wins because they can find a person to do that work for $35,000 per year. That sounds good right? So they pay a person $35K, but don't forget, the company has to make money as well (that what businesses do). So there is a mark-up, let's say 10% (which is pretty low). So now that $35K work is now costing the government $38.5K. Still works out better. But then you have to add in the cost of developing, competing and awarding a contract for the work, that has to be worth about $5K. This puts us up to $43.5K, which is still on the plus side. But wait, you also have to add in the cost of monitoring performance and training, transition in and transition out costs. Those start to push the cost up to the original $50K, which just makes this exercise a wash. But don't forget, this is just the base year for what is likely to be a 3 to 5 year contract. Those types of contracts almost always have escalation clauses built into them. The smallest I've seen is 3%, but 5% is pretty typical. If we escalate the contractor cost by 3%, we get up to about $45K in year 5 and that doesn't include the cost of administering or monitoring the contract, or, god forbid, litigation that sometimes happens with a contract.

While A-76 seems like a cost cutting program, I believe it is a net cost when considered in the long-term just from a practicaly dollars and sense perspective. To be fair, I did not go through all of the add-ins that go into a federal employee, and they exist, but I want to make a point. When you consider this issue from a Knowledge Management perspective, there is no comparison. Something like keeping track of Time & Attendance is one thing, but we have been outsourcing functions like Contract Specialists.

Wonder why we are in such a tough situation in the field of acquisition? The answer to this is rooted partly in the other side of A-76. Set aside the actualy dollars and cents cost analysis and look deeper at the hidden costs. What is the government losing when function ABC is outsourced? They are losing the institutional knowledge of that work and the connections it has to all of the other areas in the business. In my example of someone performing the work of Time & Attendance, that is one thing. But if we look at other segments that are not necessarily "Inherently Governmental" then we can start to see the roots of the acquistion problem. Procurement Analysts and Contract Specialists are two functions that are not currently identified as being inherently governmental. That means that the government can and does award contracts to get people to fill these roles. As we do that, we (the federal government) loses the institutional knowledge associated with performing this work. The people who perform this work are the people who are going to be stepping up to be tomorrow's Contract Officers. But as we outsource these positions we lose the richness of their connections and must compete against the higher salaries of the commercial sector to bring them in to the CO roles.

This Managing a Multi-sector Workforce memo says to me, 'Look out, we're getting ready to change the definition of inherently governmental, and it is going to impact the acquisition side.'


In particular, overreliance on contractors can lead to the erosion of the in-house capacity that is essential to effective government performance. Such overreliance has been encouraged by one-sided management priorities that have publicly rewarded agencies for becoming experts in identifying functions to outsource and have ignored the costs stemming from loss of institutional knowledge and capability and from inadequate management of contracted activities. Too often agencies neglect the investments in human capital planning, recruitment, hiring, and training that are necessary for building strong internal capacity – and then are forced to rely excessively on contractors because internal capacity is lacking...
It's coming I tell you.