Wednesday, September 30, 2009
Agile ROI
The part for me that was concerning begins in slide 21 in which he begins to use data to build his case. The issue I have with his data is that he is using Lines of Code (LOC) as a common denominator across Agile and non-Agile projects. The issue here is that older projects not using Agile methods are very likely using older development languages. I strongly suspect that the data for the non-Agile projects is relying heavily on COBOL and C projects. I also suspect that the Agile projects is relying more on C++, JAVA and .NET languages. The fact that there are likely differences in the development toolset increses the risk of the findings derived from the studies. In this instance, I see LOC as a high risk because I would expect more lines of code from the COBOL and C projects than the Object-Oriented projects because the OO languages operate more efficiently. For instance you can perform a function in 10 lines of JAVA code that would require 100 lines of COBOL code. As such I have concerns about the findings presented here.
Finally, I ended with some questions. If someone asked me, "What is the return on investment from switching from a waterfall or code-and-test methodology to an Agile methodology" I would probably not start with this type of formula. The first place I would look is to project success. I would begin by digging into overall project success and failure for Agile methods versus non-Agile methods. I'm not a researcher or writing a book on this subject, but I suspect that projects using Agile methods are more likely to be launched or released than projects using other methods. I would also argue that this factor is likely to dwarf other measures.
But if you insist on other measures, I would offer that it offers a significant benefit in the scope dimension when compared to the Waterfall method. In waterfall you exhaustively capture and carve requirements into stone tablets that are then delivered in complete software some amount of time later. The reality is that if the business requirements could lend themselves to stone tablets the world would be a much happier place. But that is unrealistic. As such delivered software in a waterfall project rarely meets the scope of the business needs because the business needs have evolved during the time the team was working to develop the software. Agile allows closer interaction with the business personnel during the entire development process and this helps the final products to be in much closer alignment with the business needs.
An additional measure for comparing Agile practices against the client-server code-and-test practice is on cost. Good Enterprise Architecture practices are very difficult to implement in a code-and-test environment. This leads to a lot of redundant development, more difficult integration, and the most significant component, an increased cost to maintain the finished product.
So overall I agree with the conclusions, that Agile Methods have an increase return when compared to other traditional methods of development. But the details of that analysis are a little uncomfortable for me.
Tuesday, September 29, 2009
Historical Lessons Learned
I probably disagree with the speaker on the foundation of his position, but it was an interesting exercise to walk through it. In an Agile project you start with the key requirements that can be built in the iteration or sprint schedule. You implement them, deploy and consider the evolving, adjusted or new requirements for the next iteration or sprint. Sometimes this type of project might have a throw-away iteration, especially early. I doubt that anyone in England would agree that they could throw away a couple or three weeks. Additionally, I don't see historical breaks in which something was complete and they went back to re-prioritize the changes or requirements. As such, I think the argument that this is Agile leadership is thin, but as I said, it was nonetheless fun.
Thursday, September 24, 2009
Necessary Deliverables
I also include the notifications of incurred costs as deliverables. The problem I try to address here is the lag on invoicing. If I wait to receive an invoice that tell me we have expended 80% of the funds available then I am likely to already be at 90% because of the invoice lag. To counter this, I am making contract deliverables on T&M contracts that the contractor sends notification within 2 or 3 business days of incurring costs at the 80% and 90% thresholds of funding obligated and awarded to the contract. This avoids the issue of lag and gives me the visibility I need to take action.
- List of Key Personnel - Any change
- Status Meetings - Agenda 2 days prior, Minutes 1 day after
- Project Schedule - Updated with each status meeting
- Risk Register - Updated with each status meeting
- Change Control Register - Updated with each status meeting
- BI Forms - Submitted before the resource begins work
- Separation Forms - Submitted before the resource's last day
- Computer Security Awareness Training - Specified by the COR
- 80% Cost Incurred - In writing within 3 business days
- 90% Cost Incurred - In writing within 3 business days
Monday, September 21, 2009
BI
- Finger Print
- NACI
- MBI
- BI
- SSBI
The Finger Print is quick, easy and for very low risk positions. In fact, I don't think that anyone working on a contract for which I am the COR has just a Finger Print check. The most common is the National Agency Check with Inquiries (NACI). This typically costs $100 and is the most common investigation. The Minimum Background Investigation costs about $525 and this is the check that I request for anyone who is an administrator or who has access to the production environment. As the opportunity to do harm increases, the level of investigation should also increase so that we don't allow people with a history of doing bad things the opportunity to repeat that deed. The standard Backgrount Investigation (BI) cost $2825 and is a significant investigation. The highest BI here is an SSBI or Single Scope Background Investigation. This is only used for specialized positions.
All of the contractors working for me undergo some BI. Sometimes there are vendors out there who choose to not submit a BI form for someone working on the project. That is a mistake in my opinion. Everyone who is billed better have a BI in place or at least in process. I check all the time. I don't mean to be a jerk about it, but this is a hard and fast rule, and there is no gray area.
Wednesday, September 16, 2009
It's a Cloudy Day
Unfortunately, I was a little ahead of my time. I was told that I would be required to backup to a different office and use other internal resources. Not exicted about this news, but I rolled with it. Getting services from internal resources is not always a good situation, and this case bears that out. We met and ran through the schedule and agreed that it would be set and ready before August 1. At the time, I asked, "Is there any risk that could get us that would prevent us from meeting the August 1 date?" The response was "No." Here we are on September 16 and it still isn't ready. This is a six week schedule variance.
Then today, what do I read? Apps.gov is now available for use. This is GSA's cloud, competing with Amazon and Google but geard for the federal sector. It is about 4 months too late for my project, but for backup of data and contingency operations, I will try to use it for my next project. I bet that I can get a Service Level Agreement that will help me to avoid six week schedule variances.
Monday, September 14, 2009
Drill Baby Drill (Not what you think)
I picked the very unlikely scenario of a hurricane knocking out operations of a data center in the upper midwest. I know it is not reasonable that a hurricane is going to do that, but a tornado is much more likely, only it wasn't on my scenario list and the hurricane still allowed me to run through what people should do for the other, more likely situation. We actually identified a couple of areas that need some improvement. For example, there are several applications that run out of this particular data center, we have to take some time to prioritize the order in which these applications will be restored and assume that we don't have the resources to bring them all up at once. Also, you already know that I am a Green IT fan. This means that if I don't have to print it, I won't. But Contingency Plans and Disaster Recovery Plans must be available in hard copy. I didn't have them in paper before.
Overall though, it was a worthwhile exercise and we found some things that can be improved for the next time. And the next time will be in 6 months. I think that the more frequently you run through these types of scenarios, the better you become at it. Practice makes perfect (well, at least better than before), drill drill drill. So we'll be doing it again 6 months from now, only I want to have a physical exercise, bring down the servers and restore them at the alternate location and bring the application up. I know that we'll find other useful information that will help us to perform more efficiently if we ever had to do it for real.
Friday, September 11, 2009
Worker Shortage and Hiring Process
Second, and I can't believe this was not mentioned is the fact that we have been talking about the risk of a brain drain in the federal government for almost 10 years now. There has been a bubble of retirement eligible people for several years. The issue that I'm surprised to not find discussed is how the current economic climate is affecting the retirement bubble. Though I have no scientific evidence, I think that people are holding off on retirement for 2 reasons:
- They lost a chunk of their nest egg in the recent devaluation of the stock market and
- There is too much risk and uncertainty in the current economy to begin a retirement now.
On the first point, the market lost about half its value and has been slowly inching up ever since. I think that people will begin to cash out when they feel like their portfolio gets to the pre-recession level. Once they feel like they are even, then we will see a big wave to cash out. Unfortunately, that will likely cause a ripple recession all by itself.
About the same time people start to get to the pre-recession levels in their stocks, the economic outlook will be a lot more rosy and seem less risky. It will feel like a good time to begin a retirement and seem like it will have less risk.
No matter what though, I think that we are looking at a significant opportunity for the next generation to step up and step in to real positions of leadership as soon as the economy turns the corner the retirement bubble will begin to burst. The pace of retirements will quicken, the vacancy rate will increase. Over the short-term this will be painful because we will be forced to do the same amount of work with fewer people, but, as they say, necessity is the mother of invention. We will be forced to become more efficient with our hiring process.
In this, I speak from experience. I recently (2 weeks ago) participated in a panel reviewing applicants for a position. We reviewed 6 resumes, we met for 2 hours, discussed the strengths and weaknesses of each applicant and tabulated the scores. This was on a Wednesday. The offer to the candidate was made on Thursday. The candidate accepted the offer on Friday and started work on Monday. Sure, there was 2 weeks that had passed from the closing of the announcement until we met to consider candidates, but, when the need is urgent, the government can move at the same pace, or even faster than the industry.
Thursday, September 10, 2009
Really?!? Really!?!
I know my last post will be a resource for a bunch of people, because I have talked about it with them, so that is the kind of content I really want to deliver to this blog.
Tuesday, September 8, 2009
Best Value Analysis
I recently finished another BVA and I am very happy with the process and formula I used to identify the best value. First, in the solicitation we were careful to identify that the award would be made based on Best Value and that the Technical review would be 65% of the score while cost would be 35%.
Then we completed the technical review, and let's just say that we hypothertically had:
- Offeror A - Technical 90 points - Cost $600K
- Offeror B - Technical 85 points - Cost $500K
- Offeror C - Technical 80 points - Cost $450K
- Offeror D - Technical 75 points - Cost $400K
In this hypothertical, let's set the Independent Government Cost Estimate to $350K. So everyone is over the IGCE. To get the cost into alignment that will allow me to integrate it with the technical I needed to figure out a way to get it to a 2-digit number that rewarded the offerors closer to the IGCE. I thought about a percentage of the IGCE, (proposed cost / IGCE) would work, but it went the wrong way. As costs got further away from the IGCE the score increased.
But if I took the inverse of that, then it worked well. As such I used the formula, 1 / (proposed cost / IGCE) = Cost Score
Using my examples above, I have:
- Offeror A - Technical 90 points - Cost Score 58
- Offeror B - Technical 85 points - Cost Score 70
- Offeror C - Technical 80 points - Cost Score 78
- Offeror D - Technical 75 points - Cost Score 88
(Technical x .65) + (Cost Score x .35) = combined score
When I do this I find that the offerors' final scores are:
- Offeror A - Technical 90 points - Cost Score 58 - Combined Score 78.8
- Offeror B - Technical 85 points - Cost Score 70 - Combined Score 79.75
- Offeror C - Technical 80 points - Cost Score 78 - Combined Score 79.3
- Offeror D - Technical 75 points - Cost Score 88 - Combined Score 79.55